I\'m creating a Windows Forms application that has a couple of clickable panels that require the touchscreen equivalent of the mouse down and up event.
When I\'m te
Just doing a little reading, I think you need to override the WndProc and look for WM_TOUCH events.
Have a look at the Windows 7 Multitouch .NET Interop Sample Library which has examples on handling touch and gestures in winforms.
You have to override the WndProc, capture the messages and launch your MouseDown and MouseUp events manually
public const int WM_POINTERDOWN = 0x246;
public const int WM_POINTERUP = 0x247;
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
protected override void WndProc(ref Message m)
{
base.WndProc(m);
switch (m.Msg)
{
case WM_POINTERDOWN:
{
MouseEventArgs args = new MouseEventArgs(MouseButtons.Left, 1, this.MousePosition.X, this.MousePosition.Y, 0);
MouseDown(this, args);
break;
}
case WM_POINTERUP:
{
MouseEventArgs args = new MouseEventArgs(MouseButtons.Left, 1, this.MousePosition.X, this.MousePosition.Y, 0);
MouseUp(this, args);
break;
}
}
}
I'm not completely sure about this but, have you tried using a tap event to capture the touch, rather than the click event?