问题
A utility which I wrote in C++ for testing purposes currently uses the SendInput function to emulate user input, i.e. mouse and keyboard events. I'd now like to extend my program such that it's able to emulate touch events, such that I can verify that the tested program handles e.g. WM_TOUCH messages correctly (though that message is deprecated, I'd still like to verify that it's handled correctly). I don't need to be able to send touch events to a specific HWND
, just sending them to some X/Y coordinates on the desktop is sufficient for my purposes.
I considered using HARDWAREINPUT structures with SendInput
as well as sending the WM_TOUCH
(or WM_GESTURE
?) message directly, but in either case it's not clear to me how to pass information about where or how that touch event was performed.
The WM_TOUCH
documentation explains that the lParam
argument:
Contains a touch input handle that can be used in a call to GetTouchInputInfo to retrieve detailed information about the touch points associated with this message.
...but it's not clear if and how I can allocate such a 'touch input handle' myself. Is it always just a plain pointer in the address space of the current process? If so, what value should I use for the hSource
field of the TOUCHINPUT structure?
回答1:
You are probably looking for Touch injection.
来源:https://stackoverflow.com/questions/44063271/how-can-i-emulate-touch-events-on-windows