What you're facing is "normal" yet ridiculously stupid behaviour of Windows with touch input devices. We're facing the same issue for a while now, and are trying to solve it actively for the last couple weeks.
The trick is, that windows handles touch input device as "mouse, that can be fully controlled with single finger". Therefore, it has several states, more than events:
Events are: Touch begin (down), move, up, right click, and if you really want to register for a callback you'll get stationary (not moved since last report but still down).
Meanwhile, there are more states. First, when you press your finger down, it internally detects "Touch down!" event. It than waits a certain amount of time (which might or might not be available to hack-change, depends on drivers etc...), to determine whether what you wanted was a right click (after x time, upon release !before Y time! it will fire right click event) or whether it was a left click down (triggering event TouchBegin - after y time, which is LONGER than x-time, and after the y expires, it will fire left down event (not click!)).
Meanwhile, if the first touch down is interrupted by either move or - !! while waiting for x time to expire!! release of the finger BEFORE x time is ran out, it will automatically trigger left down (on move) or left click (on release). Same goes for any time after the Y time expires as well, or MOVE is triggered.
To put it more simply and understandable, I'll try to give an example:
Let's say that you have touch device, and the timings are as follows:
Point B is beginning, positioned at 0 seconds of our timeline.
Point X is first trigger breakpoint, positioned at 5 seconds of our timeline.
Point Y is second trigger breakpoint, positioned a 10 seconds of our timeline.
You put your finger on touch, and that starts the timeline.
Possible scenarios:
- You release it anytime before point X -> Left click is triggered (Touch down and up events, in order immediately one after another)
- You release it after point X but before point Y -> Right click is triggered (Touch down and up events, with right click flag)
- You reach point Y - Left down is triggered!
- You release it after point Y anytime, no matter if move triggers between release or not (but move is NOT triggered before reaching this point in timeline) -> Left up is triggered.
- You trigger (do) move event before reaching point X -> Left down is triggered, followed by as many move events as it detects it, until you release it -> then left up is triggered
- You trigger (do) move event after X but before Y -> same as above!
- You reach point Y and move it afterwards -> check point no4 above.
Hope this is making any sense to you.
Now to jump to the possible solutions: Custom driver would be one option, but we're not there yet, so we're still trying other options. The most promising for now seems to be use of RAW_INPUT and Touch hooks, looks like we'll have to combine them to get what we want tho.
The bright side is, that windows itself DOES detect when finger touches the device, regardless of the timings and events that it wants to detect, determine and then forward to apps, but they for some reason made it hard to use in such nature. As a proof you could simply check the transparent dot that appears underneath your finger the very first moment you touch the screen.
Meanwhile, Android handles these things way better...
Hope it helps, and I'm happy to come around with the complete solution once we come through and get something good enough for usage.
M.*