5
votes

I've created a TCustomControl derived class for a VCL application running on a Windows 8.1 tablet.

I'm using the OnMouseDown / OnMouseUp events even though this is obviously touch based.

What I'd like to do is detect a long press - i.e. touch down and hold for 1 second. So in the OnMouseDown event I record the down timestamp, set a flag to indicate the mouse is down and create an anonymous thread which sleeps for 1 second, and then checks the flag.

In OnMouseUp I set the flag to false.

This works as long as you wiggle your finger on the control. Otherwise if you just touch and hold the mouse down event is not called until you release your finger.

I've look at gestures, but that just looks completely overkill and from what I understand doesn't support long press anyway.

Thanks for any suggestions.

Richard

2
How about just activating a timer with a 1000ms interval when MouseDown occurs, and cancelling it on MouseUp. Put your code in the OnTimer event. It should only fire after they have held the mouse down for over a second. You could probably get away with one timer for the whole form as long as you save which control launched it. Don't forget to disable the timer immediately in the OnTimer event.penarthur66
Thanks, but that won't work for the same reason as the anonymous thread, the MouseDown doesn't fire unless you wiggle your finger or lift your finger.Richard Chamberlain
I don't have a windows tablet, but I just added a TButton and a TMemo to a new VCL form (Berlin) and added code to add a line to the memo in the MouseDown and MouseUp event handlers, both events clearly fire individually and in sync with the mouse clicks. There must be a difference between the way the tablet fires the events from the standard PC mouse.penarthur66
Thanks - yes it is different to a standard mouse handling. If I run the application on my desktop PC and use my mouse rather than my finger it behaves as expected.Richard Chamberlain

2 Answers

2
votes

What you're facing is "normal" yet ridiculously stupid behaviour of Windows with touch input devices. We're facing the same issue for a while now, and are trying to solve it actively for the last couple weeks. The trick is, that windows handles touch input device as "mouse, that can be fully controlled with single finger". Therefore, it has several states, more than events:

Events are: Touch begin (down), move, up, right click, and if you really want to register for a callback you'll get stationary (not moved since last report but still down).

Meanwhile, there are more states. First, when you press your finger down, it internally detects "Touch down!" event. It than waits a certain amount of time (which might or might not be available to hack-change, depends on drivers etc...), to determine whether what you wanted was a right click (after x time, upon release !before Y time! it will fire right click event) or whether it was a left click down (triggering event TouchBegin - after y time, which is LONGER than x-time, and after the y expires, it will fire left down event (not click!)).
Meanwhile, if the first touch down is interrupted by either move or - !! while waiting for x time to expire!! release of the finger BEFORE x time is ran out, it will automatically trigger left down (on move) or left click (on release). Same goes for any time after the Y time expires as well, or MOVE is triggered.

To put it more simply and understandable, I'll try to give an example:

Let's say that you have touch device, and the timings are as follows: Point B is beginning, positioned at 0 seconds of our timeline. Point X is first trigger breakpoint, positioned at 5 seconds of our timeline. Point Y is second trigger breakpoint, positioned a 10 seconds of our timeline.

You put your finger on touch, and that starts the timeline. Possible scenarios: - You release it anytime before point X -> Left click is triggered (Touch down and up events, in order immediately one after another) - You release it after point X but before point Y -> Right click is triggered (Touch down and up events, with right click flag) - You reach point Y - Left down is triggered! - You release it after point Y anytime, no matter if move triggers between release or not (but move is NOT triggered before reaching this point in timeline) -> Left up is triggered.

  • You trigger (do) move event before reaching point X -> Left down is triggered, followed by as many move events as it detects it, until you release it -> then left up is triggered
  • You trigger (do) move event after X but before Y -> same as above!
  • You reach point Y and move it afterwards -> check point no4 above.

Hope this is making any sense to you.

Now to jump to the possible solutions: Custom driver would be one option, but we're not there yet, so we're still trying other options. The most promising for now seems to be use of RAW_INPUT and Touch hooks, looks like we'll have to combine them to get what we want tho. The bright side is, that windows itself DOES detect when finger touches the device, regardless of the timings and events that it wants to detect, determine and then forward to apps, but they for some reason made it hard to use in such nature. As a proof you could simply check the transparent dot that appears underneath your finger the very first moment you touch the screen.

Meanwhile, Android handles these things way better...

Hope it helps, and I'm happy to come around with the complete solution once we come through and get something good enough for usage.

M.*

0
votes

If the MouseDown event is not triggered and you want to make your delay counter dependent on it then you are basically screwed. That said, I assume you are open to slight changes of concept. :)

I think it can still be done if you change your button into something like a flip-switch. The user has to "grab" (touch) it, "pull it up" (drag, should trigger StartDrag event) and hold it there for the desired amount of time. Upon release of the switch (MouseDown event), it snaps back down.

To design such a button, you could use a tiny TPanel for the knob and a slightly larger one for allowing limited vertical movement on it. Set BevelInner property to bvLowered and BevelOuter to bvRaised to make it look like a frame and the knob panel its child. Or use a TImage to display the knob in it's up/down positions. In any case, set the knob element's DragKind property to dkDrag or the StartDrag event won't occur. If still no event is fired, please try to detect touch input capabilities and report your findings..

Not sure this qualifies for an answer but maybe it's worth a swing.