I'm working on a Qt5 application to attempt to make use of raw input data from a touchscreen because touch is not fully supported by my Linux kernel (2.6.32). I wrote a parser for the raw data coming from /dev/input/eventX because even though X doesn't support touch, the events are still there.
I'm able to read the events just fine, and wrote a wrapper called "TouchPoint" which contains the ID and (x,y) coordinates of each touch point, as well as a boolean indicating whether or not the touch point is currently "active" (ie: the user currently has that finger on the screen). It's worth noting that I also output the number of points of touch currently on the screen, and the value is accurate.
My issue is that I can't seem to figure out how to accurately simulate a mouse click with it. With multi-touch events in Linux, each touch point is assigned a "tracking ID" when the user presses a finger to the screen, and when that point is lifted, an event setting that slot's tracking ID to -1 is generated. I use this change of ID from some value to -1 to indicate that a touch point is "down" or not:
void TouchPoint::setID(int nid) {
if((id == -1) && (nid >= 0)) down = true;
else if(nid == -1) down = false;
id = nid;
}
So to try and simulate a mouse click, I do this when a tracking ID event is read:
else if(event.code == ABS_MT_TRACKING_ID) {
bool before = touchPoints[cSlot].isDown(); // where cSlot is the current slot the events are referring to
touchPoints[cSlot].setID(event.value);
bool after = touchPoints[cSlot].isDown();
if(!before && after) touch(touchPoints[cSlot]);
}
And then the touch method does this:
void MainWindow::touch(TouchPoint tp) {
if(ui->touchMe->rect().contains(QPoint(tp.getX(), tp.getY()))) {
on_touchMe_clicked();
}
}
The button does not respond to me directly touching it, but sometimes if I wildly flail my fingers around the screen, the message box that should show when it's pressed will show, but when it does, my fingers are always somewhere on another area of the screen.
Can anyone think of something I might be doing wrong here? Is my method of checking to see if the touch point is within the button's bounds wrong? Or is my logic for keeping track of the "down" status of the touch points wrong? Or something else entirely?
UPDATE: I just noticed two things by outputting the screen coordinates of both the touch location and the button location.
- The digitizer resolution is larger than the screen resolution, so I needed to scale the X and Y coordinates coming from the raw events.
- My coordinates are offset because they are screen coordinates.
UPDATE 2: I've now dealt with the two issues mentioned in my last update, but I'm still not able to figure out how to accurately simulate a touch/mouse event to be able to interact with the interface. I also modified my TouchPoint class to have a boolean flag indicating whether or not the touch point was just updated, which is set to true when the tracking ID is updated and reset when an EV_SYN event is raised. This is so I can get the X and Y position events before creating the event for the application. I tried the following:
- Using the QApplication class to post a mouse event to the QApplication::desktop()->screen() widget that has the position of the touch point.
- Using the QTest class to raise a touchEvent to the QApplication::desktop()->screen() widget using the press and release methods for the given slot and position of the touch point and then using the bool event(QEvent*) method to try to catch the event. I also enabled the WA_AcceptTouchEvents attribute on the main window.
Neither of these works, as for some reason, when I have the "bool event(QEvent*)" method in place, the signal emitted from the thread reading /dev/input/eventX doesn't trigger the slot in the main window class. and I can't seem to find any other method to accomplish simulating the events. Anyone have any ideas?