2
votes

I'm working on determining if a certain touchscreen will be compatible with an application and recently got a loaner model of an Elo 2402L touchscreen. I've installed the driver the company provides and was able to see multi-touch events using the evtest utility (parser for /dev/input/eventX).

The thing is that I'm running Scientific Linux 6.4, which uses Linux kernel 2.6.32. I've seen a lot of mixed information on touchscreen compatibility for Linux kernels before 3.x.x. Elo says that their driver only supports single-touch for 2.6.32. Also, I've seen people say that the majority of the compatibility issues with touch events in this kernel version are with Xorg interfaces.

I developed a very simple Qt5 application to test whether Qt could detect the touch events or not, because I'm not sure whether Qt applications are X-based and if they read events directly from /dev/input or something else.

However, despite a simple mouse event handler being able to correctly register mouse events, I also created a simple touch event handler and nothing happens when I touch the main screen. There is a beep, as part of the driver that Elo provides makes a beep when the screen is touched, so I know that SOMETHING is registering that touch, but neither the desktop, nor this application seem to recognize the touch event.

Also, yes, the WA_AcceptTouchEvents attribute is set to true in the window's constructor.

I have a simple mainwindow.h:

...
protected:
    int touchEvent(QTouchEvent *ev);
...

And mainwindow.cpp:

MainWindow::MainWindow(QWidget *parent) {
    ...
    setAttribute(Qt::WA_AcceptTouchEvents, true);
    touchPoints = 0;
}
...
int MainWindow::touchEvent(QTouchEvent *ev) {
    switch(ev->type()) {
        case QEvent::TouchBegin:
            touchPoints++;
            break;
        case QEvent::TouchEnd:
            touchPoints--;
            break;
    }

    ui->statusBar->showMessage("Touch Points: " + touchPoints);
}

Is there something wrong with the way I'm using the touch event handler? Or is there some issue with the device itself? Does Qt read input events directly from /dev/input, or does it get its input events from X?

Very confused here, as I haven't used Qt before and want to narrow down the cause before I say that it's the device causing the issue.

Also, if anyone has any insight into the device / kernel compatibility issue, that would be extremely helpful.

2

2 Answers

1
votes

The QTouchEvent documentation says:

Touch events occur when pressing, releasing, or moving one or more touch points on a touch device (such as a touch-screen or track-pad). To receive touch events, widgets have to have the Qt::WA_AcceptTouchEvents attribute set and graphics items need to have the acceptTouchEvents attribute set to true.

Probably you just need to call setAttribute(Qt::WA_AcceptTouchEvents, true) inside the MainWindow constructor.

0
votes

Is there something wrong with the way I'm using the touch event handler?

There is no touch event handler. If you change:

int touchEvent(QTouchEvent *ev);

to:

int touchEvent(QTouchEvent *ev) override;

(which you should always do when you are trying to override virtual functions so you can catch exactly this kind of mistake), you'll see that there is no such function for you to override. What you need to override is the event() handler:

protected:
    bool event(QEvent *ev) override;

You need to check for touch events there:

bool MainWindow::event(QEvent *ev)
{
    switch(ev->type()) {
    case QEvent::TouchBegin:
        touchPoints++;
        break;
    case QEvent::TouchEnd:
        touchPoints++;
        break;
    default:
        return QMainWindow(ev); 
    }

    ui->statusBar->showMessage("Touch Points: " + touchPoints);
}

However, it might be better to work with gestures instead of touch events. But I don't know what kind of application you're writing. If you wanted to let Qt recognize gestures rather than implementing them yourself through touch events, you would first grab the gestures you want, in this case pinching:

setAttribute(Qt::WA_AcceptTouchEvents);
grabGesture(Qt::PinchGesture);

and then handle it:

bool MainWindow::event(QEvent *ev)
{
    if (e->type() != QEvent::Gesture) {
        return QMainWindow::event(e);
    }

    auto* gestEv = static_cast<QGestureEvent*>(e);
    if (auto* gest = gestEv->gesture(Qt::PinchGesture)) {
        auto* pinchGest = static_cast<QPinchGesture*>(gest);
        auto sf = pinchGest->scaleFactor();

        // You could use the pinch scale factor here to zoom an image
        // for example.

        e->accept();
        return true;
    }
    return QMainWindow::event(e);
}

Working with gestures instead of touch events has the advantage of using the platform's gesture recognition facilities, like those of Android and iOS. But again, I don't know what kind of application you're writing and on what kind of platform you're working on.