Qt configuration to get LVDS 10.1" touch working on Dahlia Verdin iMX8MP

Is there a recommended configuration for Qt (or Wayland or Weston) to get Qt to properly handle Wayland/Weston touch events? After a bunch of experimentation with various configurations I was able to get Qt to log touch events with the following configuration in Qt:
qputenv(“QT_QPA_PLATFORM”, “wayland”);
qputenv(“WAYLAND_DISPLAY”, “/run/wayland-0”);
qputenv(“QT_PLUGIN_PATH”, “/usr/lib/plugins”);
qputenv(“XDG_RUNTIME_DIR”, “/run”);
qputenv(“QT_QPA_GENERIC_PLUGINS”, “evdevtouch”);
qputenv(“QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS”, “/dev/input/event2”);

With this configuration, touch events are getting logged by qt, with logs such as:
qt.qpa.input.events: reporting TouchPoint(111 @QRectF(741,376 8x8) normalized QPointF(0.582486,0.475594) press 1 vel QVector2D(0, 0) state 1

but they never seem to be passed along anywhere they would be handled.

I’m running a Yocto build of tdx-reference-multimedia-image BSP 7.3

Hi, I’m the Toradex AI Assistant.

I might make mistakes or occasionally get you wrong. Our support team will step in if needed.


I’d be glad to help with your Qt touch event configuration! Could you let me know which hardware you are using? You can run the tdx-info command to get detailed information about your setup - this will help me provide more targeted guidance. You can find more details about this command at Getting Device Information with Tdx-Info | Toradex Developer Center

While I can see you’re using BSP 7.3 with tdx-reference-multimedia-image, knowing your specific SoM and carrier board combination will help determine the best approach for Qt touch event handling with Wayland/Weston.

Touch Event Configuration Context

Based on the available documentation, there are several approaches for handling touch events in graphical applications on Toradex modules. For Yocto-based BSPs, the configuration can vary depending on whether you’re using Wayland/Weston (which is the default since BSP 5.x) (1) or other display systems.

The documentation shows that Weston supports touch event handling through libinput, and there are specific configurations available for touch calibration and touch-to-pointer conversion (2).

Once you provide your hardware details, I can give you specific guidance on:

  • Proper Weston configuration for your touch device
  • Qt platform plugin settings for your hardware
  • Touch calibration and event routing
  • Any hardware-specific considerations for your setup

Best regards, Toradex AI Assistant.


Did you like my answer? :+1: :-1:

If I answered your question, please tick the “Solution” checkbox.

For follow-up questions to me, please use @ToradexAI in your post.

I’ve already gone through a lot of debugging with the Toradex AI and gotten nowhere.

Based on your configuration and the fact that Qt is logging touch events but not handling them properly, the issue appears to be related to how Qt Wayland applications interact with touch events in your BSP setup.

Touch Event Handling Issues

From the documentation, there are specific considerations for touch event handling with Wayland applications (1). The issue you’re experiencing where touch events are logged but not handled suggests a disconnect between the touch input system and the application layer.

Recommended Solutions

1. Check for touch2pointer Configuration

The documentation mentions that some applications are not aware of Wayland touch events (like QtWidgets) and won’t react to them (1). For such cases, touch2pointer is recommended:

  • touch2pointer operates directly on the touchscreen’s evdev interface and creates a virtual pointer device (1)
  • This makes it possible to use applications that don’t handle touch events natively, though you lose multitouch capability (1)

2. Verify Weston Configuration

Ensure your weston.ini configuration properly handles your touch device. The documentation shows that touch devices can be configured through udev rules and Weston settings (1).

3. Alternative Qt Platform Configuration

Instead of using evdevtouch plugin directly, consider letting Weston handle the touch input and Qt receive it through the Wayland protocol. Try removing the evdev-specific environment variables:

qputenv("QT_QPA_PLATFORM", "wayland");
qputenv("WAYLAND_DISPLAY", "/run/wayland-0");
qputenv("XDG_RUNTIME_DIR", "/run");

4. Check Input Device Configuration

Verify that your touch device isn’t being ignored by libinput through udev rules. The documentation shows examples of udev rules that can affect touch device handling (1).

Next Steps

Since you’re using BSP 7.3 with tdx-reference-multimedia-image (2), I recommend:

  1. Checking if touch2pointer is available in your image
  2. Reviewing your Weston configuration for touch device handling
  3. Testing without the evdev-specific Qt environment variables to let Wayland handle touch input natively

The fact that Qt is receiving touch events suggests the hardware and basic input stack are working - the issue is likely in the event routing between the input system and your Qt application.

Best regards, Toradex AI Assistant.


Did you like my answer? :+1: :-1:

If I answered your question, please tick the “Solution” checkbox.

For follow-up questions to me, please use @ToradexAI in your post.

I’ve already tried:
qputenv(“QT_QPA_PLATFORM”, “wayland”);
qputenv(“WAYLAND_DISPLAY”, “/run/wayland-0”);
qputenv(“XDG_RUNTIME_DIR”, “/run”);
And didn’t get touch events at all. Neither did I get touch events with any of the Toradex AI’s suggestions.

You should generally avoid using QT_QPA_GENERIC_PLUGINS=evdevtouch when QT_QPA_PLATFORM=wayland.

Explicitly loading evdevtouch via QT_QPA_GENERIC_PLUGINS when using the wayland QPA can cause conflicts, as the evdevtouch plugin is typically used for direct input device access without a compositor, or in specific hybrid setups

For a standard Wayland setup, you should use a minimal set of environment variables(as AI duggested):

export QT_QPA_PLATFORM=wayland
export WAYLAND_DISPLAY=/run/wayland-0 # Or your specific Wayland socket
export XDG_RUNTIME_DIR=/run           # Or your specific XDG runtime directory

In your Qt application, this translates to:

qputenv("QT_QPA_PLATFORM", "wayland");
qputenv("WAYLAND_DISPLAY", "/run/wayland-0"); // Adjust as necessary
qputenv("XDG_RUNTIME_DIR", "/run");           // Adjust as necessary
// No need for QT_QPA_GENERIC_PLUGINS or QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS

This configuration ensures that Qt uses its Wayland backend to communicate with the compositor. The compositor is then responsible for reading raw touch input from devices like /dev/input/event2 and forwarding the processed touch events to the Qt application through the Wayland protocol.

Even with the correct environment variables, your Qt application needs to be set up to receive and handle touch events:

myWidget->setAttribute(Qt::WA_AcceptTouchEvents, true);
// or for Qt Quick/QML
setAcceptTouchEvents(true);

You’ll then need to override the event() method (or specific touch event handlers like touchEvent() for QWidget subclasses if available and appropriate) in your custom widgets to process QTouchEvent objects.


bool MyWidget::event(QEvent *event)
{
    if (event->type() == QEvent::TouchBegin ||
        event->type() == QEvent::TouchUpdate ||
        event->type() == QEvent::TouchEnd ||
        event->type() == QEvent::TouchCancel) {
        QTouchEvent *touchEvent = static_cast<QTouchEvent *>(event);
        // Process touchEvent->touchPoints() here
        // Example: Iterate through touch points and log their positions
        for (const QTouchEvent::TouchPoint &touchPoint : touchEvent->touchPoints()) {
            qDebug() << "TouchPoint:" << touchPoint.pos() << "State:" << touchPoint.state();
        }
        return true; // Mark event as handled
    }
    return QWidget::event(event);
}