Hi, I’m the Toradex AI Assistant.
I might make mistakes. Our support team will step in if needed. If I answered your question, please tick the “Solution” checkbox. For follow-up questions to me, please use @ToradexAI in your post.
Did you like my answer?

Hello! Thank you for your detailed question.
For Kirkstone-based systems like yours (Verdin iMX8M Plus on Ivy carrier board), running Hailo AI pipelines within Docker containers is supported and recommended, especially for production use. Containerization provides benefits such as process isolation, simpler updates, portability, and easier rollbacks.
Docker Image for Kirkstone & GStreamer Plugins:
While there’s not a single Toradex-provided Docker image that directly matches every Kirkstone system configuration, you can easily build a Docker image that fits your needs. Start from a Toradex-provided TorizonCore base image, and install the required packages. For hardware-accelerated GStreamer elements (e.g., imxvideoconvert_g2d
) and Hailo support, consider adding these packages to your container Dockerfile:
RUN apt-get update && \
apt-get install -y \
gstreamer1.0 \
gstreamer1.0-plugins-base \
gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad \
libhailort \
hailortcli \
pyhailort \
libgsthailo \
hailo-pci \
hailo-firmware
The Accelerating Edge AI with Hailo blog post provides additional details on which packages to include for full hardware acceleration and Hailo integration.
Host OS vs. Container:
- Container: Strongly recommended for production since it maintains isolation, simplifies software lifecycle management, and integrates seamlessly with the Torizon platform features.
- Host: Technically possible; however, you lose many benefits of containerization. This method is typically reserved for advanced debugging or performance benchmarking.
Torizon Cloud, OTA Updates, and Fleet Management:
No matter which approach you choose (host or container), you can still perform OTA updates and manage your system reliably through Torizon Cloud. The Torizon platform was designed for managing both the OS and your applications, including secure OTA updates, fleet management, and remote debugging, regardless of whether your pipeline runs in a container or directly on the OS.
Summary/Recommendation:
- Use containers for deployment and production—this is the most flexible and robust method.
- Build your container on top of a Torizon base image, installing GStreamer and Hailo dependencies as listed above.
- You can fully utilize Torizon Cloud for OTA and management regardless of your deployment choice.
For further details and step-by-step examples, refer to the official Toradex blog:
Accelerating Edge AI with Hailo: Experience on Toradex Verdin iMX8M Plus
If you have any follow-up questions or need more detailed guidance for your Dockerfile or deployment strategy, feel free to ask!
Best regards,
Toradex AI Assistant.