Manifest for weston-vivante no matching platform "linux/arm/v7"

Hello,

We are using an image based on weston-vivante:3 (also I tried :stable tag). The configuration for a docker-compose.yml for the image configuration looks like this

services:
  our-service:
    build:
      context: .
      dockerfile: Dockerfile
    image: <OUR docker image>
    volumes:
      - /tmp:/tmp
      - /dev:/dev
      - /var/run/dbus:/var/run/dbus
      - /var/run/docker.sock:/var/run/docker.sock
      - application:/application  # for run_on_host
    device_cgroup_rules:
      # ... for tty0
      - "c 4:0 rmw"
      # ... for tty7
      - "c 4:7 rmw"
      # ... for /dev/input devices
      - "c 13:* rmw"
      - "c 199:* rmw"
      # ... for /dev/dri devices
      - "c 226:* rmw"
      # ... for our service
      - "c 509:0 rmw"
      - "c 510:0 rmw"
      - "c 511:0 rmw"
    depends_on: [
      weston
    ]
    privileged: true  # required to access the /proc/device-tree
    restart: always
    user: torizon

  weston:
    image: torizon/weston-vivante:stable
    environment:
      - ACCEPT_FSL_EULA=1
    # Required to get udev events from host udevd via netlink
    network_mode: host
    volumes:
      - type: bind
        source: /tmp
        target: /tmp
      - type: bind
        source: /dev
        target: /dev
      - type: bind
        source: /run/udev
        target: /run/udev
    cap_add:
      - CAP_SYS_TTY_CONFIG
    # Add device access rights through cgroup...
    device_cgroup_rules:
      # ... for tty0
      - "c 4:0 rmw"
      # ... for tty1
      - "c 4:1 rmw"
      # ... for tty7
      - "c 4:7 rmw"
      # ... for /dev/input devices
      - "c 13:* rmw"
      - "c 199:* rmw"
      # ... for /dev/dri devices
      - "c 226:* rmw"
    restart: always

volumes:
    application:  # for run_on_host

The lockbox is successfully created, uploaded to Torizon cloud and is visible via app.torizon.io site.
When trying to build offline update using TCB 3.11 we are getting this error at certain point when it tries to fetch weston-vivante manifests .

Fetching docker-compose target 'spirometry-docker-apps-release-1.0.13a1'
Fetching target 'spirometry-docker-apps-release-1.0.13a1' from 'https://api.torizon.io/repo/api/v1/user_repo/targets/spirometry-docker-apps-release-1.0.13a1'...
Uptane info: target 'spirometry-docker-apps', version: 'release-1.0.13a1'
Fetching manifests for registry.gitlab.com/aktina/abc4/abc4-spirometry-module-software@sha256:e0dddf7758e63b3e07864a8f5b5975a949e3bd397411bed6e08eec8977379fc6...
Saving manifest of aktina/abc4/abc4-spirometry-module-software
Fetching manifests for torizon/weston-vivante@sha256:be8e1609c8e476e13b11890274454cf6c270b46500260ab0725091594a541cc0...
Saving manifest-list of torizon/weston-vivante
Saving manifest of torizon/weston-vivante [linux/arm64]
Saving manifest of torizon/weston-vivante [unknown/unknown]
An unexpected Exception occurred. Please provide the following stack trace to
the Toradex TorizonCore support team:
Traceback (most recent call last):
  File "/builder/torizoncore-builder", line 222, in <module>
    mainargs.func(mainargs)
  File "/builder/tcbuilder/cli/platform.py", line 320, in do_platform_lockbox
    platform_lockbox(
  File "/builder/tcbuilder/cli/platform.py", line 302, in platform_lockbox
    raise exc
  File "/builder/tcbuilder/cli/platform.py", line 281, in platform_lockbox
    fetch_offupdt_targets(
  File "/builder/tcbuilder/cli/platform.py", line 198, in fetch_offupdt_targets
    platform.fetch_compose_target(**params)
  File "/builder/tcbuilder/backend/platform.py", line 713, in fetch_compose_target
    images_selection = select_unique_images(
  File "/builder/tcbuilder/backend/platform.py", line 558, in select_unique_images
    images_selection = select_images(
  File "/builder/tcbuilder/backend/platform.py", line 500, in select_images
    assert _avail, \
AssertionError: There are no images matching platform 'linux/arm/v7' for 'torizon/weston-vivante@sha256:be8e1609c8e476e13b11890274454cf6c270b46500260ab0725091594a541cc0'

We don’t set explicitly linux/arm/v7 anywhere in our configurations / docker files etc.
Thanks!

Hi, I’m the Toradex AI Assistant.

I might make mistakes. Our support team will step in if needed. If I answered your question, please tick the “Solution” checkbox. For follow-up questions to me, please use @ToradexAI in your post.

Did you like my answer? :+1: :-1:


Hello,

Thank you for your detailed description and configuration. The error you’re experiencing is a known issue when working with multi-platform container images (such as weston-vivante) on i.MX8-based devices using TorizonCore Builder. By default, TorizonCore Builder expects images for the linux/arm/v7 (32-bit ARM) architecture, but the weston-vivante image is only available for linux/arm64 (64-bit ARM).

To resolve this, you need to explicitly specify the correct architecture for your device when running TorizonCore Builder commands. For i.MX8-based modules, you should always provide the --platform linux/arm64 option:

For example, when creating your offline update bundle, use:

torizoncore-builder bundle docker-compose.yml --platform linux/arm64

This tells TorizonCore Builder to fetch and process images matching the 64-bit ARM architecture, which matches what is available for the torizon/weston-vivante image.

Summary of steps:

  1. When running TorizonCore Builder for bundling or other operations, always specify the platform:
    torizoncore-builder bundle docker-compose.yml --platform linux/arm64
    
  2. Continue with your regular update and deployment workflows.

You can find more information about this in the official documentation:

Let me know if you need further assistance or if you would like more details about working with containers on Torizon OS.

Best regards,
Toradex AI Assistant.

Nice point, I wasn’t aware that the

torizoncore-builder platform lockbox 

also has --platform parameter. Checking that solution…

That worked!

1 Like

That worked!

Lovely, thanks for reporting. Some time ago we made the weston-vivante image arm64 only, to prevent precisely this issue: the tcb build here would work but the software itself wouldn’t. Nice to see that it’s working as intended.

I’ll let the team know about this ‘issue’ that you had anyway because maybe we can make the experience better (maybe using heuristics or some other mechanism to make sure we’re getting the correct architeture, or just reviewing the documentation).

Also, please do use the :3 tag if you’re running Torizon OS 6 or 4 tag if you’re running Torizon OS 7.

Cheers,