Video processing pipeline for i.MX8

yes, I tested with v4l2jpegdec.

1 Like

It seems like your problem is incredibly similar to ours. What did you end up doing or what are you planning to do?

I have an update on this. I decided to move away from MJPEG cameras and focus on H264 as I thought the H264 pipeline would have a better chance of working correctly. And I was right… partially.

The decoding part is as expected, constantly outputting frames at the maximum framerate defined by the camera. Unfortunately, my project involves overlaying text and images on top of the frame. So when I tried to create a pipeline that decodes the H264 frame, then re-encodes it, I’m getting a negotiation error.

@gclaudino.tx and @benjamin.tx , could you please try and get a pipeline working that involves H264 decoding and then re-encoding? It doesn’t need to be a camera source, mine fails even with a file input. I need to know if it’s feasible to keep trying this avenue or if I should consider alternative solutions.

PS: I’m using v4l2h264enc/dec GStreamer elements. Are these not the right ones? There seems to be another project that provides Gstreamer plugins with VPU support. Some parts of the community say that the plugins in the project are better than the ones in NXP’s BPS. I tried to include them in the Yocto image, but unfortunately, as you might expect, the ‘apalis-imx8’ is not listed as a compatible machine in the recipe.

Thank you!

Hi @alexxs , The video stream is already encoded in H.264 format for an H.264 camera or file. What do you mean by re-encoding? Do you want to transcode H.264 into another format? And what is the final format?

The stream is already in H264. I need to decode it, apply a text and image overlay and then re-encode it in H264 again

Outlook for iOS

H.264 video encoding

gst-launch-1.0 v4l2src device=/dev/video0 num-buffers=300 ! \
'video/x-raw,format=(string)NV12,width=1920,height=1080,framerate=(fraction)30/1' \
! queue ! v4l2h264enc ! avimux ! filesink location=test.avi

Thanks for the reply!

I probably didn’t explain it well enough in the message. I need a pipeline that can decode the H264, add a text overlay on it (through gstreamer element) and then re-encode it in H264 for RTMP. If I break the pipeline into two separate pipelines (with a file in between), it works.

Thanks for the explanation of the whole picture. Could you post the two pipelines with a file in between? Our default BSP doesn’t come with a rtmpsink element. That means you can’t stream the content to a RTMP server by default.

I have added RTMP plugin to the image and confirmed it’s working (I’ve attached my local.conf file).

In regards to the pipeline. I seem to have remembered wrong. I tried to reproduce the two pipelines today in Toradex on the iMX8, but with no luck. It seems there’s no way to link the v4l2h264dec to the v4l2h264enc? I’ve tried adding the overlay elements in between. It seems odd as, on paper, it should work since it’s just a case of decoding an H264 stream to RAW, then re-encoding that. Also, I’ve first tested the pipeline on Ubuntu and it worked flawlessly.

```gst-launch-1.0 -v v4l2src device=/dev/video4 ! video/x-h264 ! h264parse ! avdec_h264 ! video/x-raw ! x264enc bitrate=2000 byte-stream=false key-int-max=60 bframes=0 aud=true tune=zerolatency ! flvmux ! rtmpsink location``

PS: I did a bit of digging and it seems that this pipeline at least runs. The problem is that the Toradex build is missing the v4l2convert element:

gst-launch-1.0 -e filesrc location=test.mp4 ! qtdemux ! h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! v4l2convert ! v4l2h264enc ! h264parse ! mp4mux ! filesink location="result.mp4"

local.conf (11.7 KB)

@benjamin.tx ,

Is there any update on this please?

Hi @alexxs, how are you?

I finally could find some time to test it. I’ve been trying to find a usable pipeline for you using decodebin (decodebin) to check what could we add in between the v4l2h264dec and v4l2h264enc as the v4l2convert is indeed missing. I’ve also added the dot environment variable to monitor what the decodebin found as a possible solution Basic tutorial 11: Debugging tools.

I’ve tested the following pipeline on the module:

gst-launch-1.0 -e filesrc location=testvideo_h264.mp4 ! qtdemux !  h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! typefind ! videoconvert  ! v4l2h264enc ! h264parse ! mp4mux ! filesink location="result5.mp4"

I used the template videos that come with the Reference Multimedia Image to achieve so. Does this work for you? I’ve been able to reproduce the original video but without any audio being played on the background.

Moreover, the decodebin has once suggested using auirdemux instead of qtdemux. It seems to work better in case you want to keep the audio on the video. Have you tried it?

Best regards,

1 Like

Thanks for your help. I’m not necessarily interested in the audio at this point.

Did you measure the FPS of the pipeline you’ve suggested? It’s incredibly slow on my system. Have tried streaming the output from it and I’m getting <1FPS. To clarify, here are the two pipelines I’m comparing:

1FPS:

gst-launch-1.0 -v v4l2src device=/dev/video4 io-mode=4 ! video/x-h264 ! h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! typefind ! videoconvert  ! v4l2h264enc ! h264parse ! queue ! rtph264pay ! udpsink port=5000 host=192.168.1.39

32 FPS (camera max):

gst-launch-1.0 -v v4l2src device=/dev/video4 ! video/x-h264 ! h264parse ! v4l2h264dec ! video/x-raw ! fpsdisplaysink text-overlay=0 sync=false

Have tested streaming separately and works absolutely fine with the original H264 frames.

gst-launch-1.0 -v v4l2src device=/dev/video4 io-mode=4 ! video/x-h264 ! h264parse ! queue ! rtph264pay ! udpsink port=5000 host=192.168.1.39

I’ve also ran the pipeline on a file and it’s definitely not real-time. 2min 35s for the file you’ve used as well.

gst-launch-1.0 -ve filesrc location=video/testvideo_h264.mp4 num-buffers=300 ! qtdemux !  h264parse ! v4l2h264dec ! imxvideoconvert_g2d ! typefind ! videoconvert  ! v4l2h264enc ! h264parse ! mp4mux ! filesink location="result5.mp4"
Got EOS from element "pipeline0".
Execution ended after 0:02:35.055801710

Hi @alexxs , sorry for the late reply.
the m2m driver need to be enabled in device tree, e.g. imx8-apalis-ixora-v1.2.dtsi.
imx8qm-apalis-v1.1-ixora-v1.2.dtb (166.0 KB)
is the binary I built for a quick test. The fdt_board in u-boot needs to be set to ixora-v1.2.

&isi_0 {
    status = "okay";

    cap_device {
        status = "okay";
    };

    m2m_device {
        status = "okay";
    };
};

With that, v4l2convert plugin will be enabled.

# gst-inspect-1.0 |grep v4l2convert 
video4linux2:  v4l2convert: V4L2 Video Converter

This is the pipeline to decode an H.264 file, and then encode it again.

gst-launch-1.0 filesrc location=testvideo_h264.mp4 ! qtdemux ! queue ! h264parse ! \
v4l2h264dec ! queue ! imxvideoconvert_g2d ! queue ! v4l2convert ! queue ! \
v4l2h264enc ! h264parse ! matroskamux ! filesink location=testvideo_h264.mkv
2 Likes

Thank you! Not super familiar with Uboot and have only started to properly understand yocto. Can you please offer a few more details into how to enable the m2m driver?

Can you tell me if you were able to get real-time performance with the above pipeline?

Alex

Hey @alexxs ,

While these threads are not relevant to your original questions, one could show you how to change the overlays, the other will show you how to change variables in uboot,
Controlling the peripheral in IMX8 with M4 core - #21 by hfranco.tx → uboot
10.1 inch capacitive touch screen not functional → overlays

I hope this is useful till the official support replies

Best,
AM

2 Likes

This the guide about building and customizing the device tree. To enable m2m driver, append the above to a device tree, e.g. imx8-apalis-ixora-v1.2.dtsi.
From what I tested, the transcoding performance was not real-time. The video file testvideo_h264.mp4 was 10 seconds and the pipeline took around 20s. The performance issue was also discussed on NXP community. Replacing the libgstvideo4linux2.so didn’t improve it on our BSP 5.7.0.

2 Likes

Thank you both! I will try this and let you know. I’m hoping for real-time as I’m not planning to save the file at this point and from what I understood in the NXP thread, the I/O is the biggest time sink for them.

Going down the path of building a device tree overlay to test the solution, as that seemed the simplest way. However running into issues building the standard device tree overlays from the Git branch. I’ve created a separate thread to not mix up the two problems.

So I managed to compile the overlay and now I can see the v4l2convert plugin. I was also able to reproduce your pipeline and get the exact same result. And, even more good news!

  1. By removing the disk I/O operations from the pipeline and streaming the frame, I was able to get realtime operation. So that’s an amazing result I’d say and one that gives me hope we can actually use the i.MX8 further.

HOWEVER:

The pipeline from the H264 camera still doesn’t work and throws an error:

gst-launch-1.0 -v v4l2src device=/dev/video4 ! video/x-h264 ! h264parse ! v4l2h264dec ! queue ! imxvideoconvert_g2d ! queue ! v4l2convert ! queue  ! v4l2h264enc ! h264parse ! queue ! rtph264pay ! udpsink port=5000 host=192.168.1.39

LE: Have managed to get the camera working. The problem was a rather silly mistake on my part. When the M2M and CAP got enabled, two extra /dev/videoX devices appeared, which essentially meant that the command above was referring to the MJPEG camera stream, rather than the H264 one. Anyway, with that sorted, the stream is working. But there are still a couple of problems:

  1. Full HD with the pipeline below yields 15 FPS max.
gst-launch-1.0 -v v4l2src device=/dev/video6 io-mode=4 ! video/x-h264 ! h264parse ! queue ! v4l2h264dec output-io-mode=2 ! queue ! imxvideoconvert_g2d ! queue ! v4l2convert ! queue ! v4l2h264enc capture-io-mode=2 ! rtph264pay ! udpsink port=5000 host=192.168.1.39
  1. Reducing the width of the frame to 1280 pixels makes it realtime (32FPS), but only getting that with 110% CPU use, which suggests that some parts of the pipeline are using the CPU, rather than the VPU.

Nevertheless, this is amazing progress from where I was just a few days ago, so thank you all for the help. Would be great if, with your help, we could tick off these last two small issues.

Thank you!

Sorry, I don’t have much experience with performance. But for a comparison, I tested a MIPI-CSI2 camera with raw NV12 output and there are no conversion elements in the pipeline.

gst-launch-1.0 -v v4l2src device=/dev/video1 ! \
'video/x-raw,format=(string)NV12,width=1920,height=1080,framerate=(fraction)30/1' ! \
v4l2h264enc ! rtph264pay ! udpsink host=10.20.1.123 port=5000

On an x86 PC, I could get 25 fps.

gst-launch-1.0 -v udpsrc port=5000 ! \
"application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! \
rtph264depay ! h264parse ! decodebin ! videoconvert ! \
fpsdisplaysink text-overlay=false sync=false

/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 16, dropped: 0, current: 30.45, average: 30.45
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 29, dropped: 0, current: 24.81, average: 27.64
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 41, dropped: 0, current: 23.86, average: 26.41
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 53, dropped: 0, current: 23.88, average: 25.79
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 65, dropped: 0, current: 23.57, average: 25.35
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 78, dropped: 0, current: 24.45, average: 25.20
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 91, dropped: 0, current: 25.31, average: 25.21
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 103, dropped: 0, current: 23.45, average: 24.99

Besides some elements(maybe data copy operation is involved.), the camera interface could also affect the performance. You may connect your camera to a USB 3.0 port of Apalis iMX8 and have a try.

I was able to confirm the camera interface is not influencing the results by streaming the H264 frames from the camera directly and getting max FPS. It’s only when the decoding and encoding are added that the FPS slows.