Hi, I’m the Toradex AI Assistant. I’m in early beta, and I would appreciate your feedback.
The Toradex Support team will follow up with you shortly!
Did you like my answer?
Hello,
To provide both HD and SD feeds from a single camera sensor on a Toradex module, you have a few options to consider. Each approach has its trade-offs in terms of complexity, performance, and resource utilization. Here are the methods aligned with your suggestions and an additional one:
Use Separate Pipelines for HD and SD Streams: This approach involves duplicating the camera feed and processing it through two different GStreamer pipelines, each configured for the desired resolution. You would use imxg2dvideotransform or a similar element to scale the video to HD and SD resolutions accordingly. This method is straightforward but may increase CPU/GPU load since you’re effectively decoding, scaling, and encoding the video twice.
Utilize v4l2loopback for Duplicating the Video Feed: With v4l2loopback, you can create a virtual video device that duplicates the feed from your camera. You can then capture from this virtual device in two separate GStreamer pipelines, similar to the first option, but with the feed duplication handled at a lower level. This might offer a slight performance benefit over processing the duplication in GStreamer, but it introduces additional complexity in managing the loopback device.
Single Pipeline with tee Element: A more efficient way might be to use a single GStreamer pipeline that captures the feed from the camera and then uses a tee element to branch the pipeline into two paths: one for HD and another for SD processing. After the tee, you can scale, encode, and package each stream appropriately. This method is more resource-efficient since the capture and initial processing stages are shared. However, it requires careful pipeline construction to ensure that performance bottlenecks do not occur, especially in the scaling and encoding stages.
Here’s a simplified example of how the pipeline with a tee might look:
v4l2src device=/dev/{device} ! video/x-raw,width={max_width},height={max_height} ! tee name=t
t. ! queue ! imxg2dvideotransform ! video/x-raw,width={hd_width},height={hd_height} ! [HD encoding and streaming elements]
t. ! queue ! imxg2dvideotransform ! video/x-raw,width={sd_width},height={sd_height} ! [SD encoding and streaming elements]
This example assumes that {max_width} and {max_height} are the maximum resolution supported by
Please, also check if these links can help you solve your problem:
The recommended way to achieve this would be to duplicate the frames using tee, similar to the suggestion from @ToradexAI.
Encoding the video twice is inevitable here, as you have two different resolutions.
If the native resolution of the camera is not HD, but higher, a possible optimization would be to not scale the video from the native resolution twice, but scale them one after the other:
Capture Pipeline -> Conversion to HD -> tee
| '-> HD Streaming Pipeline
'-> Conversion to SD -> SD Streaming Pipeline
Also please note that GStreamer factories can have issues with multiple clients, therefore it may be a good idea to set them as shared.
It may still be possible to do this with tee, but you will need to setup the origin pipeline and two media factories separately, assigning each of the factories to a different mount point in the RTSP server.
Maybe some delayed linking of the pads would be needed.
I don’t think this implementation would be that simple, but would be a path forward.
I then use shmsrc on the rtsp server to generate the two feeds reading from the shared memory.
So a few questions:
is this madness?
we’re using the ar0521 for these experiments, every so often the sensor just seems to for want of a better word “die” and nothing but a power reboot will fix it ( i.e reloading the kernel modules does nothing ). I’ve tested two different sensors and both have ended up locked this way. Have you seen this before?
Because of the above, I am wondering if there is another way of rebooting these cameras ( i’m using the toradex dev board if that helps )
I’m looking at this plugin, and am wondering if you think this could also be used ina similar way to the shared memory idea, but might be even more performant New unixfd plugin in GStreamer 1.24 ( would need to backport it somehow… )
It is great to know you were able to find a solution to the problem.
Well, I was not familiar with the shmsink and shmsrc pipeline elements, but they seem to use a socket to control the communication and deal with common producer/consumer issues.
So this seems to be a good solution without going too deep into GStreamer and manual pad linking.
The unixfd plugin appears to be a viable alternative, but as you said would require backporting to the version of GStreamer which can support the VPU on the Verdin iMX8MP.
We have seen similar issues if the camera is not properly connected or when using the older driver available on BSP5.
From our tests, the driver available on the meta-toradex-econ layer for BSP6 is stable.
To confirm, are you using this driver?
Can you also check the camera connection?
There is a reset pin going to the camera, but I need to check if and how this is exposed on the software side.
I will get back to you with this information.
Unfortunately the reset signal is not exposed in any way by the driver.
If you continue to see issues even when using the new pipeline, please let us know so we can try to reproduce the problem and try to fix it.
We seem to still be having stability issues with this camera, although it working better…
I managed to backport the unixfd gstreamer plugins and it’s working great and vastly simplifies the pipeline.
However what is now more obvious in the logs is we get this error occastionally
v4l2src gstv4l2src.c:1264:gst_v4l2src_create:<v4l2src0> Timestamp does not correlate with any clock, ignoring driver timestamps
what is interesting is that this log line is thrown underneath this comment:
which at least strongly implies an issue with the driver?
Yes, this can indicate a bug in the driver, specifically when time-stamping is involved. Such problem may or may not be related to the freezing issues you are seeing.
This should not be happening.
I will prepare a setup to try to reproduce this issue and look for a possible workaround.
The workaround may involve removing the reset/powerdown pins from the camera’s node on the device tree overlays so it can be reset outside of the driver, but this may interfere with the driver initialization.
When we have further updates, we will sent them here.
Out of curiosity @bruno.tx there’s a few interesting commits in the upstream video4linux driver.
I am suspicious this is related as I wouldn’t have thought any bug in gstreamer itself would warrant a hard reboot to “fix” them and just a restart of the pipeline would suffice? Would you be in agreement with that hypothesis?
Another couple of interesting observations, the pipeline crashed again this morning eventually required a hard reboot, however even after a reboot the camera wouldn’t stream using the exact same command as above, yet another reboot and it was “fixed” again.
I’ll continue to endeavor to try and get something that narrows down how to reliably reproduce this issue, but it seems very tricky indeed.
The crashing of the camera could also be due to its overheating. Could you do another test by adding some sort of ventilation?
If it still doesn’t solve the issue, another idea would be to power down the camera by pulling down the GPIO pin (that would be gpio1 8, this is pwn-gpios in the overlay file). So when the camera dies, you can try rmmod ar0521 && gpioset gpio1 8=0. Please let me know how that goes!
That module can’t be removed because it’s built in, unless I am doing something stupid?
root@:~# rmmod ar0521
rmmod: ERROR: Module ar0521 is builtin.
root@~#
I’ve just updated my custom image to include the gpioset tooling so will give that a try next time it hangs ( I am still experincing the issue )
In terms of ventilation, the sensor is currently propped up on a shelf in my office so lots of airflow, but no fan/heat sink etc, are these required in your experience for these sensors?