I would like to send a webcam stream over the network. I added a configuration in Yocto similar to:
PREFERRED_VERSION_gstreamer1.0 = "1.16.imx"
PREFERRED_VERSION_gstreamer1.0-plugins-base = "1.16.imx"
PREFERRED_VERSION_gstreamer1.0-plugins-good = "1.16.imx"
PREFERRED_VERSION_gstreamer1.0-plugins-bad = "1.16.imx"
LICENSE_FLAGS_WHITELIST = "commercial"
PACKAGECONFIG_append_gstreamer1.0-plugins-ugly = " orc a52dec mpeg2dec x264"
and installed all the gstreamer plugins (base, good, bad and imx-gst1.0-plugin). I tried with the following pipeline:
gst-launch-1.0 v4l2src device=/dev/video2 ! 'video/x-raw,width=640,height=480' ! decodebin ! videoconvert ! x264enc ! rtph264pay ! udpsink host=224.1.1.1 auto-multicast=true port=1234 multicast-iface=eth0
However, the CPU usage rises a bit too much and guess HW acceleration is not used. Moreover, I don’t get any video on the client side with:
gst-launch-1.0 udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1234 multicast-iface=enp7s0 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
- Is it supported on apalis i.mx8?
- What would be the correct pipeline?
- Are there gstreamer plugins or pkg-config flags that I am missing?
Dear @morandg,
Thank you for writing to Toradex Community!
-
Is it supported on apalis i.mx8?
If you meant, the VPU, yes it is enabled. You have not mentioned the BSP you are using, but nevertheless this is not a problem. And we recommend using latest BSP
-
What would be the correct pipeline?
What solved a similar issue for me in the past is to use “sync= false”
. What this does is ignore the timestamps and display the incoming frame as such. When this doesn’t solve your issue and the video output is still slow/lagging, there would not be any frame loss, so it allows for us to better understand the issue by observing this pipeline.
Another thing to try is to use rtpbin
as bin usually does a lot of smart buffering. One more thing to also try and see is to increase the buffer latency by setting latency=1000, for example. Another useful command I love to use is the autovideosink
that decides automatically which video sink to use.
-
Are there gstreamer plugins or
pkg-config flags that I am missing?
Have you tried using the v4l2* plugins
? But the best place to check this would be the User Guide from NXp here: https://www.nxp.com/docs/en/user-guide/IMX_LINUX_USERS_GUIDE.pdf
I hope this helps!
Best Regards,
Janani
Hi @saijanani.tx !
Thanks for the pointers, that is very helpful!
We are using the latest Yocto stable version, that is gatesgarth at this day.
Following the guide from NXP, the vpuenc_xxx gstreamer plugins seems to be the one I’m looking for! Unfortunately, it is not compatible for i.mx8 and cannot compile it in our image:
http://git.yoctoproject.org/cgit/cgit.cgi/meta-freescale/tree/recipes-multimedia/gstreamer/gstreamer1.0-plugins-imx_0.13.1.bb?h=gatesgarth#n71
Moreover, it is not listed gst-inspect-1.0 on your multimedia image. That is why I was wondering if this is supported on i.mx8 platforms or am I missing something?!
Sorry I missed this line from NXP document:
“VPU encoder is v4l2h264enc on i.MX 8QuadMax and 8QuadXPlus.”
I will try better and let you know
Sorry @saijanani.tx , I have been doing other stuff and since I’m a gst newb, it took me a lot of time to figure many things out. Anyway, here are my findings.
First I could send a jpeg encoded stream with (v4l2jpegenc):
# Server pipeline
gst-launch-1.0 v4l2src device=/dev/video2 ! \
video/x-raw,width=640,height=480 ! \
v4l2jpegenc ! \
rtpjpegpay ! \
udpsink host=224.1.1.1 auto-multicast=true port=1234 multicast-iface=$IFACE
# Client pipeline
gst-launch-1.0 \
udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1234 multicast-iface=$IFACE ! \
application/x-rtp,encoding-name=JPEG,payload=26 ! \
rtpjpegdepay ! \
jpegdec ! \
autovideosink
Where I struggled a bit more is to encode the stream with h264. All the USB camera I tested only support the MJPG and YUYV format.
apalis-imx8 ~ # v4l2-ctl --list-formats-ext --device=/dev/video2
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
# ... Snipped out ...
Size: Discrete 960x720
Interval: Discrete 0.033s (30.000 fps)
[1]: 'YUYV' (YUYV 4:2:2)
# ... Snipped out ...
Size: Discrete 960x720
Interval: Discrete 0.067s (15.000 fps)
This is why I neede to add the videoconvert gst element:
# Server pipeline
gst-launch-1.0 v4l2src device=/dev/video2 ! \
video/x-raw,width=640,height=480 ! \
videoconvert ! \
v4l2h264enc ! \
h264parse ! \
rtph264pay ! \
udpsink sync=false host=224.1.1.1 auto-multicast=true port=1234 multicast-iface=$IFACE
# Client pipeline
gst-launch-1.0 \
udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1234 multicast-iface=$IFACE ! \
application/x-rtp ! \
rtph264depay ! \
avdec_h264 ! \
autovideosink
Now I’m wondering what is the benefit since a lot of CPU is wasted in doing the conversion.
- Are there camera supporting NV12 natively?
- Would there be a better solution?
Thanks for your help anyway and hope my pointers will be useful for others.
Hello @morandg ,
Unfortunately, I know not of a camera that would support nv12 out-of-the-box.
Yes, videoconvert
does not use HW acceleration hence the reason you might see an issue. But to improve the performance of the pipeline, here a thread I found, that could help:
http://gstreamer-devel.966125.n4.nabble.com/Videoconvert-needs-to-be-optimized-td4667003.html
Looks like adding queue
in front of ‘videoconvert’ in the pipeline seems to improve the pipeline. But I haven’t tested that myself.
I hope this helps anyways.
BR, Janani
Alright, thanks for the pointers!
Sorry again @saijanani.tx but now I’m evaluating a Verdin i.mx8mp SoC and don’t see any VPU video device (/dev/video{11,12}), is this already supported on that platform or am I missing something?