Colibri iMX6, gstreamer segfaults when streaming video from USB camera (MJPEG encoded)

Hi,
My configuration is as follows:
Colibri IMX6 DL 512MB V1.1A.
Carrierboard Iris Rev 2.0
Bsp 5, Reference Multimedia Image

I am trying to stream video data from a USB connected camera using gstreamer to a LCD (resolution 800 x 480).
Camera supports up to 1280x800 pixels at 60fps, MJPEG encoded data over a USB2 cable.
The following gstreamer pipeline is crashing with a segfault:
/usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec output-format=4 ! imxv4l2sink overlay-width=800 overlay-height=480 device=/dev/video16

The same pipeline works, if I add the frame-plus=0 option to vpudec.
/usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec frame-plus=0 output-format=4 ! imxv4l2sink overlay-width=800 overlay-height=480 device=/dev/video16

So it did implement this pipeline in a gestreamer C program. But what I found is that I can set the pipeline to paused and back to play only 2 times. Then again it crashes with segfault somewhere in imxv4l2sink (I guess).

Any ideas?

Regards,
Siegfried

Hi @s.steiger ,

Welcome and thank you for using the Toradex Community.

So if I get this right, the pipeline that works, fails after 2 restarts with the same segmentation fault?

Do the two segmentation faults look the same? Could you attach the logs, which show the respective segmentation faults?

Best Regards
Kevin

Hi Kevin,
not sure, if this is what you are asking for. If you are looking for more detail, I need some guidance on how to provide that.
Thanks,
Siegfried

root@colibri-imx6-10801658:~# /usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec output-format=4 ! imxv4l2sink overlay-width=640 overlay-height=480 device=/dev/video16
[INFO]  Product Info: i.MX6Q/D/S
====== IMXV4L2SINK: 4.5.7 build on Nov 13 2020 08:36:18. ======
Setting pipeline to PAUSED ...
display(/dev/fb0) resolution is (640x480).
[INFO]  Product Info: i.MX6Q/D/S
====== VPUDEC: 4.5.7 build on Nov 13 2020 08:36:18. ======
        wrapper: 3.0.0 (VPUWRAPPER_ARM_LINUX Build on Aug 17 2020 07:03:22)
        vpulib: 5.4.39
        firmware: 3.1.1.46076
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
[INFO]  bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstVpuDec:vpudec0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVpuDec:vpudec0.GstPad:src: caps = video/x-raw, format=(string)Y42B, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)2:4:7:1, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstImxV4l2Sink:imxv4l2sink0.GstPad:sink: caps = video/x-raw, format=(string)Y42B, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)2:4:7:1, framerate=(fraction)60/1
v4l2sink need allocate 3 buffers.
[  350.579295] mxc_sdc_fb fb@0: 640x480 h_sync,r,l: 64,16,80  v_sync,l,u: 4,3,13 pixclock=23750000 Hz
Caught SIGSEGV
exec gdb failed: No such file or directory
Spinning.  Please run 'gdb gst-launch-1.0 614' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:06.452933531
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
[  355.945458] mxc_sdc_fb fb@0: 640x480 h_sync,r,l: 64,16,80  v_sync,l,u: 4,3,13 pixclock=23750000 Hz
^C
root@colibri-imx6-10801658:~#

Hi @s.steiger ,

thank you for the log. For now, it is surely sufficient.

Are you using a Toradex Display?

Best Regards
Kevin

@s.steiger , the Reference Multimedia Image comes with Wayland. Can you try to use waylandsink instead of v4l2sink?

Hi Kevin and Denis,
the display I am using is a NHD-7.0-800480EF connected to Iris carrierboard through the Toradex display adapter. waylandsink does not work at all as the supported output formats of vpudec not match the supported input format of waylandsink:

root@colibri-imx6-10801658:~# /usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec frame-plus=0 output-format=4 ! waylandsink
[INFO]  Product Info: i.MX6Q/D/S
Setting pipeline to PAUSED ...
[INFO]  Product Info: i.MX6Q/D/S
====== VPUDEC: 4.5.7 build on Nov 13 2020 08:36:18. ======
        wrapper: 3.0.0 (VPUWRAPPER_ARM_LINUX Build on Aug 17 2020 07:03:22)
        vpulib: 5.4.39
        firmware: 3.1.1.46076
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
[INFO]  bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstVpuDec:vpudec0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVpuDec:vpudec0.GstPad:src: caps = video/x-raw, format=(string)Y42B, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)2:4:7:1, framerate=(fraction)60/1
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../git/libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:01.030391001
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

waylandsink together with imx_videoconvert_ipu does work, but framerate is only 15 fps (I need 60 fps for my application and imxv4l2sink works with 60 fps if I add the frame-plus=0 option).

Hi @s.steiger !

If possible, I would like to ask you some questions to better understand your objective and also your setup:

  • What would you like to accomplish? Simply to display the stream from the camera?
  • Which type/encodings is your camera capable of? You can get it by issuing v4l2-ctl --list-formats-ext -d /dev/video0, where /dev/video0 is your camera device

Also, after some research, we found this Issues decoding MJPEG with VPUdec - NXP Community question on NXP forum, which states that vpudec only supports baseline JPEG. So, probably you would need to convert your camera stream to this format/encoding before feeding it to vpudec.

Best regards,

Hi henrique,
objective is to build a rearview display for motorsports. Display resolution is 800 x 480. Camera supports the following formats:

root@localhost:~# v4l2-ctl --list-formats-ext -d /dev/video0
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

        [0]: 'MJPG' (Motion-JPEG, compressed)
                Size: Discrete 1920x1080
                        Interval: Discrete 0.033s (30.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.017s (60.000 fps)
                Size: Discrete 1024x768
                        Interval: Discrete 0.033s (30.000 fps)
                Size: Discrete 640x480
                        Interval: Discrete 0.008s (120.101 fps)
                Size: Discrete 800x600
                        Interval: Discrete 0.017s (60.000 fps)
                Size: Discrete 1280x1024
                        Interval: Discrete 0.033s (30.000 fps)
                Size: Discrete 320x240
                        Interval: Discrete 0.008s (120.101 fps)
        [1]: 'YUYV' (YUYV 4:2:2)
                Size: Discrete 1920x1080
                        Interval: Discrete 0.167s (6.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.111s (9.000 fps)
                Size: Discrete 1024x768
                        Interval: Discrete 0.167s (6.000 fps)
                Size: Discrete 640x480
                        Interval: Discrete 0.033s (30.000 fps)
                Size: Discrete 800x600
                        Interval: Discrete 0.050s (20.000 fps)
                Size: Discrete 1280x1024
                        Interval: Discrete 0.167s (6.000 fps)
                Size: Discrete 320x240
                        Interval: Discrete 0.033s (30.000 fps)

Customer requirement is 60 fps, so only option is to use MJPEG compressed data from camera.
Basically this is working fine using the pipeline described above with VPUdec decoding MJPEG data in hardware. The main issue is, that I need to change crop settings of imxv4l2sink on the fly for adjusting zoom and this does fail on the second try with a SEGV in gstreamer.

Regards,

Hi @s.steiger

I was testing using the Logitech C270 USB camera and no pipeline was working.

Now with Logitech C920 USB camera, I finally get something.

With your first pipeline I had no problem:

/usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec output-format=4 ! imxv4l2sink overlay-width=800 overlay-height=480 device=/dev/video16

And with your second pipeline I also had no problem:

/usr/bin/gst-launch-1.0 v4l2src -v device=/dev/video0 ! 'image/jpeg, width=1280, height=720' ! vpudec frame-plus=0 output-format=4 ! imxv4l2sink overlay-width=800 overlay-height=480 device=/dev/video16

Also, I could resize the video output tweaking the overlay-width and overlay-height when launching using gst-launch-1.0. I had no segmentation fault.

A side comment: your pipeline is reducing from 1280x720 (~1.78) to 800x480 (~1.67), which doesn’t share the same aspect ratio, so the final image will be (in this case, just a little bit) distorted if compared to the original. If this can be a problem for you, there is the possibility not to use the whole screen, like 800x450 (~1.78).

Best regards,

Hi henrique.tx,
thank you for your efforts on this topic.
Tried a Logitech C930 USB cam here. It works, too.
Reason simply is that Logitech USB cam support 30 fps only.

root@colibri-imx6-10801658:~# v4l2-ctl --list-formats-ext -d /dev/video0
[  119.453677] usb 1-1: reset high-speed USB device number 2 using ci_hdrc
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

        [0]: 'YUYV' (YUYV 4:2:2)
                Size: Discrete 640x480
                        Interval: Discrete 0.033s (30.000 fps)
						...
		[1]: 'MJPG' (Motion-JPEG, compressed)
                Size: Discrete 640x480
                        Interval: Discrete 0.033s (30.000 fps)
						...
                  Size: Discrete 1280x720
                        Interval: Discrete 0.033s (30.000 fps)
                        Interval: Discrete 0.042s (24.000 fps)
                        Interval: Discrete 0.050s (20.000 fps)
                        Interval: Discrete 0.067s (15.000 fps)
                        Interval: Discrete 0.100s (10.000 fps)
                        Interval: Discrete 0.133s (7.500 fps)
                        Interval: Discrete 0.200s (5.000 fps)
                Size: Discrete 1600x896
                        Interval: Discrete 0.033s (30.000 fps)
						...

Looks like the issue is related to 60 fps.
As said before customer requirement is 60 fps and my USB camera is supporting 60 fps at 1280x720/MJPG.

Regards,
Siegfried

Yeah… I understand.

Unfortunately, I don’t have other cameras around to test it.

Is there anything else I can do to help?

Best regards,

Hi henrique.tx,
I managed to run gstreamer under gdbserver and now I have backtrace of the crash:

Thread 2 "v4l2src0:src" received signal SIGSEGV, Segmentation fault.
[Switching to Thread 331.336]
0x76f4a3fc in gst_mini_object_unref (mini_object=0xffffffff) at ../git/gst/gstminiobject.c:639
warning: Source file is more recent than executable.
639	  g_return_if_fail (mini_object != NULL);
(gdb) backtrace 
#0  0x76f4a3fc in gst_mini_object_unref (mini_object=0xffffffff) at ../git/gst/gstminiobject.c:639
#1  0x76755a78 in gst_buffer_unref (buf=<optimized out>) at /usr/include/gstreamer-1.0/gst/gstbuffer.h:444
#2  gst_vpu_dec_handle_frame (bdec=0x52c260, frame=<optimized out>) at ../../../git/plugins/vpu/gstvpudec.c:331
#3  0x76a69632 in gst_video_decoder_decode_frame (decoder=decoder@entry=0x52c260, frame=0x75612950) at ../git/gst-libs/gst/video/gstvideodecoder.c:3411
#4  0x76a6b792 in gst_video_decoder_chain_forward (decoder=decoder@entry=0x52c260, buf=buf@entry=0x547738, at_eos=at_eos@entry=0)
    at ../git/gst-libs/gst/video/gstvideodecoder.c:2132
#5  0x76a6bc68 in gst_video_decoder_chain (pad=<optimized out>, parent=0x52c260, buf=0x547738) at ../git/gst-libs/gst/video/gstvideodecoder.c:2447
#6  0x76f4cffe in gst_pad_chain_data_unchecked (pad=pad@entry=0x528190, type=type@entry=4112, data=<optimized out>, data@entry=0x547738) at ../git/gst/gstpad.c:4327
#7  0x76f4eace in gst_pad_push_data (pad=0x5286f0, type=type@entry=4112, data=0x547738) at ../git/gst/gstpad.c:4583
#8  0x76f54312 in gst_pad_push (pad=<optimized out>, buffer=<optimized out>) at ../git/gst/gstpad.c:4702
#9  0x76afd4d2 in gst_base_transform_chain (pad=<optimized out>, parent=0x53c190, buffer=<optimized out>) at ../git/libs/gst/base/gstbasetransform.c:2330
#10 0x76f4cffe in gst_pad_chain_data_unchecked (pad=pad@entry=0x528598, type=type@entry=4112, data=<optimized out>, data@entry=0x547738) at ../git/gst/gstpad.c:4327
#11 0x76f4eace in gst_pad_push_data (pad=pad@entry=0x528038, type=type@entry=4112, data=0x547738) at ../git/gst/gstpad.c:4583
#12 0x76f54312 in gst_pad_push (pad=pad@entry=0x528038, buffer=<optimized out>) at ../git/gst/gstpad.c:4702
#13 0x76af9e3c in gst_base_src_loop (pad=0x528038) at ../git/libs/gst/base/gstbasesrc.c:2974
#14 0x76f78d10 in gst_task_func (task=0x547028) at ../git/gst/gsttask.c:328
#15 0x76e5fa48 in g_thread_pool_thread_proxy (data=<optimized out>) at ../glib-2.62.6/glib/gthreadpool.c:308
#16 0x76e5f4be in g_thread_proxy (data=0x5244c0) at ../glib-2.62.6/glib/gthread.c:805
#17 0x76d99898 in start_thread (arg=0xcdea0d89) at pthread_create.c:477
#18 0x76d39f1c in ?? () at ../sysdeps/unix/sysv/linux/arm/clone.S:73 from /opt/tdx-xwayland/5.4.0/sysroots/armv7at2hf-neon-tdx-linux-gnueabi/lib/libc.so.6

#3  0x76a69632 in gst_video_decoder_decode_frame (decoder=decoder@entry=0x52c260, frame=0x75612950) at ../git/gst-libs/gst/video/gstvideodecoder.c:3411
3411	../git/gst-libs/gst/video/gstvideodecoder.c: No such file or directory.
(gdb) p *frame
$5 = {ref_count = 4393712, flags = 1, system_frame_number = 0, decode_frame_number = 0, presentation_frame_number = 1995577941, dts = 6290538081, 
  pts = 8458043396244635648, duration = 18446744073709551615, distance_from_sync = -1, input_buffer = 0xffffffff, output_buffer = 0xffffffff, 
  deadline = 18446744073709551615, events = 0xffffffff, user_data = 0xffffffff, user_data_destroy_notify = 0xa8, abidata = {ABI = {ts = 18446744071383891656, 
      ts2 = 925816908}, padding = {0x75613ec8, 0xffffffff, 0x372ed84c, 0x0 <repeats 14 times>, 0x75612180, 0x75612180, 0x0}}}

Looks like the problem occurs on the first frame, with input_buffer/output_buffer pointers being 0xFFFFFFFF.

Any idea whom to contact to get support on this?

Regards,
Siegfried

Hi @s.steiger!

I just sent you a private message.

Best regards,