Gstreamer network stream( without sdp file)

I am trying to stream from imx6 apalis device. I can successfully capture a video and record it. However, I could not stream it to network. At this stage can not use sdp file at client. I succeeded that at both Raspberry and Nvidia devices. However, my general pipe element do not exist in toredex. Can you give me an example for network streaming or the indicate the necessary changes on nvidia example.

my pipeline for Nvidia TX2:

gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -e ! videoconvert ! x264enc noise-reduction=10000 tune=zerolatency byte-stream=true threads=4 key-int-max=15 intra-refresh=true ! mpegtsmux alignment=7 name=mux ! rtpmp2tpay ! queue ! udpsink host=127.0.0.1 port=5000 sync=true uridec. ! audioconvert ! voaacenc ! audio/mpeg ! queue ! mux.

I edited it like this: ( with a regular video file)

gst-launch-1.0 -v uridecodebin name=uridec uri=file:/home/root/vid.mp4 ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1  ! vpuenc_h264  ! rtph264pay ! queue ! udpsink host=127.0.0.1 port=5000 sync=true uridec.  ! queue ! mux.

Also, tried different approaches with no luck.

Thank, for your time

I created my pipeline from sketch. The mux is missing by using mpegtsmux I can successfully stream from pc. I translated x264enc to vpuenc_h264. However, I could not find the equivalent of the mpegtsmux. What should I use ?

I guess if mpegtsmux is really missing in our regular demo images you may easily add it to your custom image. Looks like mpegtsmux may be part of gstreamer1.0-plugins-bad. Alternatively, Fluendo may also have it available.

Ok. I am compressing a video to h264 and stream it to network. After that I am reading it with mplayer( without any config. file. Just type the port).

To be faster, I am cross checking it from computer.

At ubuntu :
gst-launch-1.0 -v ximagesrc use-damage=false ! videoconvert ! videoscale ! video/x-raw,width=1920,height=1080 ! queue ! x264enc bitrate=5000 ! mpegtsmux alignment=7 ! udpsink host=127.0.0.1 port=5004

This cammands work pretty well. Without mpegts, gstreamer constantly gives the packet is to large error.

At Apalis:
gst-launch-1.0 -vv -e videotestsrc ! queue ! vpuenc_h264 ! rndbuffersize max=1316 min=1316 ! udpsink host=192.168.1.34 port=5004

I tried this, It does not throw an error. However, mplayer reads nothing and wireshark shows strange packet behavior. So, I believe that I need a mux after vpuenc_h264. Is there anything I can try before creating costume kernel?

Why don’t you just try running JetPack if you do not like everybody else’s Embedded Linux approach?