I'm trying to do simple network streaming of just video from the Leopard board 368 using the Evaluation SDK.
I've tried several of the suggested examples including: https://www.ridgerun.com/developer/wiki/index.php/LeopardBoard_365_GStreamer_Pipelines_-_SDK_2011Q2#H264_video_streaming_pipelines_LeopardBoard_DM365The Leopard board side seems to workHowever the Ubuntu side doesn't work I get "ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
I tried to direct the stream to a Windows PC using VLC network streaming "rtp://@:3000" and I get nothing.
I also tried the network streaming post in the FAQ in this forum "Video Streaming over RTP using RidgeRun SDK".Again the Leopard board side seems to work however on the Ubuntu side I don't know what to put in the for the cap parameter, there is a whole bunch of caps info that is displayed on the Leopard board side but I don't know which to use.
It seems that a lot of these examples are incomplete, i.e. they don't include instructions for the prerequisit software that is required to make them work (it took me all morning to figure out how to get the ffdec_h264 component installed on my Ubuntu system), and or they don't break things down and tell the noob what everything means and whats going on (for instance I'm still trying to figure out what the -e option to gst-launch means, the help is no help at all)Is there an example out there that is complete, up to date, and actually breaks down the commands so that I can actually learn and understand what's going on?
Ok I just found part of a solution. For streaming from the leopard board to Ubuntu gstreamer I had to do the following:
gst-launch -v -e v4l2src always-copy=FALSE input-src=composite chain-ipipe=true ! video/x-raw-yuv,format=\(fourcc\)NV12, width=1280, height=720 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2 ratecontrol=2 intraframeinterval=23 idrinterval=46 targetbitrate=3000000 ! rtph264pay ! udpsink port=$PORT host=$HOST_ADDR sync=false enable-last-buffer=false
gst-launch -v udpsrc port=$PORT caps="application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, sprop-parameter-sets=\"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\\,aO48sA\\=\\=\", payload=96, ssrc=4122492447, clock-base=2438443486, seqnum-base=57603" ! rtph264depay ! queue ! ffdec_h264 ! ffmpegcolorspace ! videoscale ! ximagesink
Adding "ffmpegcolorspace ! videoscale between ffdec_h264 and ximagesink did the trick.
I'd still like to know why this was required and how one would have worked this out simply from the gstreamer output.
The problem with the pipeline are the caps, that pipeline (at Ubuntu client) should be:
gst-launch -v udpsrc port=3000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgC3I\,aO48sA\=\=", payload=(int)96, ssrc=(guint)1335677188, clock-base=(guint)2580247201, seqnum-base=(guint)5999' ! rtph264depay ! 'video/x-h264' ! ffdec_h264 ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false' ! xvimagesink
It is working with ffmpegcolorspace because ffmpegcolorspace converts video according to the needs of the receiver element, in your case that ffmpegcolorspace is doing a convertion for ximagesink
Carlos AgueroEmbedded Software EngineerRidgeRun