gstreamer: cannot launch rtsp streaming - stream

I am new to gstreamer. Although it sounds like a very entry-level question I couldn't find clear answer so far.
I try to launch server like below according to some example.
$ gst-launch-1.0 -v videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96 ! udpsink rtsp://127.0.0.1:8554/test
Then I use VLC as client (on the same computer).
$ vlc rtsp://127.0.0.1:8554/test
VLC reports error of "Unable to connect...". But if I use "test-launch" in the first step, it works fine.
Another question is besides VLC, I try to launch client like this.
$ gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! ffdec_h264 ! xvimagesink
But gstreamer complains no element "ffdec_h264" and no element "xvimagesink".
For extra info, I installed "gstreamer" and "gst-plugins-base/good/bad/ugly", all from git (1.2 version).
Thank you so much for the hint.

ffdec_h264 is from gst-0.10, so you need to use avdec_h264 in gst-1.0 instead. By the other hand, I use to play autovideosink sync=false as pipeline sink in my udp stream.
There is an example code in gst-rtsp-0.10.8/examples that can help you with the rstp stream server, but I suggest you receive the stream using udpsrc in gstreamer in order to reduce the delay (use -v option in the source to see caps parameter and configure it in the receiver).
If you want VLC to play your rtsp stream you need to define the .sdp file according to your rtsp stream session.
You should see this question for more info:
GStreamer rtp stream to vlc

I don't know about VLC, but as far the gstreamer launch line goes, you seem to be missing the ffmpeg package. you can probably find it in the same place as you found the other plugins.
Also, replace xvimagesink with autovideosink, which will use any sinks you have available.

Related

Using a video device in 2 applicatrions simoultaneously with Gstreamer

I am trying to use the camera feed on my Jetson Nano (running headless over SSH) in 2 different applications.
From the command line, I am able to run
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=3280, height=2464, format=NV12, framerate=(fraction)21/1' ! nvvidconv ! xvimagesink
Which streams video from my camera (IMX219 connected to the Jetson Nano) to my desktop via an X11 Window.
What I would like to do is somehow use that same video stream in 2 different applications.
My first application is a python program that runs some OpenCV stuff, the second application is a simple bash script which records video to an *.mp4 file.
Is such a thing possible? I have looked into using v4l2loopback, but I'm unsure if that is really the simplest way to do this.
Well, I managed to figure it out thanks to both commentors, here is my solution on my Jetson Nano, but it can be tweaked for any Gstreamer application.
First, use v4l2loopback to create 2 virtual devices like so:
sudo modprobe v4l2loopback video_nr=1,2
That will create /dev/video1 and /dev/video2
Then you use tee to dump a Gstreamer stream into each of these virtual devices, here is my line:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=3280, height=2464, format=NV12, framerate=(fraction)21/1' ! nvvidconv ! tee name=t ! queue ! v4l2sink device=/dev/video1 t. ! queue ! v4l2sink device=/dev/video2
This is specifically for my Jetson Nano and my specific camera, but you can change the gstreamer pipeline to do what you wish

Using vlc command line for capture device frin a network URL

I'm trying to capture a video using vlc.
The procedure is standard using the GUI :
enter the url e.g http://some_site_some_video.mp4/playlist.m3u8 in the network protocol capture device tool (ctrl+n) in the next screen enter the path so save the file and that's it.
Tried using VLC docs, and the closest command I found was vlc -I dummy -vvv input_stream --sout
but
vlc -I dummy -vvv http://some_site_some_video.mp4/playlist.m3u8 --sout home/me/videos
Didn't work.
Is it the right command to use?

gstreamer 1.0 and raspberry pi streaming from webcam to ios browser

Using gstreamer 1.0, I would like to create a pipeline that streams video from a Logitech C920 webcam to an iphone running chrome ios. This pipeline would run on a raspberry pi model B. I think I need to use hlssink, and serve a m3u8 file. I was thinking of running a python-tornado webserver to serve the m3u8 file on the raspberry pi. Also I know that the Logitech C920 supports hardware encoding for H.264 and would like to use that if possible. So far I've been unsuccessful and would appreciate any help or feedback.
Considering that the minimal hlssink pipeline is like this:
gst-launch-1.0 videotestsrc is-live=true ! x264enc ! mpegtsmux ! hlssink max-files=5
You need to encode the raw source from the camera with x264enc and parse then with h264parser. After that you need to multiplex different media streams into an MPEG Transport Stream (in this case we have only video).
The final pipeline would be, for example:
gst-launch-1.0 videotestsrc is-live=true ! video/x-raw, framerate=25/1, width=720, height=576, format=I420 ! x264enc bitrate=1000 key-int-max=25 ! h264parse ! video/x-h264 ! queue ! mpegtsmux ! hlssink playlist-length=10 max-files=20 playlist-root="http://localhost/hls/" playlist-location="/var/www/html/hls/stream0.m3u8" location="/var/www/html/hls/fragment%06d.ts" target-duration=5
I have added some caps to help you adding after v4l2src device=/dev/video0, but this depends on the camera model. I also added some different properties from hlssink to show you how to set the different files location.The pipeline above runs with videotestsrc and writes the files and playlist in /var/www/html/hls folder.
Tested with Apache, it is possible to view the result with vlc or simply running:
gst-launch-1.0 playbin uri=http://localhost/hls/stream0.m3u8
If you have any doubts capturing from webcam you can follow this link for more information.

ffmpeg open remote video with high latency while gstreamer not

I use MJPEG-Streamer to send my remote camera video via wifi. I use such command to see video:
gstreamer:
gst-launch -v souphttpsrc location= "http://192.168.1.1:8080/?action=stream&type=.mjpg" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! autovideosink
ffmpeg:
ffplay "http://192.168.1.1:8080/?action=stream&type=.mjpg"
or:
ffplay "http://192.168.1.1:8080/?action=stream&type=.mjpg" -fflags nobuffer
however the ffplay has a high latency up to 3~10 seconds in my test, while the gstreamer
show almost no latency.
when using localhost MJPEG-Streamer, both two methods show low latency. so what's the reason ? and how to decrease the latency?
more detail:
I want to use opencv to capture a remote camera,my opencv compile with ffmpeg support but without gstreamer support ( I tried but failed,the cmake seemed to not find my gstreamer,I don't know which gstreamer library to install in opensuse 13.1), I can get the video in opencv but with high latency,so I compared ffmpeg with gstreamer,the result is as above.so how to decrease the latency? I read this link,but still no solution.
thank you.

VLC screen:// wont produce a file using :sout=

Why won't this produce a file? It does everything right but saving to an actual file .. I am using Linux, Vlc 1.1.9, compiled without skins2, qt or ncurses interfaces...
vlc :sout=#transcode{vcodec=mp4,acodec=mp4a,vb=800,scale=1}:std{access=file,mux=mp4,dst="~/file.mp4"} screen:// screen-fps=12 screen-caching=100
Note this also does the same thing - shows the screen:// fine, but will not output to a file:
vlc :sout=#transcode{vcodec=h264,acodec=none,vb=800}:std{access=file,mux=avi,dst="/root/file.avi"} screen:// screen-fps=12 screen-caching=100
Try this
vlc -vvv (PORT (UDP, HTTP)) --ts-dump-file video.ts and add more commands
vlc -vvv (PORT (UDP, HTTP)) --sout-all file/ogg,mpg,ts:filename and add more commands
Thats above code shuld work!

Resources