Streaming Local Video file as RTSP and publish to a port - docker

I am trying to setup a single docker that can stream local file as rtsp to a port.
Meaning, within the docker there will be some local videos publish as rtsp to a port of that docker.
Then externally, I can fetch the stream from rtsp://:/mystream
I tried looking into rtsp-simple-server, but it does not seem to have the option of local file streaming, rather it requires first set up a docker server then using ffmpeg to publish video to that server.
Is there a way to achieve the wanted single docker RTSP stream server?
There is another response of building a docker with VLC installed, however it seem to be bulky and overkill, plus the outcome does not seem to be as smooth.

Hi,
Most RTSP Servers works like you descripe you have a server instance and publish then a stream to them.
Its not hard to build a own RTSP server with gstreamer and python here is a link ,look at the answer, the did exactly what u need.
good start a for a new program project =)

Related

Containerized applications execute local script?

I'm trying to self hosted videos with nginx using nginx-rtmp-module (VOD) similar to youtube.
I successfully hosted videos by using ffmpeg to convert mp4 file to dash chunks.
I want my site can
upload video
Containerized golang app save file to local
run ffmpeg script to convert to dash chunks
How can I handle the third step ?
Is there a better way to make a VOD self hosted service ?
I run /usr/bin/ffmpeg but output not found
That is because the executable ffmpeg mounted from the host would depend on dynamic libraries either not present in the Docker image, or not at the right version (see ldd or lddtree for analysis)
It is better to build a dedicated image with the right tools installed in it rather than relying on the host content for program execution.

How to use guacenc in guacamole?

I am running guacamole using docker image. I want to record RDP session. I've recorded RDP session, which is in raw format. In guacamole doc there is a utility called guacenc which convert recorded file data into a .m4v video format by using this command.
guacenc /path/to/recording/NAME.
Here I do not know, where I've to run this command.
You have to be logged into a terminal session on your server.

Big Data project requirements using twitter streams

I am currently trying to break into Data engineering and I figured the best way to do this was to get a basic understanding of the Hadoop stack(played around with Cloudera quickstart VM/went through tutorial) and then try to build my own project. I want to build a data pipeline that ingests twitter data, store it in HDFS or HBASE, and then run some sort of analytics on the stored data. I would also prefer that I use real time streaming data, not historical/batch data. My data flow would look like this:
Twitter Stream API --> Flume --> HDFS --> Spark/MapReduce --> Some DB
Does this look like a good way to bring in my data and analyze it?
Also, how would you guys recommend I host/store all this?
Would it be better to have one instance on AWS ec2 for hadoop to run on? or should I run it all in a local vm on my desktop?
I plan to have only one node cluster to start.
First of all, Spark Streaming can read from Twitter, and in CDH, I believe that is the streaming framework of choice.
Your pipeline is reasonable, though I might suggest using Apache NiFi (which is in the Hortonworks HDF distribution), or Streamsets, which is installable in CDH easily, from what I understand.
Note, these are running completely independently of Hadoop. Hint: Docker works great with them. HDFS and YARN are really the only complex components that I would rely on a pre-configured VM for.
Both Nifi and Streamsets give you a drop and drop UI for hooking Twitter to HDFS and "other DB".
Flume can work, and one pipeline is easy, but it just hasn't matured at the level of the other streaming platforms. Personally, I like a Logstash -> Kafka -> Spark Streaming pipeline better, for example because Logstash configuration files are nicer to work with (Twitter plugin builtin). And Kafka works with a bunch of tools.
You could also try out Kafka with Kafka Connect, or use Apache Flink for the whole pipeline.
Primary takeaway, you can bypass Hadoop here, or at least have something like this
Twitter > Streaming Framework > HDFS
.. > Other DB
... > Spark
Regarding running locally or not, as long as you are fine with paying for idle hours on a cloud provider, go ahead.

creating a jack client from inside a docker container

I use jack to route audio between multiple sound cards in my pc.
To record the audio i use a very convenient FFmpeg command which creates a writable jack client:
ffmpeg -f jack -i <client_name> -strict -2 -y <output_file_name>.
so far this works very well.
The problem starts here:
I also have an nginx docker which records my data and makes it available for streaming. when trying to use the same command inside the docker i get the following error:"Unable to register as a JACK client".
I started to look in to the FFmpeg code and found out that the FFmpeg command calls the jack_client_open command from the jack API, which fails.
Seems like there is some kind of a problem in the connection between the FFmpeg request from inside the docker to the jackd server running on the host.
Is there a simple way to create a connection between the two [exposing ports]?
(I saw some solutions like netjack2, but before creating a more complex server-client architecture i'd like to find a more elegant solution).
Thanks for the help!
I've just got this working, and I required the following in my docker run commands:
--volume=/dev/shm:/dev/shm:rw
--user=1000
So that the container is running a user which can access files in /dev/shm from a jackd spawned from my host user account. This wouldn't be required if your jackd and the container are both running as user root.
You can confirm its working by running jack_simple_client in the container, you should get a beep.

How to capture an ip camera (mjpeg) stream and replay it later via commandline?

To test a software which processes ip camera streams (eg. mjpeg) I would like to capture a short sequence from an original camera and later stream this recording in loop as if it would come from an ip camera. It should be commandline based to simplify automated integration testing.
I already figured out the recording part (capturing 10 seconds):
$ vlc -I dummy --run-time=10 http://192.168.0.142:8080/videofeed --sout=file/asf:test-stream.asf vlc://quit
How to use vlc or similar to loop this recording as a mjpeg stream served on http://localhost:8080 or similar?
I figured it out by myself:
$ vlc -I dummy -vvv test-stream.asf -L --sout '#standard{access=http,mux=mpjpeg,dst=:8080}'

Resources