How can i get the output file of this FFMPEG code saved to a variable?
def take_screenshot
logger.debug "Trying to grab a screenshot from #{self.file}"
system "ffmpeg -i #{self.file} -ss 00:00:02 -vframes 1 #{Rails.root}/public/uploads/tmp/screenshots/#{File.basename(self.file)}.jpg"
self.save!
end
I have tried:
self.screenshot = system "ffmpeg -i #{self.file} -ss 00:00:02 -vframes 1 #{Rails.root}/public/uploads/tmp/screenshots/#{File.basename(self.file)}.jpg"
but this doesn't save anything.
thanks in advance!
ffmpeg usually outputs nothing on stdout and all of its debug messages on stderr. You can make it output the video (or image) to stdout when you pass - as the output file. You'd then also need to suppress stderr.
system "ffmpeg -i #{self.file} -ss 00:00:02 -c:v mjpeg -f mjpeg -vframes 1 - 2>/dev/null"
This will output the raw data of the JPEG-encoded image to stdout. From there you can save the data to a variable and, for example, transfer it somewhere else.
To get stdout from system calls, see here: Getting output of system() calls in ruby – especially popen3 should help you in that case, where you could discard the stderr from within Ruby.
Related
Environment: Docker, Ubuntu 20.04, OpenCV 3.5.4, FFmpeg 4.2.4
Im currently reading the output of a cv2.VideoCapture session using the CV_FFMPEG backend and successfully writing that back out in real time to a file using cv2.VideoWriter. The reason I am doing this is to drawing bounding boxes on the input and saving it to a new output.
The problem is I am doing this in a headless environment (Docker container). And I’d like to view what's being written to cv2.VideoWriter in realtime.
I know there are ways to pass my display through using XQuartz for example so I could use cv2.imshow. But what I really want to do is write those frames to an RTSP Server. So not only my host can "watch" but also other hosts could watch too.
After the video is released I can easily stream the video to my RTSP Server using this command.
ffmpeg -re -stream_loop -1 -i output.mp4 -c copy -f rtsp rtsp://rtsp_server_host:8554/stream
Is there anyway to pipe the frames as they come in to the above command? Can cv2.VideoWriter itself write frames to an RTSP Server?
Any ideas would be much appreciated! Thank you.
After much searching I finally figured out how to do this with FFmpeg in a subprocess. Hopefully this helps someone else!
def open_ffmpeg_stream_process(self):
args = (
"ffmpeg -re -stream_loop -1 -f rawvideo -pix_fmt "
"rgb24 -s 1920x1080 -i pipe:0 -pix_fmt yuv420p "
"-f rtsp rtsp://rtsp_server:8554/stream"
).split()
return subprocess.Popen(args, stdin=subprocess.PIPE)
def capture_loop():
ffmpeg_process = open_ffmpeg_stream_process()
capture = cv2.VideoCapture(<video/stream>)
while True:
grabbed, frame = capture.read()
if not grabbed:
break
ffmpeg_process.stdin.write(frame.astype(np.uint8).tobytes())
capture.release()
ffmpeg_process.stdin.close()
ffmpeg_process.wait()
I have simple ruby script :
#! /usr/bin/env ruby
require 'fileutils'
FileUtils.rm "output.mkv" if File.exists?("outp ut.mkv")
pid = Process.spawn("ffmpeg -i wrong_file.mp4 -c:v libx264 -preset veryslow -qp 0 output.mkv", STDOUT => "output.txt", STDERR => "error.txt")
puts "pid : #{pid}"
Process.wait(pid)
But, STDOUT and STDERR outputs into error.txt, why ?
It looks, that ffmpeg have another exit codes ?(in usual case 0 for stdout, and 1 for stdin)
Note: I don't want to use native shell redirect like '> output.txt 2> error.txt' because i want to get pid of ffmpeg process, not shell process and kill it in future.
According to spawn method documentation you should be doing this:
pid = Process.spawn("ffmpeg -i wrong_file.mp4 -c:v libx264 -preset veryslow -qp 0 output.mkv", :out => "output.txt", :err => "error.txt")
The problem was resolved - FFMPEG redirects all output to STDERR
I am streaming a video on raspberrypi using command:
ffmpeg -re -threads 2 -i sample_video.m2v -f mpegts - | \
ffmpeg -f mpegts -i - -c copy -f mpegts udp://192.168.1.100:12345
The remote PC with 192.168.1.100 uses ffmpeg library to listen to the input stream. For example:
informat = ffmpeg::av_find_input_format("mpegts");
avformat_open_input(&pFormatCtx, "udp://192.168.1.100:12345", informat, options);
However, when I compute the hash value of each decoded frame on two sides (i.e. raspberrypi and PC), they DON'T MATCH at all. A weird thing is, among ~2000 frames, there are in total ~10 frames whose hash value are the same on the sender and receiver side. The match result look like this:
00000....00011000...00011110000...000
where 0 indicates non-match and 1 indicates match. The matched frame appeared 2~6 in sequence and appeared rarely while most of the other frames has different hash value.
The hash is computed on the frame data buffer extracted using avpicture_layout(). On the Pi side, I just stream the video to a local port and there's a local process using the same code to decode and hash the frames:
ffmpeg -re -threads 2 -i sample_video.m2v -f mpegts - | \
ffmpeg -f mpegts -i - -c copy -f mpegts udp://localhost:12345
...
The streaming source raspberry pi, is connected directly to the PC using cable. I don't think it is a packet loss problem. Because, first, I rerun the same process several times and the hash value of the received frames are the same (otherwise the result should be different because packet loss is probabilistic). Secondly, I even try to stream on tcp://192.168.1.100:12345 (and "tcp://192.168.1.100:12345?listen" on PC), and the received frame hash are still the same - different than the hash result on the Pi.
So, does anyone know why the streaming to a remote address will yield different decoded frames? Maybe I am missing some details.
Thanks in advance!!
I have installed ffmpeg and mjpeg-streamer. The latter reads a .jpg file from /tmp/stream and outputs it via http onto a website, so I can stream whatever is in that folder through a web browser.
I wrote a bash script that continuously captures a frame from the webcam and puts it in /tmp/stream:
while true
do
ffmpeg -f video4linux2 -i /dev/v4l/by-id/usb-Microsoft_Microsoft_LifeCam_VX-5000-video-index0 -vframes 1 /tmp/stream/pic.jpg
done
This works great, but is very slow (~1 fps). In the hopes of speeding it up, I want to use a single ffmpeg command which continuously updates the .jpg at, let's say 10 fps. What I tried was the following:
ffmpeg -f video4linux2 -r 10 -i /dev/v4l/by-id/usb-Microsoft_Microsoft_LifeCam_VX-5000-video-index0 /tmp/stream/pic.jpg
However this - understandably - results in the error message:
[image2 # 0x1f6c0c0] Could not get frame filename number 2 from pattern '/tmp/stream/pic.jpg'
av_interleaved_write_frame(): Input/output error
...because the output pattern is bad for a continuous stream of images.
Is it possible to stream to just one jpg with ffmpeg?
Thanks...
You can use the -update option:
ffmpeg -y -f v4l2 -i /dev/video0 -update 1 -r 1 output.jpg
From the image2 file muxer documentation:
-update number
If number is nonzero, the filename will always be interpreted as just a
filename, not a pattern, and this file will be continuously overwritten
with new images.
It is possible to achieve what I wanted by using:
./mjpg_streamer -i "input_uvc.so -r 1280×1024 -d /dev/video0 -y" -o "output_http.so -p 8080 -w ./www"
...from within the mjpg_streamer's directory. It will do all the nasty work for you by displaying the stream in the browser when using the address:
http://{IP-OF-THE-SERVER}:8080/
It's also light-weight enough to run on a Raspberry Pi.
Here is a good tutorial for setting it up.
Thanks for the help!
I'm running a ffmpeg command to try to get the duration of a video file, the command is as follows...
system('ffmpeg -i C:\Users\example\Desktop\video9.mp4 -f ffmetadata')
When I run that line it outputs a lot of info to the rails console, including duration. But how would I capture that info so I can split it and grab the data I need? (I'm doing this inside a rails controller)
When I run something like this...
metadata = system('ffmpeg -i C:\Users\example\Desktop\video9.mp4 -f ffmetadata')
puts metadata
All it returns is false.
Use:
output = `ffmpeg -i C:\\Users\\example\\Desktop\\video9.mp4 -f ffmetadata`
The problem is that system doesn't capture the output of the command being run. Instead, we use %x[...] or its equivalent using backticks, which captures the sub-shell's STDOUT.
If you need more control, look at Open3.capture3.
Found it...
inspect_command = "ffmpeg -i " + file_location + " 2>&1 "
metadata = `#{inspect_command}`
If all you need to get is the video duration use ffprobe instead of ffmpeg. It returns the video metadata directly.