Redirect shell output for ruby script - ruby-on-rails

I have simple ruby script :
#! /usr/bin/env ruby
require 'fileutils'
FileUtils.rm "output.mkv" if File.exists?("outp ut.mkv")
pid = Process.spawn("ffmpeg -i wrong_file.mp4 -c:v libx264 -preset veryslow -qp 0 output.mkv", STDOUT => "output.txt", STDERR => "error.txt")
puts "pid : #{pid}"
Process.wait(pid)
But, STDOUT and STDERR outputs into error.txt, why ?
It looks, that ffmpeg have another exit codes ?(in usual case 0 for stdout, and 1 for stdin)
Note: I don't want to use native shell redirect like '> output.txt 2> error.txt' because i want to get pid of ffmpeg process, not shell process and kill it in future.

According to spawn method documentation you should be doing this:
pid = Process.spawn("ffmpeg -i wrong_file.mp4 -c:v libx264 -preset veryslow -qp 0 output.mkv", :out => "output.txt", :err => "error.txt")

The problem was resolved - FFMPEG redirects all output to STDERR

Related

Ruby capture stderr output from bash script execution

I can currently redirect stdout to a string variable in ruby/rails by simply running the command in bash and setting the result to my string variable as follows.
val = %x[ #{cmd} ]
where cmd is a string that represents a bash command.
However, this only captures stdout, for I want to capture stderr and set it to a string in ruby -- any ideas?
Simply redirect it:
val = %x[ #{cmd} 2>&1 ]
If you want to capture output from stderr only, close the file descriptor for stdout after copying it to fd 2.
val = %x[ #{cmd} 2>&1 >/dev/null ]
You can use Open3.popen3:
require 'open3'
stdin, stdout, stderr, wait_thread = Open3.popen3('ping -Z')
# => [#<IO:fd 9>, #<IO:fd 10>, #<IO:fd 12>, #<Thread:0x007fd3d30a0ce0 sleep>]
stderr.gets # => "ping: illegal option -- Z\n"
stdout.gets # => nil

save FFMPEG screenshot output file to variable

How can i get the output file of this FFMPEG code saved to a variable?
def take_screenshot
logger.debug "Trying to grab a screenshot from #{self.file}"
system "ffmpeg -i #{self.file} -ss 00:00:02 -vframes 1 #{Rails.root}/public/uploads/tmp/screenshots/#{File.basename(self.file)}.jpg"
self.save!
end
I have tried:
self.screenshot = system "ffmpeg -i #{self.file} -ss 00:00:02 -vframes 1 #{Rails.root}/public/uploads/tmp/screenshots/#{File.basename(self.file)}.jpg"
but this doesn't save anything.
thanks in advance!
ffmpeg usually outputs nothing on stdout and all of its debug messages on stderr. You can make it output the video (or image) to stdout when you pass - as the output file. You'd then also need to suppress stderr.
system "ffmpeg -i #{self.file} -ss 00:00:02 -c:v mjpeg -f mjpeg -vframes 1 - 2>/dev/null"
This will output the raw data of the JPEG-encoded image to stdout. From there you can save the data to a variable and, for example, transfer it somewhere else.
To get stdout from system calls, see here: Getting output of system() calls in ruby – especially popen3 should help you in that case, where you could discard the stderr from within Ruby.

grep show all lines, not just matches, set exit status

I'm piping some output of a command to egrep, which I'm using to make sure a particular failure string doesn't appear in.
The command itself, unfortunately, won't return a proper non-zero exit status on failure, that's why I'm doing this.
command | egrep -i -v "badpattern"
This works as far as giving me the exit code I want (1 if badpattern appears in the output, 0 otherwise), BUT, it'll only output lines that don't match the pattern (as the -v switch was designed to do). For my needs, those lines are the most interesting lines.
Is there a way to have grep just blindly pass through all lines it gets as input, and just give me the exit code as appropriate?
If not, I was thinking I could just use perl -ne "print; exit 1 if /badpattern/". I use -n rather than -p because -p won't print the offending line (since it prints after running the one-liner). So, I use -n and call print myself, which at least gives me the first offending line, but then output (and execution) stops there, so I'd have to do something like
perl -e '$code = 0; while (<>) { print; $code = 1 if /badpattern/; } exit $code'
which does the whole deal, but is a bit much, is there a simple command line switch for grep that will just do what I'm looking for?
Actually, your perl idea is not bad. Try:
perl -pe 'END { exit $status } $status=1 if /badpattern/;'
I bet this is at least as fast as the other options being suggested.
$ tee /dev/tty < ~/.bashrc | grep -q spam && echo spam || echo no spam
How about doing a redirect to /dev/null, hence removing all lines, but you still get the exit code?
$ grep spam .bashrc > /dev/null
$ echo $?
1
$ grep alias .bashrc > /dev/null
$ echo $?
0
Or you can simply use the -q switch
-q, --quiet, --silent
Quiet; do not write anything to standard output. Exit
immediately with zero status if any match is found, even if an
error was detected. Also see the -s or --no-messages option.
(-q is specified by POSIX.)

cron job is not completing the process?

I have a ruby program to convert video to MP4 format using ffmpeg. And I'm using the crontab to run the ruby program every 15 minutes. crontab actually runs the ruby program, but the conversion of the file is not complete. The process is stopped before completing the conversion. My sample code for testin is below.
def convert(y)
system "ffmpeg -i #{SOURCE_FOLDER + LOCATION_SOURCE}/#{y} -acodec libfaac -ar 44100 -ab 96k -vcodec libx264 #{DEST_FOLDER + LOCATION_DEST}/#{y}"
end
SOURCE_FOLDER = "/home/someone/work/videoapp/public/"
DEST_FOLDER = "/home/someone/work/videoapp/public/"
LOCATION_SOURCE = "source"
LOCATION_DEST = "dest"
files = Dir.new(SOURCE_FOLDER + LOCATION_SOURCE)
files.each do |x|
convert(x)
end
This code works fine, if i run it manually in the console.
My first guess is that it's dying on "dot" directories. In Unix there are two directories in every directory/folder: "." and "..". You'll either need to specifically skip those in your script:
next if File.directory?(x) # OR
next file x.match(/^\.+$/)
-- OR --
Look specifically for whatever filetypes you are wanting
Dir[SOURCE_FOLDER + LOCATION_SOURCE + "*.wav"].each do |file|
convert(file)
end
Update: 20110401
Add Unix redirects to the crontab entry to see what the output is
* * * * * /your/program/location/file.rb 1> /some/output/file.txt 2>&1

Prevent ffmpeg from taking over stdout

When I do system "ffmpeg -i just-do-it.mp4 -ab 96k -ar 22050 -qscale 6 output.flv" ffmpeg takes over the ruby process till the job is done, which sometimes take a long time. I've tried using threads amd fork in Ruby to no avail, also system equivalent commands like exec %x[] I also tried the latest Fibers in ruby 1.9.2, but I don't think I'm using it properly.
My question is how to run two ffmpeg processes from ruby concurrently?
EDIT:
fork do
fork do
system "ffmpeg -i you-know.mp4 -ab 96k -ar 22050 -qscale 6 #{Time.now.sec}.flv"
end
fork do
system "ffmpeg -i bangbang.mp4 -ab 96k -ar 22050 -qscale 6 #{Time.now.sec}.flv"
end
end
fork/exec is the right solution. Since forked processes inherit the parent processes fopen file handles/etc., you'll have to close (or redirect) the file handles you don't want children processes to use.
For example:
# this will print nothing, but yes is running as a forked process
# you'll want to `killall yes` after running this script.
fork do
[$stdout, $stderr].each { |fh| fh.reopen File.open("/dev/null", "w") }
exec "yes"
end
Ok, some comments on the code you posted. The outer fork is pointless. Just fork the two ffmpeg process from the main process. Maybe write a helper function like:
def ffmpeg(mp4)
fork do
[$stdout, $stderr].each { ... }
exec "ffmpeg -i #{mp4} ..."
end
end
ffmpeg("you-know.mp4")
ffmpeg("bangbang.mp4")
Try the subprocess gem - that's what I'm using now for dealing with process forking and finding it much easier to use.
E.g.
work_list.each do |cmd|
process = Subprocess::Popen.new(cmd)
process.run
process.wait
#puts process.stdout
#puts process.stderr
puts process.status
end

Resources