Capture output of shell command, line by line, into an array - ruby-on-rails

I want to capture the total number of rubocop offenses to determine whether my codebase is getting better or worse. I have almost no experience with ruby scripting.
This is my script so far:
#!/usr/bin/env ruby
#script/code_coverage
var = `rubocop ../ -f fuubar -o ../coverage/rubocop_results.txt -f offenses`
puts var
So I ./rails-app/script/code_coverage and my terminal displays
...
--
6844 Total
When I var.inspect I get a long string. I want to either read the output of rubocop ../ -f ... line by line (preferred) or capture each line of output into an array.
I would like to be able to do something like this:
var.each |line| do
if line.contains('total')
...
total = ...
end
Is this possible? I guess this would be similar to writing the output to a file and and then reading the file in line by line.

If you want to keep it simple, you can simply split your var string.
var = `rubocop ../ -f fuubar -o ../coverage/rubocop_results.txt -f offenses`.split("\n")
Then you can iterate on var like you wanted to.

use open3 library of ruby. Open3 grants you access to stdin, stdout, stderr and a thread to wait the child process when running another program. You can specify various attributes, redirections, current directory, etc., of the program as Process.spawn.
http://blog.bigbinary.com/2012/10/18/backtick-system-exec-in-ruby.html
require 'open3'
# Run lynx on this file.
cmd = "lynx -crawl -dump /data/feed/#{file_name}.html > /data/feed/#{file_name}"
Open3.popen3(cmd) do |stdin, stdout, stderr, wait_thr|
cmdout = stdout.read
$logger.debug "stdout is:" + stdout.read
$logger.debug "stderr is:" + stderr.read
end

Related

Lua: forward output from Minicom

I have a Lua script and in there I open a minicom session which executes a script (with the -S" parameter).
local myFile = assert(io.popen('minicom -S myScript.sh ' myDevice ' -C myLogFile.log'))
local myFileOutput = myFile:read('*all')
myFile:close()
This works really fine.
But I would like to get the same output as if I execute the minicom command itself:
minicom -S myScript.sh ' myDevice ' -C myLogFile.log
Right now I don't get any output at all (I know that that's somehow obvious).
I would that the output should also occur at (at least nearly) the same time as with the minicom command itself. Not one big chuck of data at the end.
Does anyone know how to achieve that?
If I understand you correctly, you need something like
local myFile = assert(io.popen('minicom ...'))
for line in myFile:lines('l') do
print(line)
end
myFile:close()

How to overwrite the system function

I'ld like to overwrite the system() function. Is that possible?
Sure, you can overwrite nearly everything in Ruby (whether useful or not):
system "ls /" # returns "/etc /var...", normal behaviour
def system args
puts args
end
system "ls /" # returns "ls /"
If it doesn't matter to use system at all then you can use backticks. The backticks execute the command and return the output as a string.
You can then assign the value to a variable like so:
output = `ls`
p output

Shell: Find Matching Lines Across Many Files

I am trying to use a shell script (well a "one liner") to find any common lines between around 50 files.
Edit: Note I am looking for a line (lines) that appears in all the files
So far i've tried grep grep -v -x -f file1.sp * which just matches that files contents across ALL the other files.
I've also tried grep -v -x -f file1.sp file2.sp | grep -v -x -f - file3.sp | grep -v -x -f - file4.sp | grep -v -x -f - file5.sp etc... but I believe that searches using the files to be searched as STD in not the pattern to match on.
Does anyone know how to do this with grep or another tool?
I don't mind if it takes a while to run, I've got to add a few lines of code to around 500 files and wanted to find a common line in each of them for it to insert 'after' (they were originally just c&p from one file so hopefully there are some common lines!)
Thanks for your time,
When I first read this I thought you were trying to find 'any common lines'. I took this as meaning "find duplicate lines". If this is the case, the following should suffice:
sort *.sp | uniq -d
Upon re-reading your question, it seems that you are actually trying to find lines that 'appear in all the files'. If this is the case, you will need to know the number of files in your directory:
find . -type f -name "*.sp" | wc -l
If this returns the number 50, you can then use awk like this:
WHINY_USERS=1 awk '{ array[$0]++ } END { for (i in array) if (array[i] == 50) print i }' *.sp
You can consolidate this process and write a one-liner like this:
WHINY_USERS=1 awk -v find=$(find . -type f -name "*.sp" | wc -l) '{ array[$0]++ } END { for (i in array) if (array[i] == find) print i }' *.sp
old, bash answer (O(n); opens 2 * n files)
From #mjgpy3 answer, you just have to make a for loop and use comm, like this:
#!/bin/bash
tmp1="/tmp/tmp1$RANDOM"
tmp2="/tmp/tmp2$RANDOM"
cp "$1" "$tmp1"
shift
for file in "$#"
do
comm -1 -2 "$tmp1" "$file" > "$tmp2"
mv "$tmp2" "$tmp1"
done
cat "$tmp1"
rm "$tmp1"
Save in a comm.sh, make it executable, and call
./comm.sh *.sp
assuming all your filenames end with .sp.
Updated answer, python, opens only each file once
Looking at the other answers, I wanted to give one that opens once each file without using any temporary file, and supports duplicated lines. Additionally, let's process the files in parallel.
Here you go (in python3):
#!/bin/env python
import argparse
import sys
import multiprocessing
import os
EOLS = {'native': os.linesep.encode('ascii'), 'unix': b'\n', 'windows': b'\r\n'}
def extract_set(filename):
with open(filename, 'rb') as f:
return set(line.rstrip(b'\r\n') for line in f)
def find_common_lines(filenames):
pool = multiprocessing.Pool()
line_sets = pool.map(extract_set, filenames)
return set.intersection(*line_sets)
if __name__ == '__main__':
# usage info and argument parsing
parser = argparse.ArgumentParser()
parser.add_argument("in_files", nargs='+',
help="find common lines in these files")
parser.add_argument('--out', type=argparse.FileType('wb'),
help="the output file (default stdout)")
parser.add_argument('--eol-style', choices=EOLS.keys(), default='native',
help="(default: native)")
args = parser.parse_args()
# actual stuff
common_lines = find_common_lines(args.in_files)
# write results to output
to_print = EOLS[args.eol_style].join(common_lines)
if args.out is None:
# find out stdout's encoding, utf-8 if absent
encoding = sys.stdout.encoding or 'utf-8'
sys.stdout.write(to_print.decode(encoding))
else:
args.out.write(to_print)
Save it into a find_common_lines.py, and call
python ./find_common_lines.py *.sp
More usage info with the --help option.
Combining this two answers (ans1 and ans2) I think you can get the result you are needing without sorting the files:
#!/bin/bash
ans="matching_lines"
for file1 in *
do
for file2 in *
do
if [ "$file1" != "$ans" ] && [ "$file2" != "$ans" ] && [ "$file1" != "$file2" ] ; then
echo "Comparing: $file1 $file2 ..." >> $ans
perl -ne 'print if ($seen{$_} .= #ARGV) =~ /10$/' $file1 $file2 >> $ans
fi
done
done
Simply save it, give it execution rights (chmod +x compareFiles.sh) and run it. It will take all the files present in the current working directory and will make an all-vs-all comparison leaving in the "matching_lines" file the result.
Things to be improved:
Skip directories
Avoid comparing all the files two times (file1 vs file2 and file2 vs file1).
Maybe add the line number next to the matching string
Hope this helps.
Best,
Alan Karpovsky
See this answer. I originally though a diff sounded like what you were asking for, but this answer seems much more appropriate.

rails - Redirecting console output to a file

On a bash console, if I do this:
cd mydir
ls -l > mydir.txt
The > operator captures the standard input and redirects it to a file; so I get the listing of files in mydir.txt instead of in the standard output.
Is there any way to do something similar on the rails console?
I've got a ruby statement that generates lots of prints (~8k lines) and I'd like to be able to see it completely, but the console only "remembers" the last 1024 lines or so. So I thought about redirecting to a file - If anyone knows a better option, I'm all ears.
A quick one-off solution:
irb:001> f = File.new('statements.xml', 'w')
irb:002> f << Account.find(1).statements.to_xml
irb:003> f.close
Create a JSON fixture:
irb:004> f = File.new(Rails.root + 'spec/fixtures/qbo/amy_cust.json', 'w')
irb:005> f << JSON.pretty_generate((q.get :customer, 1).as_json)
irb:006> f.close
You can use override $stdout to redirect the console output:
$stdout = File.new('console.out', 'w')
You may also need to call this once:
$stdout.sync = true
To restore:
$stdout = STDOUT
Apart from Veger's answer, there is one of more way to do it which also provides many other additional options.
Just open your rails project directory and enter the command:
rails c | tee output.txt
tee command also has many other options which you can check out by:
man tee
If you write the following code in your environment file, it should work.
if "irb" == $0
config.logger = Logger.new(Rails.root.join('path_to_log_file.txt'))
end
You can also rotate the log file using
config.logger = Logger.new(Rails.root.join('path_to_log_file.txt'), number_of_files, file_roation_size_threshold)
For logging only active record related operations, you can do
ActiveRecord::Base.logger = Logger.new(Rails.root.join('path_to_log_file.txt'))
This also lets you have different logger config/file for different environments.
Using Hirb, you can choose to log only the Hirb output to a text file.
That makes you able to still see the commands you type in into the console window, and just the model output will go to the file.
From the Hirb readme:
Although views by default are printed to STDOUT, they can be easily modified to write anywhere:
# Setup views to write to file 'console.log'.
>> Hirb::View.render_method = lambda {|output| File.open("console.log", 'w') {|f| f.write(output) } }
# Doesn't write to file because Symbol doesn't have a view and thus defaults to irb's echo mode.
>> :blah
=> :blah
# Go back to printing Hirb views to STDOUT.
>> Hirb::View.reset_render_method
Use hirb. It automatically pages any output in irb that is longer than a screenful. Put this in a console session to see this work:
>> require 'rubygems'
>> require 'hirb'
>> Hirb.enable
For more on how this works, read this post.
Try using script utility if you are on Unix-based OS.
script -c "rails runner -e development lib/scripts/my_script.rb" report.txt
That helped me capture a Rails runner script's very-very long output easily to a file.
I tried using redirecting to a file but it got written only at the end of script.
That didn't helped me because I had few interactive commands in my script.
Then I used just script and then ran the rails runner in script session but it didn't wrote everything. Then I found this script -c "runner command here" output_file and it saved all the output as was desired. This was on Ubuntu 14.04 LTS
References:
https://askubuntu.com/questions/290322/how-to-get-and-copy-a-too-long-output-completely-in-terminal#comment1668695_715798
Writing Ruby Console Output to Text File

unix at command pass variable to shell script?

I'm trying to setup a simple timer that gets started from a Rails Application. This timer should wait out its duration and then start a shell script that will start up ./script/runner and complete the initial request. I need script/runner because I need access to ActiveRecord.
Here's my test lines in Rails
output = `at #{(Time.now + 60).strftime("%H:%M")} < #{Rails.root}/lib/parking_timer.sh STRING_VARIABLE`
return render :text => output
Then my parking_timer.sh looks like this
#!/bin/sh
~/PATH_TO_APP/script/runner -e development ~/PATH_TO_APP/lib/ParkingTimer.rb $1
echo "All Done"
Finally, ParkingTimer.rb reads the passed variable with
ARGV.each do|a|
puts "Argument: #{a}"
end
The problem is that the Unix command "at" doesn't seem to like variables and only wants to deal with filenames. I either get one of two errors depending on how I position "s
If I put quotes around the right hand side like so
... "~/PATH_TO_APP/lib/parking_timer.sh STRING_VARIABLE"
I get,
-bash: ~/PATH_TO_APP/lib/parking_timer.sh STRING_VARIABLE: No such file or directory
I I leave the quotes out, I get,
at: garbled time
This is all happening on a Mac OS 10.6 box running Rails 2.3 & Ruby 1.8.6
I've already messed around w/ BackgrounDrb, and decided its a total PITA. I need to be able to cancel the job at any time before it is due.
After playing around with irb a bit, here's what I found.
The backtick operator invokes the shell after ruby has done any interpretation necessary. For my test case, the strace output looked something like this:
execve("/bin/sh", ["sh", "-c", "echo at 12:57 < /etc/fstab"], [/* 67 vars */]) = 0
Since we know what it's doing, let's take a look at how your command will be executed:
/bin/sh -c "at 12:57 < RAILS_ROOT/lib/parking_timer.sh STRING_VARIABLE"
That looks very odd. Do you really want to pipe parking_timer.sh, the script, as input into the at command?
What you probably ultimately want is something like this:
/bin/sh -c "RAILS_ROOT/lib/parking_timer.sh STRING_VARIABLE | at 12:57"
Thus, the output of the parking_timer.sh command will become the input to the at command.
So, try the following:
output = `#{Rails.root}/lib/parking_timer.sh STRING_VARIABLE | at #{(Time.now + 60).strftime("%H:%M")}`
return render :text => output
You can always use strace or truss to see what's happening. For example:
strace -o strace.out -f -ff -p $IRB_PID
Then grep '^exec' strace.out* to see where the command is being executed.

Resources