How do I temporarily redirect stderr in Ruby on Rails? - ruby-on-rails

This is my code (recompilation of How do I temporarily redirect stderr in Ruby? (which can't be used because of native extension writes):
def silence_stdout(log = '/dev/null')
orig = $stdout.dup
$stdout.reopen(File.new(log, 'w'))
begin
yield
ensure
$stdout = orig
end
end
silence_stdout('ttt.log') do
#do something
end
But I have a problem, the file is filled with the code only after the puma stops (Ctrl + C).
Probably should I close the file? But I do not understand how to do it. All my attempts to close the file end as "log writing failed. closed stream" or "no block given (yield)".
I ask for advice.

Related

How to "reload" a cloudflare 520 request with ruby?

I wrote a ruby script to download an image URL:
require 'open-uri'
imageAddress = ARGV[0]
targetPath = ARGV[1]
fullFileNamePath = "#{targetPath}test.jpg"
begin
File.open(fullFileNamePath, 'wb') do |fo|
fo.write open(imageAddress).read
end
rescue OpenURI::HTTPError => ex
puts ex
File.delete(fullFileNamePath)
end
Example Usage:
ruby download_image.rb "https://images.genius.com/b015b15e476c92d10a834d523575d3c9.1000x1000x1.jpg" "/Users/Me/Downloads/"
The problem is, sometimes I run across this output error:
520 Origin Error
Then, when I try the same URL in my browser, I get something like this:
If I reload the page or click the 'Retry for a live version' button in the above image, the page loads.
Then if I run the script again it downloads the image just fine.
So how can I replicate this page reload / 'Retry for a live version' behavior using ruby and without switching to my browser? Running the script again doesn't do the job.
It sounds like you are looking for a delay command. If the script fails (or encounters '520 Origin Error') wait and re-try.
This is a quick built recursive function, you may want to add other checks for how many times you have looped, breaking after so many. (Also not tested, may contain errors, meant as an example)
def getFile(params_you_need)
begin
File.open(fullFileNamePath, 'wb') do |fo|
fo.write open(imageAddress).read
end
rescue OpenURI::HTTPError => ex
puts ex
File.delete(fullFileNamePath)
if ex == '520 Origin Error'
sleep(30) #generally a good time to pause
getFile(params_you_need)
end
end
end

How to send binary file over Web Sockets with Rails

I have a Rails application where users upload Audio files. I want to send them to a third party server, and I need to connect to the external server using Web sockets, so, I need my Rails application to be a websocket client.
I'm trying to figure out how to properly set that up. I'm not committed to any gem just yet, but the 'faye-websocket' gem looks promising. I even found a similar answer in "Sending large file in websocket before timeout", however, using that code doesn't work for me.
Here is an example of my code:
#message = Array.new
EM.run {
ws = Faye::WebSocket::Client.new("wss://example_url.com")
ws.on :open do |event|
File.open('path/to/audio_file.wav','rb') do |f|
ws.send(f.gets)
end
end
ws.on :message do |event|
#message << [event.data]
end
ws.on :close do |event|
ws = nil
EM.stop
end
}
When I use that, I get an error from the recipient server:
No JSON object could be decoded
This makes sense, because the I don't believe it's properly formatted for faye-websocket. Their documentation says:
send(message) accepts either a String or an Array of byte-sized
integers and sends a text or binary message over the connection to the
other peer; binary data must be encoded as an Array.
I'm not sure how to accomplish that. How do I load binary into an array of integers with Ruby?
I tried modifying the send command to use the bytes method:
File.open('path/to/audio_file.wav','rb') do |f|
ws.send(f.gets.bytes)
end
But now I receive this error:
Stream was 19 bytes but needs to be at least 100 bytes
I know my file is 286KB, so something is wrong here. I get confused as to when to use File.read vs File.open vs. File.new.
Also, maybe this gem isn't the best for sending binary data. Does anyone have success sending binary files in Rails with websockets?
Update: I did find a way to get this working, but it is terrible for memory. For other people that want to load small files, you can simply File.binread and the unpack method:
ws.on :open do |event|
f = File.binread 'path/to/audio_file.wav'
ws.send(f.unpack('C*'))
end
However, if I use that same code on a mere 100MB file, the server runs out of memory. It depletes the entire available 1.5GB on my test server! Does anyone know how to do this is a memory safe manner?
Here's my take on it:
# do only once when initializing Rails:
require 'iodine/client'
Iodine.force_start!
# this sets the callbacks.
# on_message is always required by Iodine.
options = {}
options[:on_message] = Proc.new do |data|
# this will never get called
puts "incoming data ignored? for:\n#{data}"
end
options[:on_open] = Proc.new do
# believe it or not - this variable belongs to the websocket connection.
#started_upload = true
# set a task to send the file,
# so the on_open initialization doesn't block incoming messages.
Iodine.run do
# read the file and write to the websocket.
File.open('filename','r') do |f|
buffer = String.new # recycle the String's allocated memory
write f.read(65_536, buffer) until f.eof?
#started_upload = :done
end
# close the connection
close
end
end
options[:on_close] = Proc.new do |data|
# can we notify the user that the file was uploaded?
if #started_upload == :done
# we did it :-)
else
# what happened?
end
end
# will not wait for a connection:
Iodine::Http.ws_connect "wss://example_url.com", options
# OR
# will wait for a connection, raising errors if failed.
Iodine::Http::WebsocketClient.connect "wss://example_url.com", options
It's only fair to mention that I'm Iodine's author, which I wrote for use in Plezi (a RESTful Websocket real time application framework you can use stand alone or within Rails)... I'm super biased ;-)
I would avoid the gets because it's size could include the whole file or a single byte, depending on the location of the next End Of Line (EOL) marker... read gives you better control over each chunk's size.

Issue with testing Tempfile in ruby

I have a block of code that creates Tempfiles
#tmp_file = Tempfile.new("filename")
I keep them closed after creation,
#tmp_file.close unless #tmp_file.closed?
When there is a need to add data to the temp files I open them and add data as below
def add_row_to_file(row)
#tmp_file.open
#tmp_file.read
#tmp_file.print(row.to_json + "\n")
end
All is well, but for testing the same I have stubbed tempfile as below and is creating an error when the test case runs into add_row_to_file(row)
buffers = {}
Tempfile.stub(:new) do |file_name|
buffer = StringIO.new
buffers[file_name] = buffer
end
Error message is :
Failure/Error: ]],
NoMethodError:
private method `open' called for #<StringIO:0x00000010b867c0>
I want to keep the temp files closed on creation as there is a max temp files open issue at OS level (I have to deal with uploading lot of tempfiles to S3)
but for testing I have a problem accessing the private method of StringIO.
Any idea how to solve this problem? Thanks.
I have a work around, which is to skip closing the StringIO when in test environment.
#tmp_file.close unless #tmp_file.closed? || Rails.env.test?
and update add_row_to_file(row) as below
def add_row_to_file(row)
#tmp_file.open unless Rails.env.test?
#tmp_file.read unless Rails.env.test?
#tmp_file.print(row.to_json + "\n")
end
Apart from the work-around provided if we want 100% code coverage while in test we can try the below.
Do not close the stream if it is StringIO
#tmp_file.close unless #tmp_file.is_a?(StringIO)
so we dont need to open it while in test.
def add_row_to_file(row)
#tmp_file = File.open(#tmp_file, 'a+') unless #tmp_file.is_a?(StringIO)
#tmp_file.print(row.to_json + "\n")
end
In this way we can achieve the Tempfile testing without actually creating Files while in test environment and having 100% code coverage while in test.

Can I display the log of system call in Ruby?

I need to call a command(in a sinatra or rails app) like this:
`command sub`
Some log will be outputed when the command is executing.
I want to see the log displaying continuously in the process.
But I just can get the log string after it's done with:
result = `command sub`
So, is there a way to implement this?
On windows i have the best experience with IO.popen
Here is a sample
require 'logger'
$log = Logger.new( "#{__FILE__}.log", 'monthly' )
#here comes the full command line, here it is a java program
command = %Q{java -jar getscreen.jar #{$userid} #{$password}}
$log.debug command
STDOUT.sync = true
begin
# Note the somewhat strange 2> syntax. This denotes the file descriptor to pipe to a file. By convention, 0 is stdin, 1 is stdout, 2 is stderr.
IO.popen(command+" 2>&1") do |pipe|
pipe.sync = true
while str = pipe.gets #for every line the external program returns
#do somerthing with the capturted line
end
end
rescue => e
$log.error "#{__LINE__}:#{e}"
$log.error e.backtrace
end
There's six ways to do it, but the way you're using isn't the correct one because it waits for the process the return.
Pick one from here:
http://tech.natemurray.com/2007/03/ruby-shell-commands.html
I would use IO#popen3 if I was you.

ipython redirect stdout display corruption

I'm developing a system in python, and one functionality I need is the ability to have console output go to both the console and a user-specified file. This is replicating the Diary function in MATLAB. I have the following that works perfectly well on both IDLE on windows and python cmdline in ubuntu (this all exists inside a module that gets loaded):
class diaryout(object):
def __init__(self):
self.terminal = sys.stdout
self.save = None
def __del__(self):
try:
self.save.flush()
self.save.close()
except:
# do nothing, just catch the error; maybe it self was instantiated, but never opened
1/1
self.save = None
def dclose(self):
self.__del__()
def write(self, message):
self.terminal.write(message)
self.save.write(message)
def dopen(self,outfile):
self.outfile = outfile
try:
self.save = open(self.outfile, "a")
except Exception, e:
# just pass out the error here so the Diary function can handle it
raise e
def Diary(outfile = None):# NEW TO TEST
global this_diary
if outfile == None:
# None passed, so close the diary file if one is open
if isinstance(this_diary, diaryout):
sys.stdout = this_diary.terminal # set the stdout back to stdout
this_diary.dclose() # flush and close the file
this_diary = None # "delete" it
else:
# file passed, so let's open it and set it for the output
this_diary = diaryout() # instantiate
try:
this_diary.dopen(outfile) # open & test that it opened
except IOError:
raise IOError("Can't open %s for append!"%outfile)
this_dairy=none # must uninstantiate it, since already did that
except TypeError:
raise TypeError("Invalid input detected - must be string filename or None: %s"%Diary.__doc__)
this_dairy=none # must uninbstantiate it, since already did that
sys.stdout = this_diary # set stdout to it
Far superior to both IDLE and the plain python cmline, I'm using ipython; herein my problem lies. I can turn on the "diary" perfectly fine with no error but the display on the console gets messed. The attached screenshot shows this . The output file also becomes similarly garbled. Everything goes back to normal when I undo the redirection with Diary(None). I have tried editing the code so that it never even writes to the file, with no effect. It seems almost like something is forcing an unsupported character set or something I don't understand.
Anyone have an idea about this?

Resources