On my development environment, whenever I run delayed jobs which the method contains a log, it outputs it to the logs, but this is not the case for production. Example:
def save_my_model
Rails.logger.info "MODEL SAVED!"
p = Profile.new
p.save
end
I'd see the info log in the development.log but this doesn't show in production. How can I make it output?
You can try this:
RAILS_ENV=production script/delayed_job start
Look at this links google discussion and github documentation
OR simply make delayed_job.rb file in config/initializers and add following line of code:
Delayed::Worker.logger = Logger.new(File.join(Rails.root, 'log', 'dj.log'))
Related
I am trying to learn Cron Job and whenever gem.
In my app i have created a rake task.
require 'rubygems'
namespace :cron_job do
desc "To Check Users Inactivity"
task user_inactivity: :environment do
p "Inactive Users..."
end
end
and in schedule.rb i have wrote like this
every 1.minute do
rake "cron_job:user_inactivity", environment: "development"
end
and in my terminal i wrote two commands
whenever --update-crontab
and then
sudo /etc/init.d/cron restart
but nothing is happening after 1 minute.
I checked my console for p messages and nothing happens. Do i miss something?
The output won't show in your console.
You have to set the output log path by:
set :output, "/path/to/my/cron_log.log"
Then check the log.
I can't get any log output from delayed_job, and I'm not sure my jobs are starting.
Here's my Procfile:
web: bundle exec rails server
worker: bundle exec rake jobs:work
worker: bundle exec clockwork app/clock.rb
And here's the job:
class ScanningJob
def perform
logger.info "logging from delayed_job"
end
def after(job)
Rails.logger.info "logging from after delayed_job"
end
end
I see that clockwork outputs to system out, and I can see worker executor starting, but I never see my log statements hit. I tried puts as well to no avail.
My clock file is pretty simple:
every(3.seconds, 'refreshlistings') { Delayed::Job.enqueue ScanningJob.new }
I just want to see this working, and lack of logging means I can't. What's going on here?
When I needed log output from Delayed Jobs, I found this question to be fairly helpful.
In config/initializers/delayed_job.rb I add the line:
Delayed::Worker.logger = Logger.new(File.join(Rails.root, 'log', 'dj.log'))
Then, in my job, I output to this log with:
Delayed::Worker.logger.info("Log Entry")
This results in the file log/dj.log being written to. This works in development, staging, and production environments.
I have some Rake tasks that produce CSV output which I'd like to redirect to a file and open with other tools, but when I run heroku rake foo > foo.csv I get log messages (SQL queries, etc.) in my output.
I've tried Rails.logger = Logger.new('/dev/null') and Rails.logger = Logger.new(STDERR) at the top of the Rake task and while those function as expected locally, they don't have any noticeable effect when I run the task on Heroku.
I'm not too shocked that Heroku would squash STDOUT and STDERR together but it's a mystery to me why sending to /dev/null would not kill the output.
Any help greatly appreciated.
Rails v3.0.0, Heroku bamboo-ree-1.8.7 stack, rake 0.9.2.
I was having the same problem, though I didn't run into it until I changed config/environments/production.rb to have this:
config.logger = Logger.new(STDOUT)
(I did this so that my app would log to the heroku log.)
My fix was this:
config.logger = Logger.new(STDOUT) unless 'rake' == File.basename($0)
From Heroku | Dev Center | Logging:
When a Rails app is pushed, we will automatically install the rails_log_stdout plugin into the application which will redirect logs to stdout.
I think Heroku includes (in the output sent through your git push command) a notification about this (and one other addition: for serving static/public content, if I remember correctly). You may only see the notifications for certain kinds of pushes though (complete slug rebuilds?). I remember seeing it when I recently pushed a new application to a Bamboo/MRI-1.9.2 stack, but I do not think I got the message every time I pushed changes to just the application’s code (maybe adding a new gem to the Gemfile is enough to trigger it?).
Several Rails subsystems keep their own logger binding (independent bindings whose values are often initialized from Rails.logger; reassigning the latter does not change the former):
ActiveRecord::Base.logger
ActionController::Base.logger
ActionMailer::Base.logger
Heroku’s changes probably set a new value for Rails.logger before ActiveRecord is initialized. When ActiveRecord is eventually loaded, it sets its own logger to be the same as Rails.logger (the Heroku/stdout one). When your task runs, it reassigns Rails.logger, but it is too late for this to have any effect on ActiveRecord::Base.logger (the only most likely to be handling the SQL logs).
You probably need to reassign some of these other logger bindings to squelch the logging going to STDOUT. Some other likely locations are listed in rails_log_stdout’s init.rb in the Rails 2 section.
I faced the same problem and found the following to be a more convenient workaround:
Add the following to config/environments/production.rb
config.logger.level = Logger.const_get(ENV['LOG_LEVEL'] ? ENV['LOG_LEVEL'].upcase : 'INFO')
Push to Heroku, then when you run your rake tasks add LOG_LEVEL="fatal" to the end of the command (replace foo and foo.csv with your things):
heroku run rake foo LOG_LEVEL="fatal" > foo.csv
I have log_level set to fatal in the above example, but it can be any of the following: debug|info|warn|error|fatal. In our case, using the highest would mean nothing but the most fatal errors are outputted into the csv file.
Just to help anyone with a "fresh" Rails project pushing to Heroku:
You need a combination of #Matt Burke and #Hengjie's answer:
Add these two lines to config/environments/production.rb:
config.logger = Logger.new(STDOUT)
config.logger.level = Logger.const_get(ENV['LOG_LEVEL'] ? ENV['LOG_LEVEL'].upcase : 'INFO')
This will setup a new STDOUT logger and allow you to easily control the log resolution with the LOG_LEVEL environment variable.
I solved this problem with the following change to production.rb:
if 'rake' == File.basename($0)
ActiveRecord::Base.logger = Logger.new('rake.log', 'daily')
end
I suppose we could ignore the output as well
if 'rake' == File.basename($0)
ActiveRecord::Base.logger = Logger.new('/dev/null')
end
I use RSpec for integration tests. Unfortunately when running those request specs I often miss important errors as I don't directly see the output of the test web server. Is there a way to get this stuff on the console, too?
If you mean server logs it should be something like this:
if rails_env = ENV['RAILS_ENV']
require 'logger'
logger = Logger.new(STDOUT)
ActiveRecord::Base.logger = logger
ActiveResource::Base.logger = logger
Rails.logger = logger
end
Not sure about server output.
The output of the server during testing goes to log/test.log in your rails app directory. You can view it with
$ cat log/test.log
or, if you want something of a real time view,
$ watch -c -n 1 tail -40 log/test.log
In a rake task if I use puts command then I see the output on console. However I will not see that message in log file when app is deployed on production.
However if I say Rails.logger.info then in development mode I see nothing on console. I need to go to log file and tail that.
I would ideally like to use Rails.logger.info and in development mode inside the rake task, the output from logger should also be sent to console.
Is there a way to achieve that?
Put this in application.rb, or in a rake task initialize code
if defined?(Rails) && (Rails.env == 'development')
Rails.logger = Logger.new(STDOUT)
end
This is Rails 3 code. Note that this will override logging to development.log. If you want both STDOUT and development.log you'll need a wrapper function.
If you'd like this behaviour only in the Rails console, place the same block of code in your ~/.irbrc.
You could create a new rake task to get this to work.
desc "switch logger to stdout"
task :to_stdout => [:environment] do
Rails.logger = Logger.new(STDOUT)
end
This way when you execute your rake task you can add to_stdout first to get stdout log messages or don't include it to have messages sent to the default log file
rake to_stdout some_task
Rake tasks are run by a user, on a command-line. Anything they need to know right away ("processed 5 rows") should be output on the terminal with puts.
Anything that needs to be kept for posterity ("sent warning email to jsmith#example.com") should be sent to the Rails.logger.
Code
For Rails 4 and newer, you can use Logger broadcast.
If you want to get both STDOUT and file logging for rake tasks in development mode, you can add this code into config/environments/development.rb :
if File.basename($0) == 'rake'
# http://stackoverflow.com/questions/2246141/puts-vs-logger-in-rails-rake-tasks
log_file = Rails.root.join("log", "#{Rails.env}.log")
Rails.logger = ActiveSupport::Logger.new(log_file)
Rails.logger.extend(ActiveSupport::Logger.broadcast(ActiveSupport::Logger.new(STDOUT)))
end
Test
Here's a small Rake task to test the above code :
# lib/tasks/stdout_and_log.rake
namespace :stdout_and_log do
desc "Test if Rails.logger outputs to STDOUT and log file"
task :test => :environment do
puts "HELLO FROM PUTS"
Rails.logger.info "HELLO FROM LOGGER"
end
end
Running rake stdout_and_log:test outputs
HELLO FROM PUTS
HELLO FROM LOGGER
while
HELLO FROM LOGGER
has been added to log/development.log.
Running rake stdout_and_log:test RAILS_ENV=production outputs
HELLO FROM PUTS
while
HELLO FROM LOGGER
has been added to log/production.log.
I'd say that using Rails.logger.info is the way to go.
You won't be able to see it in the server console because it won't run via the server. Just open up a new console and tail -f the log file, it'll do the trick.
Many users are aware of the UNIX®
command 'tail', which can be used to
display the last few lines of a large
file. This can be useful for viewing
log files, etc.
Even more useful in some situations,
is the '-f' parameter to the 'tail'
command. This causes tail to 'follow'
the output of the file. Initially, the
response will be the same as for
'tail' on its own - the last few lines
of the file will be displayed.
However, the command does not return
to the prompt, and instead, continues
to 'follow' the file. When additional
lines are added to the file, they will
be displayed on the terminal. This is
very useful for watching log files, or
any other file which may be appended
over time. Type 'man tail' for more
details on this and other tail
options.
(via)
In Rails 2.X to redirect the logger to STDOUT in models:
ActiveRecord::Base.logger = Logger.new(STDOUT)
To redirect logger in controllers:
ActionController::Base.logger = Logger.new(STDOUT)
Execute a background job with '&' and open script/console or whatever..
That way you can run multiple commands in the same window.
tail -f log/development.log &
script/console
Loading development environment (Rails 2.3.5)
>> Product.all
2011-03-10 11:56:00 18062 DEBUG Product Load (6.0ms) SELECT * FROM "products"
[<Product.1>,<Product.2>]
note Can get sloppy quickly when there is a lot of logging output.
How about creating an application helper which detects which environment is running and does the right thing?
def output_debug(info)
if RAILS_ENV == "development"
puts info
else
logger.info info
end
end
Then call output_debug instead of puts or logger.info