Spring server: Timeout error - ruby-on-rails

I am trying to run a cron-task using Rails schedule.rb file. The task invokes a function written in ruby. The function runs perfectly fine. However when trying to run as a cron I get this error.
Starting Spring server with `/home/ubuntu/.rvm/gems/ruby-2.4.0/gems/spring-2.0.2/bin/spring server --background` timed out after 20 seconds
Spring(2.0.2) is installed and working perfectly.
Any idea how to solve this?

If the command works outside of cron, but not in the crontab, the problem is almost certainly that the command isn't picking up some necessary environment variable setting. There are several ways to get around the problem, but the simplest and best is to wrap your command in a shell script.
For initial testing, you can simply source your login environment:
. ~/.bash_profile
But eventually you'll want to just set the variables you need and not include anything extra. For more information, see Define your own job types.

Related

Launching a process from inside a controller in rails

I want to start a process from inside a controller.
I've tried the usual
pid = fork do
code
end
Process.detach(pid)
But nothing is happening. When I try with eval(code) in the fork block the code runs but it's the actual rails server/puma running it. This means that when I kill the process I also shut down the whole server.
I had some code before that I lost that worked and I'm nearly sure it used exec or eval or something like this that created a process(and therefore returned a pid to be able to kill it later) and I remember checking with ps that it was run by something of rails but not the actual whole server.
Why isn't the fork do block enough for it to work? What's the way to do it?
And, for non-testing purposes and actual implementation, how can I make it run totally independent from the rails server?
You can execute a shell command from inside your Rails controller using exec.
I hope you are looking for the same. The process you start on the system will be totally independent of the Rails server and will be visible under ps command if it is running when you hit the command.
Documentation: http://ruby-doc.org/core-2.5.1/Kernel.html#method-i-exec

Running node.js code from Rails application

I am trying to integrate some node.js code with my Rails application. Basically its a js file with some code that process will keep running in background.
I have followed the following steps:
Added code in root of rails app in some test_node.js file.
Now what I do is pass a value to my system using exec function of ruby, e.g exec "node test_node.js".
This works perfectly fine but it chop my server from processing any requests.
To pass it to background i tried using nohup e.g: exec "nohup node test_node.js".
When I run this code my server crashes.
I am Rails developer and never worked on node app so have no idea if I taking it right way or not.
exec replaces the currently running process with a new one. Thus, to run something in the background you should fork and exec in the child.
fork { exec('node', 'test_node.js', ...) }
nohup is not needed.
See also Ruby, Difference between exec, system and %x() or Backticks

How to run a capistrano task on server start?

I have been thinking about this and searching this for ages without finding anything, so I am going to assume I hit the XY problem.
Let me describe my issue, this sounds common enough.
We use capistrano to deploy our web app and db. The relevant part is that we have a dedicated server for delayed job and we use capistrano to deploy to it and start/restart the processes. This is a custom number of workers with 2 different Gemfiles and 3 queues.
What I want to do is to start them up on server restart or, more to the point, on server clone + start.
I have tried calling cap production delayed_job:custom_start from the server itself.. didn't work. (This is the core of my non XY problem adjusted question). Not sure it even makes sense. But I want to know if it is possible. custom_start is a task that starts our set of workers.
Alternatively I am thinking of abstracting the code into a rake task or a script or something and calling it from both capistrano and where ever I would need to add it to start on startup. Does this make more sense?
Edit: Just found this post.. discouraging..
p.s. I just want to clarify that when I say server I mean my Machine/ec2 instance and not my web app restarting.
My Jenkins deploy jobs are littered with direct tasks developers can call such as cap dev app:fetch_logs, cap qa sanitize_environment, etc.
This feature of Capistrano is easy and verified.
What I am guessing you want to do is use Capistrano to setup rc.d files. Yes, you can do this. Should you use chef/puppet at this point? Consider. Scalr/Rightscale are funs things to look at too.
You can add a bash script as an .erb template for all your worker variables, then put upload the script into the deploy_to directory. Finally, you can setup another task (#{sudo}) to inject rc.d wrappers into rc.d. Or you rather than rc.d wrappers to you bash script, just call the bash script from rc.d-local. You can use sed to append to rc.d-local.
I ended up moving the delayed job related logic to its own script that accepts start/stop, and delegating to this script from capistrano. This means I can add this script to my rc.local.

Need working examples of whenever gem in action

I recently made a Ruby gem called dvi_scrape for a new Rails web site that I'm working on. The dvi_scrape gem downloads web pages, scrapes financial data from them, process the data, and outputs the results to a Postgres database.
I'm able to get the Ruby command Dvi_scrape.dopeler to work as expected when executed manually. However, I'm unable to get it to work as expected when executed through cron.
How do I get this to work from cron on my WebFaction account?
The source code of the Rails site I'm working on this . The source code of the dvi_scrape gem is at this place .
I understand that config/schedule.rb is where you specify what scripts need to be run and at what time intervals. config/environment.rb is where you specify the environment. config/deploy.rb is where you specify what happens when you enter "cap deploy".
Are there any good examples of scripts that execute commands from certain gems at regular intervals? Please point me to some good examples on GitHub.
I don't have any "good examples on GitHub", but...
Assuming you have the gem installed into some specific web application (ie, you followed WebFaction's documentation for installing gems), you probably need to set some environment variables to make it usable when running from cron.
Try setting them in your cron job, like this (my example uses a 10-minute cron interval, change it to whatever you need):
*/10 * * * * GEM_HOME=$HOME/webapps/yourapp/gems RUBYLIB=$HOME/webapps/yourapp/lib PATH=$HOME/webapps/yourapp/bin your_command_here
Hope that helps!

Delayed Jobs on Rails 2: any better way to run workers?

I finally got the DelayedJobs plugin working for Rails 2, and it does indeed work fine...as long as I run:
rake jobs:work
Just like the readme says, to be fair.
BUT, this doesn't fit my requirements...what kind of background task requires you to have a shell open, and a command running? That'd be like having to say script/server to run my rails app, and never getting that -d option so it'll keep running even after I close my shell.
Is there ANY way to keep the workers getting processed in the backgroun, or in daemon mode, or whatever?
I had a ray of hope when I saw the
You can also run by writing a simple
#script/job_runner#, and invoking it
externally:
Line in the readme...but...that just does the exact same thing the rake task does, you just call it a different way.
What I want:
I want to start my rails app, then start whatever will process the workers, and have BOTH of them run invisibly in the background, without the need for me to babysit it and keep the shell that started it running.
(My server is something I SSH into, so I don't want to have that shell that SSHed into it running 24/7 (especially since I like to turn off my local computer now and again)).
Is there any way to acomplish this?
You can make any *nix command run on the background by appending an & to its end:
rake jobs:work &
Just make sure you exit the shell (or use the disown command) to detach the process from your login session... Otherwise, if your session disconnects, the processes you own will be killed with it.
Perhaps Beanstalkd and Stalker?
Beanstalk is a fast and easy way to queue background tasks. Stalker provides a nice wrapper interface for creating these jobs.
See the railscast on it for more information
Edit:
You could also run that rake task as a cronjob which would mean the server would run it periodically without you needing to be logged in
Use the collectiveidea fork of delayed_job... It's more actively developed and has support for running the jobs in a daemon without any extra messing about.
My capistrano script calls
RAILS_ENV=production script/delayed_job start

Resources