Run ruby script in background without using screen - ruby-on-rails

I have a two scripts in the Rails environment which have to run 24/7. I'm working on a remote server, so I need to start the scripts using ssh, which means I need to keep the ssh windows open all the time.
I'm searching for a simple way to run these scripts in the background so they aren't canceled as soon as I close the ssh connection.
I don't want to use screen. I think there must be simpler way to handle this. Isn't there?

I think the most basic solution would be nohup:
nohup myscript &> /dev/null &

You can disown a script:
ruby script.rb &!
STDOUT/STDERR are still attached to the current shell, but the ruby process isn't a child of the shell anymore, so it won't get killed if you close the terminal.

Check Daemons. It's a toolkit for converting a script to a controllable daemon.

You can use
runit
supervisord
For daemonizing
Or some ruby stuff: https://www.ruby-toolbox.com/#Background_Processing for background processing

Related

Docker Compose down. Can I run a command before the actual stop?

I have a setup with docker-compose which creates a screen and runs a process in it.
That's because when I use the docker-compose with -d it will run the process in the background and attaching to the shell will actually give me a new shell.
What I need is the shell with the actual process...
So when I use my docker-compose script I use screen to run the process in a screen instance.
When I open a shell I can connect to the shell of the running process using the screen -r <screen_name> command
But because it is running in a screen the docker-compose down command won't actually stop properly and will stuck while trying to stop. Instead, I need to force the stop and this is not what I want because this is not the proper way of ending the process I have.
So I thought I need a way to define a stop command before the actual stopping happens.
Any tips are appreciated.
PS: Yes, it's Minecraft
EDIT 1: After the comment of Calum Halpin I don't need a screen anymore. So that I now only need a way to pipe something like "exit" to stdin.
EDIT 2: I guess I still need screen. When attaching to the shell I cant escape from there without killing the terminal session and therefore the process...

Starting Erlang service at boot time (using Relx for creating release)

I have a server written in Erlang, compiled with Rebar, and I make a release with Relx. Starts nicely with
/root/rel/share3/bin/share3 start
The next step is to start when the server boots.
I have tried different approaches, the last one is using the /etc/init.d/skeleton where I changed the following
NAME=share3
DAEMON=/root/rel/share3/bin/share3
DAEMON_ARGS="$1"
After that, I run update-rc.d, but I have not gotten it too work. (Ubuntu 14.04)
The service runs until the machine reboots, and I need to login and start it again.
For Windows, it is really elegant, since it can create the Windows service.
Ubuntu uses upstart as init system, so you could try something like that:
description "Start my awesome service"
start on runlevel [2345]
stop on runlevel [!2345]
respawn
exec /root/rel/share3/bin/share3
You have to place this script in /etc/init/ directory with '.conf' extension like '/etc/init/share3.coinf'. To start it invoke sudo start share3.
At last, I solved it!
I have told to relx to place the result at /home/mattias/rel. The script from relx is /home/mattias/rel/share3/bin/share3
Replace the row
SCRIPT_DIR="$(dirname "$0")"
by (you need to fix the path /home/mattias/rel)
HOME=/home/mattias
export HOME
SCRIPT_DIR="/home/mattias/rel/share3/bin"
Copy the file to /etc/init.d/share3 using
sudo cp ~/rel/share3/bin/share3 /etc/init.d/
Test that it works using
/etc/init.d/share3 start
and
/etc/init.d/share3 stop
In order to make it start at boot, install sysv-rc-conf
sudo apt-get install sysv-rc-conf
Enable boot at start using
sudo sysv-rc-conf share3 on
and disable
sudo sysv-rc-conf share3 off
Alternatives are welcome.

Run new ant target without killing previous target

I've got an ant target ant server that runs a Java application which logs to the console. I need to run a new ant target ant server-gui which also logs to the console. But when I run ant server the logging prevents me from running any new ant targets.
When I enter ^c (which is the only way I know of to get out of situations like that) it kills the Java application. I need both to run. What keystroke will get me out of that "input" mode and able to run new terminal commands?
UPDATE: I haven't found a direct solution to getting out of that mode I mentioned, but opening a new tab/window in terminal does the trick. I can run as many any commands as I'd like that way. Still looking for a good solution to get out the "input" mode, though!
UPDATE 2: #abcdef pointed out another post that has an even more elegant solution.
There are a few ways to do this, assuming you are on *nix
1) Run the ant command with a & at the end to tell *nix to run the command in the background
2) Run the command with nohup at the beginning (https://en.wikipedia.org/wiki/Nohup)
3) when the process is running press ctrl-z then enter the command bg. This manually forces the command to run in the background
I hope this helps you out

Start god process on server startup (Ubuntu)

I'm currently struggling with executing a simple command which I know works when I run it manually when logged in as either root or non-root user:
god -c path/to/app/queue_worker.god
I'm trying to run this when the server starts (I'm running Ubuntu 12.04), and I've investigated adding it to /etc/rc.local just to see if it runs. I know I can add it to /etc/init.d and then use update-rc.d but as far as I understand it's basically the same thing.
My question is how I run this command after everything has booted up as clean as possible without any fuzz.
Im probably missing something in the lifecycle of how everything's initialized, but then I gladly encourage some education! Are there alternative ways or places of putting this command?
Thanks!
You could write a bash script to determine when Apache has started and then set it to run as a cron job at a set interval...
if [ "$(pidof apache)" ]
then
# process was found
else
# process not found
fi
Of course then you'll have a useless cron job running all the time, and you'll have to somehow flip a switch once it's run once so it doesn't run again.. This should give you an idea to start from..

When calling a shell command via ruby, what context does the command run on?

In a rails application (or sinatra), if I make a call to a shell command, under what context does this command run?
I'm not sure if I am asking my question correctly, but does it run in the same thread as the rails process?
When you shell out, is it possible to make this a asychronous call? If yes, does this mean at the operating system level it will start a new thread? Can it start in a pool of threads instead of a new thread?
If you are using system('cmd') or simply backticks:
`cmd`
Then the command will be executed in the context of a subshell.
If you wish to run multiple of these at a time, you can use Rubys fork functionality:
fork { system('cmd') }
fork { system('cmd') }
This will create multiple subprocessess which run the individual commands in their respective subshells.
Read up on forking here: http://www.ruby-doc.org/core-2.0/Process.html#method-c-fork
It's more than just a new thread, it's a completely separate process. It will be synchronous and control will not return to Ruby until the command has completed. If you want a fire-and-forget solution, you can simply background the task:
$ irb
irb(main):001:0> system("sleep 30 &")
=> true
irb(main):002:0>
$ ps ax | grep sleep
3409 pts/4 S 0:00 sleep 30
You can start as many processes as you want via system("foo &") or`foo &`.
If you want more control over launching background processes from Ruby, including properly detaching ttys and a host of other things, check out the daemons gem. That's more suitable for long-running processes that you want to manage, with PID files, etc., but it's also possible to just launch tasks with it.
There are alternative solutions for managing background processes depending on your needs. The resque gem is popular for queuing and managing background jobs. It requires Redis and some setup, but it's good if you need that level of control.

Resources