I have some containers running in aws ecs (ec2 based) all containers are running ruby on rails applicatin and we are using ecs-exec to connect with them. I need to run the migration task and need to keep the shell open after running that task. Any idea what I need to add after the migration command to do this?
The command I am using to run the migration task is rake db:migrate.
Will something like work rake db:migrate exec $SHELL
Related
Right now I have a docker container running a rails 6 app in Heroku. Now I am wondering if there is a way to run some rake task. I have tried:
heroku run bash
heroku run rake -T
heroku run rails console
But all of these commands execute rails server for some reason. Is it possible to run rake tasks if I am using Docker?
I am new new to Ruby on Rails.
In my rails rails application I have used the two servers. One is the rails server and other one is simple ruby server.rb file. I need to start both the server with Start.sh script to deploy.
I have tried the following Code of Start.sh file. But the issue is Rail server is not starting until and unless I stop the ruby server.rb.
start.sh file code
rake ts:stop
rake ts:start
rake ts:index
ruby server.rb
rails server
I want to run both the servers through a single script
If you are on a unix based system adding an & will start a command in the background. What you need is:
rake ts:stop
rake ts:start
rake ts:index
ruby server.rb &
rails server
For a rails project the better way to start multiple processes is to use a Procfile. Then you would start your application using a Procfile manager like foreman https://github.com/ddollar/foreman
so i've successfully installed Harrys Prelauncher on Heroku (https://github.com/harrystech/prelaunchr)
and to export my collected emails into a csv i need to run this command (bundle exec rake prelaunchr:create_winner_csvs)
is there any way to run that command through pgadmin or some other program?
or is the only way for me to download my heroku database and run the command locally? also how and what would i need to do this?
i'm pretty new to rails and postgresql and would really appreciate if someone could help me out!
Because the rake task creates files locally you can't just run it on heroku via heroku run rake. You can however set up your local database.yml to connect to your heroku postgresql instance and run the rake task locally.
Run heroku pg:credentials to get the required database values.
Fill in the production environment of config/database.yml with the values you obtained from step 1 (for the value of 'database' in the yml file, use dbnmae from step 1)
Test your connection with RAILS_ENV=production rails db. This should drop you into a psql console.
Run the rake task. RAILS_ENV=production bundle exec rake prelaunchr:create_winner_csvs
The files will save locally in lib/assets as indicated by the documentation.
From within the directory of the project you can use
heroku run rake prelaunchr:create_winner_csvs
You should probably create a UI form in to your application.
On click on export CSV, it should run background job on heroku (Using delayed jobs).
heroku run rake prelaunchr:create_winner_csvs
Use send_data ruby method. To send your generated and dumped data file on to your browser.
Download the file on to your local system from running heroku instance.
Hope this will resolve your problem.
Cheers!!!
I ran into this problem recently while developing the Prelaunchr campaign for a client. Assuming you have a local version of your app and are using Postgres Copper in Heroku, you can "pull" your Heroku database down to your local machine, set that as your development database in database.yml, and run the rake task from your local app, which should now have the same database as your heroku version. Here is the command to pull the db (subbing out name_for_database & heroku_app_name with your own):
heroku pg:pull HEROKU_POSTGRESQL_COPPER_URL name_for_database --app heroku_app_name
Make sure to restart your local server to see the new database info populated.
I've noticed I've been having to do:
bundle exec script/console
<wait for console to load>
require migration
generate some data
a lot... and I was wondering if there is a way to have this all in a bash script or something. so i could just do ./generatedata and have it run the above commands.
I've found that custom rake tasks are an awesome tool for when you have work which requires running code in the rails environment. Check out this railscast http://railscasts.com/episodes/66-custom-rake-tasks
If you want to run a one-off command in the console, you can use the rails runner command. So if you had a ./generatedata.rb script which performs the ruby commands you want to execute in the console, you can just call rails runner ./generatedata.rb and it will run your ruby script in the rails environment against the database. Alternatively, you could add the shebang line to the ./generatedata.rb script: #!/usr/bin/env rails runner. Then you only need to execute the ./generatedata.rb script and it will use rails runner automatically.
I have a remote Ubuntu Linux for testing a Ruby on Rails application. I deployed delayed_jobs gem in the application.
In my local machine, I used rake jobs:work to start the worker process, which will run all delayed jobs automatically.
I would like to start this worker process in Linux and then quit the SSH connection.
What's the best practice in setting up delayed_job in linux? Thanks.
try to use & at the end of your command, in order to start rake as a background process:
rake jobs:work &
Now you can quit SSH
Use nohup rake jobs:work & can fix the problem. Solved :)