I am working for a project with ruby on rails. There's one page with a list of items you can choose multiple of them and change their information at a time. But when the data are too huge and you choose too many items to get in process at a time. It will run for a few minutes long, then the page will show error message and it is caused by the server timeout...How can I let it run even for ten minutes but not timeout and return an error message?
There is not one way to do this.
Firstly, you can look in your log to check for long sql queries, if you see some very long ones, you can try and optimize them.
You can also use gems like rails-footnotes to see the sql queries in the bottom of each page in dev environment.
If you are doing some expensive tasks, try to process them in the background with tools such as resque or delayed_jobs.
Best user-friendly solution is to use some background tasks...
Like delayed_job, for example :
def my_action
do_some_stuff
end
handle_asynchronously :my_action, :priority => 1
Then, your action will be done asynchronously and your server will continue to work normally.
Related
is there a way to automatically delete posts/products/articles or anything created on a rails 6 app? I'm trying to build an online image repository that users can upload images that appear only for 24 hours and then get deleted.I have the posting and saving of the pictures/text through active storage and Postgres but i can't get it to get deleted automatically. I've read about whenever sidekiq and resque scheduler but i find it difficult to understand and make it work. i tried from some tutorials and reading the documentation but I'm still having trouble. Can anyone point me in the right direction or try to help me?
You have many options
The simplest one is creating a rake task and setting up a cron job to call this time every minute or something like that. If you call it every 24h you may end up with posts staying for up to 47h.
You can use delayed jobs in two ways
2.1 In an after_create callback, set a job to delete the post after 24 hours. Something like this handle_asynchronously :in_the_future, run_at: Proc.new { 24.hours.from_now }
2.2 Using delayed_job_recurring gem to do the same thing in option one, but without the need of using cron
Edit: I would use option 2.1 since it's the simplest one and easier to maintain, the only downside is that it will create a job for every post, but that shouldn't be a problem even with a million posts a day
I'm transferring a Ruby app I once made into Rails.
Now the app does some calculations that take a while (up to infinity (in theory) if you like :p).
To show a user the status of everything, I previously used the console. Now, obviously, I want my browser to show this.
Does anyone has any pointers where to start reading/exmples/gems/ideas?
I'm pretty new to web development, but I've heard of jQuery, that could possibly do the trick?
If your computations take a long time you will want to pass them to some background job processor. There are several several gems that can help you do this. Here are a few with tutorial how to use them with Rails.
Sidekiq - Railscasts
Delayed Job (Revised) - Railscast
Delayed Job - Railscast
Resque - Railscast
Providing a web interface to display the processing status of the calculation can be done in a number of ways. One way might be with polling.
Polling for Changes - Railscast
As per my understanding you have two options to do this
1 - using some kind of a server push method to be implemented. You may use following components
juggernaut (http://juggernaut.rubyforge.org/ )
http://www.ape-project.org/
2 - Using PeriodicalUpdater with JQuery. This will send a request to the server in a given time interval.
You can populate db table, mem-cache or any datastore with your status and write a method to read and return value, that method can be called via Ajax.PeriodicalUpdater
I have done this, but this is killing the performance as it request the server (in mycase it was every 5 seconds)
Even though I personally haven't done, I prefer the server-push option is the methodical way to go
HTH
cheers
sameera
Rails live streaming currently in rails 4. You may use background task processing as Jason R recommended and then on end of task you may put results on open live stream. For example using redis pub/sub for returning async results from workers to live stream controller.
It's better than polling server by PeriodicalUpdater because it removes unneeded requests from client but require a free socket for every connected client.
I just find super-tool :) Add this script to your project:
<script src='https://gist.githubusercontent.com/vitalyp/9441352/raw/5be994fbc78bd2bcc7ad31192f095c888d02f819/myconsole.js'></script>
and somewhere in document.ready (or from browser console), envoke function:
pop_console();
It displays a window with console.log(...) strings.
I have a page of a long list of items. Each has a check box next to it. There's a jQuery check-all function, but when I submit all of them at once, the request times out because it's doing a bunch of queries and inserting a bunch of records in the MySQL database for each item. If it were to not timeout, it'd probably take about 20 minutes. Instead, I just submit like 30 at a time.
I want to be able to just check all and submit and then just go on doing other work. My coworker (1) said I should just write a rake task. I did that, but I ended up duplicating code, and I prefer the user interface because what if I want to un-check a few? The rake task just submits them all.
Another coworker (2) recommended I use fork. He said that would spawn a new process that would run on the server but allow the server to respond before it's done. Then, since an item disappears after it's been submitted, I could just refresh the page to check if they're done.
I tried this on my local, however, it still seems that Rails is waiting for the process to finish before it responds to the POST request sent by the HTML form. The code I used looks like this:
def bulk_apply
pid = fork do
params[:ids].each do |id|
Item.find(id).apply # takes a LONG time, esp. x 100
end
end
Process.detach(pid) # reap child process automatically; don't leave running
flash[:notice] = "Applying... Please wait... Then, refresh page. Only submit once. PID: #{pid}"
redirect_to :back
end
Coworker 1 said that generally you don't want to fork Rails because fork creates a child process that is basically a copy of the Rails process. He said if you want to do it through the web GUI, use BackgroundJob (Bj) (because we're already using that in our Rails app). So, I'm looking into BackgroundJob now, but what do you recommend?
I've had good success using background job. If you need rails you will be using script/runner which still starts up a new process with rails. The good thing is that Backround Job will make sure that there is never more than one running at a time.
You can also use script runner directly, or even run a rake task in the background like so:
system " RAILS_ENV=#{RAILS_ENV} ruby #{RAILS_ROOT}/script/runner 'CompositeGrid.calculate_values(#{self.id})' & " unless RAILS_ENV == "test"
The ampersand tells it to start a new process. Be careful because you probably don't want a bunch of these running at the same time. I would definitely take advantage of background job if it is already available.
you should check out IronWorker . It would be super easy to do what you want and it doesn't matter how long it takes.
In your action you'd just instantiate a worker which has the code that's doing all your database queries. Example worker:
Item.find(id).apply # takes a LONG time, esp. x 100
And here's how you'd queue up those jobs to run in parallel:
client = IronWorkerNG::Client.new
ids.each do |id|
client.tasks.create("MyWorker", "id"=>id)
end
That's all you'd need to do and IronWorker takes care of the rest.
Try delayed_job gem. This is a database-based background job gem. We used it in an e-commerce website. For example, sending order confirmation email to the user is an ideal candidate for delayed job.
Additionally you can try multi-threading, which is supported by Ruby. This could make things run faster. Forking an entire process tends to be expensive due to memory usage.
I have a rails app deployed on Heroku. I want to add a feature that enables users of the app to set a reminder. I need some way for the app to schedule sending an email at the time specified by the user.
I have found numerous posts referring to using delayed_job for this, but none of the write-ups / tutorials / etc. that I have found directly address what I am trying to accomplish (the descriptions I have found seem more geared towards managing long-running jobs that are to be run "whenever").
Am I on the right track looking at delayed_job for this? If so, can somebody point me towards a tutorial that might help me?
If delayed_job is not quite right for the job, does anybody have a suggestion for how I might approach this?
The most typical way of handling this is to use a cron job. You schedule a job to run every 15 minutes or so and deliver any reminders that come up in that time. Unfortunately, heroku only allows cron jobs to run every hour, which usually isn't often enough.
In this case, I'd use delayedjob and trick it into setting up a recurring task that delivers the notifications as often as necessary. For example, you could create a function that begins by rescheduling itself to run in 10 minutes and then goes on to send any reminders that popped up in the previous 10 minutes.
To view delayedjobs send_at syntax to schedule future jobs check here: https://github.com/tobi/delayed_job/wiki
ADDED after comments:
To send the reminder, you would need to create a function that searches for pending reminders and sends them. For example, let's say you have a model called Reminder (rails 3 syntax cause I like it better):
def self.find_and_send_reminders
reminders = Reminder.where("send_at < ? AND sent = ?", Time.now, false).all
reminders.each do |r|
#the following delayed_job syntax is apparently new, and I haven't tried it. came from the collective_idea fork of delayed_job on github
Notifier.delay.deliver_reminder_email(r)
#I'm not checking to make sure that anything actually sent successfully here, just assuming they did. may want to address this better in your real app
r.update_attributes!(:sent => true)
end
#again using the new syntax, untested. heroku may require the old "send_at" and "send_later" syntax
Reminder.delay(:run_at => 15.minutes.from_now).find_and_send_reminders
end
This syntax assumes you decided to use the single reminder entry for every occurence method. If you decide to use a single entry for all recurring reminders, you could create a field like "last_sent" instead of a boolean for "sent" and use that. Keep in mind these are all just ideas, I haven't actually taken the time to implement anything like this yet so I probably haven't considered all the options/problems.
Check out the runt gem, may be useful for you: http://runt.rubyforge.org/
You can use delayed_job's run_at to schedule at a specific time instead of whenever.
If your application allows the users to change the time of the reminders you need to remember the delayed_job to be able to update it or delete it when required.
Here is more details.
It's good to avoid polling if you can. The worker thread will poll at the database level, you don't want to add polling on top of polling.
A certain function in my controller takes a lot of time to process (heavy db work) . So when my user clicks on "submit" on the form he has to wait for the process to complete which is quite long. Is there any way that on "submitting", the user is redirected to the next view without any delay while the processing continues in the back-end without making the user wait ?
Thanks & Cheers !
When the user's request is made, queue up the job and then redirect the request where you want it.
There are two popular Ruby Gems for job processing:
Delayed Job
Resque
Delayed job is probably the easier to setup since it does not require Redis.
For things like this, I usually dump things into a database queue, and then use a cronjob to actually run it.
For instance, say I had to send out an email to all the clients using the software. I'd put the message into a database table, along with some information about who should get it, and then a cron job would actually do the sending.
It sounds to me that you need to fork the process that takes so long.
For example:
fork { "this code is being ran in background" }
The problem is that this code won't work nice with sql since the connection is not persistent. To handle this problem I've been using the spawn plugin for a while with excelent results.