Sidekiq query for job completion - ruby-on-rails

I have an admin dashboard action in a react front end using a ruby on rails API back end. I have a sidekiq job that runs to import users from a third party. I would like to somehow trigger a refresh on the admin panel when the job is complete. What is the best way to go about this? I am using graphql-ruby. I could use start-polling on the front end maybe? Can I poll for completion of a specific job on sidekiq somehow? Any help is appreciated!

Hi Ilovebathroomlights
There are several approaches I can use, one of them I am writing below assuming you are okay to write a state to Redis, Database, or any storage you might be using and can be used to poll.
You can start by assigning a unique id to your sidekiq job, and save this id with status as "pending", at the last line inside the sidekiq job code, update the status of the unique id as "completed".
In meantime, you can poll by the unique id you generated for the particular job, and refresh the data or refresh the page.

Related

Prevent duplicate ActiveJob being scheduled

I have a Rails app that queries an external webservice to update records.
I want to continue polling the web service until the user session expires.
Currently I create an ActiveJob in the show action for the record I want updated.
In the ActiveJob I reschedule it using
self.class.set(wait: 60.seconds).perform_later(record_id)
The problem is that if the user goes to the show action again, it will create another ActiveJob.
Is there anyway to prevent duplicate jobs from being created?
Ideally, there would be a way to search the ActiveJobs to see if one already exists. Something like ActiveJob.find_by(job: JobType, params: record_id). That would allow you to manipulate the jobs before creating a duplicate. I'm going to dig in further and see if that could be created...
The activejob-uniqueness gem is designed to de-dupe jobs. The gem is still relatively new, but it's sponsored by an engineering company (Veeqo) and seems well supported. See also: https://devs.veeqo.com/job-uniqueness-for-activejob/
First set a cookie value when the user visits the show for the first time.
self.class.set(wait: 60.seconds).perform_later(record_id) if cookies[:_aj].nil?
cookies[:_aj] = true
Also maybe create a column in your Record, maybe call it pending_update and set it true whenever you schedule a job to run and set it to false at the end of the scheduled job. That way, even if the user clears the cookies, your program will not create duplicate jobs.

How do I have an action happen 24 hours after a specific date in Rails?

Im on Rails 4, I'm creating a listing/rental site where people can list things and then other people can rent them. I'm using Stripe to handle all my payments, and I have a form set up that gets the users credit card and makes them a customer when they request to book a rental. After that, the owner of the rental can view the request and confirm or deny it. If they confirm it, the user renting gets their card charged and their money goes into holding.
When a user requests a booking, they choose a pick-up and drop-off date. I would like to have an action that calls a payout from stripe to the listings owner 24 hours after the pick up date. I am not sure how to go about this, so any suggestions are great! Of course if anyone knows of any tutorials implementing such a thing that would be awesome :).
Thanks.
Couple of things you can do
delayed_job: requires a database and a running process to run scheduled jobs; You can use it on heroku as shown here
resque-scheduler: requires redis and resque and a running process to run scheduled jobs. You can use it on heroku as shown here. Use resque-web and resque-cleaner to check and handle failed jobs.
whenever: requires access to cron jobs and your own script to be setup to run every hour or every few minutes to then pickup listings that need to be processed and then process them away. You'll need to work out a good error reporting system. Doesn't run on heroku
heroku scheduler: that is all managed through heroku but essentially gives you the same capabilities as whenever.
Resque would probably be my choice, but you should know better about your domain.
Install the whenever gem and documentation is available here: https://github.com/javan/whenever
Then in config/scheduler.rb file mention your function name and defined in the model or as per requirement. It will be behave like CRON job but major difference is that it can run application internal function as well.
There are many ways of doing this.
All of which involve some sort of data store that is checked every X minutes. Within this data store you can usually set a run on time.
Checkout:
https://github.com/collectiveidea/delayed_job
or
https://github.com/mperham/sidekiq

Background processing in Rails

A certain function in my controller takes a lot of time to process (heavy db work) . So when my user clicks on "submit" on the form he has to wait for the process to complete which is quite long. Is there any way that on "submitting", the user is redirected to the next view without any delay while the processing continues in the back-end without making the user wait ?
Thanks & Cheers !
When the user's request is made, queue up the job and then redirect the request where you want it.
There are two popular Ruby Gems for job processing:
Delayed Job
Resque
Delayed job is probably the easier to setup since it does not require Redis.
For things like this, I usually dump things into a database queue, and then use a cronjob to actually run it.
For instance, say I had to send out an email to all the clients using the software. I'd put the message into a database table, along with some information about who should get it, and then a cron job would actually do the sending.
It sounds to me that you need to fork the process that takes so long.
For example:
fork { "this code is being ran in background" }
The problem is that this code won't work nice with sql since the connection is not persistent. To handle this problem I've been using the spawn plugin for a while with excelent results.

Monitor database table for external changes from within Rails application

I'm integrating some non-rails-model tables in my Rails application. Everything works out very nicely, the way I set up the model is:
class Change < ActiveRecord::Base
establish_connection(ActiveRecord::Base.configurations["otherdb_#{RAILS_ENV}"])
set_table_name "change"
end
This way I can use the Change model for all existing records with find etc.
Now I'd like to run some sort of notification, when a record is added to the table. Since the model never gets created via Change.new and Change.save using ActiveRecord::Observer is not an option.
Is there any way I can get some of my Rails code to be executed, whenever a new record is added? I looked at delayed_job but can't quite get my head around, how to set that up. I imagine it evolves around a cron-job, that selects all rows that where created since the job last ran and then calls the respective Rails code for each row.
Update Currently looking at Javan's Whenever, looks like it can solve the 'run rails code from cron part'.
Yeah, you'll either want some sort of background task processor (Delayed::Job is one of the popular ones, or you can fake your own with the Daemon library or similar) or to setup a cronjob that runs on some sort of schedule. If you want to check frequently (every minute, say) I'd recommend the Delayed::Job route, if it's longer (every hour or so) a cron job will do it just fine.
Going the DJ route, you'd need to create a job that would check for new records, process them if there are any, then requeue the job, as each job is marked "completed" when it's finished.
-jon
This is what I finally did: Use Whenever, because it integrates nicely with Capistrano and showed me how to run Rails code from within cron. My missing peace was basically
script/runner -e production 'ChangeObserver.recentchanges'
which is now run every 5 minutes. The recentchanges reads the last looked-at ID from a tmp-file, pulls all new Change records which have a higher ID than that and runs the normal observer code for each record (and saves the highest looked-at ID to the tmp-file, of course).
As usual with monitoring state changes, there are two approaches : polling and notification. You seem to have chose to go the polling way for now (having a cron job look at the state of the database on a regular basis and execute some code if that changed)
You can do the same thing using one of the rails schedulers, there are a few out there (google will find them readily, they have various feature sets, I'll let you choose the one which suits your need if you got that way)
You could also try to go the notification way depending on your database. Some database support both triggers and external process execution or specific notification protocols.
In this case you are notified by the database itself that the table changed. there are many such options for various DBMS in Getting events from a database

How to go about sending email x hours after a user signs up in Ruby on Rails?

How would I go about sending an email to a user, say, 48 hours after they sign up, in Ruby on Rails? Thanks!
As Joseph Daigle mentioned, you need to obviously record the exact date and time the user registered. After that, you need a cron running every certain number of minutes (every hour, for example) checking to see if there's any new users whose registration time is greater than 48 hours, send a mail to said user and mark that user as already emailed, so you don't email them again.
As per the actual mail sending, check out the following documentation page:
http://wiki.rubyonrails.org/rails/pages/HowToSendEmailsWithActionMailer
It has all you need to know to send mails with RoR.
I recommend that you use the latest version of BackgrounDRb to handle this. You can read about BackgrounDRb here: http://backgroundrb.rubyforge.org/
In order to queue a message for later delivery, the BackgrounDRb client code (in your application model's after_create callback, maybe) could look something like this:
MiddleMan(:email_worker).enq_send_email_task(:message => #message,
:job_key => "notify1",
:scheduled_at => Time.now + 48.hours)
You'd have to build a BackgrounDRb worker to handle sending the email:
# RAILS_ROOT/lib/workers/email_worker.rb
class EmailWorker < BackgrounDRb::MetaWorker
set_worker_name :email_worker
def send_email_task(message)
# ... Code to send the email message
end
end
Note that in order to use BackgrounDRb in this way, you have to use persistent job queues, so make sure you run the migration included with BackgrounDRb to set up the persistence table in your application.
BackgrounDRb is started separately from Rails (mongrel, apache, etc) using 'script/backgroundrb start', so make sure that you add the daemon to whatever process monitoring you're using (god, monit, etc) or that you create an /etc/init.d script for it.
First you're going to need a running daemon or background service which can poll your queue (probably from in a database) every few minutes.
The algorithm is pretty simple. Record the time of the user event in the queue. When the daemon checks that item in the queue, and the time difference is greater than 48 hours, prepare the e-mail to send.
You can queue jobs with a delay using async observer. Ideally, anything you have that isn't known to be instant (or very close to it) all the time should pass through something like that.
I wrote a plugin called acts_as_scheduled that may help you out.
acts_as_scheduled allows you to manage
scheduled events for your models.
A good example of this is scheduling
the update of RSS Feeds in a
background process using Cron or
BackgroundRB.
With acts_as_scheduled your schedule
manager can simply call
"Model.find_next_scheduled()" to grab
the next item from the database.
How I would approach this is by creating a scheduling controller, that will query the database for the next_scheduled and then use a mailer to send the message. The you set up a Cron Job to call the controller periodically using WGET or CURL. The advantage of the Cron/Controller approach is that no further infrastructure or configuration is required on the server and you avoid complicated threading code.
I think I'd be inclined to store the need for the email and the earliest time after which it should be sent, somewhere separate, then have my things-to-do task look at that. That way I only have to process as many records as there are emails to be sent, rather than examine every user every time, which would either get tedious or require an otherwise probably unnecessary index. As a bonus, if I had other tasks to be performed on some sort of a diarised basis, the same construct would be useful with little modification.

Resources