I use rails with ActiveJob and sidekiq as backend. When user come on a page sidekiq create a long-term background task, how can I notice a user (by render partial on the web page) when a task would be completed?
Rails and sidekiq work as different processes. This fact confused me I don't understand how to handle completed status using background job.
ActiveJob provides an after_perform callback which according to docs work like this:
class VideoProcessJob < ActiveJob::Base
queue_as :default
after_perform do |job|
UserMailer.notify_video_processed(job.arguments.first)
end
def perform(video_id)
Video.find(video_id).process
end
end
So, you don't have to worry to integrate directly with Sidekiq or any other queuing backend, talk to ActiveJob :)
My approach in this situation is:
Add sidekiq-status so that background jobs can be tracked by ID.
In the client call that creates the background job, return the newly-created job's ID.
class MyController < ApplicationController
def create
# sidekiq-status lets us retrieve a unique job ID when
# creating a job
job_id = Workers::MyJob.perform_async(...)
# tell the client where to find the progress of this job
return :json => {
:next => "/my/progress?job_id={job_id}"
}
end
end
Poll a 'progress' endpoint on the server with that job ID. This endpoint fetches job progress information for the job and returns it to the client.
class MyController < ApplicationController
def progress
# fetch job status from sidekiq-status
status = Sidekiq::Status::get_all(params[:job_id])
# in practice, status can be nil if the info has expired from
# Redis; I'm ignoring that for the purpose of this example
if status["complete"]
# job is complete; notify the client in some way
# perhaps by sending it a rendered partial
payload = {
:html => render_to_string({
:partial => "my/job_finished",
:layout => nil
})
}
else
# tell client to check back again later
payload = {:next => "/my/progress?job_id={params[:job_id]}"}
end
render :json => payload
end
end
If the client sees that the job has completed, it can then display a message or take whatever next step is required.
var getProgress = function(progress_url, poll_interval) {
$.get(progress_url).done(function(progress) {
if(progress.html) {
// job is complete; show HTML returned by server
$('#my-container').html(progress.html);
} else {
// job is not yet complete, try again later at the URL
// provided by the server
setTimeout(function() {
getProgress(progress.next, poll_interval);
}, poll_interval);
}
});
};
$("#my-button").on('click', function(e) {
$.post("/my").done(function(data) {
getProgress(data.next, 5000);
});
e.preventDefault();
});
Caveat emptor: that code is meant to be illustrative, and is missing things you should take care of such as error handling, preventing duplicate submissions, and so forth.
Related
In my Rails 6 API only app I've got FetchAllProductsWorker background job which takes around 1h30m.
module Imports
class FetchAllProductsWorker
include Sidekiq::Worker
sidekiq_options queue: 'imports_fetch_all'
def perform
# do some things
end
end
end
During this time the frontend app sends requests to the endpoint on BE which checks if the job is still running. I need to send true/false of that process. According to the docs there is a scan method - https://github.com/mperham/sidekiq/wiki/API#scan but none of these works for me even when worker is up and running:
# endpoint method to check sidekiq status
def status
ss = Sidekiq::ScheduledSet.new
render json: ss.scan('FetchAllProductsWorker') { |job| job.present? }
end
The console shows me:
> ss.scan("\"class\":\"FetchAllProductsWorker\"") {|job| job }
=> nil
> ss.scan("FetchAllProductsWorker") { |job| job }
=> nil
How to check if particular sidekiq process is not finished?
Maybe this will be useful for someone. Sidekiq provides programmatic access to the current active worker using Sidekiq::Workers https://github.com/mperham/sidekiq/wiki/API#workers
So based on that we could do something like:
active_workers = Sidekiq::Workers.new.map do |_process_id, _thread_id, work|
work
end
active_workers.select do |worker|
worker['queue'] == 'imports_fetch_all'
end.present?
I am building a web service that accepts a string, parses it and returns it as JSON. In my controller, I am calling an ActiveJob to run a service to parse the data. I would like to return the results of the ActiveJob back to my controller in order to return it as JSON and I am not sure how to do it. What is the simplest way to do this?
class GeocoderService
require 'ruby_postal/parser'
def initialize(address)
#address = address
end
def parse_address
address_group = {}
result = Postal::Parser.parse_address(#address)
result.each do |r|
address_group[r.values[0]] = r.values[1]
end
address_group
end
class ParseAddressJob < ApplicationJob
queue_as :default
def perform(address)
geo = GeocoderService.new(address)
result = geo.parse_address
end
end
class LocationsController < ApplicationController
def create
if geo_params[:address]
ParseAddressJob.perform_later(geo_params[:address])
render json: result
else
render json: { error: "Invalid address"}, status: 400
end
end
private
def geo_params
params.require(:geo).permit(:address)
end
end
ActiveJob is used to create tasks that run in the background, and asynchronously from the HTTP request/response flow. The controller hands off the task to ActiveJob then the controller returns while the job runs in the future. If you need the output from a job immediately because its something the user needs then you shouldn't be using a job - you should just be calling the code the job calls directly and blocking until the code is finished so you can get the output.
def create
if geo_params[:address]
geo = GeocoderService.new(geo_params[:address])
result = geo.parse_address
render json: result
else
render json: { error: "Invalid address"}, status: 400
end
end
If you're really concerned with the blocking that your controller action will do while it waits on the response from the geolocation API, you can institute a queueing system of your own, for your API consumers. The flow looks something like this:
A user makes a request to your endpoint
Your endpoint inserts a record in to a database table called GeoResults with a status of 'processing', and an empty response text. Get the ID of this record.
Your endpoint fires off the job as you're doing now, but now you also pass in the ID of the GeoResults record you created.
Your endpoint gives the user a URL to check this record in GeoResults.
Your consumer starts to poll this endpoint until they see the status of 'complete'
When your background job is completed, it updates its record in GeoResults (since it has the ID) with a status of 'complete', and assigns the geolocation response text.
Your consumer sees the update, and grabs the response.
A user can upload multiple files in my system via a web interface that is handled by a controller action.
def import
paths = params[:files].map(&:path) if params[:files].present?
if paths
#results = MyRecord.create_from paths
end
redirect_to confirmation_my_records_path
end
class MyRecord < ActiveRecord::Base
def self.create_from(paths)
paths.each do |path|
MyRecordWorker.perform path
end
end
end
works/my_record_worker.rb
class MyRecordWorker
include Sidekiq::Worker
def perform(path)
# each time this is run, it is no I/O, just expensive math calculations that take minutes
collection = ExpensiveOperation.new(path).run
if collection && collection.any?
save_records(collection)
else
[]
end
end
end
Because the sidekiq jobs will run in the background, how do I notify the user through the confirmation page that the jobs are done? We don't know when the job will finish and therefore the Rails request and response cycle cannot determine anything when the confirmation page is loaded. Should the confirmation page just say "Processing..." and then have Faye update the page when the job is finished?
Add a RecordJobs model to your application that stores who uploaded the records, the number of records, the upload time and so on.
Create a new record_job each time someone uploads new records and pass its id to the worker. The worker can update that instance to indicate its progress.
At the same time, the application can query for existing record_job and show their progress. That can be done whenever the user reloads it page, by active polling or with web sockets (depending on your needs).
use ActionCable, here is the example with Report channel
In routes.rb add the following line
mount ActionCable.server => '/cable'
create a ReportChannel.rb inside app/channels with following code
class ReportChannel < ApplicationCable::Channel
def subscribed
stream_from "report_#{current_user.id}"
end
end
create reports.coffee inside app/assets/javascripts/channels with the following code
window.App.Channels ||= {}
window.App.Channels.Report ||= {}
window.App.Channels.Report.subscribe = ->
App.report = App.cable.subscriptions.create "ReportChannel",
connected: ->
console.log('connected ')
disconnected: ->
console.log('diconnected ')
received: (data) ->
console.log('returned ')
data = $.parseJSON(data)
#write your code here to update or notify user#
and add the following lines at the end of your job
ActionCable.server.broadcast("*report_*#{current_user_id}", data_in_json_format)
replace report with your specific model
seems simple right :)
I use rails 4.2.5 and Sidekiq for background processing.
There is an API which an application can call.
I now have this code:
def start_item(name, init_query)
job_id = AzureBufferBase.delay.execute_in_transaction(name, init_query)
job_id
end
I get a job_id back like this: ef95bdd9cf5da0ef1273db6c
Now I want to expose this status through the API:
module Api
class BackgroundJobsController < BaseApiController
def show
result = Sidekiq::Status(params[:id])
render json: { 'status' => result.to_json }, status: 200
end
end
end
Sidekiq::Status: this doesn't work, but my question is, how can I get the status from Active Job of a job (queued, progress, completed, ...)?
It seems like you're looking for Active Job Status gem.
given the delayed job worker,
class UserCommentsListWorker
attr_accessor :opts
def initialize opts = {}
#opts = opts
end
def perform
UserCommentsList.new(#opts)
end
def before job
p 'before hook', job
end
def after job
p 'after hook', job
end
def success job
p 'success hook', job
end
def error job, exception
p '4', exception
end
def failure job
p '5', job
end
def enqueue job
p '-1', job
end
end
When I run Delayed::Job.enqueue UserCommentsListWorker.new(client: client) from a rails console, I can get repeated sequences of print statements and a proper delayed job lifecyle even hooks to print including the feedback from the worker that the job was a success.
Including the same call to run the worker via a standard rails controller endpoint like;
include OctoHelper
include QueryHelper
include ObjHelper
include StructuralHelper
class CommentsController < ApplicationController
before_filter :authenticate_user!
def index
if params['updateCache'] == 'true'
client = build_octoclient current_user.octo_token
Delayed::Job.enqueue UserCommentsListWorker.new(client: client)
end
end
end
I'm noticing that the worker will run and created the delayed job, but none of the hooks get called and the worker nevers logs the job as completed.
Notice the screenshot,
Jobs 73,75,76 were all triggered via a roundtrip to the above referenced endpoint while job 74 was triggered via the rails console, what is wrong with my setup and/or what am I failing to notice in this process? I will stress that the first time the webserver hits the above controller endpoint, the job queues and runs properly but all subsequent instances where the job should run properly appear to be doing nothing and giving me no feedback in the process.
i would also highlight that i'm never seeing the failure, error or enqueue hooks run.
thanks :)
The long and the short of the answer to this problem was that if you notice, i was attempting to store a client object in the delayed job notification which was causing problems, so therefore, don't store complex objects in the job, just work with basic data ids 1 or strings foo or booleans true etc. capisce?