Trigger the reloading of a page after a background job - ruby-on-rails

I have a rails 4 application, that is doing a background calculus. So the user press a button and it launches a background job for the long calculus (using delay_job) and arrive on a waiting page. Despite the fact that the calculus is done asynchronously, I would like to find a way to warn the user and automatically reload it page once the calculus is finish.
I have looked at several solutions :
Faye or private_pub => I can't afford to use another server for this very specific job
ActionController::Live => I am not sure how to trigger the action once the job is finished
Polling => best solution for the moment, but I expect to find something that is less greedy than polling
If you have any smart solution for this problem, or any advise for my limitation in the listed ideas.

I'll explain how this works for you
HTTP
When you send the request to Rails, your browser doesn't have any way to "listen" to what happens to anything other than a direct response. The HTTP request is "sent" & doesn't have any mechanism to wait until the asynchronous process is complete
Your job is to give your app a "listener" so that your front-end can receive updates as they are generated (out of scope of HTTP), hence the websocket or SSE stuff
We've set up what you're looking for before, using Pusher
"Live"
Achieving "live" functionality is what you need
This basically means keeping a request open "perpetually" to the server; listening to any of the "events" the server sends back. This is done with JS on the front-end, where the client will "subscribe" to a channel (typically a user-centric one), and will then have all the data sent to it
We used a third-party system called Pusher to achieve this recently:
#GemFile
gem "pusher", "~> 0.12.0"
#app/controllers/message.rb
def send_message
public_key = self.user.public_key
Pusher['private-user-' + public_key].trigger('message_sent', {
message: "Message Sent"
})
end
#app/assets/javascripts/application.js
pusher = new Pusher("***********",
cluster: 'eu'
)
channel = pusher.subscribe("private-user-#{gon.user}")
channel.bind "message_sent", (data) ->
alert data.message
Hope this gives another option for you

Related

Twilio REST + Rails task without waiting for a response

I have a rake task that on a cron job sends out a bulk of the messages via Twilio (using their Twilio REST client):
def self.twilio_client
Twilio::REST::Client.new(ENV['TWILIO_ACCOUNT_ID'], ENV['TWILIO_TOKEN'])
end
scheduled_messages.each do |scheduled_message|
MessagesSender.twilio_client.messages.create(
:to => scheduled_message.phone_number,
:messaging_service_sid => ENV['TWILIO_MSID'],
:body => scheduled_message.body
)
end
It works totally fine. However, my task waits for each response from Twilio and is not progressing to the next one until previous is finished. As I use Twilio's messaging service, I think I'm not utilising simultaneous sendout using my multiple numbers. As a result, when I have to send 10000 messages, it takes a very very long time to send them all out.
How can I speed up the process? Is it possible at all not to wait for the response? Do I need to do it in threads to utilise the messaging service?
Thanks
Twilio developer evangelist here.
You are being rate limited, even if you're using multiple numbers to send the messages using the messaging service. When you've got that many messages to send its always going to take a long time.
I recommend setting up a background job to process all the messages. That has a couple of benefits: you'd be able to tweak the number of workers you use to get the best possible performance and if a message fails in the middle, the remaining batch of messages will still be sent (which isn't the case with your current code).
I wrote a blog post a while back on how to set up Rails' ActiveJob for sending messages in the background. You could also use a gem like Textris which handles that for you too.
Let me know if this helps.
You are rate limited based on the plan that you have with Twilio.
https://www.twilio.com/help/faq/twilio-basics/how-many-calls-and-sms-messages-per-second-can-my-twilio-account-make?baseUrl=%2Fhelp%2F
Your best bet is to setup delayed processing like delayed job and then just delay each send so that it is processed by your queue.

How to create a Rails/Ruby method similar to javascript throttle/debounce function

In our application we expose a callback route for an external service to hit. When we receive the callback, we publish an update to client-side subscribers using Eventsource on the client/browser-side and cramp on the server side. Sometimes, however, we get bombarded with callback requests from this external service which results in us publishing a crap ton of updates to the client. Is there a way on the Rails-side, similar to a javascript debounce function, that would wait a set time between callbacks received to publish the message?
We're using sidekiq + threads already, so open to suggestions using those tools.
There is a Sidekiq-debounce gem available.
Another approach (without such gem) is to use the Rails.cache to trigger your execution only once per x time
delay = 1.minute
Rails.cache.fetch('unique-identifier-of-the-job', expires_in: delay) do
YourActiveJobHere.set(wait: delay).perform_later('your-action')
end

How to get a Facebook like notification system in rails

Hey how to make a notification system like Facebook or diaspora in rails.
I had tried making activity feed but that was not the thing I wanted I want an exactly same feature like this websites have.
I have a simple app Where there are two types of users buyers and sellers
I want to notify the seller whenever a buyer comment on their products.
What you are looking at here is a server push implementation. That means when some notification/action happens in the server, it should push a notification to your rails app. The difference with #manju's answer is, its proposing a a solution based on your clients browser will call the server periodically for new notifications.
There are two main ways to do this.
1 - Using some third party SASS solutions. (easy way, but cost money ;))
Fire base , allows you to send push notifications to clients.
pusher is another provider offers the same kind of functionalists.
Read their documentations, normally each of them have gems you can easily integrate to your rails app.
2 - Implement your own push server
You can implement your own push server with rails, and integrate to your app.
Faye is a one option,
But more exiting thing is Rails5 will have Action Cable which tries to solve the same issue. action cable gem
and there are articles showing action cable with rails4 apps (you dont have to wait till rails5 comes out) , but I haven't used it personally yet.
HTH
Facebook does it using comet techniques.
Here are some of the helpful links
Link1
Link2
Link3
Here is the theory how facebook does
Facebook works by polling the server for any changes.
Facebook page will make ajax request to server and the ajax request will have much time out
But in the server-side in the API in server it will constantly poll DB server if anything has changed by constantly checking the activity log table in database ..if a change has been found it will return the result till then it will poll the DB
Once Ajax request is complete it will recursively try again.
Here is a code snippet - Client side
function doPoll() {
$.get("events.php", {}, function(result) {
$.each(result.events, function(event) { //iterate over the events
//do something with your event
});
doPoll();
//this effectively causes the poll to run again as
//soon as the response comes back
}, 'json');
}
$(document).ready(function() {
$.ajaxSetup({
timeout: 1000*60//set a global AJAX timeout of a minute
});
doPoll(); // do the first poll
});
Here is a code-snippet in server side:
while(!has_event_happened()) {
sleep(5);
}
echo json_encode(get_events());
you can find it in much detail here
you can actually adopt this approach according to your needs

How to dynamically and efficiently pull information from database (notifications) in Rails

I am working in a Rails application and below is the scenario requiring a solution.
I'm doing some time consuming processes in the background using Sidekiq and saves the related information in the database. Now when each of the process gets completed, we would like to show notifications in a separate area saying that the process has been completed.
So, the notifications area really need to pull things from the back-end (This notification area will be available in every page) and show it dynamically. So, I thought Ajax must be an option. But, I don't know how to trigger it for a particular area only. Or is there any other option by which Client can fetch dynamic content from the server efficiently without creating much traffic.
I know it would be a broad topic to say about. But any relevant info would be greatly appreciated. Thanks :)
You're looking at a perpetual connection (either using SSE's or Websockets), something Rails has started to look at with ActionController::Live
Live
You're looking for "live" connectivity:
"Live" functionality works by keeping a connection open
between your app and the server. Rails is an HTTP request-based
framework, meaning it only sends responses to requests. The way to
send live data is to keep the response open (using a perpetual connection), which allows you to send updated data to your page on its
own timescale
The way to do this is to use a front-end method to keep the connection "live", and a back-end stack to serve the updates. The front-end will need either SSE's or a websocket, which you'll connect with use of JS
The SEE's and websockets basically give you access to the server out of the scope of "normal" requests (they use text/event-stream content / mime type)
Recommendation
We use a service called pusher
This basically creates a third-party websocket service, to which you can push updates. Once the service receives the updates, it will send it to any channels which are connected to it. You can split the channels it broadcasts to using the pub/sub pattern
I'd recommend using this service directly (they have a Rails gem) (I'm not affiliated with them), as well as providing a super simple API
Other than that, you should look at the ActionController::Live functionality of Rails
The answer suggested in the comment by #h0lyalg0rithm is an option to go.
However, primitive options are.
Use setinterval in javascript to perform a task every x seconds. Say polling.
Use jQuery or native ajax to poll for information to a controller/action via route and have the controller push data as JSON.
Use document.getElementById or jQuery to update data on the page.

WebSockets server that will complete the job after the connection is made - Ruby, Rails

I want to use something like EventMachine websockets to push status updates to the client as they happen.
My application crawls round a section of a website screen scraping relevant details of a user's search. I want to push any screen scraping captures to the client as they happen. I also want to persist these changes to the database. I also want the job to complete even if the user closes down the browser.
At the moment, the job is initiated from the client (browser) and the job is placed on a resque queue that completes the job. The client polls the database and displays the results.
I want to have a play around with websockets but I don't think I can get the same behaviour. It is more important that the results are persisted and the job completes than the real time pushes.
Am I wrong in the assumption that this cannot be done?
Have you looked at faye. Masseging With Faye(RailsCasts). You can keep on using the rescue queue to get the job completed and push the message to subscriber(your web client) as and when you find the results.

Resources