Hi I have developed a rails app.
When one controller receives a request, it will render a client view AND send another action to a LED ticker display via TCP/IP. BUT send stuff to LED ticker display takes about 3 second. And I might have 5-10 LED ticker to send.
This will block client view rendering. (I can use multi-thread to send to each LED ticker display, but still have to delay 3-5 seconds when thread queue joins)
Question:
Since client view has nothing to do regardlessly if sending to LED fails.
Can I make it an async job?HOW?
Should I make a Sinatra background process listens stuff and send to LED by the sinatra app?
Thanks!
The spawn-plugin from https://github.com/tra/spawn should do nicely and can use forking (by default), threads or yields.
I use spawn with fork for long-running, fairly heavy tasks and it works like a charm. A simple example would be :
spawn(:method => :fork) do
do_led_stuff()
end
and since you don't require any feedback from the LED-ticker you won't have to wait() for the spawned process either.
Have you tried delayed_job (http://rubygems.org/gems/delayed_job)? I don't know if it's compatible with Sinatra, but maybe you can have a look at it.
You can use Resque (https://github.com/defunkt/resque)
Related
I am creating a simulator in rails for an upcoming product. We want to model how a device that transmits data will behave, so I need to simulate the creation of multiple objects at different times. Basically, I want a method that does this:
def simulate_scenario_a
create_data_packet_a (instantly)
create_data_packet_b (5 mins delay)
create_data_packet_c (10 mins)
end
These all need to be sent to the front end api as soon as they are created but i am not sure of a suitable method to use to delay them. All my delays so far prevent the main method simulate_scenario_a from completing so they are all fired at the same time. Should i use delayed jobs gem?? Advice needed
Consider one of background jobs gem. I prefer sidekiq gem.
Using sidekiq's API, you can
def simulate_scenario_a
DataPacket.create!(params)
DataPacket.delay_for(5.minutes).create!(params)
DataPacket.delay_for(10.minutes).create!(params)
end
I want to create an API endpoint in my app that takes data and performs a complex, time-consuming operation with it, but only after returning that the data has been received.
As in, I'd love to be able to do something like this:
def action
render json: { data_received: params[:whatever].present? }
perform_calculations( params[:whatever] )
end
Unfortunately, as I understand it, Ruby/Rails is synchronous, and requires that a controller action end in a render/redirect/head statement of some sort.
Ordinarily, I'd think of accomplishing this with a background worker, like so:
def action
DelayedJobActionPerformer.perform_later(params[:whatever])
render { data_received: params[:whatever].present? }
end
But a worker costs (on Heroku) a fair amount of monthly money for a beginning app like this, and I'm looking for alternatives. Is there any alternative to background workers you can think of to return from the action and then perform the behavior?
I'm thinking of maybe creating a separate Node app or something that can start an action and then respond, but that's feeling ridiculous. I guess the architecture in my mind would involve a main Rails app which performs most of the behavior, and a lightweight Node app that acts as the API endpoint, which can receive a request, respond that it's been received, and then send on the data to be performed by that first Rails app, or another. But it feels excessive, and also like just kicking the problem down the road.
At any rate, whether or not I end up having to buy a worker or few, I'd love to know if this sort of thing is feasible, and whether using an external API as a quasi-worker makes sense (particularly given the general movement towards breaking up application concerns).
Not really...
Well you can spawn a new thread:
thread = Thread.new { perform_calculations( params[:whatever] ) }
And not call thread.join, but that is highly unreliable, because that thread will be killed if the main thread terminates.
I don't know how things with cron jobs are in Heroku, but another option is to have a table with pending jobs where you save params[:whatever] and have a rake task that is triggered with cron periodically to check and perform any pending tasks. This solution is a (really) basic worker implementation.
Heard about sucker_punch, you can give it a try. This will run in single webprocess but the downside is that if the web processes is restarted and there are jobs that haven't yet been processed, they will be lost. So not recommended for critical background tasks.
I have a Rails 3 application that lets a user perform a search against a 3rd party database via an API. It could potentially bring back quite a bit of XML data. Also, the API could be busy serving requests for other users and have a nontrivial delay in its response time.
I only run 2 webservers so I can't afford to have a delayed request obviously. I use Sidekiq to process long-running jobs, but in all the cases I've needed that I haven't had to return a value to the screen.
I also use Pusher to communicate back to the user when a background job is finished. I am checking it out, but I don't know if it can be used for the kind of data I want to push to the screen. Right now it just pops up dialog boxes with messages that I send it.
I have thought of some pretty kooky stuff, like running the request via Sidekiq, sending the results to a session object or file, then using Pusher to kick off some kind of event to grab the data and populate the screen with it. Seems kind of Rube Goldberg-ish.
I appreciate any help or insight anyone can offer into the problem!
I had a similar situation not long ago and the way I've fixed was using memcache and threads.
I've also thought about using Sidekiq, but Sidekiq is ideal if you don't expect to use the data right away, so memcache and threads worked pretty well and gave us a good amount of control.
Instead of calling the API directly I would assign the API request to a thread and this thread once done would write to memcache, in my case this can happen incrementally, with the same API being able to return more data from the same endpoint until is complete.
From the UI I would have a basic ajax pooling mechanism that would hit a controller and check memcache for data and for the status to see if it was complete or not, this would sign the UI that it need to keep pooling for more data.
I'm building a web game in Ruby on Rails that relies on a choose-your-own-adventure mechanic coupled with a time-waiting system (a la Zynga and the come back in 15m, 30m, 1hr, etc concept).
However, I need a game loop to run in the background to constantly run and check if the "quests" that players are waiting on are ready and if so, ping the user(email/smartphone push notification/whatever they want). I obviously need it to do more than just this, but this is the core functionality.
I don't want to throw this into a Rails controller because I don't need the game logic running on every single page view or for it to be hammered when tons of users are on, rather I just need a loop to run continuously (at a set interval) and handle all of the small tasks that will be necessary to run the backend of a multiplayer game.
What language/technique is best for this, or do I even need to leave my Ruby/Rails foundation at all?
EDIT: This game does not feature a "persistent" world and has no real need of persistent connections with clients. The game is spread out over many pages and it will feature some asynchronous functionality (a news 'ticker' at the top that has updates pushed to it, etc).
Sounds like you're looking for a background worker of some sort. Heroku supports a scheduler that you can set to run every ten minutes; documentation is here: http://devcenter.heroku.com/articles/scheduler?preview=1
Otherwise, a system like Resque ( https://github.com/defunkt/resque ) or DelayedJob ( https://github.com/tobi/delayed_job ) would be good plugins for handling periodic introspection without tying up your controllers.
Check out either PusherApp or PrivatePub
My idea is that the player will conduct a movement, and then broadcast out (via pusher or private pub) to the other players that it is their turn.
Check on page load or by JS request if the event is finished already. In example when you show page that inform your user about time to end process then you will update it with JA, when time come to 0.0s then refresh page or use AJAX and show info that process is finished. If user leave the page and return back you will know that he finished that process.
hi
i'm going to set up a rails-website where, after some initial user input, some heavy calculations are done (via c-extension to ruby, will use multithreading). as these calculations are going to consume almost all cpu-time (memory too), there should never be more than one calculation running at a time. also i can't use (asynchronous) background jobs (like with delayed job) as rails has to show the results of that calculation and the site should work without javascript.
so i suppose i need a separate process where all rails instances have to queue their calculation requests und wait for the answer (maybe an error message if the queue is full), kind of a synchronous job manager.
does anyone know if there is a gem/plugin with such functionality?
(nanite seemed pretty cool to me, but seems to be only asynchronous, so the rails instances would not know when the calculation is finished. is that correct?)
another idea is to write my own using distributed ruby (drb), but why invent the wheel again if it already exists?
any help would be appreciated!
EDIT:
because of the tips of zaius i think i will be able to do this asynchronously, so i'm going to try resque.
Ruby has mutexes / semaphores.
http://www.ruby-doc.org/core/classes/Mutex.html
You can use a semaphore to make sure only one resource intensive process is happening at the same time.
http://en.wikipedia.org/wiki/Mutex
http://en.wikipedia.org/wiki/Semaphore_(programming)
However, the idea of blocking a front end process while other tasks finish doesn't seem right to me. If I was doing this, I would use a background worker, and then use a page (or an iframe) with the refresh meta tag to continuously check on the progress.
http://en.wikipedia.org/wiki/Meta_refresh
That way, you can use the same code for both javascript enabled and disabled clients. And your web app threads aren't blocking.
If you have a separate process, then you have a background job... so either you can have it or you can't...
What I have done is have the website write the request params to a database. Then a separate process looks for pending requests in the database - using the daemons gem. It does the work and writes the results back to the database.
The website then polls the database until the results are ready and then displays them.
Although I use javascript to make it do the polling.
If you really cant use javascript, then it seems you need to either do the work in the web request thread or make that thread wait for the background thread to finish.
To make the web request thread wait, just do a loop in it, checking the database until the reply is saved back into it. Once its there, you can then complete the thread.
HTH, chris