Access a value across all threads in rails application - ruby-on-rails

I have a user model and i am setting a value in a thread
Thread.current[:partner_domain] = "example.com"
I am able to access this in model, but not in delayed job worker, as it runs in separate thread, i can't save this domain in my database due to some business requirement.
To be more clear i am using Thread.current[:partner_domain] in a dynamically created method, that is being invoked by delayed job worker
Please help me with this.

Multithreading has nothing to do with this. DelayedJob worker runs in a separate process and, as such, doesn't share anything with your rails server process. Not threads, not memory, nothing.
The right thing to do would be to bundle all the data the job needs into its arguments. Something like this:
MyClass.delay.do_action(primary_data, options)
Where options contain your partner domain name and all the other info. Then the job just accesses the info from the arguments.

If the delayed job worker needs this value for processing jobs, I think you could pass the value as a job's argument.

Related

Does a Sidekiq worker know what time it was posted?

Assuming a Sidekiq worker wasn't posted at the same time it started (i.e. via perform_at or perform_in), is the post time stored anywhere in the worker object? In other words, can a Sidekiq worker know what time it was posted? I could pass that value to perform_at but I'd rather not if it's already there.
The created_at and enqueued_at times are stored in the job payload but not available from the worker instance. They are accessible to middleware.

Request Store variable cannot be accessed in delayed job

We are using request store gem in our app. It is used for storing global data. But the problem is If I try to access request store variable in the delayed job It is not accessible. Is there anything extra which needs to be done in order for the request store data to be available in delayed job ?
Delayed Job Code
class CustomersCreateJob < Struct.new()
def perform
puts "Request Data =====> #{RequestStore.store[:current_user] }"
end
end
In general, current_user by default is only available in controllers for reason.
You did not mention you method or running jobs, but in any way by the time when job starts, even if it happens to be in same process and thread - request is already finished and there's no current_user. So pass user's id to job explicitly (this depends on how you run them)
delayed_job workers won't get the request_store normally, because they are outside of the request/response cycle.
However this frequently isn't the desired behaviour, given the typical uses of request_store.
You can always extend ApplicationJob yourself with such functionality, (e.g. around_enqueue and around_perform), and I do recall having to do something similar at a previous role.

parallel asynchronous processing with callbacks in rails controller

I am making a rails app and I am wondering whether it is possible to setup an asynchronous/callback architecture in the controller layer. I am trying to do the following:
When a HTTP request is made to /my_app/foo, I want to asynchronously dish out two jobs - a naive ranking job and a complicated ranking job both of which rank 1000 posts - to several worker machines. I want to setup a callback method in the controller for each job which is called when the respective job is finished. If the complicated job does not return within X milliseconds, I want to return the output from the naive job. Otherwise, I want to return the output from the complicated job.
It is important to note that I want these jobs to performed in parallel. What is the best way to implement such a system in Rails? I am using Apache Phusion Passenger as my rails server if that helps.
Thanks.
Sounds like you should be using background jobs. In that case, when a request comes in, you would start / queue two jobs which would be picked up and processed by a worker, which acts independently of your Rails app.
Here a few links that could be of help:
https://www.ruby-toolbox.com/categories/Background_Jobs
http://railscasts.com/episodes/171-delayed-job
http://railscasts.com/episodes/243-beanstalkd-and-stalker
http://railscasts.com/episodes/271-resque
http://rubyrogues.com/queues-and-background-processing/
It's possible to issue several HTTP request asynchronously in Rails. However, it's impossible to make Rails event-driven.
In can send several HTTP request asynchronously with libraries such as Typhoeus. However, you might have concurrency issue if your timeout is too long.
Otherwise, you can try some event-driven web framework such as Cramp and Goliath. They are both based on EventMachine, so you can try em-http-request.
Try using rabbitmq where you can post a message on queue and expect the response in reply queue. The queue consumer can be even implemented in Scala for fastness. amqp gem would suffice what I am saying. Rails controller with amqp binding would be even more nice if possible(I am exploring that option having endpoints with amqp binding instead of http). That would solve enough no of problems

Suggestions for how to write a service in Rails 3

I am building an application which will send status requests to users (via email & sms) on a regular basis. I want to execute the service each hour which will:
Query the database for all requests that need to be sent (based on some logic)
Send the requests through Amazon's Simple Email Service (this is already working)
Write a record of the status request notification back to the data store
I am considering wrapping up this series of operations into a single controller with an end point that can be called remotely to kick off the process within the rails app.
Longer term, I will break this process out into an app that can be run independently of my rails app, but for now I'm just trying to keep it simple.
My first inclination is to build the following:
Controller with the following elements:
A method which will orchestrate the steps outlined above (and can be called externally)
A call to the status_request model which will bring back a collection of request needing to be sent
A loop to iterate through the pending requests, which will:
Make a call to my AWS Simple Email Service module to actually send the email, and
Make a call to the status_request model to log the request back to the database
Model:
A method on my status_request model which will bring back a collection of requests that need to be sent
A method in my status_request model which will log that a notification was sent
Since this will behave as a service that gets called periodically from an outside scheduler I don't think I'll need a view for this operation. (Will, of course, need views to show users and admins what requests have been sent, but that's later...).
As someone new to Rails, I'm asking for review of this approach and any suggestions you may have.
Thanks!
Instead of a controller which Jeff pointed out exposes a security risk, you may just want to expose a rake task and use cron to invoke it on an hourly basis.
If you are still interested in building a controller, look at devise gem and its single access token, token_authenticatable, for securing the methods you are exposing.
You may also want to look at delayed_job or resque to offload the call to status_request and the loop to AWS simple service to a background worker process.
You may want a seperate controller and view for the log file so you can review progress on demand.
And if you want to get real fancy use Amazon SNS to send you alerts when the service reaches some unacceptable level of failures, backlog, etc.
Since you are trying to invoke this from an outside process, your approach should work. You could also have a worker process that processes task when they are there.
You will need routes to expose your service, and you may want to also make security decisions. How will the service that invokes your application authenticate so all others can't hit it at will?
Another consideration should be how many emails are you sending. If there are enough, we may want to look into the fact that writing this sort of loop is going to be extremely top heavy; and may affect users on the current system if it's a web application.
In the end, there are many ways to do this. I would focus on the performance/usage you expect as well as security. There's never one perfect way to solve a problem like this, and your way should just be aware of the variables it will need to be operating within.
Resque and Redis might be helpful to you in scheduling and performing operatio n .They are simple and superfast, [here](http://railscasts.com/episodes/271-resque] is a simple tut on same.

Fork a process and send data to it inside Rails

I'm making a Rails application.
In the one action I need to spawn a long running process.
This is not a problem. I can fork new process using spawn gem or some other.
But some time after process has been spawned,
user must be able to pass additional data to that process.
Sure, I can fork process which will listen a UNIX socket,
store socket address in the HTTP session and
communicate with that process using drb protocol when user will require to pass new data to process. But I think it is not best solution and it will be a problem to deploy an application to the hosting.
What is the easy way to do that?
Can the additional data go into the database, and the process checks there for the "new data"?
I suggest going up-level. Use delayed_job to handle the spawning and initial setting of parameters for your job.
You can modify delayed_job's dbms model to add fields for the additional information that you want to later send to the job.
Then your flow would be:
Submit using delayed_job. Store the id of the jobs table. If stock de;ayed_job doesn't give you the job id, then modify delayed_job as necessary.
Add the additional data to the field in the jobs table
If your main job needs data back from the forked job, then you could also have a db fields for that.
You should try to design your application to mimimize the amount of message passing needed. Otherwise you'll need to design a message protocol. Can be done, but it is work that you should try to avoid.
Can't you use a thread and communicate with it via two queues, one for input to the thread, and one for the thread's responses?

Resources