Assuming a Sidekiq worker wasn't posted at the same time it started (i.e. via perform_at or perform_in), is the post time stored anywhere in the worker object? In other words, can a Sidekiq worker know what time it was posted? I could pass that value to perform_at but I'd rather not if it's already there.
The created_at and enqueued_at times are stored in the job payload but not available from the worker instance. They are accessible to middleware.
Related
I have a user model and i am setting a value in a thread
Thread.current[:partner_domain] = "example.com"
I am able to access this in model, but not in delayed job worker, as it runs in separate thread, i can't save this domain in my database due to some business requirement.
To be more clear i am using Thread.current[:partner_domain] in a dynamically created method, that is being invoked by delayed job worker
Please help me with this.
Multithreading has nothing to do with this. DelayedJob worker runs in a separate process and, as such, doesn't share anything with your rails server process. Not threads, not memory, nothing.
The right thing to do would be to bundle all the data the job needs into its arguments. Something like this:
MyClass.delay.do_action(primary_data, options)
Where options contain your partner domain name and all the other info. Then the job just accesses the info from the arguments.
If the delayed job worker needs this value for processing jobs, I think you could pass the value as a job's argument.
We are using request store gem in our app. It is used for storing global data. But the problem is If I try to access request store variable in the delayed job It is not accessible. Is there anything extra which needs to be done in order for the request store data to be available in delayed job ?
Delayed Job Code
class CustomersCreateJob < Struct.new()
def perform
puts "Request Data =====> #{RequestStore.store[:current_user] }"
end
end
In general, current_user by default is only available in controllers for reason.
You did not mention you method or running jobs, but in any way by the time when job starts, even if it happens to be in same process and thread - request is already finished and there's no current_user. So pass user's id to job explicitly (this depends on how you run them)
delayed_job workers won't get the request_store normally, because they are outside of the request/response cycle.
However this frequently isn't the desired behaviour, given the typical uses of request_store.
You can always extend ApplicationJob yourself with such functionality, (e.g. around_enqueue and around_perform), and I do recall having to do something similar at a previous role.
I'm sending a sidekiq job with a parameter that is confidential and I would like that parameter to be filtered so that no log or user is ever able to see it. Sidekiq itself does NOT output this parameter to its log, and the jobs runs in milliseconds so it would be tough to see it in the sidekiq web monitor.
But, if someone were to launch the redis-cli from a command line and run MONITOR then they would be able to see the parameters of the job passed in plain text.
Is there a way to filter this, so the parameters are blocked from the redis monitor?
The easiest thing you can do is encrypt your job arguments.
Sorry for the basic question about Sidekiq's delaying ActionMailer. As per this article, Sidekiq can delay sending out emails by just saying UserMailer.delay_for(1.hour).....
Does this mean this is handled in the background now, or does it mean that it simply just delays sending the email out for an hour but once that hour comes, then the email is basically being sent like a regular ActionMailer, which slows down response time?
Or is it that if I truly want to do this in the background then I would have to do the other sidekiq stuff like putting it in a specific Worker and then firing it up that way?
Also, separately, if I do just do it via UserMailer.delay..., I presume I won't need a worker dyno on Heroku to save some money, correct?
Thanks for the help!
Yes, for emails you don't need to do anything else. It's like calling the Mailer 1 hour later. You just need to make sure you don't pass any complex objects into the mailer, for example a user object, you should only pass the user_id, because it will be stored in redis. On the Mailer fetch the user object with the given id.
I'm making a Rails application.
In the one action I need to spawn a long running process.
This is not a problem. I can fork new process using spawn gem or some other.
But some time after process has been spawned,
user must be able to pass additional data to that process.
Sure, I can fork process which will listen a UNIX socket,
store socket address in the HTTP session and
communicate with that process using drb protocol when user will require to pass new data to process. But I think it is not best solution and it will be a problem to deploy an application to the hosting.
What is the easy way to do that?
Can the additional data go into the database, and the process checks there for the "new data"?
I suggest going up-level. Use delayed_job to handle the spawning and initial setting of parameters for your job.
You can modify delayed_job's dbms model to add fields for the additional information that you want to later send to the job.
Then your flow would be:
Submit using delayed_job. Store the id of the jobs table. If stock de;ayed_job doesn't give you the job id, then modify delayed_job as necessary.
Add the additional data to the field in the jobs table
If your main job needs data back from the forked job, then you could also have a db fields for that.
You should try to design your application to mimimize the amount of message passing needed. Otherwise you'll need to design a message protocol. Can be done, but it is work that you should try to avoid.
Can't you use a thread and communicate with it via two queues, one for input to the thread, and one for the thread's responses?