Query SQS based on a custom MessageAttribute - amazon-sqs

So SQS has MessageAttributes that you can attach to jobs, letting you store metadata on jobs.
https://aws.amazon.com/blogs/aws/simple-queue-service-message-attributes/
Is there a way to query a queue based on one of these attributes?
Example: I want to set a "Tag" MessageAttribute, and then query the queue to see how many jobs with a "Tag" of "something" exist.

No. The best option would be to read all the messages and then delete only those which match your criteria. All the other messages that you have read will show up back in the queue after the visibility timeout.

Related

Can the flipper gem return a count of enabled users?

We are exploring using the flipper gem (https://github.com/jnunemaker/flipper) to gate who sees new features. In one of our first tests, we want to show a specific feature to only the first X users that see a banner promoting it.
We looked at using a percentage, but the business is very specific on the number, and also wants to reach that number right away, then disable the feature for all other users, without disabling it for those that saw it first. Using a percentage, we weren't able to see a way to ensure the correct number would see it, and that everyone of the first x would see it.
Inside the gates/actor.rb, there is this:
enabled_actor_ids = value
which implies we could get the list of enabled ids, and perform a count on that, but we couldn't find whether or where that list may be exposed.
Since we are using the AR adapter as a trial, we instead created a scope on an actor object that joins to the flipper_gates table, but this feels extremely fragile and getting very much into the inner workings of the gem.
Any advice is greatly appreciated.
Nowadays you can do Flipper[:some_feature].actors_value.size, assuming you've configured your default flipper instance with Flipper.configure.
https://github.com/jnunemaker/flipper/blob/196946c63aee1eaa09fa25e945cdbff896fe71e5/lib/flipper/feature.rb#L258-L260
You should be able to accomplish this by programmatically turning the feature on for Individual Actors until an upper limit is reached.
IMPORTANT NOTE: according to the documentation:
The individual actor gate is typically not designed for hundreds or
thousands of actors to be enabled. This is an explicit choice to make
it easier to batch load data from the adapters instead of performing
individual checks for actors over and over. If you need to enable
something for more than 20 individual people, I would recommend using
a group.
Now that we've agreed that we want to move forward with this anyways.. Let's talk about implementation.
Enabling the feature for an actor
The first thing you need to do is to ensure that the actor (probably a User) responds to flipper_id and that the flipper_id is unique for every actor. Once that is set up, you should be able to simply do enable the feature for a user when they see the banner like this:
flipper[:stats].enable_actor user
Counting actors enrolled in a feature
Now, in order to determine if we should enable the feature for a user, we need to determine how many users have been enrolled in the feature.
To do this we can query the Gate directly:
Flipper::Adapters::ActiveRecord::Gate.where(
feature_key: "stats",
key: "actors"
).count
This will return a count of the number of actors enrolled in a feature.
How do we know that works?
Well, let's take a look at the gem.
flipper[:stats].enable_actor actually calls Feature#enable_actor with the user we passed in earlier (that responds to flipper_id) being passed in as the actor.
Next, Feature#enable_actor passes the actor into Types::Actor.wrap which creates a new instance of Types::Actor which checks to make sure the actor isn't nil and that it has a flipper_id and then sets two instance variables, thing which is set to the actor, and value which is set to the flipper_id of the actor.
Now that we have an instance of Types::Actor, we pass it into Feature#enable which looks up the gate which in our case would be a Gates::Actor instance. Finally we call enable on the adaptor (which in your case is ActiveRecord).
In Adapters::ActiveRecord.enable we first look at gate.data_type which in our case, is :set. From there we do:
#gate_class.create! do |g|
g.feature_key = feature.key
g.key = gate.key
g.value = thing.value.to_s
end
Where, as mentioned earlier, thing.value is the flipper_id. Bingo! #gate_class is the active record class responsible for the gates table and the default table name is "flipper_gates".
Now we know exactly what to query to get a count of the actors enrolled in the feature!
number_of_actors_enrolled_in_stats_feature = Flipper::Adapters::ActiveRecord::Gate.where(
feature_key: "stats",
key: "actors"
).count

Record progress for long running ActiveJob

Based on this question How to reference active delayed_job within the actual job I'm using Delayed::Job with an additional progress text column to record progress of a long running task.
I'm now trying to update my code to use ActiveJob, so I've replaced def before with before_perform, but the job object passed to before_perform is not the same as the one passed to before. And quite rightly, because the queue adapter is configurable and may not always be :delayed_job.
So, given that the queue adapter is configurable, is there a correct way to access (read and write) the progress column in table delayed_jobs?
Thanks.

How to create unique delayed jobs

I have a method like this one
def abc
// some stuff here
end
handle_asynchronously :abc, queue: :xyz
I want to create a delayed job for this only if there isn't one already in the queue.
I really feel like this should have an easy solution
Thanks!
I know this post is old but it hasn't been replied.
Delayed jobs does not provide a way to identify jobs. https://github.com/collectiveidea/delayed_job/issues/192
My suggestion is that your job could check if it still has to run when it is executing, for example, comparing to a database value, etc. Inserting jobs in the table should be quick and you might lose that if you start checking for a certain job in the queue.
If you still want to look for duplicates when enqueuing, this might help you.
https://gist.github.com/landovsky/8c505ecab41eb38fa1c2cd23058a6ae3

How do I run delayed job inserts in the backgroud without affecting page load - Rails

I have an RoR application like posting answers to a question. If a user answers to a question, notification messages are sent to all the users, who watch-listed the question, who tracks the question and to the owner of the question. I am using delayed jobs for creating the notification messages. so, While creating answer, there are many inserts into delayed job table going on,which is slowing down the page load. It takes more time to redirect to the question show page after the answer is created.
Currently I am inserting into answers table using AJAX request. Is there any way to insert into delayed jobs table in background after the AJAX request completes?
As we have been trying to say in comments:
It sounds like you have something like:
User.all.each do |user|
user.delay.some_long_operation
end
This ends up inserting a lot of rows into delayed_jobs. What we are suggesting is to refactor that code into the delayed job itself, roughly:
def delayed_operation
User.all.each do |user|
user.some_long_operation
end
end
self.delay.delayed_operation
Obviously, you'll have to adapt that, and probably put the delayed_operation into a model library somewhere, maybe as a class method... but the point is to put the delay call outside the big query and loop.
I really advice doing this like that in a separate process. Why has the user to wait for those meta-actions? Stick to delivering a result page and only notifying your server something has to be done.
Create a separate model PostponedAction to build a list of 'to-do' actions. If you post an answer, add one PostponedAction to this database, with a parameter of the answer id. Then give the results back to the user.
Use a separate process (cron job), to read the PostponedAction items, and handle those. Mark them as 'handled' or delete on succesfull handling. This way, the user is not bugged by slow server processes.
Beside the email jobs you currently have, invent another type of job handling the creation of these jobs.
def email_all
User.all.each do |user|
user.delay.email_one()
end
end
def email_one
# do the emailing
end
self.delay.email_all()
This way the user action only triggers one insert before they see the response. You can also track individual jobs.

Rails - capturing transaction id when saving a collection of child records

When I save/update a project and its tasks I would like to be able to tell which tasks where saved in the transaction. For example I might create the project with a few tasks initially, several hours later I might add more tasks. I would like to know which tasks where create/updated in each transaction. I could determine this by comparing created_at fields but that seems like a bit of a hack.
Any ideas? Can I access the transaction object within the transaction and get a unique id from it? That would be ideal.

Resources