The SimpleMessageListenerContainerFactory in spring-cloud-aws provides a way to configure the MaxNumberOfMessages to pull from an SQS queue. See spring cloud docs
If an application is listening to more than one queue. How can the MaxNumberOfMessages be set differently for each queue?
Related
We have more than 100 SQS queues and its dynamic as well. Hence, creating alert for each metric name would be challenging. Is there any different solution to monitor SQS queues?
I configured cloudwatch event rule, to direct all SQS events to cloudwatch log group. But no logs are recorded in log group. Can someone help me out with a solution to monitor these queues.
One possible solution is to read the SQS information and then use it as a variable to create cloudwatch alarms via terraform.
see:
AWS CLI SQS List Queues
Terraform alarm resource
Terraform apply variables
I have a use case where I publish to a topic and listen via Queues in solace. Due to the increased number of Queues , we have decided to create temporary queues. When I tried with a temp Queue, I was able to publish and subscribe directly. But I'm unable to attach a topic endpoint to the queue. Is it possible t attach a topic endpoint to a temp Queue in Solace, if so how to do it?
Yes, it is possible. Which API are you using? Assuming it is the Solace Java API, then to add a topic subscription to a queue you can use the JCSMPSession.addSubscription(...) method. For other Solace APIs see the documentation here: https://docs.solace.com/Solace-PubSub-Messaging-APIs/API-Developer-Guide/Adding-Topic-Subscriptio.htm
The process is same whether the queue is temporary or durable.
Queue queue = JCSMPFactory.onlyInstance().createQueue("Q/tutorial/topicToQueueMapping");
Topic tutorialTopic = JCSMPFactory.onlyInstance().createTopic("T/mapped/topic/sample");
session.addSubscription(queue, tutorialTopic, JCSMPSession.WAIT_FOR_CONFIRM);
In the add subscription method above pass in your temporary queue object reference.
I'm not clear about how to create a pull queue in GCP from an outside application. I've found documentation about pulling messages but not about creating queues.
Can somebody point me out some information about it?
Best Regards
Creating queues from outside of an AppEngine App is currently not available.
Queue management features are coming in the new Cloud Tasks API, which is available in Alpha. You can request to join the Alpha here
I would like to know the advantages and disadvantages of using SimpleMessageListenerContainer over receiving a message manually using Spring AMQP. Another question is when we create SimpleMessageListenerContainer setting a queue, does the rabbitmq calls the listeneradaptor or does SimpleMessageListenerContainer keeps polling the queue to check for messages and calls the registered adaptor when their is message.
It depends on your requirements; the listener container gives you an async (message-driven) approach. Otherwise, if you use the RabbitTemplatereceive methods, you are polling the queue. The container does not poll the queue, the broker pushes messages to the container according to the prefetch settings (default 1) - if using ackmode AUTO.
Iam sending messages to the queue and using amazon sqs queuing system in a rails application. But since the queue follows FIFO process, it will get the next items in the same fashion. Suppose if I have 100 items in a queue, how can I retrieve the 35th item from the queue and process it. As far as I know, there is no such method that amazon sqs provides for doing it. So is there any other method/workaround where I can achieve the this functionality.
There is no method to do that; SQS does not guarantee order of items in the queue due to its geographically redundant nature; it can't even guarantee FIFO. If you absolutely must process things in order, and need the ability to 'look ahead' in the queue, SQS may not be your best choice. Perhaps a custom made queue in something like DynamoDB may be work better.
SQS is designed to guarantee at-least-once delivery and does not take into account the order of messages. So the simple answer to your question on whether you can do that, is no.
A work around would depend on your use-case:
To split work among different processes handling queue messages and making sure they don't both process the same item - Different queues is one approach, or prefixing every message with an identifier denoting which process is supposed to work on it. For example, if I have 4 daemons's running, I could prefix every message in the queue with the ID of the process which should work on it - 1,2,3 or 4. Every process would only process messages with the number corresponding to it's ID.
Order of arrival is critical - In this case, you're better off not using SQS because it wasn't to be used this way. CloudAMQP is a cloud based service that is based off RabbitMQ which is a true FIFO queue and would suit this case better than SQS.