I am using boto3 to receive_message() in batches. This returns a dict item as specified in the boto3 documentation.
After receiving the batch of message, I process them one by one and want to batch delete them after they are all processed. I do not want to delete them one by one after processing each single item.
The delete_message_batch() function has a slightly different syntax than the response from the receive_message() function, so I need to reformat the response dict from receive_message() into a suitable format (only using Id and ReceiptHandle).
Is there any easier way to directly use the response from receive_message() for batch deletion?
Here is how I handle the messages. I create a list with the ids and the receipt handles. The ids are not mandatory, you could insert the output of range(10). Also, there is no error checking so you need to take care of that.
response = SQS.receive_message(
QueueUrl=queue_url,
AttributeNames=[],
MaxNumberOfMessages=10,
MessageAttributeNames=['All'],
)
delete_batch = []
for message in response.get('Messages') or []:
# do stuff with each message
delete_batch.append(
{'Id': message['MessageId'], 'ReceiptHandle': message['ReceiptHandle']}
)
if delete_batch:
SQS.delete_message_batch(QueueUrl=queue_url, Entries=delete_batch)
In Boto3 you can use the Queue resource to delete batches of messages in one call. Here's some code (from a larger example in GitHub) that shows how:
import logging
import boto3
from botocore.exceptions import ClientError
logger = logging.getLogger(__name__)
sqs = boto3.resource('sqs')
def delete_messages(queue, messages):
"""
Delete a batch of messages from a queue in a single request.
:param queue: The queue from which to delete the messages.
:param messages: The list of messages to delete.
:return: The response from SQS that contains the list of successful and failed
deletions.
"""
try:
entries = [{
'Id': str(ind),
'ReceiptHandle': msg.receipt_handle
} for ind, msg in enumerate(messages)]
response = queue.delete_messages(Entries=entries)
if 'Successful' in response:
for msg_meta in response['Successful']:
logger.info("Deleted %s", messages[int(msg_meta['Id'])].receipt_handle)
if 'Failed' in response:
for msg_meta in response['Failed']:
logger.warning(
"Could not delete %s",
messages[int(msg_meta['Id'])].receipt_handle
)
except ClientError:
logger.exception("Couldn't delete messages from queue %s", queue)
else:
return response
queue = sqs.get_queue_by_name(QueueName=name)
messages = queue.receive_messages(
MessageAttributeNames=['All'],
MaxNumberOfMessages=max_number,
WaitTimeSeconds=wait_time
)
delete_messages(queue, messages)
There is also an option to purge the queue. Not sure if it will fit your use case but here is the reference:
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sqs.html#SQS.Queue.purge
and the code example:
import boto3
REGION_NAME = 'us-east-01'
QUEUE_NAME = 'my-queue01-example'
sqs = boto3.resource('sqs', REGION_NAME)
queue = sqs.get_queue_by_name(QueueName=QUEUE_NAME)
response = queue.purge()
Related
Using express and Node.js, I'm using the twitter streaming API and the needle npm package for accessing APIs to pull tweets related to keywords. The streaming is functional and I am successfully pulling tweets using the following (simplified) code:
const needle = require('needle');
const TOKEN = // My Token
const streamURL = 'https://api.twitter.com/2/tweets/search/stream';
function streamTweets() {
const stream = needle.get(streamURL, {
headers: {
Authorization: `Bearer ${TOKEN}`
}
});
stream.on('data', (data) => {
try {
const json = JSON.parse(data); // This line appears to be causing my error
const text = json.data.text;
} catch (error) {
console.log("error");
}
});
}
However, no matter which search term I use (and the subsequent large or small volume of tweets coming through), the catch block will consistently log 1-3 errors per minute, which look like this:
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
at PassThrough.<anonymous> (C:\Users\danie\OneDrive\Documents\Personal-Projects\twitter-program\server.js:56:31)
at PassThrough.emit (events.js:315:20)
at addChunk (internal/streams/readable.js:309:12)
at readableAddChunk (internal/streams/readable.js:284:9)
at PassThrough.Readable.push (internal/streams/readable.js:223:10)
at PassThrough.Transform.push (internal/streams/transform.js:166:32)
at PassThrough.afterTransform (internal/streams/transform.js:101:10)
at PassThrough._transform (internal/streams/passthrough.js:46:3)
at PassThrough.Transform._read (internal/streams/transform.js:205:10).
I've seen previous advice which says that data can be fired in multiple chunks, and to push the chunks to an array i.e. something like the following:
let chunks = [];
stream.on('data', (dataChunk) => {
chunks.push(dataChunk);
}).on('end',() => {
// combine chunks to create JSON object
})
But this didn't work either (may have been my implementation but I don't think so) and now I'm wondering if it's perhaps an error with the twitter API, because most of the tweet objects do come through correctly. I should note that the streamTweets() function above is called from an async function, and I am also wondering if that is having something to do with it.
Has anyone else encountered this error? Or does anyone have any idea how I might be fix it? Ideally i'd like 100% of the tweets to stream correctly.
Thanks in advance!
For future readers, this error is triggered by Twitter's heartbeat message that is sent every 20 seconds. Per the documentation:
The endpoint provides a 20-second keep alive heartbeat (it will look like a new line character).
Adding a guard against parsing the empty string will prevent the JSON parsing error.
if (data === "")
return
An empty string is invalid JSON, hence the emitted error.
Now, acknowledging that the heartbeat exists, it may be beneficial to add read_timeout = 20 * 1000 in the needle request to avoiding a stalled program with no data, be that due to a local network outage or DNS miss, etc.
How can I test a sever sent events endpoint with REST-Assured?
I have tried the following, but the test doesn't terminate:
val stream = RestAssured.given()
.contentType(MediaType.SERVER_SENT_EVENTS)
.get("/status/stream").asInputStream()
myService.publish(SomeEvent())
val line = stream.reader().readText()
stream.close()
assertEquals("my-event", line)
myService.publish() pushes an object as the data payload to the SSE stream, which is validated by another test.
The assertion here is only exemplary. The problem is how to receive data from the stream.
You can read SSE urls with javax.ws.rs.sse.SseEventSource.
we have a Jenkins pipeline that sends a sh 'curl' request to a an api/application to runs specific tests for us, and we fail/pass the build depending on the results.
What I want to do: is to parse the information we get back from the curl (XML document) and send these notifications via Slack.
What I've done so far is exactly that, parse the XML document, able to print the results locally, but when I try to send this result to slack I get a error: I am assuming this is because I saved my results as an Array, and when I am trying to send the information to slack its unable to reference the variable.
My question is: How should we send XML results to Slack and how can we properly parse the XML file and send it to slack?
My code snippet:
List <String> someString = new ArrayList<String>()
parsed = new XmlSlurper().parse("${workspace}/tmp/TESTS-results.xml")
parsed.testsuite.testcase.each { device ->
someString.add(device.#name)
someString.add(device.#time)
someString.add(device)
println deviceArr
println deviceArr.getClass()
}
parsed= XmlUtil.serialize(parsed)
return deviceArr
}
in field groovy.lang.GString.values
in object org.codehaus.groovy.runtime.GStringImpl#40198fcc
in field groovy.lang.Closure.delegate
in object org.jenkinsci.plugins.workflow.cps.CpsClosure2#4a4ae500
in field org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.closures
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5dd0e25c
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5dd0e25c
Caused: java.io.NotSerializableException: groovy.util.slurpersupport.Attributes
I am using Spray-can to host a REST service to which a user will be able to upload a file. The block of code that listens for incoming requests is given below:
def receive: Receive = {
case _: Http.Connected => sender ! Http.Register(self)
case req#HttpRequest(HttpMethods.POST, Uri.Path("/upload"), headers, entity, _) =>
logger.info("Received file upload request.")
// Process the uploaded data using the 'entity' object
I upload the file using this curl command:
curl --data-binary #inputFile.csv 'devserver:8998/upload?tenant=DressShop&facility=CityCenter&customer=Jimmy'
The challenge I am facing is that I'm not able to pick up the filename "inputFile.csv" from the request, though I'm getting the data from the "entity" object. I tried poring through the API but couldn't find out any way to get the filename.
My objective is to ensure that I allow upload of only csv files.
You need to process the entity as form data using as
as[MultipartFormData]
Then you can get the file name from the header fields:
def processFormData(data: MultipartFormData) = {
var attForm = ""
val bodyPart = data.fields(0)
data.fields foreach {
bodyPart => println(bodyPart.headers.find(h=> h.is("content-disposition")).get.value)
}
}
This might help:
The filename can be found in parameters.
We have two channels called channelA and channelB.
In channelA we have two destinations
a. first destination will invoke the channelB with XML data as input and get the response from the channelB in XML format.
b. retrieve the response of first destination in xml format and process it.
var dest1 = responseMap.get("destination1");
var resMessage = dest1.getMessage();
I am getting channelB response as "Message routed successfully".
How I will get actual XML from channelB instead of "Message routed successfully" message.
We are doing above steps to define generic channels such that we can reuse it in different scenarios in the mirth application.
We using mirth 2.2.1.5861 version.
We are doing something very similar to what you described. In our case, destination1 is a SOAP sender (SOAP uses XML for its send and receive envelopes). Here's the syntax we are successfully using in destination2 JavaScript Writer:
var dest1 = responseMap.get("destination1");
var resMessage = dest1.getStatus().toString();
if (resMessage == "SUCCESS")
{
var stringResponse = dest1.getMessage();
channelMap.put('stringResponse',stringResponse);
var xmlResponse = new XML(stringResponse);
// use e4x notation to parse xmlResponse
}
If your destination1 is not a SOAP sender, then the XML response from channelB might be getting packaged up in some way that you need to extract from "stringResponse." You can see the contents of the channelMap variable "stringResponse" after running the message through the channel. Go to the dashboard, double-click the channel, find a message that has been sent, and then look at the mappings tab. What does the content of "stringResponse" actually look like? Is it just "Message routed successfully?" Or is that text followed by the XML that you're after?
Create ChannelB having source data type as an XML, and put source as a channel reader.
You have to make a single destination on ChannelA as a Channel Writer, and put ChannelB in the details.
This way whatever message you get in the form of an XML in ChannelAwill be routed to ChannelB.