Cannot parse message published in Pub/Sub job update notification for google transcoder - google-cloud-transcoder

Business Impact: Not being able to use the Pub/Sub job update notifications feature. Issue Summary: We (Recall) are trying to use the Pub/Sub notifications for job updates (https://cloud.google.com/transcoder/docs/how-to/create-pub-sub).
However, since the PubsubMessage data is serialized into bytes [1], without knowing the proto type, we cannot deserialize it [2]. (Unless the intention is to treat the data as a string and parse from there?)
Based off the message we see published in the Pub/Sub topic:
data: "{\"job\":{\"name\":\"projects/PROJECT_NUMBER/locations/us-central1/jobs/JOB_ID\",\"state\":\"SUCCEEDED\",\"failureReason\":null}}"
message_id: "2356400951506061"
publish_time {
seconds: 1620063162
nanos: 430000000
}
We were guessing that the type is google.cloud.video.transcoder.v1beta1.Job [3], but parsing the message into that type throws InvalidProtocolBufferException.
Is this the type we should be using to deserialize the message? Any tips to help parse the published message would be helpful.
Thanks!

Pubsub message in v1beta1 doesn't refer to any protos. The data is in json form so I think you could "treat the data as a string and parse from there".
The message in transcoder v1 will be in proto format. The doc has not been updated yet. Please check later

https://cloud.google.com/pubsub/docs/pull
Here is documentation, they have mentioned we need to convert the buffer to string and then we can parse the data.
// Create an event handler to handle messages
let messageCount = 0;
const messageHandler = message => {
console.log(`Received message ${message.id}:`);
console.log(`\tData: ${message.data}`);
console.log(`\tAttributes: ${message.attributes}`);
messageCount += 1;
// "Ack" (acknowledge receipt of) the message
message.ack();
};
// Listen for new messages until timeout is hit
subscription.on('message', messageHandler);
I have tried the same
converting buffer to a string
parsing the string
let stringData = message.data.toString();
console.log(stringData);
let jsonData = JSON.parse(stringData);
console.log(jsonData);
This worked for me in javascript

Related

How to use twilio bi-directional stream feature to play raw audio data

I'm using Twilio Programmable Voice to process phone calls.
I want to use bi-directional stream feature to send some raw audio data to play by twilio, the initialization code looks like,
from twilio.twiml.voice_response import Connect, VoiceResponse, Stream
response = VoiceResponse()
connect = Connect()
connect.stream(url='wss://mystream.ngrok.io/audiostream')
response.append(connect)
Then when got wss connection from twilio, I start to send raw audio data to twilio, like this
async def send_raw_audio(self, ws, stream_sid):
print('send raw audio')
import base64
import json
with open('test.wav', 'rb') as wav:
while True:
frame_data = wav.read(1024)
if len(frame_data) == 0:
print('no more data')
break
base64_data = base64.b64encode(frame_data).decode('utf-8')
print('send base64 data')
media_data = {
"event": "media",
"streamSid": stream_sid,
"media": {
"playload": base64_data
}
}
media = json.dumps(media_data)
print(f"media: {media}")
await ws.send(media)
print('finished sending')
test.wav is a wav file encoded audio/x-mulaw with a sample rate of 8000.
But when run, I can't hear anything, and on twilio console, it said
31951 - Stream - Protocol - Invalid Message
Possible Causes
- Message does not have JSON format
- Unknown message type
- Missing or extra field in message
- Wrong Stream SID used in message
I have no idea which part is wrong. Does anyone know what's my problem? I can't find an example about this scenario, just follow instructions here, really appreciate it if someone knows there is an example about this, thanks.
Not sure if this will fix it but I use .decode("ascii"), not "utf-8"
Question is probably not relevant anymore, but I came across this while debugging my bi-directional stream, so it might be useful for someone:
Main reason why were you receiving this error because of the typo in json content. You are sending "playload" instead of "payload".
Another issue when sending data to twilio stream is that you should send mark message at the end of data stream to notify twilio that complete payload was sent. https://www.twilio.com/docs/voice/twiml/stream#message-mark-to-twilio
When sending data back to twilio stream, be aware that payload should not contain audio file type header bytes, so make sure you remove them from your recording or alternatively skip them while sending data to twilio.

Multiple Immutable Id's for same mail message?

I am writing an Office Outlook add-in that has a React frontend and a dotnet core backend. I have set up a subscription using the Graph API to receive notifications when a new email appears on the SentItems folder. I want to correlate the email from the notification with information I have stored in a database.
Unfortunately the item id changes when the email is sent and moves from the Drafts folder into SentItems so it isn't useful for matching.
There is a new ImmutableId that doesn't change when the email is moved between folders. I've been unable to get the Office.js lib to generate an ImmutableId but there is a translateExchangeIds method that when given an email item id will return an immutable id.
// convert to immutable
var translateRequest = new {
inputIds = new string [] { mailMessage.ItemId },
targetIdType = "restImmutableEntryId",
sourceIdType = "restId"
};
var immutableResponse = await graphClient.PostAsJsonAsync("me/translateExchangeIds", translateRequest);
var immutableId = await immutableResponse.Content.ReadAsStringAsync();
I can use that immutable id to retrieve the email message using the Graph request:
await graphClient.GetAsync($"Users/cccccccc-dddd-eeee-ffff-ba0c52e56d99/Messages/AAkALgAAAAAAHYQDEapmEc2byACqAC-EWg0AQ-irLc2NFESKcGAhz1k_GBBDB5JMOwAA/
But the immutable id that is returned with the subscription notification is a different immutable id for the same message. So it's not possible to match the notification mail message with the message info stored in my database. So I still have to attach a custom property to the message for the sole purpose of matching the database entry with the SentItems notification.
Is there a better way to deal with this issue?
Update: my theory is the difference occurs because the immutable id is derived when the item is in different folders? When translating the item id to an immutable id, the item is still in the Drafts folder. When the subscription notification occurs, the item is in the Sent Items folder. The following responses were from queries using the different immutable id's but identify the same message - the myId GUID is a custom property attached to the message and used to correlate the notification with the message info stored in a local database.
\"id\":\"AAkALgAAAAAAHYQDEapmEc2byACqAC-EWg0AQ-irLc2NFESKcGAhz1k_GAADB4INPAAA\",...,\"myId\":\"8baa904f-cf64-437c-878c-be4f71714aee\"
\"id\":\"AAkALgAAAAAAHYQDEapmEc2byACqAC-EWg0AQ-irLc2NFESKcGAhz1k_GAADB4INLwAA\",...,\"myId\":\"8baa904f-cf64-437c-878c-be4f71714aee\"
We fixed this and now you should see the same ImmutableId for draft and sent messages. Can you try and let us know if this is working as expected?

how to disable google cloud speech to text in nodejs

I'm not able by any methods to stop the streamingRecognize. I tried to unpipe the stream, tried to unset client but I still get a 'Audio Timeout Error:'. Is there any method to stop recognizing?
When we call streamingRecognize() what is returned to us is an input stream. We then push data records through the stream for as long as we have input. When we have sent all the data we wish to send, we must instruct the speech to text processor that there is no more data to process. We do this by calling the end() method of the stream.
In your example fragment:
let recognizeStream = client
.streamingRecognize(request)
.on('error', console.error)
.on('data', data =>
{
console.log(data.results[0].alternatives[0].transcript);
recognizeStream.end();
}
);
I was able to stop it without errors in case someone has the same problem this way:
let recognizeStream = client
.streamingRecognize(request)
.on('error', console.error)
.on('data', data =>
{
console.log(data.results[0].alternatives[0].transcript);
// first stop the microphone stream
micInstance.stop();
//call again to client.streamingRecognize with request set to null
client.streamingRecognize(null);
}
);
But now I get a Error Response: [4] DEADLINE_EXCEEDED at node_modules/grpc/src/common.js:91:15)

getting null response from recieveAndConvert() spring -amqp

I have created on replyQ and done biniding with one direct exchange.
Created the message by setting replyto property to "replyQ"
And sending the message on rabbit to the other service.
The service at other end getting the message and sending reply on given replyTo queue.
and now I am trying to read from a replyQ queue using
template.receiveAndConvert(replyQueue));
But getting null response and i can see the message in the replyQ.
That is the service is able to send the reply but am not able to read it from the given queue
Please help what is going wrong.
template.receiveAndConvert() is sync, blocked for some time one time function, where default timeout is:
private static final long DEFAULT_REPLY_TIMEOUT = 5000;
Maybe this one is your problem.
Consider to switch to ListenerContainer for continuous queue polling.
Another option is RabbitTemplate.sendAndReceive(), but yeah, with fixed reply queue you still get deal with ListenerContainer. See Spring AMQP Reference Manual for more info.
I don't know if this could help anyone, but I found out that declaring the expected Object as a parameter of a method listener did the work
#RabbitListener(queues = QUEUE_PRODUCT_NEW)
public void onNewProductListener(ProductDTO productDTO) {
// messagingTemplate.receiveAndConvert(QUEUE_PRODUCT_NEW) this returns null
log.info("A new product was created {}", productDTO);
}

Not able to process response received from template.convertSendAndReceive()

I am trying to send message and receive response using following code
MessageProperties props =MessagePropertiesBuilder.newInstance().setContentType(MessageProperties.CONTENT_TYPE_TEXT_PLAIN)
.setMessageId("MSG12345").setHeader("type", "type1").setCorrelationId(UUID.randomUUID().toString().getBytes()).build();
Message message = MessageBuilder.withBody(input.getBytes()).andProperties(props).build();
Message response = (Message) template.convertSendAndReceive("key", message);
But, its is throwing ava.lang.ClassCastException: java.lang.String cannot be cast to org.springframework.amqp.core.Message
May be because, I am sending request using java (spring-amqp) program and the receiver is a python (pika) program.
Recevier is sending me a JSON object dumped in string format but I am not able to handle it.
Your problem that you use RabbitTemplate.convertSendAndReceive():
/**
* Basic RPC pattern with conversion. Send a Java object converted to a message to a default exchange with a
* specific routing key and attempt to receive a response, converting that to a Java object. Implementations will
* normally set the reply-to header to an exclusive queue and wait up for some time limited by a timeout.
*
* #param routingKey the routing key
* #param message a message to send
* #return the response if there is one
* #throws AmqpException if there is a problem
*/
Object convertSendAndReceive(String routingKey, Object message) throws AmqpException;
Even if your payload is Message and we we have:
protected Message convertMessageIfNecessary(final Object object) {
if (object instanceof Message) {
return (Message) object;
}
return getRequiredMessageConverter().toMessage(object, new MessageProperties());
}
It converts a reply into a target object from body:
return this.getRequiredMessageConverter().fromMessage(replyMessage);
and don't return Message as you expected.
So, you really have to cast to String and deal with your JSON already on your own.

Resources