I have been using DocuSign SOAP and REST based API calls to create envelope and am also using their Connect feature to update the recipient and envelope statuses for my clients.
I am getting a strange error parsing DocuSign Connect update for one client.
The error says "There is an error in XML document (1, 16174)".
Here is my code...
Dim sr As New StreamReader(Request.InputStream)
Dim reader As XmlReader = New XmlTextReader(New StringReader(xml))
Dim serializer As New XmlSerializer(GetType(DocuSignEnvelopeInformation), "http://www.docusign.net/API/3.0")
If Not serializer Is Nothing Then
envelopeInfo = TryCast(serializer.Deserialize(reader), DocuSignEnvelopeInformation)
Dim envid As String = envelopeInfo.EnvelopeStatus.EnvelopeID.ToString
I have tried bunch of things such as removing the XML definition from the XML document but did not work. The strange thing is that the same code works for all of my other clients. This is the only client that is having issues. They have added closed 65 tags in the document to be signed but I don't think that the tags are causing issues on their end since I also tried removing them.
Please advise.
Minal
I have run into this issue before when there are unsupported characters in the tab values or in the PDF byte stream itself when it is decoded. I suspect that copying and pasting values into tabs from external programs like Word introduce some invisible weird characters like
- carriage returns and the like. You should validate your XML in its entirety.
Related
We have an API as a proxy between clients and google Pub/Sub, so it basically retrieves a JSON body and publishes it to the topic. Then, it is processed by DataFlow, which stores it in BigQuery. Also, we use transform UDF to, for instance, convert a field value to upper case; it parses JSON sent and produces a new one.
The problem is the following. The number of bytes sent to the destination table is much less than to the deadletter, and the error message is 99% percent contains the error saying that the sent JSON is invalid. And that's true, the payloadstring column contains distorted JSONs: they could be truncated, concatenated with other ones, or even both. I've added logs on the API side to see where did the message set corrupted, but neither received or sent by the API JSON bodies are invalid.
How can I debug this problem? Is it any chance of pub/sub or dataflow to corrupt messages? If so, what can I do to fix it?
UPD. By the way, we use a Google-provided template called "pubsub topic to bigquery"
UPD2. API is written in Go, and the way we send the message is simply by calling
res := p.topic.Publish(ctx, &pubsub.Message{Data: msg})
The res variable is then used for error logging. p here is a custom struct.
The message we sent is a JSON with 15 fields, and just to be concise I'll mock it and UDF.
Message:
{"MessageName":"Name","MessageTimestamp":123123123",...}
UDF:
function transform(inJson) {
var obj;
try {
obj = JSON.parse(inJson);
} catch (error){
throw 'parse JSON error: '+error;
}
if (Object.keys(obj).length !== 15){
throw "Message is invalid";
}
if (!(obj.hasOwnProperty('EventSource') && typeof obj.EventSource === 'string' && obj.MessageName.length>0)) {
throw "MessageName is absent or invalid";
}
/*
other fields check
*/
obj.MessageName = obj.MessageName.toUpperCase()
/*
other fields transform
*/
return JSON.stringify(obj);
}
UPD3:
Besides being corrupted, I've noticed that every single message is duplicated at least once, and the duplicates are often truncated.
The problem occurred several days ago when it was a massive increase in the number of messages, but now it got back to normal, and the error is still there. The problem was seeing before, but it was a much more rare case.
The behavior you describe suggests that the data is corrupt before it gets to Pubsub or Dataflow.
I have performed a test, sending JSON messages containing 15 fields. Your UDF function as well as the Dataflow template work fine since I was able to insert the data to BigQuery.
Based on that, it seems your messages are already corrupted before getting to Pub/Sub, I suggest you to check your messages once they arrived to Pub/Sub and see if they have the correct format.
Please notice that it's required for the messages schema match with the BigQuery table schema.
I am trying to write a test case for my Spring Cloud Stream application. I am using Confluent Schema Registry with Avro, so I need to decode the message after polling from the channel. Here is my code:
processor.input()
.send(MessageBuilder.withPayload(InputData).build());
Message<?> message = messageCollector.forChannel(processor.output()).poll();
BinaryMessageDecoder<OutputData> decoder = OutputData.getDecoder();
OutputData outputObject = decoder.decode((byte[]) message.getPayload());
For some reason this code throws
org.apache.avro.message.BadHeaderException: Unrecognized header bytes: 0x00 0x08
I am not sure if this is some sort of bug I am facing or I am not following a proper way to decode the received avro message. I suspect I need to set header with something, but I am not quite sure how and with what exactly. I would appreciate it if someone could help me with this matter.
P.S: I am using spring-cloud-stream-test-support for the purpose of this test.
The data won't be avro-encoded when using the test binder.
The test binder is very limited.
To properly test end-to-end with avro, you should remove the test binder and use the real kafka binder with an embedded kafka broker.
One of the sample apps shows how to do it.
It turns out that the issue was related to how I was trying to decode the Avro message. By using the official Avro libraries, the following code worked for me:
Decoder decoder = DecoderFactory.get().binaryDecoder((byte[]) message.getPayload(), null);
DatumReader<OutputData> reader = new SpecificDatumReader<>(OutputData.getClassSchema());
RawDataCapsule rawDataCapsule = reader.read(null , decoder);
The echo twimlet gives this error in the developer console when trying to save:
TypeError: $(...) is null
base.js:2:20968
TypeError: t is not a function
Ajax.Request<.initialize()
ext.js:1
t()
ext.js:1
Ext.lib.Ajax.request()
ext.js:4
.request()
ext.js:6
.save()
twimlets.js:1
Ext.Button<.onClick()
ext.js:13
E/a()
ext.js:5
n/a()
ext.js:3
ext.js:1:18861
Any ideas on what might be the problem? I've tried even a very simple echo twimlet, as well as trying to edit an existing twimlet.
There is currently an issue with editing saved Twimlets. An error will occur when you attempt to edit a saved Twimlet. However, this error does not apply when generating a new Twimlet.
We are moving towards using Twiml Bins in the Twilio Console. You can create request URLs containing some TwiML instructions and save and edit them at anytime.
It can be a bit of extra work to do this from scratch so if you are really hoping to stick with the Twimlet, alternatively, be sure to check your URL encoding.
I have created a FMX Windows app that connects to a web server to obtain REST data. I have been using the REST Client, Response, Request and ResponseDataAdapter and have connected that to a Client Data Set. I have then connected the Datasets to a string grid through live bindings. I have done this for 2 different string grids with no problems at all, And then I come to the very last request I want to make and I am getting some very strange behaviour. I set everything up in a data module and did an execute of the RestRequest in the IDE and got the content I expected in the RESTResponse. I then activated the RESTResponseAdapter and ClientDataset. The clientdata set was populated and I was able to add the fielddefs through the ide by just going to add fields.
I have a timer setup on the app to update the string grids etc,,, Works fine for two string grids. However on the last one all I ever get on the StringGrid is the data that I originally fetched while in the IDE. I assumed this could be due to some caching on the clientdataset so I put a memo on the form and after each request execute I posted the response content to the memo.... The bizarre thing is that I occasionally get the response the server is currently sending back (Verified by going to the webserver through Chrome) but sometimes the Response Content is the data that I originally requested when I set it up in the IDE. So I went back to the IDE and cleared the response data from the Rest Response. Tried again and get the same... I get the expected result sometimes and other times I get the response that I originally got in the IDE yesterday. So then I thought perhaps the webserver was sending it back. So have run the same REST request through the webserver and never get back the data that the restresponse is showing...
The code below fires on my timer. The top two sets of code are working fine the last one is the buggy one.
restDataModule.adapterOperators.ClearDataSet;
restDataModule.cdsOperators.Close;
restDataModule.responseOperators.Content.Empty;
restDataModule.reqOnlineOperators.ClearBody;
restDataModule.reqOnlineOperators.Execute;
restDataModule.cdsOperators.Open;
restDataModule.adapterStats.ClearDataSet;
restDataModule.cdsStats.Close;
restDataModule.responseOperatorStats.Content.Empty;
restDataModule.reqOperatorStats.ClearBody;
restDataModule.reqOperatorStats.Execute;
restDataModule.cdsStats.Open;
try
restDataModule.adapterChats.ClearDataSet;
restDataModule.cdsChats.Close;
restDataModule.responseChats.Content.Empty;
restDataModule.reqChats.ClearBody;
restDataModule.reqChats.Execute;
restDataModule.cdsChats.Open;
except on E: Exception do
// ignore
memo1.Lines.Add('Failed!')
end;
memo1.Lines.Add(restDataModule.responseChats.Content);
Any suggestions welcome.
Ok the solution was to add a Parameter to the RestClient with the following settings:-
Kind = pkHTTPHEADER
Name = Cache-Control
Value = no-cache
Simple but elusive
I'm receiving messages from a JMS MQ queue which are supposedly utf-8 encoded. However on reading the out using msgText = ((TextMessage)msg).getText();
I get question marks where non standard characters were present. It seems possible to specify the encoding when using a bytemessage, but I cant find a way to specify encoding while reading out the TextMessage. Is there a way to solve this, or should I press for bytemessages?
We tried adding Dfile.encoding="UTF-8" to Websphere's jvm and we added
source = new StreamSource(new ByteArrayInputStream(
((TextMessage) msg).getText().getBytes("UTF-8")));
In our MessageListener. This worked for us, so then we took out the Dfile.encoding bit away and it still works for us.
Due to preferred minimum configuration for Websphere we decided to leave it this way, also taking into account that we may easier switch the UTF-8 string by a setting from file or database.
If the text is not decoded correctly, then probably the client is not sending the message with the utf-8 codec; this should work:
byte[] by = ((TextMessage) msg).getText().getBytes("ISO-8859-1");
String text = new String(by,"UTF-8");