Is it possible to convert avro message to POJO similar to application/json to Pojo conversion? - avro

In my poc, I am using Spring cloud stream and connected to confluent schema registry. I can see schema's registered on schema registry, so I dont want to create own avro schema. I was able to get payload converted to POJO using application/json. Now I am trying to do the same with avro to Pojo conversion. But it doesnt seem to be working.
I am setting content type as ''contentType: application/*+avro''.
Any sample application, will be helpful.

Related

Avro fingerprint schema

I am using Confluent kafka in C# to consume Messages, these messages are formatted as hexadecimal strings from which the Schema fingerprint is extracted. How to get the schema from the schema fingerprint in C#? am I missing something?
I'm a bit late, but wanted to answer your question because I also could not find a definitive answer on this topic. This is what I found out:
The confluent schema registry uses ids to identify schemas.
When an event is serialized with the confluent schema registry (see e.g. here for the Java implementation for Avro), the schema id is embedded into the data: <Magic Byte><Schema ID><Payload>
Other serialization mechanisms that embedd schema fingerprints into the data, as e.g. Avro Single-Object are not supported by the schema registry. See this discussion on Github.
If you can only encodings with a schema fingerprint (i.e. where the schema registry is not available on serialization), the best option is to provide a cache on the consumer side, that maps fingerprint -> schema. This way the consumer can retrieve the fingerprint from the encoded event, lookup the schema from the cache and then decode the event using the retrieved schema.
The short answer: you can't recover the schema from the fingerprint.
The long one: fingerprint is a hash of your schema FingerPrint avro documentation. It's not designed to store the schema.
Fingerprint permits to identify which schema has been used to encode the message. It's used by some Schema registry and store all managed schemas in a map <FingerPrint, Schema>.

Is there a way to use Swagger just for validation without using the whole framework?

Suppose I have an existing Java service implementing a JSON HTTP API, and I want to add a Swagger schema and automatically validate requests and responses against it without retooling the service to use the Swagger framework / code generation. Is there anything providing a Java API that I can tie into and pass info about the requests / responses to validate?
(Just using a JSON schema validator would mean manually implementing a lot of the additional features in Swagger.)
I don't think there's anything ready to do this alone, but you can easily do this by the following:
Grab the SchemaValidator from the Swagger Inflector project. You can use this to validate inbound and outbound payloads
Assign a schema portion to your request/response definitions. That means you'll need to assign a specific section of the JSON schema to your operations
Create a filter for your API to grab the payloads and use the schema
That will let you easily see if the payloads match the expected structure.
Of course, this is all done for you automatically with Inflector but there should be enough of the raw components to help you do this inside your own implementation

Posting form-data and binary data through AWS API Gateway

I'm trying to POST "mutlipart\form-data" to my EC2 instance through AWS API Gateway, but I couldn't find a way to this. There is a way to post data using "application/x-www-form-urlencoded" and Mapping Tamplate to convert it to JSON but still posting a binary data like an image file is missing I guess. Is there anything I'm missing ?
EDIT:
I have found another way:
I convert the image to base64 string then POST it as with content type "application/x-www-form-urlencoded". By this way I'm sending whole image as string. After I got the message I can convert it back to image in PHP. Only down side of this I could find is when I convert image to base64 its size gets a bit bigger. Other than that, I couldnt find any other downside. If there is could you please share with me ?
Api Gateway team here.
Binary data isn't supported at the moment, but it's on our backlog. Several customers have requested this.
Some customers have had success using the base64 util in the mapping templates which may get it working for you: http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-mapping-template-reference.html#util-template-reference
Other than that you'll have to wait for official support.
Edit
Binary support is finally here!!

Export breeze entities in the server side to json

I am looking for a way to export breeze entities on the server side to a json string which a breezejs manager can import from the client side. I looked all over the breeze APIs (both public and internal source code) but I couldn't find an obvious way of achieving this. There is a possibility of getting the desired results by using BreezeSharp (a .NET breeze client) on the server side but I would like to see if this is achievable with using the breeze server APIs only.
First you need to determine the shape of the bundle to be imported, i.e. something that manager.importEntities will understand. I don't think the format is documented, but you can reverse-engineer it by using:
var exported = manager.exportEntities(['Customer', 'Product'], {asString:true, includeMetadata:false});
Then pretty-print the value of exported to see the data format. See EntityManager.exportEntities for more info.
Once you have that, you can re-create it on the server. In C#, you can build it up using Dictionary and List objects, then serialize it using Json.NET.
An alternative approach would be to have your webhook just tell the client to initiate a query to retrieve the data from the server.

MQTT source Spring XD

Using the sink MQTT source module for Spring XD, I am getting wrong values on the payload. I have the stream subscribed to a certain topic and a normal client in eclipse also subscribed to the same topic. The payload is supposed to be an array of bytes. For the same message I am getting
xxxxx...0000073F, on Spring XD
xxxxx...000007F9, on eclipse Paho client
In reality, this value is supposed to be a counter, and the eclipse paho client behaves perfectly but Spring XD seems to behave a weird behavior when any of the hexadecimal digits reaches F. For same reason it gets stuck on the same number until the counter has increased enough to get rid of any F on the sequence.
My question would be, if there eny pre processing happening on the MQTT client provided on Spring XD that explains why I am getting different values on the payload. I am sure that the second is correct since it is myself who is sending the values.
Thanks.
Spring XD uses Spring Integration which uses the Paho client under the covers.
Unfortunately, it converts the payload to String (UTF-8 by default), which produces results like this with data that is not valid UTF-8.
The adapter can be configured to pass the payload as binary but, unfortunately, that option is not currently exposed in XD.
The work around is to create a sublass of DefaultPahoMessageConverter, override mqttBytesToPayload ...
protected Object mqttBytesToPayload(MqttMessage mqttMessage) throws Exception {
return mqttMessage.getPayload();
}
Put the converter in a jar in the module's lib directory, and update the mqtt.xml to pass an instance of the converter in the converter attribute.
I will open a JIRA issue to make binary a standard option of the module.

Resources