Save Avro Schema to Confluent Schema-Registry - avro

I am trying to write a very simple schema to the registry using Postman, and have been having a very hard time getting it to register. Is it really this complicated just to register a simple schema which is only the first step in this whole process, or am I missing something here ? The schema I am using is below:
{
"schema":{
"type" : "record",
"name" : "User",
"namespace" : "com.temp.avro.model",
"fields" : [ {
"name" : "_id",
"type" : "string"
}, {
"name" : "updatedDate",
"type":"long",
"logicalType":"timestamp-millis"
}, {
"name" : "createdDate",
"type":"long",
"logicalType":"timestamp-millis"
}, {
"name" : "applicationId",
"type": ["null", "string"],
"default": null
},{
"name" : "country",
"type" : "string"
}, {
"name" : "bank",
"type" : "string"
}]
}
}
I am getting the following error:
Internal Server Error com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_OBJECT token\n at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#774da834; line: 2, column: 1] (through reference chain: io.confluent.kafka.schemaregistry.client.rest.entities.requests.RegisterSchemaRequest[\"schema\"])\ncom.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_OBJECT token\n at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#774da834; line: 2, column: 1] (through reference chain: io.confluent.kafka.schemaregistry.client.rest.entities.requests.RegisterSchemaRequest[\"schema\"])\n\tat com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)\n\tat com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:835)\n\tat com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:59)\n\tat com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:12)\n\tat com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:523)\n\tat com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)\n\tat com.fasterxml.jackson.databind.deser.impl.BeanPropertyMap.findDeserializeAndSet(BeanPropertyMap.java:285)\n\tat com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:248)\n\tat com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136)\n\tat com.fasterxml.jackson.databind.ObjectReader._bind(ObjectReader.java:1410)\n\tat com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:860)\n\tat com.fasterxml.jackson.jaxrs.base.ProviderBase.readFrom(ProviderBase.java:810)\n\tat io.confluent.rest.validation.JacksonMessageBodyProvider.readFrom(JacksonMessageBodyProvider.java:65)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.invokeReadFrom(ReaderInterceptorExecutor.java:260)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.aroundReadFrom(ReaderInterceptorExecutor.java:236)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:156)\n\tat org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundReadFrom(MappableExceptionWrapperInterceptor.java:74)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:156)\n\tat org.glassfish.jersey.message.internal.MessageBodyFactory.readFrom(MessageBodyFactory.java:1085)\n\tat org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:853)\n\tat org.glassfish.jersey.server.ContainerRequest.readEntity(ContainerRequest.java:270)\n\tat org.glassfish.jersey.server.internal.inject.EntityParamValueFactoryProvider$EntityValueFactory.provide(EntityParamValueFactoryProvider.java:96)\n\tat org.glassfish.jersey.server.spi.internal.ParameterValueHelper.getParameterValues(ParameterValueHelper.java:81)\n\tat org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$AbstractMethodParamInvoker.getParamValues(JavaResourceMethodDispatcherProvider.java:127)\n\tat org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:143)\n\tat org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)\n\tat org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:308)\n\tat org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)\n\tat org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:315)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:297)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:267)\n\tat org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)\n\tat org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:291)\n\tat org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1140)\n\tat org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:403)\n\tat org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:386)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:548)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:489)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:426)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:256)\n\tat org.eclipse.jetty.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:219)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:159)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:499)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat java.lang.Thread.run(Thread.java:745)\n"

The value for schema must be a string. Try to JSON.stringify your Avro schema, so that the JSON payload you would send to the schema registry is the following:
{
"schema": "{\"type\":\"record\",\"name\":\"User\",\"namespace\":\"com.temp.avro.model\",\"fields\":[{\"name\":\"_id\",\"type\":\"string\"},{\"name\":\"updatedDate\",\"type\":\"long\",\"logicalType\":\"timestamp-millis\"},{\"name\":\"createdDate\",\"type\":\"long\",\"logicalType\":\"timestamp-millis\"},{\"name\":\"applicationId\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"country\",\"type\":\"string\"},{\"name\":\"bank\",\"type\":\"string\"}]}"
}

Related

Avro Schema Evolution with Enum – Deserialization Crashes

I defined two versions of a record in two separate AVCS schema files. I used the namespace to distinguish versions
SimpleV1.avsc
{
"type" : "record",
"name" : "Simple",
"namespace" : "test.simple.v1",
"fields" : [
{
"name" : "name",
"type" : "string"
},
{
"name" : "status",
"type" : {
"type" : "enum",
"name" : "Status",
"symbols" : [ "ON", "OFF" ]
},
"default" : "ON"
}
]
}
Example JSON
{"name":"A","status":"ON"}
Version 2 just has an additional description field with default value.
SimpleV2.avsc
{
"type" : "record",
"name" : "Simple",
"namespace" : "test.simple.v2",
"fields" : [
{
"name" : "name",
"type" : "string"
},
{
"name" : "description",
"type" : "string",
"default" : ""
},
{
"name" : "status",
"type" : {
"type" : "enum",
"name" : "Status",
"symbols" : [ "ON", "OFF" ]
},
"default" : "ON"
}
]
}
Example JSON
{"name":"B","description":"b","status":"ON"}
Both schemas were serialized to Java classes.
In my example I was going to test backward compatibility. A record written by V1 shall be read by a reader using V2. I wanted to see that default values are inserted. This is working as long as I do not use enums.
public class EnumEvolutionExample {
public static void main(String[] args) throws IOException {
Schema schemaV1 = new org.apache.avro.Schema.Parser().parse(new File("./src/main/resources/SimpleV1.avsc"));
//works as well
//Schema schemaV1 = test.simple.v1.Simple.getClassSchema();
Schema schemaV2 = new org.apache.avro.Schema.Parser().parse(new File("./src/main/resources/SimpleV2.avsc"));
test.simple.v1.Simple simpleV1 = test.simple.v1.Simple.newBuilder()
.setName("A")
.setStatus(test.simple.v1.Status.ON)
.build();
SchemaPairCompatibility schemaCompatibility = SchemaCompatibility.checkReaderWriterCompatibility(
schemaV2,
schemaV1);
//Checks that writing v1 and reading v2 schemas is compatible
Assert.assertEquals(SchemaCompatibilityType.COMPATIBLE, schemaCompatibility.getType());
byte[] binaryV1 = serealizeBinary(simpleV1);
//Crashes with: AvroTypeException: Found test.simple.v1.Status, expecting test.simple.v2.Status
test.simple.v2.Simple v2 = deSerealizeBinary(binaryV1, new test.simple.v2.Simple(), schemaV1);
}
public static byte[] serealizeBinary(SpecificRecord record) {
DatumWriter<SpecificRecord> writer = new SpecificDatumWriter<>(record.getSchema());
byte[] data = new byte[0];
ByteArrayOutputStream stream = new ByteArrayOutputStream();
Encoder binaryEncoder = EncoderFactory.get()
.binaryEncoder(stream, null);
try {
writer.write(record, binaryEncoder);
binaryEncoder.flush();
data = stream.toByteArray();
} catch (IOException e) {
System.out.println("Serialization error " + e.getMessage());
}
return data;
}
public static <T extends SpecificRecord> T deSerealizeBinary(byte[] data, T reuse, Schema writer) {
Decoder decoder = DecoderFactory.get().binaryDecoder(data, null);
DatumReader<T> datumReader = new SpecificDatumReader<>(writer, reuse.getSchema());
try {
T datum = datumReader.read(null, decoder);
return datum;
} catch (IOException e) {
System.out.println("Deserialization error" + e.getMessage());
}
return null;
}
}
The checkReaderWriterCompatibility method confirms that schemas are compatible.
But when I deserialize I’m getting the following exception
Exception in thread "main" org.apache.avro.AvroTypeException: Found test.simple.v1.Status, expecting test.simple.v2.Status
at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:309)
at org.apache.avro.io.parsing.Parser.advance(Parser.java:86)
at org.apache.avro.io.ResolvingDecoder.readEnum(ResolvingDecoder.java:260)
at org.apache.avro.generic.GenericDatumReader.readEnum(GenericDatumReader.java:267)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:181)
at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:136)
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247)
at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
at test.EnumEvolutionExample.deSerealizeBinary(EnumEvolutionExample.java:70)
at test.EnumEvolutionExample.main(EnumEvolutionExample.java:45)
I don’t understand why Avro thinks it got a v1.Status. Namespaces are not part of the encoding.
Is this a bug or has anyone an idea how get that running?
Try adding an #aliases.
For example:
v1
{
"type" : "record",
"name" : "Simple",
"namespace" : "test.simple.v1",
"fields" : [
{
"name" : "name",
"type" : "string"
},
{
"name" : "status",
"type" : {
"type" : "enum",
"name" : "Status",
"symbols" : [ "ON", "OFF" ]
},
"default" : "ON"
}
]
}
v2
{
"type" : "record",
"name" : "Simple",
"namespace" : "test.simple.v2",
"fields" : [
{
"name" : "name",
"type" : "string"
},
{
"name" : "description",
"type" : "string",
"default" : ""
},
{
"name" : "status",
"type" : {
"type" : "enum",
"name" : "Status",
"aliases" : [ "test.simple.v1.Status" ]
"symbols" : [ "ON", "OFF" ]
},
"default" : "ON"
}
]
}
Found a workaround. I moved the enum to a "unversioned" namespace. So it is in both versions the same.
But actually it looks like a bug for me. Converting a record is not an issue but enum is not working. Both are complex types in Avro.
{
"type" : "record",
"name" : "Simple",
"namespace" : "test.simple.v1",
"fields" : [
{
"name" : "name",
"type" : "string"
},
{
"name" : "status",
"type" : {
"type" : "enum",
"name" : "Status",
"namespace" : "test.model.unversioned",
"symbols" : [ "ON", "OFF" ]
},
"default" : "ON"
}
]
}

Using aws-api-gateway models, how can I require an object contains AT LEAST 1 valid key

I am using API gateway's request validation. Here is my model so far:
{
"type" : "object",
"required" : [ "dc", "uid", "data" ],
"properties" : {
"dc" : {
"type" : "string"
},
"uid" : {
"type" : "string"
},
"data" : {
"type" : "object"
}
},
"title" : "MyApi"
}
So similar to the required function, I want to ensure that the data object has at least one key in a list I define [a, b, c, whatever]
If this is not possible, is there a way to at least prevent a null value from being sent? I tried "nullable": false but AWS said that was an invalid model schema.
Setting minProperties to '1' may work. See https://swagger.io/docs/specification/data-models/data-types/
Can (should) data be strongly typed? If so you could use composition, inheritance, and polymorphism to indicate that the data is a defined schemas. See https://swagger.io/docs/specification/data-models/inheritance-and-polymorphism/

Extracting a specific part from JSON

Hello I am trying to extract only the id value from a JSON. However if I
print(response!["id"])
result = "not created" is outputted. reponse is already in JSON format.
Is there something that I am doing wrong?
Update 1
[
{
"user" : {
"last_name" : "test",
"email" : "test#test.com",
},
"id" : 902,
"scale" : 7,
"created_at" : "2018-02-24 06:45:33",
},
{
"user" : {
"last_name" : "test",
"email" : "test#test.com",
},
"id" : 903,
"scale" : 7,
"created_at" : "2018-02-24 06:45:33",
},
{
"user" : {
"last_name" : "test",
"email" : "test#test.com",
},
"id" : 904,
"scale" : 7,
"created_at" : "2018-02-24 06:45:33",
},
]
The best way to access the id of any object from the array of response will be:
let id = response[index]["id"]
// Here index is the index of the object you want to access.
I am assuming you want to extract ids from this JSON array. You can do this using
response.map({ $0["id"]})
Which will give you another array containing only ids

Ruby on Rails MongoDB existing dataset

I have a RoR application which will read info from a MongoDB collection.
I have my model set as:-
class Vulnerability
include Mongoid::Document
field :id, type: String
field :description, type: String
field :type, type: String
end
The first record in the collection is formatted like this:-
> db.vulnerabilities.findOne()
{
"_id" : "NGINX:CVE-2009-3896",
"_index" : "bulletins",
"_type" : "bulletin",
"_score" : null,
"_source" : {
"lastseen" : "2016-09-26T17:22:32",
"references" : [ ],
"edition" : 1,
"description" : "Null pointer dereference vulnerability\nSeverity: major\nCVE-2009-3896\nNot vulnerable: 0.8.14+, 0.7.62+, 0.6.39+, 0.5.38+\nVulnerable: 0.1.0-0.8.13",
"reporter" : "Nginx",
"published" : "2009-11-24T12:30:00",
"type" : "nginx",
"title" : "Null pointer dereference vulnerability",
"bulletinFamily" : "software",
"affectedSoftware" : [
{
"name" : "nginx",
"version" : "0.8.13",
"operator" : "le"
}
],
"cvelist" : [
"CVE-2009-3896"
],
"modified" : "2009-11-24T12:30:00",
"id" : "NGINX:CVE-2009-3896",
"href" : "http://nginx.org/en/security_advisories.html",
"cvss" : {
"score" : 5,
"vector" : "AV:NETWORK/AC:LOW/Au:NONE/C:NONE/I:NONE/A:PARTIAL/"
}
},
"sort" : [
38985
]
}
I want to pull the ID, the type and the description. When trying to view the index view I get
NameError at /vulnerabilities uninitialized constant Bulletin
I believe the issue is with the type in the document as there is a key of _type but I want to get the type from the key _source, so in the example above it will display nginx not bulletin.

Spring Data Neo4j corrupted json

I'm using neo4j with spring data, when I store an object with inside a relation I get as a result of the findAll a corrupted json. I never get this error when I query the objects one at time.
Even more strange the first one in the list is correct but the second has this error. The error is at the edge field. Any idea?
[{
"uuid" : "e5c90af5-6259-4ddf-ae1f-c0cff5a41296",
"name" : "test",
"createdBy" : {
"uuid" : "319535cc-288f-4a23-bc02-a3b01bf6e93f",
"createdAt" : "2017-03-10T02:06:55.925+0000",
"user" : {
"uuid" : "9e91032e-a54d-4297-8a6a-1506589b7529",
},
"edge" : { "id" : 6514
},
"graphId" : 664
}
},
{
"uuid" : "e5c90af5-6259-4ddf-ae1f-c0cff5a41296",
"name" : "test",
"createdBy" : {
"uuid" : "319535cc-288f-4a23-bc02-a3b01bf6e93f",
"createdAt" : "2017-03-10T02:06:55.925+0000",
"user" : {
"uuid" : "9e91032e-a54d-4297-8a6a-1506589b7529",
},
"edge" : { : 6514
},
"graphId" : 664
}
}]

Resources