json schema validation not enforcing type - ruby-on-rails

I have a column that stores json. I am trying to make sure that only an array of objects can be stored in this column as described in the json schema below. The schema is working except for that I am able to save the attribute show as a string when it should be forced to be a boolean. For example, [{"name"=>"primary_phone", "show"=> "some text"}] is saving correctly but it shouldn't. How do I enforce that show must be a boolean?
{
"type": "array",
"items": {
"definitions": {
"name": { "type": "string" },
"show": {"type": "boolean"}
},
"required": ["name", "show"]
}
}

Your schema under "items" is invalid. Perhaps you meant "properties" instead of "definitions"?

Related

Schema Validation of Rest API Serenity BDD

I am facing a strange issue that I extract a schema of api response and added json file in my serenity project. While validating schema, what ever the schema provided the test was passing, moreover if I changed any type of key like I change the data-type of any key value correct schema( like changed the name data-type from string to integer) then test failed.
Scenario:
My API response:
{
"name":"Alex",
"age" : 20,
"city":"New York"
}
My Schema for this API: Test Passed which is ok
{
"type": "object",
"properties": {
"name": {
"type": "string"
},
"age": {
"type": "integer"
},
"city": {
"type": "string"
}
},
"required": [
"name",
"age",
"city"
]
}
If I changed the schema from correct to wrong that is remove any key value pair the test even passed which is not correct
{
"type": "object",
"properties": {
"name": {
"type": "string"
},
"city": {
"type": "string"
}
},
"required": [
"name",
"city"
]
}
Moreover if I write only " { } " in the Schema file the test passed
The method I am using for validation is matchesJsonSchemaInClassPath
The schema validation only checks the data-types that are coming for values in JSON are correct by matching with schema. For the data validation there is another method in Serenity BDD that is VerifyResponseData.ofTheresponse(jsonobj)
This works for me

Recursive avro schema type in schema registry

I want to create an Avro schema for schema registry for the following Typescript code:
export type Value = {
[key: string]:
| Value
| Value[]
| string
| number;
};
It's a recursive map type. I know it is possible to create a recursive record like below, but it's a different use case.
export type Node = {
value: number;
leafs: Node[];
}
I tried different approaches, including named types and schema references, but all resulted in validation errors when publishing a schema.
A simplified schema (excluding the recursive array) that is desired but invalid looks like this:
{
"type": "record",
"name": "Value",
"namespace": "com.namespace",
"fields": [
{ "name": "itemValues", "type": { "type": "map", "values": ["string", "int", "itemValues"] } }
]
}
Most of variations of this schema result in an error: org.apache.avro.SchemaParseException: Undefined name: "itemValues"
I could not find examples of similar scenarios and wondered if it's even possible to create one like this? The limitation for that would most likely be the lack of named union and map types in Avro.
Update
An example JSON that I want to achieve:
{
"itemValues": {
"validA": "sth",
"validB": [],
"validC": 8,
"recursiveProperty": {
"anyMap": { "sth": "else" }
}
}
}
The problem with your schema is that your values list is ["string", "int", "itemValues"], but the parser is complaining because you have told it there should be some type itemValues and you haven't defined one. The only type you have defined is Value.
Here's the fixed schema (including adding the array of Value as one of the potential types:
{
"type": "record",
"name": "Value",
"namespace": "com.namespace",
"fields": [
{ "name": "itemValues", "type": { "type": "map", "values": ["string", "int", "Value", {"type": "array", "items": "Value"}] } }
]
}

Avro schema - map type as optional field

How do I set the arrayofmap in avro schema as optional field. The below schema is working, however, if this field is missing in the data, then the parsing is failing with org.apache.avro.AvroTypeException: Error converting field - quantities and.Caused by: org.apache.avro.AvroTypeException: Expected array-start. Got VALUE_NULL
` I just want to make sure the deserialization of data goes through whether the field is present in the data or not.
{
"name":"quantities",
"type":{
"items":{
"type":"map",
"values":"string"
},
"type":"array"
},
"default" : null,
}
Just found a solution myself. this will make the array of map fields optional in the avro schema
{
"name": "quantities",
"type": ["null",
{
"type": "array",
"items": {
"type": "map",
"values": "string"
}
}
],
"default": null,
}

Writing an array of multiple different Records to Avro format, into the same file

We have some legacy file format, which I would need to migrate to Avro storage. The tricky part is that the records basically have
some common fields,
a discriminator field and
some unique fields, specific to the type selected by the discriminator field
all of them stored in the same file, without any order, fully mixed with each other. (It's legacy...)
In Java/object-oriented programming, one could represent our records concept as the following:
abstract class RecordWithCommonFields {
private Long commonField1;
private String commonField2;
...
}
class RecordTypeA extends RecordWithCommonFields {
private Integer specificToA1;
private String specificToA1;
...
}
class RecordTypeB extends RecordWithCommonFields {
private Boolean specificToB1;
private String specificToB1;
...
}
Imagine the data being something like this:
commonField1Value;commonField2Value,TYPE_IS_A,specificToA1Value,specificToA1Value
commonField1Value;commonField2Value,TYPE_IS_B,specificToB1Value,specificToB1Value
So I would like to process an incoming file and write its content to Avro format, somehow representing the different types of the records.
Can someone give me some ideas on how to achieve this?
Nandor from the Avro users email list was kind enough to help me out with this answer, credits go to him; this answer is for the record just in case someone else hits the same issue.
His solution is simple, basically using composition rather than inheritance, by introducing a common container class and a field referencing a specific subclass.
With this approach the mapping looks like this:
{
"namespace": "com.foobar",
"name": "UnionRecords",
"type": "array",
"items": {
"type": "record",
"name": "RecordWithCommonFields",
"fields": [
{"name": "commonField1", "type": "string"},
{"name": "commonField2", "type": "string"},
{"name": "subtype", "type": [
{
"type" : "record",
"name": "RecordTypeA",
"fields" : [
{"name": "integerSpecificToA1", "type": ["null", "long"] },
{"name": "stringSpecificToA1", "type": ["null", "string"]}
]
},
{
"type" : "record",
"name": "RecordTypeB",
"fields" : [
{"name": "booleanSpecificToB1", "type": ["null", "boolean"]},
{"name": "stringSpecificToB1", "type": ["null", "string"]}
]
}
]}
]
}
}

Can i define nested array objects in swagger 2.0

We are using Swagger 2.0 for our documentation. We are Pro-grammatically creating swagger 2.0 spec straight out our data design documents.
Our Model is very complex and nested. I would like to understand can we define nested array objects defined inline.
for e.g :
{
"definitions": {
"user": {
"type": "object",
"required": ["name"],
"properties": {
"name": {
"type": "string"
},
"address": {
"type": "array",
"items": {
"type": "object",
"properties": {
"type": {
"type": "string",
"enum": ["home",
"office"]
},
"line1": {
"type": "string"
}
},
"Person": {
"type": "object",
"properties": {
"name": {
"type": "string"
}
}
}
}
}
}
}
}
}
We have many cases where we encounter this in our model and defining a #ref is not an option that we want to consider at this time. We need this to handled inline.
As per the following post : https://github.com/swagger-api/swagger-editor/issues/603#evenenter code heret-391465196 looks like its not supported to handle nested array objects defined inline.
Since lot of big enterprise's have a very complex data model we would like to have this this feature to be supported in swagger 2.0 spec.
Is there any thought on this feature to be added.
You document is just invalid and this is not about nested arrays: the property Person is not allowed in a Swagger 2.0 schema inside items.
The only allowed properties in a schema are: $ref, format, title, description, default, multipleOf, maximum, exclusiveMaximum, minimum, exclusiveMinimum, maxLength, minLength, pattern, maxItems, minItems, uniqueItems, maxProperties, minProperties, required, enum, additionalProperties, type, items, allOf, properties, discriminator, readOnly, xml, externalDocs, example.

Resources