Using aws-api-gateway models, how can I require an object contains AT LEAST 1 valid key - swagger

I am using API gateway's request validation. Here is my model so far:
{
"type" : "object",
"required" : [ "dc", "uid", "data" ],
"properties" : {
"dc" : {
"type" : "string"
},
"uid" : {
"type" : "string"
},
"data" : {
"type" : "object"
}
},
"title" : "MyApi"
}
So similar to the required function, I want to ensure that the data object has at least one key in a list I define [a, b, c, whatever]
If this is not possible, is there a way to at least prevent a null value from being sent? I tried "nullable": false but AWS said that was an invalid model schema.

Setting minProperties to '1' may work. See https://swagger.io/docs/specification/data-models/data-types/
Can (should) data be strongly typed? If so you could use composition, inheritance, and polymorphism to indicate that the data is a defined schemas. See https://swagger.io/docs/specification/data-models/inheritance-and-polymorphism/

Related

Multiple labels in Prometheus query

I am trying to get an array of label and values inside one JSON from Prometheus.
I have a metric
http_server_requests_seconds_count{method="POST", service="application", status="200", uri="/v1/rest/clients/ids"}
Using query:
count(sum(rate(http_server_requests_seconds_count[5m])) by (method,uri)) by (uri)
I get:
"result" : [
{
"metric" : {
"uri" : "/v1/rest/clients/ids"
},
"value" : [
1.662458065998E9,
"1"
]
},
However, I would like to get more labels in the metric field, such as service, status, uri.
For example:
"metric" : {
"uri" : "/v1/rest/clients/ids",
"service" : "application",
"status" : "200",
},
Either value aggregation over each unique label
I have adjusted my query as follows
round(sum(delta(http_server_requests_seconds_count[5m])) by (service, status, uri) >0 )

What should an iOS JSON payload submitting to an API look like?

We have an iOS application that grabs model data from an API. The user of the iOS device performs actions that link up the model data and creates relationships between them. The iOS then submits back to the API to save the relationship and allow for the relationship to be viewable in the web application.
The JSON payload has the potentially of being large if all the objects that were associated with the new relationship are submitted with their attributes.
"peoples_vehicles": [
{
"owner" : {
"id" : 8282, <----------this would be a UUID
"name" : "John Smith",
"address": "123 Fake Street",
},
"vehicles" : [
{
"id" : 1234, <----------this would be a UUID
"name" : "FORD F150",
"make" : "F150",
"model" : "FORD",
"year" : "2017",
},
{
"id" : 5678, <----------this would be a UUID
"name" : "FORD ESCAPE",
"make" : "ESCAPE",
"model" : "FORD",
"year" : "2013",
}
]
},
{
... another person with their vehicles
}
]
Since the data being sent back is all data that was originally from the API, should the iOS application even bother sending all the attributes back? Should the iOS application simply send back all the relationships with only the objects ids? We use UUIDs
"peoples_vehicles": [
{
"owner" : {
"id" : 8282, <----------this would be a UUID
},
"vehicles" : [
{
"id" : 1234, <----------this would be a UUID
},
{
"id" : 5678, <----------this would be a UUID
}
]
},
{
... another person with their vehicles
}
]
We seem to be leaning more towards just sending ID for pre-existing data. It makes the submit payload from iOS to API have very few attributes that are actually strings (none in this example).
There are cases when we would have to create new custom objects. In that case, all the attributes would be sent over and the ID of this new object would be null to indicate that this object has to be created.
"peoples_vehicles": [
{
"owner" : {
"id" : 8282, <----this would be a pre-existing object
},
"vehicles" : [
{
"id" : 1234, <----this would be a pre-existing object
},
{
"id": null <----new object that needs to be saved
"name" : "FORD F150",
"make" : "F150",
"model" : "FORD",
"year" : "2017",
}
]
},
{
... another person with their vehicles
}
]
Would this be a decent approach? Looking at Stripe and Shopify API as examples, this seems to work nicely, but I wanted to make sure I wasn't missing anything and if I should be included the attributes for objects that are pre-existing.

Save Avro Schema to Confluent Schema-Registry

I am trying to write a very simple schema to the registry using Postman, and have been having a very hard time getting it to register. Is it really this complicated just to register a simple schema which is only the first step in this whole process, or am I missing something here ? The schema I am using is below:
{
"schema":{
"type" : "record",
"name" : "User",
"namespace" : "com.temp.avro.model",
"fields" : [ {
"name" : "_id",
"type" : "string"
}, {
"name" : "updatedDate",
"type":"long",
"logicalType":"timestamp-millis"
}, {
"name" : "createdDate",
"type":"long",
"logicalType":"timestamp-millis"
}, {
"name" : "applicationId",
"type": ["null", "string"],
"default": null
},{
"name" : "country",
"type" : "string"
}, {
"name" : "bank",
"type" : "string"
}]
}
}
I am getting the following error:
Internal Server Error com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_OBJECT token\n at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#774da834; line: 2, column: 1] (through reference chain: io.confluent.kafka.schemaregistry.client.rest.entities.requests.RegisterSchemaRequest[\"schema\"])\ncom.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_OBJECT token\n at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#774da834; line: 2, column: 1] (through reference chain: io.confluent.kafka.schemaregistry.client.rest.entities.requests.RegisterSchemaRequest[\"schema\"])\n\tat com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)\n\tat com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:835)\n\tat com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:59)\n\tat com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:12)\n\tat com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:523)\n\tat com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)\n\tat com.fasterxml.jackson.databind.deser.impl.BeanPropertyMap.findDeserializeAndSet(BeanPropertyMap.java:285)\n\tat com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:248)\n\tat com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136)\n\tat com.fasterxml.jackson.databind.ObjectReader._bind(ObjectReader.java:1410)\n\tat com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:860)\n\tat com.fasterxml.jackson.jaxrs.base.ProviderBase.readFrom(ProviderBase.java:810)\n\tat io.confluent.rest.validation.JacksonMessageBodyProvider.readFrom(JacksonMessageBodyProvider.java:65)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.invokeReadFrom(ReaderInterceptorExecutor.java:260)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.aroundReadFrom(ReaderInterceptorExecutor.java:236)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:156)\n\tat org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundReadFrom(MappableExceptionWrapperInterceptor.java:74)\n\tat org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:156)\n\tat org.glassfish.jersey.message.internal.MessageBodyFactory.readFrom(MessageBodyFactory.java:1085)\n\tat org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:853)\n\tat org.glassfish.jersey.server.ContainerRequest.readEntity(ContainerRequest.java:270)\n\tat org.glassfish.jersey.server.internal.inject.EntityParamValueFactoryProvider$EntityValueFactory.provide(EntityParamValueFactoryProvider.java:96)\n\tat org.glassfish.jersey.server.spi.internal.ParameterValueHelper.getParameterValues(ParameterValueHelper.java:81)\n\tat org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$AbstractMethodParamInvoker.getParamValues(JavaResourceMethodDispatcherProvider.java:127)\n\tat org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$VoidOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:143)\n\tat org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347)\n\tat org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)\n\tat org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:308)\n\tat org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)\n\tat org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:315)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:297)\n\tat org.glassfish.jersey.internal.Errors.process(Errors.java:267)\n\tat org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)\n\tat org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:291)\n\tat org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1140)\n\tat org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:403)\n\tat org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:386)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:548)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:489)\n\tat org.glassfish.jersey.servlet.ServletContainer.doFilter(ServletContainer.java:426)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlets.CrossOriginFilter.handle(CrossOriginFilter.java:256)\n\tat org.eclipse.jetty.servlets.CrossOriginFilter.doFilter(CrossOriginFilter.java:219)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:159)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:499)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat java.lang.Thread.run(Thread.java:745)\n"
The value for schema must be a string. Try to JSON.stringify your Avro schema, so that the JSON payload you would send to the schema registry is the following:
{
"schema": "{\"type\":\"record\",\"name\":\"User\",\"namespace\":\"com.temp.avro.model\",\"fields\":[{\"name\":\"_id\",\"type\":\"string\"},{\"name\":\"updatedDate\",\"type\":\"long\",\"logicalType\":\"timestamp-millis\"},{\"name\":\"createdDate\",\"type\":\"long\",\"logicalType\":\"timestamp-millis\"},{\"name\":\"applicationId\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"country\",\"type\":\"string\"},{\"name\":\"bank\",\"type\":\"string\"}]}"
}

Neo4j cypher with parameters returns success but doesn't create anything?

If I send this:
{
"query" : "MATCH (p) WHERE p.id={id} CREATE (c {props}) CREATE UNIQUE p-[r:CHILD]->c",
"params" : {
"id" : ["{0000-0000-0000-0000}","{0000-0000-0000-0000}","{0000-0000-0000-0004}"],
"props" : [ {
"id" : "{0000-0000-0000-0004}",
"type": 48,
"title" : "TestNode"
},{
"id" : "{0000-0000-0000-0005}",
"type": 49,
"title" : "TestNode"
},{
"id" : "{0000-0000-0000-0006}",
"type": 49,
"title" : "TestNode"
}]
}
}
Via the restful cypher api, I get back "success" but nothing was created. If I send through this:
{
"query" : "MATCH (p) WHERE p.id={id} CREATE (c {props}) CREATE UNIQUE p-[r:CHILD]->c",
"params" : {
"id" : "{0000-0000-0000-0000}",
"props" : [ {
"id" : "{0000-0000-0000-0001}",
"type": 48,
"title" : "TestNode"
},{
"id" : "{0000-0000-0000-0002}",
"type": 49,
"title" : "TestNode"
} ]
}
}
It creates two children of 0000-0000-0000-0000 as expected. So something about the way I'm specifying two arrays isn't working.
I was hoping to be able to create large tree structures by essentially specifying parent ID/child to create parameters. The other option is that I use the latter style cypher and the transactional endpoint... but I'm just not sure what I'm doing wrong in the first one. Any advice much appreciated.
You probably need to do:
MATCH (p) WHERE p.id IN {id}
CREATE (c {props})
CREATE UNIQUE (p)-[r:CHILD]->(c)
The = operator is an exact comparison.

Ruby on rails polymorphic model: dynamic fields

Suppose I have base model class Item
class Item
include Mongoid::Document
field :category
end
Each category determines which fields should item contain. For example, items in "category1" should contain additional string field named text, items in "category2" should contain fields weight and color. All the fields are of basic types: strings, integers and so on.
These additional values are to be stored in mongodb as document's fields:
> db.items.find()
{ "_id" : ObjectId("4d891f5178146536877e1e99"), "category" : "category1", "text" : "blah-blah" }
{ "_id" : ObjectId("4d891f7878146536877e1e9a"), "category" : "category2", "weight" : 20, "color" : "red" }
Categories are stored in the mongodb, too. Fields' configuration is defined at runtime by an administrator.
> db.categories.find()
{ "_id" : "category1", "fields" : [ { "name" : "text", "type" : "String" } ] }
{ "_id" : "category2", "fields" : [
{
"name" : "weight",
"type" : "Integer"
},
{
"name" : "color",
"type" : "String"
}
] }
Users need to edit Items with html forms entering values for all additional fields defined for the category of particular item.
The question is
What approaches could I take to implement this polymorphism on rails?
Please ask for required details with comments.
Just subclass the Item, and Mongoid will take care of rest, e.g. storing type.
class TextItem < Item; end
Rails will like it, but you'll probably want to use #becomes method, as it will make form builder happy and path generation easier:
https://github.com/romanbsd/mongoid/commit/9c2fbf4b7b1bc0f602da4b059323565ab1df3cd6

Resources