SWAGGER: Dereferencing $ref from body parameter - swagger

Taking the petstore example,
I am trying to dereference the $Ref in the /pet -> put operation which is currently:
schema
$ref: #definitions/Pet
I am trying to resolve this but unable to get this text out from the json file.
This is what I have:
BodyParameter bp = (BodyParameter) param;
System.out.println(((RefModel) bp.getSchema()).get$ref());
I thought this would give me the above text out which I could later map with a definition Map and resolve it but got the following error:
Exception in thread "main" java.lang.ClassCastException: io.swagger.models.ModelImpl cannot be cast to io.swagger.models.RefModel
Would anyone know of a way to extract this string out from a body parameter and in general since the schema returns a Type Model?
I do not find a proper documentation source for the swagger parser , swagger inflector projects so hunting around through the source code itself.

you would do the following:
Model model = bp.getSchema();
if(model instanceof RefModel) {
RefModel ref = (RefModel) model;
String simpleRef = ref.getSimpleRef();
Model concreteModel = swagger.getDefinitions().get(simpleRef);
}
You should confirm that concreteModel is a ModelImpl but in the petstore case, it will be.

Related

failure in serializing optional date type filed to avro regardless of null value or non-null value

We are using avro1.8.2 to serialize data with optional date type field to be published to topic.
record aRecord {
/** Variable: lastUpdate
* lastUpdate indicates the latest date and time the reference asset was updated
*/
union {null, date} lastUpdate = null;
/** Variable: businessDate
* businessDate indicates the business date of the reference asset price
*/
union {null, date} businessDate = null;
}
Ran into the following exception while using the avro tool generated java class to serialize the data:
Error serializing avro message
Caused by: org.apache.avro.AvroRunTimeException: Unknown datum type org.joda.time.LocalDate: 2021-09-17
at org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:772)
at org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:302)
at org.apache.avro.generic.GenericData.resolveUnion
Please note that2 this happens regardless of the value is null or non-null (as shown value 2021-09-17 also caused the exception)
We did the following investigation and experiment but could not figure it out why:
Making the date field mandatory, the issue is resolved.
This is because DATE_CONVERSION is added to the corresponding field in the java class generated by avro tool.
If this field is defined as optional and default value is null, DATA_CONVERSION is not added to the java file generated by avro tool.
Using avro 1.9.1 resolved the issue unfortunately we must use avro 1.8.2
We also tried a few other versions of kafka-avro-serializer and spring-boot kafka framework. Nothing works for us.
Other projects that depend on avro1.8.2 seems to be able to handle this and we checked all the places as far as we considered relevant
and all the codes are the same except that somehow they have DATE_CONVERSION in place in the java file
generated by avro tool (although they are defined in advl file exactly the same).
Debuggin into the GenericData.java we found that if DATE_CONVERSION is in place for optional date field, getSchemaName is not called at all.
The getSchemaName basically checks of the type of the object, whether it's an Int, Record, String,...etc.
The date is a logicaltype of joda. Its real type is int as far as we understand
So our questions are:
How to make avro tool enable DATE_CONVERSION for optional "date" type field using avro 1.8.2?
If DATE_CONVERSION is not the key to resolve the issue, what's the best practice to serialize date type field using avro 1.8.2?
and this field could be null (default) or non-null.
Thanks.
SpecificData specificData = SpecificData.get();
specificData.addLogicalTypeConversion(new DateConversion());
DatumWriter<MessageClass> dw = new SpecificDatumWriter<MessageClass>(message.getSchema(), specificData);
DataFileWriter<MessageClass> dfw = new DataFileWriter<MessageClass>(dw);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
dfw.create(message.getSchema(), outputStream);
dfw.append(message);
dfw.close();
ProducerRecord<String, byte[]> record = new ProducerRecord<>(topic, key, outputStream.toByteArray());
return kafkaProducer.send(record, new Callback());
The above code fixed the issue. MessageClass is the java code generated by avro tool.
message is wrapped in specificData which is constructed with new DateConversion()
DATE_CONVERSION is exactly what is needed for optional date field during serialization.
Note that this solution is only needed as a workaround to avro1.8.

unable to read messages from messages file in unit test

I want to read the error messages from messages file but I am unable to. What mistake am I making?
The code where I want to read the string from messages file is
Future { Ok(Json.toJson(JsonResultError(messagesApi("error.incorrectBodyType")(langs.availables(0))))) }
The messages file error.incorrectBodyType=Incorrect body type. Body type must be JSON
The messagesApi("error.incorrectBodyType") should return Incorrect body type. Body type must be JSON but it returns error.incorrectBodyType.
If I remove double quotes in messagesApi(error.incorrectBodyType) then the code doesn't compile
Update
I added a couple of debug prints and notice that the keys I am using in MessagesApi are not defined. I don't know why though as I have created them in messages file.
println("langs array"+langs.availables)
println("app.title"+messagesApi.isDefinedAt("app.title")(langs.availables(0)))
println("error"+messagesApi.isDefinedAt("error.incorrectBodyType")(langs.availables(0)))
prints
langs arrayList(Lang(en_GB))
app.titlefalse
errorfalse
Update 2
I might have found the issue but I don't know how to resolve it. Basically, I am running my test case without an instance of the Application. I am mocking messagesApi by calling stubMessagesApi() defined in Helpers.stubControllerComponents, If I run the same code using an Application eg class UserControllerFunctionalSpec extends PlaySpec with OneAppPerSuiteWithComponents then app.title and error are defined. It seems without an instance of Application, MessagesApi is not using the messages file.
I was able to solve the issue by creating a new instance of MessagesApi using DefaultMessagesApi
val messagesApi = new DefaultMessagesApi( //takes map of maps. the first is the language file, the 2nd is the map between message title and description
Map("en" -> //the language file
Map("error.incorrectBodyType" -> "Incorrect body type. Body type must be JSON") //map between message title and description
)
)
val controller = new UserController(mockUserRepository,mockControllerComponents,mockSilhouette,messagesApi,stubLangs())

Swagger: Add description with ref

I want to add a description to an object property that his definition is referenced. Something like that:
newCreditCard:
type: object
properties:
billingPhone:
description: Phone number of the card holder
$ref: "#/definitions/PhoneNumber"
But the editor warns that the description property will be skipped:
Extra JSON Reference properties will be ignored: description
I have found a less elegant workaround that works for the editor, but not for the Swagger UI (not sure that is may due to the recent update to 3.0.2 version of the Swagger UI)
newCreditCard:
type: object
properties:
billingPhone:
description: Phone number of the card holder
allOf:
- $ref: "#/definitions/PhoneNumber"
How do you do it in your Swaggers specification?
Thanks for the help!
If you add anything to the same level of $ref it will be ignored .
json $ref definition https://datatracker.ietf.org/doc/html/draft-pbryan-zyp-json-ref-03#section-3
correct way is to provide the description in the referenced object.
You could simply move the description property to the definition of PhoneNumber. Your original post does not show how you have defined PhoneNumber, but this snippet validates without warnings:
definitions:
phoneNumber:
type: string
description: Phone number of the card holder
newCreditCard:
type: object
properties:
billingPhone:
$ref: "#/definitions/phoneNumber"
If this answer is not what you are looking for, please restate the question. We need to know what you are trying to accomplish.
although it is not according to JSON standards.
if you are using Swashbuckle to generate your swagger.
i took advantage over the "Extensions" property of schema.
and managed to create a swagger JSON, with $ref and extended properties.
var refSchema = new OpenApiSchema
{
//Reference = new OpenApiReference { ExternalResource = referenceLink, Type = ReferenceType.Link }, this wont work and omit all your other properties
Extensions = new Dictionary<string, IOpenApiExtension>
{
{ "$ref" , new OpenApiString(referenceLink) } // adding ref as extension cause according to JSON standards $ref shouldnt have any other properties
},
Description = prop.Value.Description,
ReadOnly = prop.Value.ReadOnly,
Required = prop.Value.Required,
Type = prop.Value.Type,
Example = prop.Value.Example,
};
For anyone using Swashbuckle with ASP.NET, you can use the following code to have the $ref construct put under the allOf (just like the :
// do this wherever you are calling AddSwaggerGen()
ArgBuilder.Services.AddSwaggerGen(opts => {
opts.UseAllOfToExtendReferenceSchemas(); // add this line.
});
Now if you have a model with two properties of the same type, the individual descriptions for each field will show up in Swagger UI (e.g. below, both FooHeader and BarHeader are properties of type HttpHeader and their descriptions show up):

Dataflow output parameterized type to avro file

I have a pipeline that successfully outputs an Avro file as follows:
#DefaultCoder(AvroCoder.class)
class MyOutput_T_S {
T foo;
S bar;
Boolean baz;
public MyOutput_T_S() {}
}
#DefaultCoder(AvroCoder.class)
class T {
String id;
public T() {}
}
#DefaultCoder(AvroCoder.class)
class S {
String id;
public S() {}
}
...
PCollection<MyOutput_T_S> output = input.apply(myTransform);
output.apply(AvroIO.Write.to("/out").withSchema(MyOutput_T_S.class));
How can I reproduce this exact behavior except with a parameterized output MyOutput<T, S> (where T and S are both Avro code-able using reflection).
The main issue is that Avro reflection doesn't work for parameterized types. So based on these responses:
Setting Custom Coders & Handling Parameterized types
Using Avrocoder for Custom Types with Generics
1) I think I need to write a custom CoderFactory but, I am having difficulty figuring out exactly how this works (I'm having trouble finding examples). Oddly enough, a completely naive coder factory appears to let me run the pipeline and inspect proper output using DataflowAssert:
cr.RegisterCoder(MyOutput.class, new CoderFactory() {
#Override
public Coder<?> create(List<? excents Coder<?>> componentCoders) {
Schema schema = new Schema.Parser().parse("{\"type\":\"record\,"
+ "\"name\":\"MyOutput\","
+ "\"namespace\":\"mypackage"\","
+ "\"fields\":[]}"
return AvroCoder.of(MyOutput.class, schema);
}
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
List components = new ArrayList();
return components;
}
While I can successfully assert against the output now, I expect this will not cut it for writing to a file. I haven't figured out how I'm supposed to use the provided componentCoders to generate the correct schema and if I try to just shove the schema of T or S into fields I get:
java.lang.IllegalArgumentException: Unable to get field id from class null
2) Assuming I figure out how to encode MyOutput. What do I pass to AvroIO.Write.withSchema? If I pass either MyOutput.class or the schema I get type mismatch errors.
I think there are two questions (correct me if I am wrong):
How do I enable the coder registry to provide coders for various parameterizations of MyOutput<T, S>?
How do I values of MyOutput<T, S> to a file using AvroIO.Write.
The first question is to be solved by registering a CoderFactory as in the linked question you found.
Your naive coder is probably allowing you to run the pipeline without issues because serialization is being optimized away. Certainly an Avro schema with no fields will result in those fields being dropped in a serialization+deserialization round trip.
But assuming you fill in the schema with the fields, your approach to CoderFactory#create looks right. I don't know the exact cause of the message java.lang.IllegalArgumentException: Unable to get field id from class null, but the call to AvroCoder.of(MyOutput.class, schema) should work, for an appropriately assembled schema. If there is an issue with this, more details (such as the rest of the stack track) would be helpful.
However, your override of CoderFactory#getInstanceComponents should return a list of values, one per type parameter of MyOutput. Like so:
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
return ImmutableList.of(myOutput.foo, myOutput.bar);
}
The second question can be answered using some of the same support code as the first, but otherwise is independent. AvroIO.Write.withSchema always explicitly uses the provided schema. It does use AvroCoder under the hood, but this is actually an implementation detail. Providing a compatible schema is all that is necessary - such a schema will have to be composed for each value of T and S for which you want to output MyOutput<T, S>.

JsonProvider "This is not a constant expression or valid custom attribute value"

Given the code:
#if INTERACTIVE
#r "bin\Debug\FSharp.Data.dll"
#endif
open System
open FSharp.Data
open FSharp.Data.Json
let testJson = """{ "workingDir":"hello", "exportDir":"hi there", "items":[{ "source":"", "dest":"", "args": {"name":"that"} }] }"""
//here is where i get the error
let Schema = JsonProvider<testJson>
The last line keeps giving me the error "This is not a constant expression or valid custom attribute value"-- what does that mean? How can i get it to read this JSON?
The string has to be marked as a constant. To do that, use the [<Literal>] attribute. Also, the type provider creates a type, not a value, so you need to use type instead of let:
open FSharp.Data
[<Literal>]
let testJson = """{ "workingDir":"hello", "exportDir":"hi there", "items":[{ "source":"", "dest":"", "args": {"name":"that"} }] }"""
type Schema = JsonProvider<testJson>
The JsonProvider can be viewed as a parametrized JSON parser (plus the data type that the parser produces) that is specialized at compile time.
The parameter you give to it (a string or a path to JSON file) defines the structure of JSON data -- a schema if you wish. This allows the provider to create a type that will have all the properties your JSON data should have, statically, and the set of those properties (along with their respective types) are defined (actually inferred from) with the JSON sample that you give to the provider.
So the correct way to use the JsonProvider is shown in one of the examples from the documentation:
// generate the type with a static Parse method with help of the type provider
type Simple = JsonProvider<""" { "name":"John", "age":94 } """>
// call the Parse method to parse a string and create an instance of your data
let simple = Simple.Parse(""" { "name":"Tomas", "age":4 } """)
simple.Age
simple.Name
The example was taken from here.

Resources