Let us suppose we have some yaml files like following:
Article.yml
GenericArticleDto:
type: object
properties:
description:
type: string
[...]
Order.yml
EnglishArticleDto:
type: object
allOf:
- $ref: '../Article.yml/components/schemas/GenericArticleDto'
[...]
I generate the Java classes from the Order.yml with swagger gen and specify the destination package, let's say 'com.test.order'
After this operation I have two classes in com.test.order, EnglishArticleDto.java and GenericArticleDto.java
If I have another yml file with a dto that point to the same object GenericArticleDto
Proposal.yml
CustomerArticleDto:
type: object
allOf:
- $ref: '../Article.yml/components/schemas/GenericArticleDto'
I generate the Java classes from the Proposal.yml and specify the destination package, let's say 'com.test.proposal'. If I run the swagger script I have two classes in com.test.proposal, CustomerArticleDto.java and GenericArticleDto.java
My doubt is:
Is there a way with swagger gen to specify to avoid the creation of another Java class for GenericArticleDto and to REUSE the same present under the package com.test.order?
Swagger gen version: 2.2
Thanks a lot!
Related
I'm trying to create a code generator that takes input a JSON file and generates multiple classes in multiple files.
And my question is, is it possible to create multiple files for one input using build from dart lang?
Yes it is possible. There are currently many tools in available on pub.dev that have code generation. For creating a simple custom code generator, check out the package code_builder provided by the core Dart team.
You can use dart_style as well to format the output of the code_builder results.
Here is a simple example of the package in use (from the package's example):
import 'package:code_builder/code_builder.dart';
import 'package:dart_style/dart_style.dart';
final _dartfmt = DartFormatter();
// The string of the generated code for AnimalClass
String animalClass() {
final animal = Class((b) => b
..name = 'Animal'
..extend = refer('Organism')
..methods.add(Method.returnsVoid((b) => b
..name = 'eat'
..body = refer('print').call([literalString('Yum!')]).code)));
return _dartfmt.format('${animal.accept(DartEmitter())}');
}
In this example you can use the dart:io API to create a File and write the output from animalClass() (from the example) to the file:
final animalDart = File('animal.dart');
// write the new file to the disk
animalDart.createSync();
// write the contents of the class to the file
animalDart.writeAsStringSync(animalClass());
You can use the File API to read a .json from the path, then use jsonDecode on the contents of the file to access the contents of the JSON config.
I'm going to put the csv file into the bucket using influxdb v2.1.
Attempting to insert a simple example file results in the following error:
error in csv.from(): failed to read metadata: failed to read annotations: expected annotation datatype
The csv file that I was going to write is as follows.
#datatype measurement,tag,double,dateTime:RFC3339
m,host,used_percent,time
mem,host1,64.23,2020-01-01T00:00:00Z
mem,host2,72.01,2020-01-01T00:00:00Z
mem,host1,62.61,2020-01-01T00:00:10Z
mem,host2,72.98,2020-01-01T00:00:10Z
mem,host1,63.40,2020-01-01T00:00:20Z
mem,host2,73.77,2020-01-01T00:00:20Z
This is the example data in the official document of influxdata.
If you look at the first line of the example, you can see that datatype is annotated, but why does the error occur?
How should I modify it?
This looks like invalid annotated CVS.
In the csv.from function documentation, you can find examples (as string literals) of both annotated and raw CVS that the cvs.from supports.
I'm trying to convert a .mif file to GeoJSON. I've got .mif, .mid and .dbf. But when I convert it, the properties included in .dbf are not rendered in the GeoJson file. The command I use is : ogr2ogr -f GeoJSON file.json file.mif.
Is there something wrong with my command or do I need to add an option?
More infos:
With ogrinfo for the .dbf file :
ogrinfo car_m.dbf -so car_m
INFO: Open of `car_m.dbf'
using driver `ESRI Shapefile' successful.
Layer name: car_m
Metadata:
DBF_DATE_LAST_UPDATE=1913-10-18
Geometry: None
Feature Count: 2278213
Layer SRS WKT:
(unknown)
id: String (21.0)
idINSPIRE: String (30.0)
idk: String (25.0)
ind_c: Real (16.4)
nbcar: Real (16.4)
And from .mid :
Geometry: Unknown (any)
Feature Count: 2278213
Extent: (48385.790000, 1620790.500000) - (1197778.210000, 2676806.350000)
Layer SRS WKT:
PROJCS["unnamed",
GEOGCS["unnamed",...]
idINSPIRE: String (30.0)
id: String (21.0)
A solution : I converted the .mif in two steps, using .shp and merging the DBF file generated with the original one before converting to GeoJson, as relations are .mif/.mid and .shx/.shp/.dbf.
I have a .js or json file where all my parameters are defined. I want to use this file and have Jenkins read these parameter name from file during build and display such that Suite1 is displayed one job and parameters for Suite2 in another.
For Suite1, jenkins should show smoke and default however for Suite 2 it should show default and Testing in drop down. Can anyone please suggest right way to do it?
module.exports = {
Suite1: {
smoke: ['file1.spec.js','file2.spec.js],
default: ['file3.spec.js']
},
Suite2: {
default: ['file2.spec.js'],
Testing: ['file2.spec.js']
}
}
I tried extended Choice parameter but not getting desired results as there is no way of importing Json file as parameter in there.
I am working with OWLAPI v3.5.2 on iterating over all classes via getClassesInSignature(true), including the imports closure of the current ontology, and I am wondering if there's a similar way to include the imports closure for getAnnotations() as well.
A very basic example would be the following:
for (OWLClass klass: ontology.getClassesInSignature(true)) {
for (OWLAnnotation annotations: klass.getAnnotations(ontology, datafactory.getRDFSLabel())) {
...
}
}
Currently, only rdfs:labels contained in the root ontology are found while classes originated from owl:imports are not.
It's available in 4.x, in OWLOntology and EntitySearcher. Not supported in 3.5.2 - it would be an interface change, so it's not going to be backported.