How to generate JSON-Schema from Swagger API Declaration - swagger

I have Swagger API Declaration for services using Swagger v 1.2.
My original feeling about Swagger was that it is very close to JSON Schema (Draft 3 and lately Draft 4) and it shall be relatively easy to generate JSON Schema for request and response objects.
However, while part of the Swagger reuses JSON Schema structures, it turned out that it uses only a subset of features, and it also introduces its own inheritance in Models (using subTypes and discriminator).
Question: Is there any existing project or piece of code which can generate usable JSON Schema from Swagger API Declaration?
Optimally JSON Schema Draft 4 and using Python (but I will be happy to find anything).

After longer fight with using Swagger for specifying REST API and reusing it in related test suites, I will share my own experience with it (answering my own question).
Swagger supports only subset of JSON Schema Draft 4
Specification of Swagger 1.2 and 2.0 states, it supports only subset of JSON Schema Draft 4 (s. here). This means, that:
one cannot rely, that each valid JSON Schema can be completely supported by Swagger.
thinking of XML, Swagger supports only canonical representation of subset of JSON structures provided by JSON Schema Draft 4.
In other words:
Swagger (1.2 and 2.0) does not support usage of many JSON structures, which are valid in terms of JSON Schema Draft 4 (the same applies to Draft 3).
Swagger does not support general XML data structures, only very restricted structures are allowed.
In practice, you cannot start with designing your data in JSON or XML, with Swagger you have to start and end with Swagger.
Getting JSON Schema is theoretically possible, but not easy
I have spent some time coding a library, which would take Swagger API Specification and create JSON Schema Draft 4. I gave up for couple of reasons:
it was not easy at all
got disappointed finding, that I can use only subset of what JSON Schema provides. We had some JSON payload already proposed and had to start modifying it just to fit what Swagger specification framework allows.
Apart from having really nice looking UI for showing and testing the API (yes, everybody agrees, it is visually very pleasing), I have found it weird, that a specification framework is not allowing us to use what we want, but adds unexpected restrictions to our design.
If you want full JSON or XML Schema support, use RAML
Researching other API specification frameworks, I have found RAML. As it is built from ground up by supporting any JSON Schema Draft 3/4 or W3C XML Schema 1.0 data structures, the experience was excellent - having structure of my payload designed, I was able authoring API specification very quickly and following validation of real requests and responses against defined schemas was very easy, as the schemas are essentials components of the specification without adding any restrictions on them.
RAML was at version 0.8 that time (version 1.0 was not released yet).
Correcting the question leads to real solution
Good question makes half of the solution. My question was wrong as it missed fulfilling my real expectations. Corrected question would be:
What specification framework and technique to use, to specify REST API using payload defined by arbitrary JSON Schema Draft 4 or W3C XML Schema 1.0.
My answer to such a question would be:
Design your payload in JSON Schema Draft 4 or W3C XML Schema
Describe your REST API by means of RAML (v0.8 at the moment).
There might be other specification frameworks usable, but Swagger (neither v1.2 nor v2.0) is definitely not the case. Apart from providing really a lot of features (code generation, very nice looking documentation of the API and much more), it simply fails in providing solution to the updated question stated above.

There is a python tool to do the same by the name openapi2jsonschema.
You can simply install it using pip.
The readme for openapi2 shows the simplest way to use it:
openapi2jsonschema https://raw.githubusercontent.com/kubernetes/kubernetes/master/api/openapi-spec/swagger.json
Hope this helps.

I just wrote a tool pyswagger seems fit your need.
It loads Swagger API declaration, and able to convert python object to/from Swagger primitives. Also provide a set of client implementations (including requests & tornado.httpclient.AsyncHTTPClient) that able to make request to Swagger-enabled service directly.
This tool tends to solve the first problem I encountered when adapting Swagger in python, and is still pretty new now. Welcome for any suggestion.

I've had some success doing it like this:
swagger.yaml -> proto -> jsonschema
I used openapi2proto to generate proto files from Swagger yaml, then protoc-gen-jsonschema to generate the JSONSchemas from that. It's good enough to get a typed JSONSchema, but proto3 doesn't support "required" types so you miss out on this.

2023 Answer Update:
Open https://editor.swagger.io/
Paste your swagger.yaml file
Click Edit -> Convert and save as JSON as shown below.
Enjoy!

Install OpenApi to Jsonschema extractor:
Open terminal - do the following commands
sudo yum install python-pip
pip install openapi2jsonschema
download the openApi yaml file to a folder
cd to the downloaded folder and then run this command
openapi2jsonschema --strict <openapi yaml filename>

I've written one recursive function for creating json schema from Swagger definition.
For example, consider the swagger specification petstore-minimal
The definition Pet is converted into the below json-schema
{
"description": null,
"type": "object",
"properties": {
"name": {
"description": null,
"type": "string"
},
"id": {
"format": "int64",
"description": null,
"type": "integer"
},
"tag": {
"description": null,
"type": "string"
}
}
}
Ofcourse, this is a very minimal json schema, but can be achieved much more with the function that I wrote. Let me explain the way I did this, I used the following maven dependency to fetch the definitions from swagger specification
<dependency>
<groupId>io.swagger.parser.v3</groupId>
<artifactId>swagger-parser</artifactId>
<version>2.1.2</version>
</dependency>
To create the json schema, I used the below maven dependency
<dependency>
<groupId>net.minidev</groupId>
<artifactId>json-smart</artifactId>
<version>2.4.7</version>
</dependency>
Now to the coding part, the input is the location of the swagger spec and also the definition name which we need to convert to json schema
public static void main(String[] args) {
String jsonSchema = SwaggerUtil.generateJsonSchemaFromSwaggerSpec("path to swagger spec", "Pet");
System.out.println(jsonSchema);
}
Now, we need to process the swagger definition passed as the input and recursively process the properties of the definition
public static String generateJsonSchemaFromSwaggerSpec(String swaggerPath, String fieldName){
Swagger swagger = new SwaggerParser().read(swaggerPath);
Map<String, Model> definitions = swagger.getDefinitions();
Model schemaGenerationDefinition = definitions.get(fieldName);
Map<String, Property> propertyMap = schemaGenerationDefinition.getProperties();
Map<String, JsonProperty> customJsonPropertyMap = new HashMap<>();
propertyMap.forEach((propertyName, property) -> {
JsonProperty jsonProperty = processSwaggerProperties(propertyName, property, definitions);
customJsonPropertyMap.put(propertyName, jsonProperty);
});
JsonObjectProperty objectProperty = new JsonObjectProperty(customJsonPropertyMap, schemaGenerationDefinition.getDescription());
JSONObject generatedObject = objectProperty.toJsonObject();
String jsonSchema = generatedObject.toJSONString();
return jsonSchema;
}
private static JsonProperty processReference(String referenceName, String type, Map<String, Model> definitions){
Model model = definitions.get(referenceName);
Map<String, Property> propertyMap = model.getProperties();
Map<String, JsonProperty> jsonPropertyMap = new HashMap<>();
propertyMap.forEach((propertyName, property) -> {
JsonProperty jsonProperty = processSwaggerProperties(propertyName, property, definitions);
jsonPropertyMap.put(propertyName, jsonProperty);
});
if (type.equalsIgnoreCase("array")){
JsonArrayProperty jsonProperty = new JsonArrayProperty(model.getDescription());
jsonProperty.loadPropertiesFromMap(jsonPropertyMap);
return jsonProperty;
}else{
JsonObjectProperty objectProperty = new JsonObjectProperty(jsonPropertyMap, model.getDescription());
return objectProperty;
}
}
private static JsonProperty processSwaggerProperties(String propertyName, Property property, Map<String, Model> propertyDefinitions){
String definitionRefPath = "";
String type = "";
JsonProperty jsonProperty = null;
if (property.getType().equalsIgnoreCase("ref")){
definitionRefPath = ((RefProperty) property).getOriginalRef();
type = "object";
}else if (property.getType().equalsIgnoreCase("array")){
type = "array";
Property childProperty = ((ArrayProperty) property).getItems();
if (childProperty instanceof RefProperty){
RefProperty refProperty = (RefProperty) ((ArrayProperty) property).getItems();
definitionRefPath = refProperty.getOriginalRef();
}else{
JsonArrayProperty arrayProperty = new JsonArrayProperty(property.getDescription());
arrayProperty.loadChildProperty(childProperty);
return arrayProperty;
}
}else{
jsonProperty = PropertyFactory.createJsonProperty(property);
return jsonProperty;
}
String[] splitResult = definitionRefPath.split("/");
if (splitResult.length == 3) {
String propertyPath = splitResult[2];
System.out.println(propertyPath);
jsonProperty = processReference(propertyPath, type, propertyDefinitions);
}
return jsonProperty;
}
So for creating the json schema, I created my own custom json schema classes. that is for each of the json schema data types.. Also wrote from factory class to create the required json type
public class PropertyFactory {
public static JsonProperty createJsonProperty(Property property){
JsonProperty jsonProperty = null;
switch (property.getType()){
case "number":
jsonProperty = new JsonNumberProperty(property.getFormat(), property.getDescription());
break;
case "string":
jsonProperty = new JsonStringProperty(property.getDescription());
break;
case "boolean":
jsonProperty = new JsonBooleanProperty(property.getDescription());
break;
case "integer":
jsonProperty = new JsonIntegerProperty(property.getFormat(), property.getDescription());
if (property instanceof IntegerProperty){
IntegerProperty integerProperty = (IntegerProperty) property;
if (integerProperty.getMinimum() != null)
((JsonIntegerProperty) jsonProperty).setMinimum(integerProperty.getMinimum());
if (integerProperty.getMaximum() != null)
((JsonIntegerProperty) jsonProperty).setMaximum(integerProperty.getMaximum());
}else if (property instanceof LongProperty){
LongProperty longProperty = (LongProperty) property;
if (longProperty.getMinimum() != null)
((JsonIntegerProperty) jsonProperty).setMinimum(longProperty.getMinimum());
if (longProperty.getMaximum() != null)
((JsonIntegerProperty) jsonProperty).setMaximum(longProperty.getMaximum());
}
break;
default:
System.out.println("Unhandled type");
}
return jsonProperty;
}
}
Below are the abstractions I created for each of the json datatypes
public abstract class JsonProperty {
protected String type;
protected JSONArray required;
protected String description;
protected JsonProperty(String type, String description){
this.type = type;
this.description = description;
}
protected abstract JSONObject toJsonObject();
}
public class JsonArrayProperty extends JsonProperty{
private JsonProperty items;
public JsonArrayProperty(String description){
super("array", description);
}
#Override
protected JSONObject toJsonObject() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("type", this.type);
jsonObject.put("description", this.description);
jsonObject.put("items", this.items.toJsonObject());
return jsonObject;
}
public void loadPropertiesFromMap(Map<String, JsonProperty> propertyMap){
this.items = new JsonObjectProperty(propertyMap, this.description);
}
public void loadChildProperty(Property childProperty){
this.items = PropertyFactory.createJsonProperty(childProperty);
}}
public class JsonObjectProperty extends JsonProperty{
private Map<String, JsonProperty> properties;
public JsonObjectProperty(Map<String, JsonProperty> properties, String description){
super("object", description);
this.properties = properties;
}
#Override
public JSONObject toJsonObject() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("type", this.type);
jsonObject.put("description", this.description);
JSONObject propertyObject = new JSONObject();
this.properties.forEach((propertyName, jsonProperty) -> {
if (jsonProperty != null) {
JSONObject object = jsonProperty.toJsonObject();
propertyObject.put(propertyName, object);
}
});
jsonObject.put("properties", propertyObject);
return jsonObject;
}
public Map<String, JsonProperty> getProperties() {
return properties;
}
public void setProperties(Map<String, JsonProperty> properties) {
this.properties = properties;
}
}
public class JsonNumberProperty extends JsonProperty {
protected String format;
public JsonNumberProperty(String format, String description) {
super("number", description);
this.format = format;
}
public JsonNumberProperty(String type, String format, String description){
super(type, description);
this.format = format;
}
#Override
protected JSONObject toJsonObject() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("type", this.type);
jsonObject.put("description", this.description);
if (this.format != null)
jsonObject.put("format", this.format);
return jsonObject;
}
}
public class JsonIntegerProperty extends JsonNumberProperty{
private String pattern;
private BigDecimal minimum;
private BigDecimal maximum;
public JsonIntegerProperty(String format, String description) {
super("integer", format, description);
}
#Override
protected JSONObject toJsonObject() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("type", this.type);
jsonObject.put("description", this.description);
if (this.format != null)
jsonObject.put("format", this.format);
if (this.minimum != null)
jsonObject.put("minimum", this.minimum);
if (this.maximum != null)
jsonObject.put("maximum", this.maximum);
return jsonObject;
}
public void setPattern(String pattern) {
this.pattern = pattern;
}
public void setMinimum(BigDecimal minimum) {
this.minimum = minimum;
}
public void setMaximum(BigDecimal maximum) {
this.maximum = maximum;
}
}
public class JsonStringProperty extends JsonProperty{
public JsonStringProperty(String description) {
super("string", description);
}
#Override
protected JSONObject toJsonObject() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("type", this.type);
jsonObject.put("description", this.description);
return jsonObject;
}
}
NOTE
This is a custom implementation that I did for my needs, if you come across any additional datatypes, you can simply create the type by extending the JsonProperty class and providing the toJsonObject implementation.
Happy Coding

using Swagger UI for documentation the highlighted link returns json object for your api schema :

Related

Passing body in restassured using Generics doesn't work for File and FileInputStream

In Restassured we can pass the request payload in the body method by different ways like
String
POJO Object
Map Object
JsonObject (from GSON library)
File and
FileInputStream
So, I created following one method using generics to accommodate all these types: -
public <T> Response postAMember(T body) {
return given().spec(this.spec).body(body).when().post(EndPoints.GET_ALL_POST_A_MEMBER).andReturn();
}
Now, this is how I'm consuming it for respective Type (Not all in one go...one at a time): -
#Test
public void postMember() throws IOException {
// Using Hashmap
Map<String, String> body = new HashMap<>();
body.put("name", "Rocky");
body.put("gender", "Male");
// Using Model and GSON
Member imember = new Member("Rocky", "Male");
Gson gson = new GsonBuilder().excludeFieldsWithoutExposeAnnotation().create();
String body = gson.toJson(imember);
// Using JsonObject (GSON)
JsonObject body = new JsonObject();
body.addProperty("name", "Rocky");
body.addProperty("gender", "Male");
// Using Payload JSON File
File body = new File("src/test/resources/Payloads/postmemberpayload.json");
// Using Raw String
String body = "{\r\n" +
" \"name\": \"Rocky\",\r\n" +
" \"gender\": \"Male\"\r\n" +
"}";
// Using FileInputStream
FileInputStream fis = new FileInputStream(body); // in this case I would pass fis to body method
Response resp = MemberService.getMemberServiceInstance().postAMember(body);
Assert.assertEquals(resp.getStatusCode(), StatusCode.CREATED_201);
Member omember = resp.getBody().as(Member.class);
System.out.println(omember.toString());
}
postAMember method works fine only with : -
String
POJO Object
Map Object
JsonObject (from GSON library)
But fails with remaining two: -
File - Output is bad request 400
FileInputStream - Output is java.lang.IllegalArgumentException: jdk.internal.ref.PhantomCleanable<?> declares multiple JSON fields named next
And for now I've to make following two more overloaded version of postAMember: -
public Response postAMember(File body) {
return given().spec(this.spec).body(body).when().post(EndPoints.GET_ALL_POST_A_MEMBER).andReturn();
}
public Response postAMember(FileInputStream body) {
return given().spec(this.spec).body(body).when().post(EndPoints.GET_ALL_POST_A_MEMBER).andReturn();
}
Now above two methods generate the response. Any clue what's wrong here? Why the method with generics is not able to take File and FileInputStream?
I've fetched the latest Restassured libraries from maven central repo.
As far as I understand, your generics method will map to body(Object object) of RequestSpecification, then this object will be serialized.
class RequestSpecificationImpl
...
RequestSpecification body(Object object) {
...
this.requestBody = ObjectMapping.serialize(object, requestContentType,
findEncoderCharsetOrReturnDefault(requestContentType), null,
objectMappingConfig(), restAssuredConfig().getEncoderConfig());
...
}
All below kinds of object has no problem with serialization.
String
POJO Object
Map Object
JsonObject (from GSON library)
But
File ---serialize---> full path of FILE
FileInputStream ---serialize---> exception (from Gson/Jackson)
When you add 2 methods, then Rest-Assured correctly map to body(File body) and body(InputStream body) --> no serialization for them --> no issue.

Neo4j - Custom converter for field of type List

I am trying to write a custom converter for a nested object so that this object gets saved as string in Neo4j database.
I am using #Convert annotation on my field and passing ImageConverter.class which is my AttributeConverter class.
Everything works fine as expected and I am able to save string representation of Image class in Neo4j db.
However, now instead of single image I want to have List<Image> as my nested field. In this case, putting #Convert(ImageConverter.class) doesn't work.
I see that there is a class called ConverterBasedCollectionConverter which gets used when I have a field of type List<LocalDateTime.
However, I couldn't find any exammples on how to use this class in case of custom converters.
Please can anyone help me with this or if there is any other approach to use custom converter on field of type List.
I am using Neo4j (version 3.4.1) and Spring-data-neo4j (5.0.10.RELEASE) in my application. I am also using OGM.
PS: I am aware that it is advised to store nested objects as separate node establishing a relationship with parent object. However, my use case demands that the object be stored as string property and not as separate node.
Regards,
V
It is not so difficult as I assumed it would be.
Given a class (snippet)
#NodeEntity
public class Actor {
#Id #GeneratedValue
private Long id;
#Convert(MyImageListConverter.class)
public List<MyImage> images = new ArrayList<>();
// ....
}
with MyImage as simple as can be
public class MyImage {
public String blob;
public MyImage(String blob) {
this.blob = blob;
}
public static MyImage of(String value) {
return new MyImage(value);
}
}
and a converter
public class MyImageListConverter implements AttributeConverter<List<MyImage>, String[]> {
#Override
public String[] toGraphProperty(List<MyImage> value) {
if (value == null) {
return null;
}
String[] values = new String[(value.size())];
int i = 0;
for (MyImage image : value) {
values[i++] = image.blob;
}
return values;
}
#Override
public List<MyImage> toEntityAttribute(String[] values) {
List<MyImage> images = new ArrayList<>(values.length);
for (String value : values) {
images.add(MyImage.of(value));
}
return images;
}
}
will print following debug output on save that I think is what you want:
UNWIND {rows} as row CREATE (n:Actor) SET n=row.props RETURN row.nodeRef as ref, ID(n) as id, {type} as type with params {type=node, rows=[{nodeRef=-1, props={images=[blobb], name=Jeff}}]}
especially the images part.
Test method for this looks like
#Test
public void test() {
Actor jeff = new Actor("Jeff");
String blobValue = "blobb";
jeff.images.add(new MyImage(blobValue));
session.save(jeff);
session.clear();
Actor loadedActor = session.load(Actor.class, jeff.getId());
assertThat(loadedActor.images.get(0).blob).isEqualTo(blobValue);
}
I am came up with a solution to my problem. So, in case you want another solution along with the solution provided by #meistermeier, you can use the below code.
public class ListImageConverter extends ConverterBasedCollectionConverter<Image, String>{
public ListImageConverter() {
super(List.class, new ImageConverter());
}
#Override
public String[] toGraphProperty(Collection<Image> values) {
Object[] graphProperties = super.toGraphProperty(values);
String[] stringArray = Arrays.stream(graphProperties).toArray(String[]::new);
return stringArray;
}
#Override
public Collection<Image> toEntityAttribute(String[] values) {
return super.toEntityAttribute(values);
}
}
ImageConverter class just implements AttributeConverter<Image, String> where I serialize and deserialize my Image object to/from json.
I chose to go with this approach because I had Image field in one object and List<Image> in another object. So just by changing #Convert(ListImageConverter.class) to #Convert(ImageConverter.class) I was able to save list as well as single object in Neo4j database.
Note: You can skip overriding toEntityAttribute method if you want. It doesn't add much value.
However you have to override toGraphProperty as within Neo4j code it checks for presence of declared method with name toGraphProperty.
Hope this helps someone!
Regards,
V

Handle Double.NaN from Java to C# using snakeyaml (Java) and YamlDotNet (C#)

I'm using YAML to communicate between C# GUI and server side Java, which is working fine in general. However, if I pass a field that is a Double and the value is Double.NaN on Java side the Yaml passes as ".NaN", and when I come to deserialize on the C# side a 'System.FormatException' is thrown as C# expects the string "NaN" [not ".NaN"].
Does anyone know if there is a way to intercept the deserializer, or add formatting so that on the C# side ".NaN" can be parsed in a double?
(One workaround I can think of is changing all NaN's to a special value before serliazing to YAML, and then on C# recognizing the special value and converting back to NaN, but this seems like a big hack.)
It seems that this is a bug in the way YamlDotNet handles floats. Until it is fixed, you can work around it by registering a custom node INodeDeserializer that will handle these special cases.
Here is a quick-and-dirty implementation of such a deserializer:
public class FloatNodeDeserializer : INodeDeserializer
{
private static readonly Dictionary<Tuple<Type, string>, object> SpecialFloats =
new Dictionary<Tuple<Type, string>, object>
{
{ Tuple.Create(typeof(float), ".nan"), float.NaN },
{ Tuple.Create(typeof(float), ".inf"), float.PositiveInfinity },
{ Tuple.Create(typeof(float), "-.inf"), float.NegativeInfinity },
{ Tuple.Create(typeof(double), ".nan"), double.NaN },
{ Tuple.Create(typeof(double), ".inf"), double.PositiveInfinity },
{ Tuple.Create(typeof(double), "-.inf"), double.NegativeInfinity },
};
bool INodeDeserializer.Deserialize(
EventReader reader,
Type expectedType,
Func<EventReader, Type, object> nestedObjectDeserializer,
out object value
) {
var scalar = reader.Peek<Scalar>();
if (scalar == null) {
value = null;
return false;
}
var found = SpecialFloats.TryGetValue(
Tuple.Create(expectedType, scalar.Value),
out value);
if(found) {
reader.Allow<Scalar>();
}
return found;
}
}
The way to register it is:
var deserializer = new Deserializer();
deserializer.NodeDeserializers.Insert(0, new FloatNodeDeserializer());
See a fully working fiddle here

NewtonSoft json Contract Resolver with MVC 4.0 Web Api not producing the output as expected

I am trying to create a conditional ContractResolver so that I can control the serialization differently depending on the web request/controller action.
For example in my User Controller I want to serialize all properties of my User but some of the related objects I might only serialize the primitive types. But if I went to my company controller I want to serialize all the properties of the company but maybe only the primitive ones of the user (because of this I don't want to use dataannotations or shouldserialize functions.
So looking at the custom ContractResolver page i created my own.
http://james.newtonking.com/projects/json/help/index.html?topic=html/ContractResolver.htm
It looks like this
public class IgnoreListContractResolver : DefaultContractResolver
{
private readonly Dictionary<string, List<string>> IgnoreList;
public IgnoreListContractResolver(Dictionary<string, List<string>> i)
{
IgnoreList = i;
}
protected override IList<JsonProperty> CreateProperties(Type type, MemberSerialization memberSerialization)
{
List<JsonProperty> properties = base.CreateProperties(type, memberSerialization).ToList();
if(IgnoreList.ContainsKey(type.Name))
{
properties.RemoveAll(x => IgnoreList[type.Name].Contains(x.PropertyName));
}
return properties;
}
}
And then in my web api controller action for GetUsers i do this
public dynamic GetUsers()
{
List<User> Users = db.Users.ToList();
List<string> RoleList = new List<string>();
RoleList.Add("UsersInRole");
List<string> CompanyList = new List<string>();
CompanyList.Add("CompanyAccesses");
CompanyList.Add("ArchivedMemberships");
CompanyList.Add("AddCodes");
Dictionary<string, List<string>> IgnoreList = new Dictionary<string, List<string>>();
IgnoreList.Add("Role", RoleList);
IgnoreList.Add("Company", CompanyList);
GlobalConfiguration
.Configuration
.Formatters.JsonFormatter
.SerializerSettings
.ContractResolver = new IgnoreListContractResolver(IgnoreList);
return new { List = Users, Status = "Success" };
}
So when debugging this I see my contract resolver run and it returns the correct properties but the Json returned to the browser still contains entries for the properties I removed from the list.
Any ideas what I am missing or how I can step into the Json serialization step in webapi controllers.
*UPDATE**
I should add that this is in an MVC4 project that has both MVC controllers and webapi controllers. The User, Company, and Role objects are objects (created by code first) that get loaded from EF5. The controller in question is a web api controller. Not sure why this matters but I tried this in a clean WebApi project (and without EF5) instead of an MVC project and it worked as expected. Does that help identify where the problem might be?
Thanks
*UPDATE 2**
In the same MVC4 project I created an extension method for the Object class which is called ToJson. It uses Newtonsoft.Json.JsonSerializer to serialize my entities. Its this simple.
public static string ToJson(this object o, Dictionary<string, List<string>> IgnoreList)
{
JsonSerializer js = JsonSerializer.Create(new Newtonsoft.Json.JsonSerializerSettings()
{
Formatting = Formatting.Indented,
DateTimeZoneHandling = DateTimeZoneHandling.Utc,
ContractResolver = new IgnoreListContractResolver(IgnoreList),
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
});
js.Converters.Add(new Newtonsoft.Json.Converters.StringEnumConverter());
var jw = new StringWriter();
js.Serialize(jw, o);
return jw.ToString();
}
And then in an MVC action i create a json string like this.
model.jsonUserList = db.Users.ToList().ToJson(IgnoreList);
Where the ignore list is created exactly like my previous post. Again I see the contract resolver run and correctly limit the properties list but the output json string still contains everything (including the properties I removed from the list). Does this help? I must be doing something wrong and now it seems like it isn't the MVC or web api framework. Could this have anything to do with EF interactions/ proxies /etc. Any ideas would be much appreciated.
Thanks
*UPDATE 3***
Process of elimination and a little more thorough debugging made me realize that EF 5 dynamic proxies were messing up my serialization and ContractResolver check for the type name match. So here is my updated IgnoreListContractResolver. At this point I am just looking for opinions on better ways or if I am doing something terrible. I know this is jumping through a lot of hoops just to use my EF objects directly instead of DTOs but in the end I am finding this solution is really flexible.
public class IgnoreListContractResolver : CamelCasePropertyNamesContractResolver
{
private readonly Dictionary<string, List<string>> IgnoreList;
public IgnoreListContractResolver(Dictionary<string, List<string>> i)
{
IgnoreList = i;
}
protected override IList<JsonProperty> CreateProperties(Type type, MemberSerialization memberSerialization)
{
List<JsonProperty> properties = base.CreateProperties(type, memberSerialization).ToList();
string typename = type.Name;
if(type.FullName.Contains("System.Data.Entity.DynamicProxies.")) {
typename = type.FullName.Replace("System.Data.Entity.DynamicProxies.", "");
typename = typename.Remove(typename.IndexOf('_'));
}
if (IgnoreList.ContainsKey(typename))
{
//remove anything in the ignore list and ignore case because we are using camel case for json
properties.RemoveAll(x => IgnoreList[typename].Contains(x.PropertyName, StringComparer.CurrentCultureIgnoreCase));
}
return properties;
}
}
I think it might help if you used Type instead of string for the ignore list's key type. So you can avoid naming issues (multiple types with the same name in different namespaces) and you can make use of inheritance. I'm not familiar with EF5 and the proxies, but I guess that the proxy classes derive from your entity classes. So you can check Type.IsAssignableFrom() instead of just checking whether typename is a key in the ignore list.
private readonly Dictionary<Type, List<string>> IgnoreList;
protected override IList<JsonProperty> CreateProperties(Type type, MemberSerialization memberSerialization)
{
List<JsonProperty> properties = base.CreateProperties(type, memberSerialization).ToList();
// look for the first dictionary entry whose key is a superclass of "type"
Type key = IgnoreList.Keys.FirstOrDefault(k => k.IsAssignableFrom(type));
if (key != null)
{
//remove anything in the ignore list and ignore case because we are using camel case for json
properties.RemoveAll(x => IgnoreList[key].Contains(x.PropertyName, StringComparer.CurrentCultureIgnoreCase));
}
return properties;
}
Then the ignore list must be created like this (I also used the short syntax for creating the list and dictionary):
var CompanyList = new List<string> {
"CompanyAccesses",
"ArchivedMemberships",
"AddCodes"
};
var IgnoreList = new Dictionary<Type, List<string>> {
// I just replaced "Company" with typeof(Company) here:
{ typeof(Company), CompanyList }
};
Be aware that, if you use my code above, adding typeof(object) as the first key to the ignore list will cause this entry to be matched every time, and none of your other entries will ever be used! This happens because a variable of type object is assignable from every other type.

JSON serialization, returning keys that have dashes in them?

I would like to return JSON from my controller which was generated from an anonymous type and contains dashes in the key names. Is this possible?
So if I have this:
public ActionResult GetJSONData() {
var data = new { DataModifiedDate = myDate.ToShortDateString() };
return Json(data);
}
On the client side I would like it to arrive serialized like this:
{ "data-modified-date" : "3/17/2011" }
My reason for wanting this is this Json data will ultimately become attributes on a DOM node, and I want to play nice and use the new HTML5 data attributes. I can just return { modifieddate: "3/17/2011" } and use it this way, but if I can become that little bit more conforming to standards I'd like to be.
I understand if I write my own JsonResult class that uses the WCF JSON Serializer on a non anonymous type, I can use theDataMemberAttribute to accomplish this. But that's a lot of overhead for such a simple desire.
I could also have the client massage the keys for me once it receives the data, but I'm hoping to avoid that too. All in all I'd rather just not follow standards than either of these workarounds.
You could use Json.NET and have full control over property names:
public ActionResult GetJSONData()
{
var obj = new JObject();
obj["data-modified-date"] = myDate.ToShortDateString();
var result = JsonConvert.SerializeObject(obj);
return Content(result, "application/json");
}
Obviously this code is screaming to be improved by introducing a custom action result:
public class JsonNetResult : ActionResult
{
private readonly JObject _jObject;
public JsonNetResult(JObject jObject)
{
_jObject = jObject;
}
public override void ExecuteResult(ControllerContext context)
{
var response = context.HttpContext.Response;
response.ContentType = "application/json";
response.Write(JsonConvert.SerializeObject(_jObject));
}
}
and then:
public ActionResult GetJSONData()
{
var obj = new JObject();
obj["data-modified-date"] = myDate.ToShortDateString();
return new JsonNetResult(obj);
}
I found the JavaScriptSerializer that JsonResult uses has a special case for Dictionaries. So if you just do:
var data = new Dictionary<string, string>
{
{ "data-modified-date", myDate.ToShortDateString() }
};
Then the resulting JSON is in the desired format.

Resources