Proper way to handle Nullable References and RequiredAttribute on model binding - model-binding

In my web-api controllers I post query model (for paging, filtering and sorting) for my entities.
I have nullable references enabled solution-wise (all projects in csproj)
The base classes are:
public class SearchFilter
{
public string? q { get; set; } // <-- This should indicate that the "q" property is optional
public bool? fullResults { get; set; } // <-- This should indicate that the "fullResults" property is optional
}
public class Query<TFilter> where TFilter : SearchFilter, new()
{
public int skip { get; set; }
public int take { get; set; }
public List<SortDescriptor> sorting { get; set; } = new List<SortDescriptor>();
public TFilter? filter { get; set; } // <-- this indicates that the property "filter" is optional
}
So, for instance, a Query model for users would be:
class UserFilter: SearchFilter
{
bool? is_activated {get;set;} // etc.
}
class UserQuery: Query<UserFilter>{
}
When my client posts this request json
{
"skip":0,
"take":25,
"sorting":[{"field":"id","desc":true}]
}
The request completes successfully as expected. The "filter" property is optional (as indicated by the nullable property), and indeed the "filter" property is missing from the above json
When I try to filter my data by providing another value
{
"skip":0,
"take":25,
"sorting":[{"field":"id","desc":true}],
"filter": {
"is_avtivated": true
}
}
I get a bad request response
{
"errors": {
"filter.q": ["The q field is required."]
},
"type": "https://tools.ietf.org/html/rfc7231#section-6.5.1",
"title": "One or more validation errors occurred.",
"status": 400,
"traceId": "|fb9d87a2-47080df6ddd0a9dc."
}
What is going on here? Is not supposed "q" to be optional? I read that enabling nullable-references in c#8 will automatically put [Required] attributes to non-nullable properties, but as it seems here it also puts [Required] attributes to nullable properties too!
Is this a bug in .net core?
Or is this expected?
And how should I make my request work?
The request is clearly correct (it worked perfectly before enabling nullable references) and very clear. The code is also very clear.
"A filter is optional. If a filter exists, the q property of the filter is also optional".
This is what the code reads to.
As a note, I feel very frustrated that I don't have control over which property is actually required and which is not (nullable references is not the way to declare this intention). I use nullable references only for the warnings, to spot potentially problematic pieces of code. I would expect that enabling this feature wouldn't alter the application's behavior in any level.

Related

How to use JSON.Net CamelCasePropertyNamesContractResolver with dot separated names?

Our application uses validation attributes to make use of the ASP.NET model validation, however this gives dot separated names for validation errors. When passed through the CamelCasePropertyNamesContractResolver this only applies camelcase to before the first dot, whereas we would like the have camelcase applied to each section of the name.
For example we currently get the current json response:
{
"body.State": [
"The state field is required."
],
"body.LatestVersion": [
"The latestVersion field is required."
]
}
But desire to get out:
{
"body.state": [
"The state field is required."
],
"body.latestVersion": [
"The latestVersion field is required."
]
}
In our MVC setup we do have a line similar to
services.AddJsonOptions(options => options.ContractResolver = new CamelCasePropertyNamesContractResolver());
We'd appreciate any solution, be that modifications to how we set up the resolver, or how we could modify the validation.
Edit: Just for reference the model structure for the request that is generating this request is as follows:
public sealed class RequestModel
{
[FromRoute, DisplayName("entity"), Required, MaximumLength(255)]
public string Entity { get; set; }
[FromBody, DisplayName("body"), Required]
public BodyModel Body { get; set; }
}
public sealed class BodyModel
{
[DisplayName("latestVersion"), Required, MaximumLength(255)]
public string LatestVersion { get; set; }
[DisplayName("state"), Required]
public ModelState State { get; set; }
}
and the request body being sent is:
{
}
Assuming that the validation errors are serialized as some sort of IDictionary<string, T> for some T, then the JSON property names corresponding to the dictionary keys can be piecewise camel-cased between each . character by creating a custom naming strategy.
Json.NET encapsulates the logic to algorithmically remap property names and dictionary keys (e.g. to camel case) in the NamingStrategy type, specifically CamelCaseNamingStrategy. To modify the logic of a naming strategy to apply to each portion of a property name between . characters, you can adopt the decorator pattern and create a decorator naming strategy that applies some inner strategy to each portion of the name like so:
public class PiecewiseNamingStrategy : NamingStrategy
{
readonly NamingStrategy baseStrategy;
public PiecewiseNamingStrategy(NamingStrategy baseStrategy)
{
if (baseStrategy == null)
throw new ArgumentNullException();
this.baseStrategy = baseStrategy;
}
protected override string ResolvePropertyName(string name)
{
return String.Join(".", name.Split('.').Select(n => baseStrategy.GetPropertyName(n, false)));
}
}
Then, configure MVC as follows:
options.ContractResolver = new DefaultContractResolver
{
NamingStrategy = new PiecewiseNamingStrategy(new CamelCaseNamingStrategy())
{
OverrideSpecifiedNames = true, ProcessDictionaryKeys = true
},
};
This takes advantage of the fact that, as shown in the reference source, CamelCasePropertyNamesContractResolver is basically just a subclass of DefaultContractResolver that uses a CamelCaseNamingStrategy with ProcessDictionaryKeys = true and OverrideSpecifiedNames = true.
Notes:
Naming strategies were introduces in Json.NET 9.0.1 so this answer does not apply to earlier versions.
You may want to cache the contract resolver statically for best performance.
By setting NamingStrategy.ProcessDictionaryKeys to true, the naming strategy will be applied to all dictionary keys.

What can cause automapping to behave differently in different MVC app with same code?

I have a test app that works perfectly with the following classes in one app, but not in another:
public class ValueChange
{
public int GroupId { get; set; }
public List<ItemValueChange> Changes { get; set; }
}
public class ItemValueChange
{
public int ItemId { get; set; }
public string Value { get; set; }
public string Key { get; set; }
}
My plugin posts a JS structure that matches this structure (changes is a jQuery array).
The raw post data (from Fiddler2) looks like:
GroupId 1000
Changes[0][Value]
Changes[0][Key]
Changes[0][ItemId] 1
In the test app this works and maps the data sent to a ValueChange object correctly.
[HttpPost]
public JsonResult Validate(ValueChange change)
{
// The Changes property has the required array of objects/properties
}
In our main application, to which I just ported the plugin and classes, the post data sent looks like:
GroupId 3705
Changes[0][Value]
Changes[0][Key]
Changes[0][ItemId] 81866
and the validate method called looks identical:
[HttpPost]
public JsonResult Validate(ValueChange changes)
{
// changes contains a null list and no GroupId
}
If I break-point this method changes is non-null object with a GroupId of 0 and no child elements in Changes. I can however see these values available from Request.Form in the debugger:
Request.Form["GroupId"] "3705" string
Request.Form["Changes[0][Key]"] "" string
Request.Form["Changes[0][ItemId]"] "81866" string
Request.Form["Changes[0][Value]"] "" string
Q. What would cause the automapping to not work in a different MVC project with the type of data?
If I simplify ValueChange to this (below) it starts working and receives GroupId values:
public class ValueChange
{
public int GroupId { get; set; }
}
If I send JS object data without a changes property it works e.g.
{ GroupId: 123 }
Something about the list called Changes is causing the mapping to fail. I have tried it as an array and also sending a single hard-wired entry from JS like this (still fails):
{ GroupId: 123, Changes: [{ItemId: 456, Value: "V", Key: "K"}]
OMG. The auto-mapper will ignore properties if a property name matches the parameter name!!!
It was caused simply by having the parameter called changes (vs. change in the test app) when a property of the received data was also called changes.
Solution: I changed the parameter name e.g.
[HttpPost]
public JsonResult Validate(ValueChange valueChange)
{
}
To clarify, this problem occurs is a first-level property of the data passed matches a parameter name. If it were a nested property tit would not attempt to match the parameter name.
This little detail needs to be stapled to everyone's desk/hand/head.* :)

OData Api controllers flattern request

This question deals with the different "size" of a returned JSON result from these two difference controllers (API vs OData).
Some entities for example: (this is a bad composition and it was only made for making a point, please don't judge to relation between these entities)
public class Customer
{
public string Name { get; set; }
public Category Category { get; set; }
}
public class Category
{
public string CategoryName { get; set; }
public List<Customers> CustomersInCategory { get; set; }
}
When making a GET request to an OData controller, say:
GET http://localhost:81/Customers
The result will not contain the Customers' Category object, unless I explicitly mention "$expand=Category" on the URL.
However,
The same request for an API controller, will return the Customers' Category object (even if the result is IQueryable<Customer>).
The problem with this is that in case of cyclic relations between entities, the result is recursively flatten in becomes enormous (might be infinity).
I've been looking for a solution for this problem all over and found stuff like MaxDepth that doesn't work and many other things that resulted nothing.
What I really want is a way to "tell" the API controller or its methods, to "DO not expand the result" - or better yet, ignore cyclic referencing (which I've also tried and didn't work).
UPDATED:
Here is the GET method on the API controller:
[HttpGet]
[ActionName("DefaultAction")]
public IQueryable<Customer> Get()
{
return _unitOfWork.Repository<Customer>().Query().Get();
}
Thanks.

IgnoreDataMember attribute is skipping json object properties during **DE**serialization

According to the docs, the IgnoreDataMember attribute is only supposed to be considered during serialization.
From what I'm seeing, however, MVC model binding is using it during *de*serialization of json as well.
Consider the following class:
public class Tax
{
public Tax() { }
public int ID { get; set; }
[Required]
[Range(1, int.MaxValue)]
[IgnoreDataMember]
public int PropertyId { get; set; }
}
If POST/PUT the following json string to an action method:
{"Position":0,"Description":"State sales tax","Rate":5,"RateIsPercent":true,"PropertyId":1912}
I get the following validation error:
{
"Message": "The request is invalid.",
"ModelState": {
"newTax.PropertyId": [
"The field PropertyId must be between 1 and 2147483647."
]
}
}
Both the [Range(1, int.MaxValue)] and [Required] attributes are invalid.
If I remove the [IgnoreDataMember] attribute, everything works fine.
Is there a different attribute that can be used which will tell MVC binding not to ignore the property during deserialization?
This only happens when posting a json string. If I post a name/value string, everthing works fine.
The answer has to do with the behavior of Json.net. That's what the model binding is using and it's checking IgnoreDataMember for both serialization and deserialization making it useless for me (since I want to only use it for serialization).
The JsonIgnore attribute works exactly the same way.
Given that, I pulled all the ignore attributes off my properties and switched to using json.net's conditional serialization methods.
So basically add this for the above PropertyId field:
public bool ShouldSerializePropertyId() { return false; }
That allows deserialization to come in but blocks serialization from going out.

Does MVC not json-serialize other properties on a model that implements IEnumerable<T>?

I don't know why I'm just now noticing this behavior for the first time. Looking to confirm whether this is designed behavior, or whether I'm missing something.
Say I have a viewmodel that implements IEnumerable<T>, and provides additional properties. For example:
public class MyResultsViewModel : IEnumerable<MyResultViewModel>
{
public IEnumerable<MyResultViewModel> Results { get; set; }
public string SomeAdditionalProperty { get; set; }
public IEnumerator<MyResultViewModel> GetEnumerator()
{
return Results.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator() { return GetEnumerator; }
}
Say I also have a controller which returns this model as json. For example:
public ActionResult MyResults()
{
var entities = PrivateMethodToGetEntities();
var models = Mapper.Map<MyResultsViewModel>(entities);
models.SomeAdditionalProperty = "I want this in the JSON too";
return Json(models, JsonRequestBehavior.AllowGet);
// models now contains a populated Results as well as the add'l string prop
}
When I get the results back from a JSON request on the client, it always comes back as an array. For example:
$.get('/Path/To/MyResults')
.success(function (results) {
alert(results.SomeAdditionalProperty); // this alerts 'undefined'
alert(results.length); // this alerts the size / count of Results
alert(results[0]); // this alerts object
// inspecting results here shows that it is a pure array, with no add'l props
});
Before I refactor to make the viewmodel not implement IEnumerable<T>, I want to get some confirmation that this is by design and should be expected. I guess it makes sense, since the javascript array object's prototype would have to be extended to accommodate the additional properties.
Update:
I've made the following changes to the viewmodel, to avoid naming the internal enumerable Results:
public class MyResultsViewModel : IEnumerable<MyResultViewModel>
{
public IEnumerable<MyResultViewModel> NotNamedResults { get; set; }
public string SomeAdditionalProperty { get; set; }
public IEnumerator<MyResultViewModel> GetEnumerator()
{
return NotNamedResults.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator() { return GetEnumerator; }
}
With this change, the behavior remains. alert(JSON.stringify(results)) yields normal array syntax for my enumerated collection of MyResultViewModels:
[{"ResultProp1":"A","ResultProp2":"AA","ResultProp3":"AAA"},{"ResultProp1":"B","ResultProp2":"BB","ResultProp3":"BBB"},{"ResultProp1":"C","ResultProp2":"CC","ResultProp3":"CCC"},{"ResultProp1":"D","ResultProp2":"DD","ResultProp3":"DDD"},{"ResultProp1":"E","ResultProp2":"EE","ResultProp3":"EEE"}]
Is still seems that the additional property is being lost between when the controller action returns the JsonResult and the jquery success function is invoked.
For classes implementing IEnumerable, the JavascriptSerializer only serializes the enumerated items. It calls GetEnumerator() to get the data to be serialized, any other properties are ignored. That's why the Count property of a List<T> is not serialized, nor will any other property.
The reason is that this type of construction cannot be represented in json format. To include the other properties the serializer would have to create a hash object instead of an array, but a hash object is not a collection, strictly speaking. (The exception are key/value pair classes like Dictionary, which are serialized as hash objects. But the rule stands - only the enumerated dictionary entries are serialized).
But why are you creating a class that implements IEnumerable directly to serialize as json instead of doing this?
public class MyResultsViewModel {
public IEnumerable<MyModel> Models{ get; set; }
public String SomeAdditionalData { get; set; }
}

Resources