F#, Serialize dynamically generated objects with WebAPI - f#

I am attempting to create Web API controller in F# which returns objects from Entity Framework. SharpObject and SharpContext are my object and DbContext respectively defined in a c# project.
/// Retrieves values.
[<RoutePrefix("api2/values")>]
type ValuesController() =
inherit ApiController()
let values = [| "value1"; "value2" |]
/// Gets all values.
[<Route("")>]
member x.Get() : IEnumerable<SharpObject> =
use context = new SharpContext()
context.SharpObjects.ToList() :> IEnumerable<SharpObject>
Here is SharpObject with the SerializableAttribute.
[Serializable]
public class SharpObject
{
[Key]
public virtual int Id { get; set; }
public virtual string Description { get; set; }
}
The error that I am getting is this:
The type System.Data.Entity.DynamicProxies.SharpObject_3A697B5C46C0BF76858FEAFC93BFED36DD8D4CA2CEACBB178D2D3C38BB2D2052 was not expected. Use the XmlInclude or SoapInclude attribute to specify types that are not known statically.
When I de-compile this using ILSpy, it looks like this:
[Route("")]
public IEnumerable<SharpObject> Get()
{
SharpContext context = new SharpContext();
IEnumerable<SharpObject> result;
try
{
result = (IEnumerable<SharpObject>)context.SharpObjects.ToList<SharpObject>();
}
finally
{
IDisposable disposable = context as IDisposable;
if (disposable != null)
{
disposable.Dispose();
}
}
return result;
}
What is the best way to get my list to show through in f#?

This happens because the object that you get from EF is not, in fact, of type SharpObject, but rather of that scarily named type, which inherits from SharpObject. This type is called "proxy" and is dynamically generated by EF in order to provide certain services (such as lazy loading, see below).
Because your action is declared as returning IEnumerable<SharpObject>, the default WebAPI's XML serializer expects to find object of that type, and so rightly complains upon finding an object of different type.
One temporary, bandaid-style fix that you can try is to remove the virtual keywords from your entity (why do you have them there, anyway?). It is the presence of the virtual keywords that causes EF to generate the proxy type. Absent virtual, no proxy will be generated, thus making the XML serializer happy.
This, however, will not work once you extend your model to include navigation properties with lazy loading. Those properties, you must make virtual, otherwise lazy loading won't work.
So the correct fix is not to use the same type for both DB-facing DTO and client-facing DTO. Use different types.
Using the same type for these two purposes may seem "convenient" at first, but this road quickly leads to numerous problems. One of small technical problems you have already discovered. But even absent those, conceptually, you almost never, ever want to just serve up your DB records directly to the untrusted user. Some of possible consequences include security holes, badly factored UI code, badly factored database structure, performance problems, and so on.
Bad idea. Don't do it.
P.S. This doesn't actually have anything to do with F#.

Related

OData swallowing objects in navigation property

I have a problem with an OData controller that is a little unusual compared to the others I have. It is the first one working completely from memory - no database involved.
The returned entity is:
public class TrdRun {
[Key]
public Guid Identity { get; set; }
public TrdTrade [] Trades { get; set; }
TrdTrade is also an entity set (which if queries goes against a database). But in this particular case I want to return all trades associated as active from a run, and I an do so WITHOUT going to the database.
My problem? The following code:
[ODataRoute]
public IEnumerable<Reflexo.Api.TrdRun> Get(ODataQueryOptions options) {
var instances = Repository.TrdInstance.AsEnumerable();
var runs = new List<Reflexo.Api.TrdRun>();
foreach (var instance in instances) {
runs.Add(Get(instance.Identifier));
}
return runs;
}
correctly configures runs to have the trades initialized - but WebApi decides to swallow them.
What is a way to configure it to return the data "as given" without further filtering? I know about the AutoExpandAttribute (Which I would love to avoid - I do not want the API classes marked with OData attributes), but I have not enabled Query, so I would expect the return data to be returned as I set it up.
The value of the Trades property is not being serialized because the default behavior of ODataMediaTypeFormatter is to not follow navigation properties, regardless of what is in memory. You could override this behavior by using $expand in the query string of the request, or AutoExpandAttribute on the Trades property in the class definition, but both approaches require decorating your controller method with EnableQueryAttribute.
If you don't want to do any of that, you can still programmatically specify auto-expansion of Trades in your service configuration as follows:
// Let builder be an instance of ODataModelBuilder or a derived class.
builder.EntityType<TrdRun>().CollectionProperty(r => r.Trades).AutoExpand = true;
Minor issue: With the programmatic approach, if the client requests full metadata (e.g., odata.metadata=full in the Accept header), the OData serializer will not include full metadata in the auto-expanded objects.

Dapper.NET mapping with Data Annotations

So I have a class with a property like this:
public class Foo
{
[Column("GBBRSH")
public static string Gibberish { get; set;}
....
}
For saving data, I have it configured so that the update/insert statements use a custom function:
public static string GetTableColumnName(PropertyInfo property)
{
var type = typeof(ColumnAttribute);
var prop = property.GetCustomAttributes(type, false);
if (propr.Count() > 0)
return ((ColumnAttribute)prop.First()).Name;
return property.Name;
}
This handles fine, but I noticed that when I go to retrieve the data, it isn't actually pulling data back via the function for this particular column. I noticed that the other data present was pulled, but the column in question was the only field with data that didn't retrieve.
1) Is there a way to perhaps use the GetTableColumnName function for the retrieval part of Dapper?
2) Is there a way to force Dapper.NET to throw an exception if a scenario like this happens? I really don't want to have a false sense of security that everything is working as expected when it actually isn't (I get that I'm using mapping that Dapper.NET doesn't use by default, but I do want to set it up in that manner).
edit:
I'm looking in the SqlMapper source of Dapper and found:
private static IEnumerable<T> QueryInternal<T>(params) // my knowledge of generics is limited, but how does this work without a where T : object?
{
...
while (reader.Read())
{
yield return (T)func(reader);
}
...
}
so I learned about two things after finding this. Read up on Func and read up on yield (never used either before). My guess is that I need to pass reader.Read() to another function (that checks against column headers and inserts into objects appropriately) and yield return that?
You could change your select statement to work with aliases like "SELECT [Column("GBBRSH")] AS Gibberish" and provide a mapping between the attribute name and the poco property name.
That way, Dapper would fill the matching properties, since it only requires your POCO's to match the exact name of the column.

Returning specifically shaped POCOs to ASP.NET MVC actions

In my ASP.NET MVC project, my actions typically call a Service layer to get data. I use the same dozen or so POCOs for all my models. I also plan on using the Service layer in console applications and maybe expose a web api at some point.
To make my database operations more efficient, my service layer only hydrates the properties in the model that are relevant to the particular method (which at this point is mostly driven by the needs of my controller actions).
So for example I might have a class Order with properties Id, Name, Description, Amount, Items. For a given service call I might only need to populate Id, Name, Items. A consumer of that service won't necessarily know that Amount is 0 only because it didn't populate the property.
Similarly, the consumer won't know whether Items is empty b/c there actually aren't any items, or whether this particular service method just doesn't populate that property.
And for a third example, say one of my views displays an ItemCount. I don't want to fully populate my Items collection, I just need an additional property on my "model". I don't want to add this property to my POCO that other service methods will be using because it's not going to be populated anywhere else.
So the natural solution is to make a POCO designed specifically for that method with only those 3 properties. That way the consumer can know that all properties will be populated with its real values. The downside to this is that I'll end writing tons of similarly shaped models.
Any advice on which method works best?
You could use Nullable Types to indicate the missing properties with a null.
For example:
class Order {
public int Id {get;set;}
public string Name {get;set;}
public string Description {get;set;}
public decimal? Amount {get;set;}
public List<Item> Items {get;set;}
}
And then if Items == null, it wasn't set. If it's an empty new List<Item>(), it's set but empty. Same for Amount. If Amount.HasValue == false, it wasn't set. If Amount.Value is 0.0d, it's set and the item is free.
Why don't you use LINQ projection?
One service method does something like:
return DbContext.Orders.Select(o => new { Id = o.Id, Name = o.Name, Description = o.Description });
while the other service method does something like:
return DbContext.Orders.Select(o => o);
I'm not sure how your application is architected, but this may be a way around creating 100's of POCO's.
Hope this helps! Good luck.
You could pass in a selector Func that returns dynamic:
public IEnumerable<dynamic> GetOrders(Func<Order, dynamic> selector) { ... }
I'm not sure how you are accessing data, but the following shows how this would work using a List<T>:
class Program
{
static void Main(string[] args)
{
var service = new Service();
var orderNames = service.GetOrders(o => new { o.Name });
foreach (var name in orderNames)
Console.WriteLine(name.Name);
Console.ReadLine();
}
}
public class Service
{
private List<Order> _orders = new List<Order>
{
new Order { Id = 1, Name = "foo", Description = "test order 1", Amount = 1.23m },
new Order { Id = 2, Name = "bar", Description = "test order 1", Amount = 3.45m },
new Order { Id = 3, Name = "baz", Description = "test order 1", Amount = 5.67m }
};
public IEnumerable<dynamic> GetOrders(Func<Order, dynamic> selector)
{
return _orders.Select(selector);
}
}
public class Order
{
public int Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public decimal Amount { get; set; }
}
The use of nullable values is a good solution, however it has the downside you have no way to matk required fields. That is you cannot use a required attribute on any property. So if there is field that is obligatory in some views you have no way to represent it.
If you don't need required fileds validation this is ok. Otherwise, you need a way to represent which fileds are actually used, and then to write a custom validation provider.
A simple way to do this is to use a "Mask" class with the same property names of the original class, but with all fields boolean: a true values means the field is in use.
I used a similar solution in a system where the properties to be shown are configured in a configuration files...so it was the unique option for me since I had no possibility to represent all combination of properties. HOWEVER, I used the "Mask" class also in the View, so I was able to do all the job with just one View..with a lot of ifs.
Now if your 150 service methods and probably about 150 Views...are all different, then maybe it is simpler to use also several classes ...that is in the worst case 150 classes..the extra work to write them is negligible if compared to the effort of preparing 150 different Views.
However this doesnt mean you need 150 POCO classes. You might use an unique POCO class that is copied into an adequate class just into the presentation Layer. The advantage of this approach is that you can put different validation attributes on the various classes and you don't need to write a custom Validation provider.
Return the entire POCO with nullable types as mentioned by #sbolm. You can then create a ViewModel per MVC page view that receives a model with the specific properties it needs. This will take more performance (insignificant) and code, but it keeps your service layer clean, and keeps your views "dumb" in that they are only given what they need and have no direct relation to the service layer.
I.e. (example class from #sbolm)
class Order {
public int Id {get;set;}
public string Name {get;set;}
public string Description {get;set;}
public decimal? Amount {get;set;}
public List<Item> Items {get;set;}
}
// MVC View only needs to know the name and description, manually "map" the POCO properties into this view model and send it to the view
class OrderViewModel {
public string Name {get;set;}
public string Description {get;set;}
}
I would suggest that instead of modifying the models or creating wrapper models, you have to name the service methods such that they are self-explanatory and reveals the consumer what they returns.
The problem with the nullable approach is it makes the user to feel that the property is not required or mandatory and they try inserting instances of those types without setting those properties. Is it won't be bad having nullables every-where?
It won't be a good approach to change the domain models since all you want is just to populate some of the properties instead of that you create service with names and descriptions that are self-explanatory.
Take the Order class itself as the example, say one service method returns the Order with all the items and the other one returns only the details of the Order but not the items. Then obviously you may have to create two service methods GetOrderItems and GetOrderDetail, this sounds so simple, yes it is! but notice the service method names itself tells the client what it is going to return. In the GetOrderDetail you can return an empty items or null (but here I suggest a null) that doesn't matter much.
So for new cases you don't need to frequently change the models but all you got to do is add or remove the service methods and that's fine. Since you are creating a service you can create a strong documentation that says what method does what.
I would not performance optimize this to much unless you realy get performance problems.
I would only distinguish between returning a flat object and an object with a more complete object graph.
I would have methods returning flat objects called something like GetOrder, GetProduct.
If more complete object graphs are requested they would be called : GetOrderWithDetails.
Do you use the POCO classes for the typed views? If yes: try to make new classes that serve as dedicated ViewModels. These ViewModels would contain POCO classes. This will help you keeping the POCO classes clean.
To expand on the nullable idea, you could use the fluentvalidation library to still have validation on the types dependent on whether they are null or not. This would allow you to have a field be required as long as it was not null or any other validation scheme you can think of. Example from my own code as I had a similar requirement:
Imports FluentValidation
Public Class ParamViewModelValidator
Inherits AbstractValidator(Of ParamViewModel)
Public Sub New()
RuleFor(Function(x) x.TextBoxInput).NotEmpty.[When](Function(x) Not (IsNothing(x.TextBoxInput)))
RuleFor(Function(x) x.DropdownListInput).NotEmpty.[When](Function(x) Not (IsNothing(x.DropdownListInput)))
End Sub
End Class

ASP.NET MVC Issue with Using Reflection Created Objects with the Default Model Binder

I am having a weird issue in ASP.NET MVC with objects not being updated with UpdateModel when passed a formCollection. UpdateModel does not appear to be working properly when the object being updated is created through reflection.
Scenario: I have an application which has approximately 50 lookup tables--each of which includes exactly the same schema including typical fields like id, title, description, isactive, and createdon. Rather than build 50 views, I wanted to have a single view which could display the data from all of the lookup tables. I created an Interface called IReferenceEntity and implemented it in each of the POCOs representing my lookup tables.
Using this interface, I am able to easily populate a view with a record from the lookup table. (I pass the items to the view via the following.)
System.Web.Mvc.ViewPage<MyNamespece.IReferenceEntity>
From the database to the view, every thing works perfectly.
However, when I attempt to update the model on post, I am running into some problems.
If I explicitly declare an object reference like the following, every thing works perfectly and the values of my object are updated with the values from my form. Hence, I can then update the database.
AccountStatus a = new AccountStatus();
UpdateModel(a, formCollection.ToValueProvider());
Unfortunately, hard coding the object type would completely defeat the reason for using an interface.
(A primary objective of the application is to be able to dynamically add new tables such as lookup tables without having to do anything "special". This is accomplished by reflecting on the loaded assemblies and locating any classes which implement a specific interface or base class)
My strategy is to determine the concrete type of the object at postback and then create an instance of the type through reflection. (The mechanism I use to determine type is somewhat primitive. I include it as a hidden field within the form. Better ideas are welcome.)
When I create an instance of the object using reflection through any of the following methods, none of the objects are being updated by UpdateModel.
Type t = {Magically Determined Type}
object b = Activator.CreatorInstance(t);
UpdateModel(b, formCollection.ToValueProvider());
Type t = {Magically Determined Type}
var c = Activator.CreatorInstance(t);
UpdateModel(c, formCollection.ToValueProvider());
Type t = {Magically Determined Type}
IReferenceEntity d = Activator.CreatorInstance(t);
UpdateModel(d, formCollection.ToValueProvider());
Note: I have verified that the objects which are being created through relection are all of the proper type.
Does anyone have any idea why this might be happening? I am somewhat stumped.
If I was really "hard up", I could create factory object which would many instantiate any one of these reference entity/lookup objects. However, this would break the application's ability to allow for new lookup tables to be added and discovered transparently and is just not quite as clean.
Also, I could try deriving from an actual ReferenceEntity base class as opposed to an interface, but I am doubtful whether this would make any difference. The issue appears to be with using reflection created objects in the modelbinder.
Any help is appreciated.
Anthony
Augi answered this on ASP.NET forums. It worked with only a couple of minor modifications. Thank you Augi.
The problem is that [Try]UpdateModel methods allow to specify model type using generic parameter only so they don't allow dynamic model type specification. I have created issue ticket for this.
You can see TryModelUpdate method implementation here. So it's not difficult to write own overload:
public virtual bool TryUpdateModelDynamic<TModel>(TModel model, string prefix, string[] includeProperties, string[] excludeProperties, IDictionary<string, ValueProviderResult> valueProvider) where TModel : class
{
if (model == null)
{
throw new ArgumentNullException("model");
}
if (valueProvider == null)
{
throw new ArgumentNullException("valueProvider");
}
//Predicate<string> propertyFilter = propertyName => BindAttribute.IsPropertyAllowed(propertyName, includeProperties, excludeProperties);
IModelBinder binder = Binders.GetBinder( /*typeof(TModel)*/model.GetType());
ModelBindingContext bindingContext = new ModelBindingContext()
{
Model = model,
ModelName = prefix,
ModelState = ModelState,
//ModelType = typeof(TModel), // old
ModelType = model.GetType(),
// new
//PropertyFilter = propertyFilter,
ValueProvider = valueProvider
};
binder.BindModel(ControllerContext, bindingContext);
return ModelState.IsValid;
}
Does your IReferenceEntity contain setters on the properties as well as getters? I would think that the last sample would work if the interface had property setters, though you'd have to cast it to get it to compile.
Type t = {Magically Determined Type}
IReferenceEntity d = Activator.CreatorInstance(t) as IReferenceEntity;
UpdateModel(d, formCollection.ToValueProvider());
Normally the reason that it won't set a property on a class is because it can't find a public setter method available to use via reflection.
Just a quick "another thing to try":
UpdateModel(d as IReferenceEntity, formCollection.ToValueProvider());
Not sure if that will work, and I haven't tried it myself, but it's the first thing that came to mind.
If I get a chance later I'll peek at the Default Model Binder code and see if there's anything in there that is obvious...

In TDD and DDD, how do you handle read-only properties in fakes?

Question
How do you handle read-only fields when creating fakes?
Background
I'm in the beginner stages of using ASP.Net MVC and am using Steven Sanderson's Sports Store and Scott Gu's Nerd Dinner as examples. One small problem that I've just hit is how to work with read-only properties when doing fakes. I'm using LINQToSQL.
My interface is:
public interface IPersonRespository
{
Person GetPerson(int id);
}
and my fake becomes
public class FakePersonRepository
{
public Person GetPerson(int id)
{
return new Person {id="EMP12345", name="John Doe", age=47, ssn=123-45-6789, totalDrWhoEpisodesWatched=42};
}
}
Here's my problem. The fields id, ssn and totalDrWhoEpisodesWatched are read-only, so the above code won't actually work. However, I don't recognize how to create a fake new person and set a read-only property. I'm sure there is a solution, but I haven't come across it yet in my searches.
Update: Inheritance + Property Hiding as a Potential Solution?
I haven't yet decided upon a firm solution to the problem. I dislike the notion of modifying my Domain classes for the purposes of creating fakes. To me, adding markup to the domain classes in order to do testing is a form of added coupling -- coupling to the implementation of your test. I'm now investigating another possibility, which is to create a FakePerson class, which inherits from Person, but hides the properties with new read-write properties.
public class FakePerson: Person
{
public new int age { get; set; }
public new string ssn { get; set; }
public new int totalDrWhoEpisodesWatched { get; set; }
}
So far, this solution is how I am leaning. It does break the Liskov Substitution Principle, however that doesn't bug me as much in a test project. I'd be glad to hear any criticism and/or feedback on this as a solution.
Winner: Mock Frameworks
Moq appears to do the job. My last solution of hiding the property through inheritance does, in fact, work, however by using Moq, I get a standardized set of functionality that is more maintainable. I assume that other mock frameworks have this functionality, but I haven't checked. Moq is said to be more straightforward for the beginning mock writing, which I definitely am right now.
Consider mocking the Person type in your test. Example using Moq:
var mock = new Mock<Person>();
mock.SetupGet(p => p.id).Returns("EMP12345");
mock.SetupGet(p => p.ssn).Returns("123-45-6789");
mock.SetupGet(p => p.totalDrWhoEpisodesWatched).Returns(42);
return mock.Object;
Otherwise, try finding out how LINQ to SQL sets those read only properties.
EDIT: If you attempt the above and Moq throws an ArgumentException in the SetupGet call with the message "Invalid setup on a non-overridable member: p => p.id", then you need to mark the property as virtual. This will need to be done for each property whose getter you wish to override.
In LINQ to SQL, this can be done in the OR designer by selecting the property, then in the Properties window set Inheritance Modifier to virtual.
You can only set readonly properties in the constructor of the class. The Person object should have a constructor that accepts id, ssn, and totalDrWhoEpisodesWatched. Of course, if this is a linqtosql generated object, you might have issues modifying that as the code is auto-generated.
You could consider using a mapped object to expose in your repository ... so you'd never actually have to use your linqtosql object as your model.
In .NET, you could mark your setters as "internal" and use the InternalsVisibleTo assembly attribute to make internals visible to your test assembly. That way your setters won't be public, but you can still access them.
note: even though the question isn't tagged .NET, I assumed it was based on your usage of object initializer syntax. If my assumption was wrong, this suggestion does not apply (unless the language you're using has an equivalent feature, of course).
If it's for tests - consider using reflection. That wouldn't involve messing around your domain model.
For example - i got FactoryBase class, which uses reflection to set needed prop by lambda expression through parameters (like this). Works like a charm - creating new factory is simple as defining repository type and default entity data.
I also use Moq. I love it and it works great. But, before I started using Moq, I wrote many fakes. Here's how I would have solved the problem using fakes.
Since a fake can have additional methods that the "production" implementation doesn't have, I would add a few extra methods to my fake implementation to handle setting the read-only portion.
Like this:
public class FakePersonRepository : IPersonRespository
{
private IDictionary<int, Person> _people = new Dictionary<int, Person>();
public Person GetPerson(int id) // Interface Implementation
{
return _people(id);
}
public void SetPerson(int id, Person person) // Not part of interface
{
_people.Add(id, person);
}
}

Resources