Entity Framework Core 7 apply custom ValueConverter to Inserts and Updates - valueconverter

I am struggling to understand how to apply a custom ValueConverter to all the properties of all the models before every insert and update.
According to this resource it can be done:
https://learn.microsoft.com/en-us/ef/core/modeling/value-conversions?tabs=data-annotations
"Value converters allow property values to be converted when reading from or **writing ** to the database"
I was able to accomplish this, but only when reading from the database, overriding ConfigureConventions in the Context:
protected override void ConfigureConventions(ModelConfigurationBuilder configurationBuilder)
{
configurationBuilder.Properties<string>().HaveConversion<ToUpperConverter>();
}
Converter
public class ToUpperConverter : ValueConverter<string, string>
{
public ToUpperConverter() : base(v => v, v => v.Trim().ToUpper())
{
}
}
Tried also to override the method SaveChangesAsync in the Context, this works but obviously I don't want to go manually for every property and for every model considering they can change at every time.
Using reflection here can be an option, but I am not really a big fan of it.
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = new CancellationToken())
{
foreach (var entry in ChangeTracker.Entries<Player>())
{
if (entry.State == EntityState.Modified || entry.State == EntityState.Added)
{
entry.Entity.Name = entry.Entity.Name.ToUpper();
entry.Entity.MiddleName = entry.Entity.MiddleName?.ToUpper();
entry.Entity.Surname = entry.Entity.Surname.ToUpper();
}
}
return await base.SaveChangesAsync(cancellationToken);
}

Solved, after spending on this an entire day..
The first argument of the converter is the conversion happening to the database, the second argument is the conversion happening from the database.
public class ToUpperConverter : ValueConverter<string, string>
{
public ToUpperConverter() : base(
// writing to the database
v => v.Trim().ToUpper(),
// reading from the database
v => v.Trim().ToUpper())
{
}
}
Source: https://learn.microsoft.com/en-us/dotnet/api/microsoft.entityframeworkcore.metadata.builders.propertiesconfigurationbuilder.haveconversion?view=efcore-6.0
"The type to convert to and from or a type that inherits from ValueConverter."

Related

Neo4j - Custom converter for field of type List

I am trying to write a custom converter for a nested object so that this object gets saved as string in Neo4j database.
I am using #Convert annotation on my field and passing ImageConverter.class which is my AttributeConverter class.
Everything works fine as expected and I am able to save string representation of Image class in Neo4j db.
However, now instead of single image I want to have List<Image> as my nested field. In this case, putting #Convert(ImageConverter.class) doesn't work.
I see that there is a class called ConverterBasedCollectionConverter which gets used when I have a field of type List<LocalDateTime.
However, I couldn't find any exammples on how to use this class in case of custom converters.
Please can anyone help me with this or if there is any other approach to use custom converter on field of type List.
I am using Neo4j (version 3.4.1) and Spring-data-neo4j (5.0.10.RELEASE) in my application. I am also using OGM.
PS: I am aware that it is advised to store nested objects as separate node establishing a relationship with parent object. However, my use case demands that the object be stored as string property and not as separate node.
Regards,
V
It is not so difficult as I assumed it would be.
Given a class (snippet)
#NodeEntity
public class Actor {
#Id #GeneratedValue
private Long id;
#Convert(MyImageListConverter.class)
public List<MyImage> images = new ArrayList<>();
// ....
}
with MyImage as simple as can be
public class MyImage {
public String blob;
public MyImage(String blob) {
this.blob = blob;
}
public static MyImage of(String value) {
return new MyImage(value);
}
}
and a converter
public class MyImageListConverter implements AttributeConverter<List<MyImage>, String[]> {
#Override
public String[] toGraphProperty(List<MyImage> value) {
if (value == null) {
return null;
}
String[] values = new String[(value.size())];
int i = 0;
for (MyImage image : value) {
values[i++] = image.blob;
}
return values;
}
#Override
public List<MyImage> toEntityAttribute(String[] values) {
List<MyImage> images = new ArrayList<>(values.length);
for (String value : values) {
images.add(MyImage.of(value));
}
return images;
}
}
will print following debug output on save that I think is what you want:
UNWIND {rows} as row CREATE (n:Actor) SET n=row.props RETURN row.nodeRef as ref, ID(n) as id, {type} as type with params {type=node, rows=[{nodeRef=-1, props={images=[blobb], name=Jeff}}]}
especially the images part.
Test method for this looks like
#Test
public void test() {
Actor jeff = new Actor("Jeff");
String blobValue = "blobb";
jeff.images.add(new MyImage(blobValue));
session.save(jeff);
session.clear();
Actor loadedActor = session.load(Actor.class, jeff.getId());
assertThat(loadedActor.images.get(0).blob).isEqualTo(blobValue);
}
I am came up with a solution to my problem. So, in case you want another solution along with the solution provided by #meistermeier, you can use the below code.
public class ListImageConverter extends ConverterBasedCollectionConverter<Image, String>{
public ListImageConverter() {
super(List.class, new ImageConverter());
}
#Override
public String[] toGraphProperty(Collection<Image> values) {
Object[] graphProperties = super.toGraphProperty(values);
String[] stringArray = Arrays.stream(graphProperties).toArray(String[]::new);
return stringArray;
}
#Override
public Collection<Image> toEntityAttribute(String[] values) {
return super.toEntityAttribute(values);
}
}
ImageConverter class just implements AttributeConverter<Image, String> where I serialize and deserialize my Image object to/from json.
I chose to go with this approach because I had Image field in one object and List<Image> in another object. So just by changing #Convert(ListImageConverter.class) to #Convert(ImageConverter.class) I was able to save list as well as single object in Neo4j database.
Note: You can skip overriding toEntityAttribute method if you want. It doesn't add much value.
However you have to override toGraphProperty as within Neo4j code it checks for presence of declared method with name toGraphProperty.
Hope this helps someone!
Regards,
V

MVC Full Calendar Error [duplicate]

I am trying to do a simple JSON return but I am having issues I have the following below.
public JsonResult GetEventData()
{
var data = Event.Find(x => x.ID != 0);
return Json(data);
}
I get a HTTP 500 with the exception as shown in the title of this question. I also tried
var data = Event.All().ToList()
That gave the same problem.
Is this a bug or my implementation?
It seems that there are circular references in your object hierarchy which is not supported by the JSON serializer. Do you need all the columns? You could pick up only the properties you need in the view:
return Json(new
{
PropertyINeed1 = data.PropertyINeed1,
PropertyINeed2 = data.PropertyINeed2
});
This will make your JSON object lighter and easier to understand. If you have many properties, AutoMapper could be used to automatically map between DTO objects and View objects.
I had the same problem and solved by using Newtonsoft.Json;
var list = JsonConvert.SerializeObject(model,
Formatting.None,
new JsonSerializerSettings() {
ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore
});
return Content(list, "application/json");
This actually happens because the complex objects are what makes the resulting json object fails.
And it fails because when the object is mapped it maps the children, which maps their parents, making a circular reference to occur. Json would take infinite time to serialize it, so it prevents the problem with the exception.
Entity Framework mapping also produces the same behavior, and the solution is to discard all unwanted properties.
Just expliciting the final answer, the whole code would be:
public JsonResult getJson()
{
DataContext db = new DataContext ();
return this.Json(
new {
Result = (from obj in db.Things select new {Id = obj.Id, Name = obj.Name})
}
, JsonRequestBehavior.AllowGet
);
}
It could also be the following in case you don't want the objects inside a Result property:
public JsonResult getJson()
{
DataContext db = new DataContext ();
return this.Json(
(from obj in db.Things select new {Id = obj.Id, Name = obj.Name})
, JsonRequestBehavior.AllowGet
);
}
To sum things up, there are 4 solutions to this:
Solution 1: turn off ProxyCreation for the DBContext and restore it in the end.
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
bool proxyCreation = db.Configuration.ProxyCreationEnabled;
try
{
//set ProxyCreation to false
db.Configuration.ProxyCreationEnabled = false;
var data = db.Products.ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
finally
{
//restore ProxyCreation to its original state
db.Configuration.ProxyCreationEnabled = proxyCreation;
}
}
Solution 2: Using JsonConvert by Setting ReferenceLoopHandling to ignore on the serializer settings.
//using using Newtonsoft.Json;
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
try
{
var data = db.Products.ToList();
JsonSerializerSettings jss = new JsonSerializerSettings { ReferenceLoopHandling = ReferenceLoopHandling.Ignore };
var result = JsonConvert.SerializeObject(data, Formatting.Indented, jss);
return Json(result, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
Following two solutions are the same, but using a model is better because it's strong typed.
Solution 3: return a Model which includes the needed properties only.
private DBEntities db = new DBEntities();//dbcontext
public class ProductModel
{
public int Product_ID { get; set;}
public string Product_Name { get; set;}
public double Product_Price { get; set;}
}
public ActionResult Index()
{
try
{
var data = db.Products.Select(p => new ProductModel
{
Product_ID = p.Product_ID,
Product_Name = p.Product_Name,
Product_Price = p.Product_Price
}).ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
Solution 4: return a new dynamic object which includes the needed properties only.
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
try
{
var data = db.Products.Select(p => new
{
Product_ID = p.Product_ID,
Product_Name = p.Product_Name,
Product_Price = p.Product_Price
}).ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
JSON, like xml and various other formats, is a tree-based serialization format. It won't love you if you have circular references in your objects, as the "tree" would be:
root B => child A => parent B => child A => parent B => ...
There are often ways of disabling navigation along a certain path; for example, with XmlSerializer you might mark the parent property as XmlIgnore. I don't know if this is possible with the json serializer in question, nor whether DatabaseColumn has suitable markers (very unlikely, as it would need to reference every serialization API)
add [JsonIgnore] to virtuals properties in your model.
Using Newtonsoft.Json: In your Global.asax Application_Start method add this line:
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
Its because of the new DbContext T4 template that is used for generating the EntityFramework entities. In order to be able to perform the change tracking, this templates uses the Proxy pattern, by wrapping your nice POCOs with them. This then causes the issues when serializing with the JavaScriptSerializer.
So then the 2 solutions are:
Either you just serialize and return the properties you need on the client
You may switch off the automatic generation of proxies by setting it on the context's configuration
context.Configuration.ProxyCreationEnabled = false;
Very well explained in the below article.
http://juristr.com/blog/2011/08/javascriptserializer-circular-reference/
Provided answers are good, but I think they can be improved by adding an "architectural" perspective.
Investigation
MVC's Controller.Json function is doing the job, but it is very poor at providing a relevant error in this case. By using Newtonsoft.Json.JsonConvert.SerializeObject, the error specifies exactly what is the property that is triggering the circular reference. This is particularly useful when serializing more complex object hierarchies.
Proper architecture
One should never try to serialize data models (e.g. EF models), as ORM's navigation properties is the road to perdition when it comes to serialization. Data flow should be the following:
Database -> data models -> service models -> JSON string
Service models can be obtained from data models using auto mappers (e.g. Automapper). While this does not guarantee lack of circular references, proper design should do it: service models should contain exactly what the service consumer requires (i.e. the properties).
In those rare cases, when the client requests a hierarchy involving the same object type on different levels, the service can create a linear structure with parent->child relationship (using just identifiers, not references).
Modern applications tend to avoid loading complex data structures at once and service models should be slim. E.g.:
access an event - only header data (identifier, name, date etc.) is loaded -> service model (JSON) containing only header data
managed attendees list - access a popup and lazy load the list -> service model (JSON) containing only the list of attendees
Avoid converting the table object directly. If relations are set between other tables, it might throw this error.
Rather, you can create a model class, assign values to the class object and then serialize it.
I'm Using the fix, Because Using Knockout in MVC5 views.
On action
return Json(ModelHelper.GetJsonModel<Core_User>(viewModel));
function
public static TEntity GetJsonModel<TEntity>(TEntity Entity) where TEntity : class
{
TEntity Entity_ = Activator.CreateInstance(typeof(TEntity)) as TEntity;
foreach (var item in Entity.GetType().GetProperties())
{
if (item.PropertyType.ToString().IndexOf("Generic.ICollection") == -1 && item.PropertyType.ToString().IndexOf("SaymenCore.DAL.") == -1)
item.SetValue(Entity_, Entity.GetPropValue(item.Name));
}
return Entity_;
}
You can notice the properties that cause the circular reference. Then you can do something like:
private Object DeCircular(Object object)
{
// Set properties that cause the circular reference to null
return object
}
//first: Create a class as your view model
public class EventViewModel
{
public int Id{get;set}
public string Property1{get;set;}
public string Property2{get;set;}
}
//then from your method
[HttpGet]
public async Task<ActionResult> GetEvent()
{
var events = await db.Event.Find(x => x.ID != 0);
List<EventViewModel> model = events.Select(event => new EventViewModel(){
Id = event.Id,
Property1 = event.Property1,
Property1 = event.Property2
}).ToList();
return Json(new{ data = model }, JsonRequestBehavior.AllowGet);
}
An easier alternative to solve this problem is to return an string, and format that string to json with JavaScriptSerializer.
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.Select(x => new { ID = x.ID, AnotherAttribute = x.AnotherAttribute });
return j.Serialize(entityList );
}
It is important the "Select" part, which choose the properties you want in your view. Some object have a reference for the parent. If you do not choose the attributes, the circular reference may appear, if you just take the tables as a whole.
Do not do this:
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.toList();
return j.Serialize(entityList );
}
Do this instead if you don't want the whole table:
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.Select(x => new { ID = x.ID, AnotherAttribute = x.AnotherAttribute });
return j.Serialize(entityList );
}
This helps render a view with less data, just with the attributes you need, and makes your web run faster.

Orchard CMS ContentManager.New<>() Specified Cast Was Invalid

I am in the early stages of developing a new module.
Much of it is laid out in terms of the models etc. Also have the migrations all set up and my database now has the tables for my module.
I am encountering the following error when calling ContentManager.New<myPart> and would like some help please.
Error is this:
An unhandled exception has occurred and the request was terminated. Please refresh the page. If the error persists, go back
Specified cast is not valid.
System.InvalidCastException: Specified cast is not valid.
at Orchard.ContentManagement.ContentCreateExtensions.New[T]
(IContentManager manager, String contentType)
The chunk of code that fires the exception is this:
public static T New<T>(this IContentManager manager, string contentType) where T : class, IContent {
var contentItem = manager.New(contentType);
if (contentItem == null)
return null;
var part = contentItem.Get<T>();
if (part == null)
throw new InvalidCastException();
return part;
}
Here are the various parts to my module that are related to the operation i am struggling with:
ContentPart
public class GoogleMapsSettingsPart : ContentPart<GoogleMapsSettingsPartRecord>
{
public string ApiKey {
get { return Record.ApiKey; }
set { Record.ApiKey = value; }
}
}
ContentPartRecord
public class GoogleMapsSettingsPartRecord : ContentPartRecord
{
public virtual string ApiKey { get; set; }
}
Handler
public GoogleMapsSettingsPartHandler(IRepository<GoogleMapsSettingsPartRecord> repository)
{
Filters.Add(StorageFilter.For(repository));
}
Migration for this table
// Settings Table
SchemaBuilder.CreateTable("GoogleMapsSettingsPartRecord", table => table
.ContentPartRecord()
.Column("ApiKey", DbType.String, c => c.WithLength(60))
);
Some of the code from the controller for this model etc
public AdminController(IContentManager contentManager, IShapeFactory shapeFactory, IServiceLocatorService serviceLocatorService, INotifier notifier)
{
_contentManager = contentManager;
_serviceLocatorService = serviceLocatorService;
_notifier = notifier;
Shape = shapeFactory;
T = NullLocalizer.Instance;
}
/// <summary>
/// Display Settings
/// </summary>
/// <returns></returns>
public ActionResult Settings()
{
var settings = _serviceLocatorService.GoogleMapsSettings;
var editor = CreateSettingsEditor(settings);
var model = _services.ContentManager.BuildEditor(settings);
return View((object)model);
}
Finally - the Services where my call throws this exception
private GoogleMapsSettingsPart _settings;
public GoogleMapsSettingsPart GoogleMapsSettings
{
get {
if (_settings == null)
{
_settings = _contentManager.Query<GoogleMapsSettingsPart, GoogleMapsSettingsPartRecord>().List().FirstOrDefault();
if (_settings == null)
{
_settings = _contentManager.New<GoogleMapsSettingsPart>("GoogleMapsSettings");
}
}
return _settings;
}
}
The actual line where the exception happens is _settings = _contentManager.New<GoogleMapsSettingsPart>("GoogleMapsSettings");
I have tried all sorts of stuff in place of "GoogleMapsSettings" though nothing is working.
I'm pretty sure at this point it's something simple, though it's avoiding me..My limited knowledge of Orchard is stumping me
Any help would be appreciated :)
The exception is thrown because your content type does not have the part you specified to get.
_contentManager.New<GoogleMapsSettingsPart>("GoogleMapsSettings");
This method call creates a new content item of type GoogleMapSettings and gets the content item as a GoogleMapsSettingsPart. However, it seems that GoogleMapSettings content type does not have a GoogleMapsSettingsPart. That's why the exception gets thrown here.
var part = contentItem.Get<T>();
if (part == null)
throw new InvalidCastException();
You must either attach the part dynamically to your content type or do it in a migration (or manually in the admin, but that's not a good idea). Your migration should look like this.
this.ContentDefinitionManager.AlterTypeDefinition("GoogleMapsSettings",
alt => alt
.WithPart("GoogleMapsSettingsPart");
Ok, so I fixed it...
My understanding of how Orchard works is still very much in the learning stages.
for this particular operation I didn't want to have a content type in the admin - though not sure why after adding the ContentType it still didn't work...
anyway, adding the lines below to my handler took care of the rest. I believe it's actually creating a temporary type so one isn't needed in the system.
public GoogleMapsSettingsPartHandler(IRepository<GoogleMapsSettingsPartRecord> repository)
{
Filters.Add(new ActivatingFilter<GoogleMapsSettingsPart>("GoogleMapsSettings"));
Filters.Add(StorageFilter.For(repository));
Filters.Add(new TemplateFilterForRecord<GoogleMapsSettingsPartRecord>("GoogleMapsSettings", "Parts/GoogleMapsSettings"));
}
I'v got the same error, but in my case it was everything ok with migration class.
The reason was unlucky merge, which deleted my driver class of my part.
Just look at this code of Activating method of ContentPartDriverCoordinator class. In my case there was no partInfo for my content part and resulted part became ContentPart, so casting throws an exception
var partInfos = _drivers.SelectMany(cpp => cpp.GetPartInfo()).ToList();
foreach (var typePartDefinition in contentTypeDefinition.Parts) {
var partName = typePartDefinition.PartDefinition.Name;
var partInfo = partInfos.FirstOrDefault(pi => pi.PartName == partName);
var part = partInfo != null
? partInfo.Factory(typePartDefinition)
: new ContentPart { TypePartDefinition = typePartDefinition };
context.Builder.Weld(part);
}

Automapper + EF4 + ASP.NET MVC - getting 'context disposed' error (I know why, but how to fix it?)

I have this really basic code in a MVC controller action. It maps an Operation model class to a very basic OperationVM view-model class .
public class OperationVM: Operation
{
public CategoryVM CategoryVM { get; set; }
}
I need to load the complete list of categories in order to create a CategoryVM instance.
Here's how I (try to) create a List<OperationVM> to show in the view.
public class OperationsController : Controller {
private SomeContext context = new SomeContext ();
public ViewResult Index()
{
var ops = context.Operations.Include("blah...").ToList();
Mapper.CreateMap<Operation, OperationVM>()
.ForMember(
dest => dest.CategoryVM,
opt => opt.MapFrom(
src => CreateCatVM(src.Category, context.Categories)
// trouble here ----------------^^^^^^^
)
);
var opVMs = ops.Select(op => Mapper.Map<Operation, OperationVM>(op))
.ToList();
return View(opVMs);
}
}
All works great first time I hit the page. The problem is, the mapper object is static. So when calling Mapper.CreateMap(), the instance of the current DbContext is saved in the closure given to CreateMap().
The 2nd time I hit the page, the static map is already in place, still using the reference to the initial, now disposed, DbContext.
The exact error is:
The operation cannot be completed because the DbContext has been disposed.
The question is: How can I make AutoMapper always use the current context instead of the initial one?
Is there a way to use an "instance" of automapper instead of the static Mapper class?
If this is possible, is it recommended to re-create the mapping every time? I'm worried about reflection slow-downs.
I read a bit about custom resolvers, but I get a similar problem - How do I get the custom resolver to use the current context?
It is possible, but the setup is a bit complicated. I use this in my projects with help of Ninject for dependency injection.
AutoMapper has concept of TypeConverters. Converters provide a way to implement complex operations required to convert certain types in a separate class. If converting Category to CategoryVM requires a database lookup you can implement that logic in custom TypeConverter class similar to this:
using System;
using AutoMapper;
public class CategoryToCategoryVMConverter :
TypeConverter<Category, CategoryVM>
{
public CategoryToCategoryVMConverter(DbContext context)
{
this.Context = context;
}
private DbContext Context { get; set; }
protected override CategoryVM ConvertCore(Category source)
{
// use this.Context to lookup whatever you need
return CreateCatVM(source, this.Context.Categories);
}
}
You then to configure AutoMapper to use your converter:
Mapper.CreateMap<Category, CategoryVM>().ConvertUsing<CategoryToCategoryVMConverter>();
Here comes the tricky part. AutoMapper will need to create a new instance of our converter every time you map values, and it will need to provide DbContext instance for constructor. In my projects I use Ninject for dependency injection, and it is configured to use the same instance of DbContext while processing a request. This way the same instance of DbContext is injected both in your controller and in your AutoMapper converter. The trivial Ninject configuration would look like this:
Bind<DbContext>().To<SomeContext>().InRequestScope();
You can of course use some sort of factory pattern to get instance of DbContext instead of injecting it in constructors.
Let me know if you have any questions.
I've found a workaround that's not completely hacky.
Basically, I tell AutoMapper to ignore the tricky field and I update it myself.
The updated controller looks like this:
public class OperationsController : Controller {
private SomeContext context = new SomeContext ();
public ViewResult Index()
{
var ops = context.Operations.Include("blah...").ToList();
Mapper.CreateMap<Operation, OperationVM>()
.ForMember(dest => dest.CategoryVM, opt => opt.Ignore());
var opVMs = ops.Select(
op => {
var opVM = Mapper.Map<Operation, OperationVM>(op);
opVM.CategoryVM = CreateCatVM(op.Category, context.Categories);
return opVM;
})
.ToList();
return View(opVMs);
}
}
Still curious how this could be done from within AutoMapper...
The answer from #LeffeBrune is perfect. However, I want to have the same behavior, but I don't want to map every property myself. Basically I just wanted to override the "ConstructUsing".
Here is what I came up with.
public static class AutoMapperExtension
{
public static void ConstructUsingService<TSource, TDestination>(this IMappingExpression<TSource, TDestination> mappingExression, Type typeConverterType)
{
mappingExression.ConstructUsing((ResolutionContext ctx) =>
{
var constructor = (IConstructorWithService<TSource, TDestination>)ctx.Options.ServiceCtor.Invoke(typeConverterType);
return constructor.Construct((TSource)ctx.SourceValue);
});
}
}
public class CategoryToCategoryVMConstructor : IConstructorWithService<Category, CategoryVM>
{
private DbContext dbContext;
public DTOSiteToHBTISiteConverter(DbContext dbContext)
{
this.dbContext = dbContext;
}
public CategoryVM Construct(Category category)
{
// Some commands here
if (category.Id > 0)
{
var vmCategory = dbContext.Categories.FirstOrDefault(m => m.Id == category.Id);
if (vmCategory == null)
{
throw new NotAllowedException();
}
return vmCategory;
}
return new CategoryVM();
}
}
// Initialization
Mapper.Initialize(cfg =>
{
cfg.ConstructServicesUsing(type => nInjectKernelForInstance.Get(type));
cfg.CreateMap<Category, CategoryVM>().ConstructUsingService(typeof(CategoryToCategoryVMConstructor));
};

Linq to SQL using Repository Pattern: Object has no supported translation to SQL

I have been scratching my head all morning behind this but still haven't been able to figure out what might be causing this.
I have a composite repository object that references two other repositories. I'm trying to instantiate a Model type in my LINQ query (see first code snippet).
public class SqlCommunityRepository : ICommunityRepository
{
private WebDataContext _ctx;
private IMarketRepository _marketRepository;
private IStateRepository _stateRepository;
public SqlCommunityRepository(WebDataContext ctx, IStateRepository stateRepository, IMarketRepository marketRepository)
{
_ctx = ctx;
_stateRepository = stateRepository;
_marketRepository = marketRepository;
}
public IQueryable<Model.Community> Communities
{
get
{
return (from comm in _ctx.Communities
select new Model.Community
{
CommunityId = comm.CommunityId,
CommunityName = comm.CommunityName,
City = comm.City,
PostalCode = comm.PostalCode,
Market = _marketRepository.GetMarket(comm.MarketId),
State = _stateRepository.GetState(comm.State)
}
);
}
}
}
The repository objects that I'm passing in look like this
public class SqlStateRepository : IStateRepository
{
private WebDataContext _ctx;
public SqlStateRepository(WebDataContext ctx)
{
_ctx = ctx;
}
public IQueryable<Model.State> States
{
get
{
return from state in _ctx.States
select new Model.State()
{
StateId = state.StateId,
StateName = state.StateName
};
}
}
public Model.State GetState(string stateName)
{
var s = (from state in States
where state.StateName.ToLower() == stateName
select state).FirstOrDefault();
return new Model.State()
{
StateId = s.StateId,
StateName = s.StateName
};
}
AND
public class SqlMarketRepository : IMarketRepository
{
private WebDataContext _ctx;
public SqlMarketRepository(WebDataContext ctx)
{
_ctx = ctx;
}
public IQueryable<Model.Market> Markets
{
get
{
return from market in _ctx.Markets
select new Model.Market()
{
MarketId = market.MarketId,
MarketName = market.MarketName,
StateId = market.StateId
};
}
}
public Model.Market GetMarket(int marketId)
{
return (from market in Markets
where market.MarketId == marketId
select market).FirstOrDefault();
}
}
This is how I'm wiring it all up:
WebDataContext ctx = new WebDataContext();
IMarketRepository mr = new SqlMarketRepository(ctx);
IStateRepository sr = new SqlStateRepository(ctx);
ICommunityRepository cr = new SqlCommunityRepository(ctx, sr, mr);
int commCount = cr.Communities.Count();
The last line in the above snippet is where it fails. When I debug through the instantiation (new Model.Community), it never goes into any of the other repository methods. I do not have a relationship between the underlying tables behind these three objects. Would this be the reason that LINQ to SQL is not able to build the expression tree right?
These are non-hydrated queries, not fully-hydrated collections.
The Communities query differs from the other two because it calls methods as objects are hydrated. These method calls are not translatable to SQL.
Normally this isn't a problem. For example: if you say Communities.ToList(), it will work and the methods will be called from the objects as they are hydrated.
If you modify the query such that the objects aren't hydrated, for example: when you say Communities.Count(), linq to sql attempts to send the method calls into the database and throws since it cannot. It does this even though those method calls ultimately would not affect the resulting count.
The simplest fix (if you truly expect fully hydrated collections) is to add ToList to the community query, hydrating it.
Try adding another repository method that looks like this:
public int CommunitiesCount()
{
get { return _ctx.Communities.Count(); }
}
This will allow you to return a count without exposing the entire object tree to the user, which is what I think you're trying to do anyway.
As you may have already guessed, I suspect that what you are calling the anonymous types are at fault (they're not really anonymous types; they are actual objects, which you are apparently partially populating in an effort to hide some of the fields from the end user).

Resources