I'm trying to declare DbSet in my Databases Context class.
Because I don't which class (Entity Framwork Class) user calls.
That's why I want to declare the DbSet with dynamic.
I would try to have everything explicitly defined in your context. There's a lot of functionality baked into that property declaration, not the least of which is the design-time benefit of having types.
There are many techniques for dynamic querying, but I'll post one that I've used in the past for dynamically querying lookup tables.
public List<LookupItemDTO> LoadItems(int lookupId)
{
//nothing special here, just map the lookup id to a name
var lookupRepo = new LookupRepository<LookupDefinition>(uow);
var lookup = lookupRepo.Find(id);
//map the lookup's name from the table to a type. This is important DO NOT ACCEPT A TABLE NAME FROM THE USER.
var targetType = Type.GetType("MyApp.DomainClasses.Lookup." + lookup.Name + ",MyApp.DomainClasses");
// this is an optional extra step but it allows us to instantiate an instance of our repository to ensure that business rules are enforced
var lookupType = typeof(LookupRepository<>).MakeGenericType(targetType);
dynamic LookupTable = Activator.CreateInstance(lookupType);
//just some pattern nonsense to wire up the unit of work
LookupTable.SetupUOW(uow);
//make sure to cast it into something you can work with ASAP
var data = ((IQueryable<LookupTableBase>)LookupTable.All).OrderByDescending(l => l.IsVisible).ThenBy(l => l.SortOrder);
var list = from li in data
select new LookupItemDTO
{
Id = li.Id,
Description = li.Description,
Display = li.Display,
SortOrder = li.SortOrder,
IsVisible = li.IsVisible
};
return list.ToList();
}
The key here is that you can dynamically query tables, but it's better to do it at a higher level. Also, have some kind of indirection between you and your user's input. Let them select a table from a list, use the ID from that table to look up the name. My solution here is the have a lookup table definition table. Any dynamic queries start there first and then values from that definition table are used to build the necessary types.
Related
I have a web app using MVC and EF. I am using Repository and Unit of Work Patterns from Microsoft online doc.
I am trying to insert multiple rows from multiple tables.
The code look something like this:
unitOfWork.Table1.Insert(row1);
unitOfWork.Save();//recId is primary key, will be auto generated after save.
table2Row.someId = table1Row.recId;
unitOfWork.Table2.Insert(row2);
unitOfWork.Save();
If anything goes wrong when inserting row2, I need to rollback row1 and row2.
How do I implement BeginTransaction/Commit/Rollback with UnitOfWork pattern?
Thanks.
To avoid these issues it is better to utilize EF as an ORM rather than a simple data translation service by using reference properties rather than relying on setting FK values directly.
Your example doesn't really appear to be providing anything more than a thin abstraction of the DbContext.
Given an example of an Order for a new Customer where you want a CustomerId to be present on the Order.
Problem: the new Customer's ID is generated by the DB, so it will only be available after SaveChanges.
Solution: Use EF navigation properties and let EF manage the FKs.
var customer = new Customer
{
Name = customerName,
// etc.
};
var order = new Order
{
OrderNumber = orderNumber,
// etc.
Customer = customer,
};
dbContext.Orders.Add(order);
dbContext.SaveChanges();
Note that we didn't have to add the Customer to the dbContext explicitly, it will be added via the order's Customer reference and the Order table's CustomerId will be set automatically. If there is a Customers DbSet on the context you can add the customer explicitly as well, but only one SaveChanges call is needed.
To set the navigation properties up see: https://stackoverflow.com/a/50539429/423497
** Edit (based on comments on 1-many relationships) **
For collections some common traps to avoid would include setting the collection references directly for entities that have already been associated with the DbContext, and also utilizing virtual references so that EF can best manage the tracking of instances in the collection.
If an order has multiple customers then you would have a customers collection in the order:
public virtual List<Customer> Customers{ get; set; } = new List<Customer>();
and optionally an Order reference in your customer:
public virtual Order Order { get; set; }
Mapping these would look like: (from the Order perspective)
HasMany(x => x.Customers)
.WithRequired(x => x.Order)
.Map(x => x.MapKey("OrderId"));
or substitute .WithRequired() if the customer does not have an Order reference.
This is based on relationships where the entities do not have FK fields declared. If you declare FKs then .Map(...) becomes .HasForeignKey(x => x.FkProperty)
From there if you are creating a new order:
var order = new Order
{
OrderNumber = "12345",
Customers = new []
{
new Customer { Name = "Fred" },
new Customer { Name = "Ginger" }
}.ToList()
};
using (var context = new MyDbContext())
{
context.Orders.Add(order);
context.SaveChanges();
}
This should all work as expected. It will save the new order and both associated customers.
However, if you load the order from the DbContext and want to manipulate the customers associated to it, then there are a couple caveats.
1. You should to eager-load the Customers collection so that EF knows about these entities.
2. You need to manipulate the collection with Add/Remove rather than setting the reference to avoid confusion about what the code looks like it does and what EF interprets.
Something like:
using (var context = new MyDbContext())
{
var order = context.Orders.Find(1);
order.Customers = new []
{
new Customer { Name = "Roy" }
}.ToList();
context.SaveChanges();
}
Will result in "Roy" being added to the Customers, rather than replacing them.
To replace them you need to remove them first, then add the new one.
using (var context = new MyDbContext())
{
var order = context.Orders.Find(1);
context.Customers.RemoveRange(order.Customers); // Assuming customers cannot exist without orders. If OrderId is nullable, this line can be omitted.
order.Customers.Clear();
order.Customers,Add(new Customer { Name = "Roy" });
context.SaveChanges();
}
This starts to fall apart if the collections are not virtual. For instance:
using (var context = new MyDbContext())
{
var order = context.Orders.Find(1);
order.Customers = new []
{
new Customer { Name = "Roy" }
}.ToList();
context.SaveChanges();
}
if the customers collection is virtual, after the SaveChanges, order.Customers will report a collection size of 3 elements. If it is not virtual it will report the size as 1 element even though there are 3 records now associated to the order in the DB. This leads to all kinds of issues where projects get caught out with invalid data state, duplicate records and the like.
Cases where you're seeing some records getting missed will likely be due to missing the virtual on the references, manipulating the collections outside of what EF is tracking, or other manipulations of the tracking state. (a common issue when projects are set up to detach/re-attach entities from contexts.)
I'm trying to limit the amount of data coming across when implementing Lookup Lists in Breeze.JS. The Lookup Lists sample uses queries that results in full objects but I would like to project the entities to fewer properties (e.g. primary key, foreign keys, and a descriptor property) and still have Breeze.JS recognize the entity type on the client. I know how to do the projections from the client to get partials but how would I do that with Lookup Lists (either from the client or the server Web API)?
You might satisfy your intentions with a custom JsonResultsAdapter.
You're probably wondering "What is a JsonResultsAdapter?"
That's what breeze uses to interpret the JSON arriving from the server. You can read about here and here.
Perhaps more helpful is to look at the adapter in the Web API dataservice and at the example adapter from the "Edmumds" sample.
The Edmunds sample demonstrates translating a JSON source that you don't control into breeze entities.
In this case, your JsonResultsAdapter would look at each node of JSON and say "this is a Foo, this is a Bar, and that one is a Baz". Accordingly, for each of these nodes it would return { entityType: "Foo" }, return { entityType: "Bar" }, and return { entityType: "Baz" }
Now breeze knows what to do and creates corresponding entities out of the Lookups payload.
Remember to mark these entities as partial, in the same way you would if you had made a projection query that targeted a single entity type.
Fortunately, the Lookups query returns the container object that holds the Foo, Bar, and Baz collections. So you can iterate over these and mark them partial right there in the query success callback.
Once you wrap your head around THAT ... you'll want to know how to put your custom JsonResultsAdapter to work in the Lookups query ... and ONLY in the Lookups query.
You can enlist that JsonResultsAdapter exclusively for your Lookups query with a using clause.
Here's an example:
var jsa = new breeze.JsonResultsAdapter({
name: 'myLookupsJsa',
visitNode: function() {...}
});
query = query.using(jsa);
Is this overkill? Would you be better off making three trips?
Only you will know. I would like to hear from you when you try it ... and give us your suggestions on how we might make this easier in a general way.
In the lookup lists example the controller action looks like this:
[HttpGet]
public object Lookups() // returns an object, not an IQueryable
{
var regions = _contextProvider.Context.Regions;
var territories = _contextProvider.Context.Territories;
var categories = _contextProvider.Context.Categories;
return new {regions, territories, categories};
}
You could reduce the footprint using a server-side projection like this:
[HttpGet]
public object Lookups() // returns an object, not an IQueryable
{
var regions = _contextProvider.Context.Regions
.Select(x => new { id = x.RegionId, name = x.RegionName });
var territories = _contextProvider.Context.Territories
.Select(x => new { id = x.TerritoryId, name = x.TerritoryName });
var categories = _contextProvider.Context.Categories
.Select(x => new { id = x.CategoryId, name = x.CategoryName });
return new {regions, territories, categories};
}
This approach does not answer this part of your question:
and still have Breeze.JS recognize the entity type on the client
Not sure what the solution or use case is for this piece.
We have controllers that read Entities with certain criteria and return a set of view models containing the data to the view. The view uses a Kendo Grid, but I don't think that makes a particular difference.
In each case, we have a Linq Query that gets the overall collection of entity rows and then a foreach loop that creates a model from each row.
Each entity has certain look ups as follows:
those with a 1:1 relationship, e.g. Assigned to (via a foreign key to a single person)
those with a 1:many relationship e.g. copy parties (to 0:many people - there are not many of these)
counts of other relationships (e.g. the number of linked orders)
any (e.g. whether any history exists)
If we do these in the model creation, the performance is not good as the queries must be run separately for each and every row.
We have also tried using includes to eager load the related entities but once you get more than two, this starts to deteriorate too.
I have seen that compiled queries and LoadProperty may be an option and I am particularly interested in the latter.
It would be great to understand best practice in these situations, so I can direct my investigations.
Thanks,
Chris.
Edit - sample code added. However, I'm looking for best practice.
public JsonResult ReadUsersEvents([DataSourceRequest]DataSourceRequest request, Guid userID)
{
var diaryEventModels = new List<DiaryEventModel>();
var events = UnitOfWork.EventRepository.All().Where(e => e.UserID == userID);
foreach (var eventItem in events)
{
var diaryModel = new DiaryEventModel(eventItem);
diaryEventModels.Add(diaryModel);
}
var result = diaryEventModels.ToDataSourceResult(request);
return Json(result, JsonRequestBehavior.AllowGet);
}
public DiaryEventModel(Event eventItem) {
// Regular property from Entity examples - no issue here as data retrived simply in original query
ID = eventItem.ID;
Start = eventItem.StartDateTime;
End = eventItem.EndDateTime;
EventDescription = eventItem.Description;
// One to one looked up Property example
eventModel.Creator = eventItem.Location.FullName;
// Calculation example based on 0 to many looked up properties, also use .Any in some cases
// This is a simplified example
eventModel.AttendeeCount = eventItem.EventAttendees.Count();
// 0 to Many looked up properties
EventAttendees = eventItem.EventAttendees.Select(e => new SelectListItem
{
Text = e.Person.FullName,
Value = e.Person.ID.ToString()
}).ToList();
}
I am trying to find the best way to generate an enities, this is what I am doing at the moment.
I create an entity trough a mapper and a hydrator like this:
namespace Event\Model\Mapper;
use ZfcBase\Mapper\AbstractDbMapper;
class Event extends AbstractDbMapper
{
protected $tableName = 'events';
public function findEventById($id)
{
$id = (int) $id;
$select = $this->getSelect($this->tableName)
->where(array('event_index' => $id));
$eventEntity = $this->select($select)->current();
if($eventEntity){
//Set Location Object
$locationMapper = $this->getServiceLocator()->get('location_mapper');
$locationEntity = $locationMapper->findlocationById($eventEntity->getLocationIndex());
$eventEntity->setLocationIndex($locationEntity);
//Set User Object
$userMapper = $this->getServiceLocator()->get('user_mapper');
$userEntity = $userMapper->findUserById($eventEntity->getEnteredBy());
$eventEntity->setEnteredBy($userEntity);
//Set Catalog Object
$catalogMapper = $this->getServiceLocator()->get('catalog_mapper');
$catalogEntity = $catalogMapper->findCatalogById($eventEntity->getCatalogIndex());
$eventEntity->setCatalogIndex($catalogEntity);
}
return $eventEntity;
}
}
Now the problem with this approach is that when I call let say the User entity this entity has other entities attach to it so when I generate the Event entity by inserting the User entity my Event entity becomes very large and bulky, I dont want that I just want the first layer of the "gerontology tree".
So I was thinking on creating a EventEntityFactory where I can bind together the child entities of the Event enity, I was planning on doing a factory for this.
Is there a better way of doing this?
Thanks
One approach would be to use Virtual Proxies (with lazy loading):
http://phpmaster.com/intro-to-virtual-proxies-1/
http://phpmaster.com/intro-to-virtual-proxies-2/
Basically you would generate your entity, and replace any related entities with a light weight proxy object. this object would only load the related entity when required via lazy loading.
I've used this approach many times along with the Datamapper design pattern and it works very well.
I am refactoring an MVC project to make it testable. Currently the Controller uses the Entity Framework's context objects directly to ask for the required data. I started abstract this and it just doesn't work. Eventually I have an IService and an IRepository abstraction, but to describe the problem let's just look at the IRepository. Many people advise an interface with functions which return some of these: IQueriable<...>, IEnumerable<...>, IList<...>, SomeEntityObject, SomeDTO. Then when one wants to test the service layer they can implement the interface with a class which doesn't go to the database to return these.
Problem: Using linq to entities I have lazy (deferred) loading in my toolset. This is actually very useful, because my controller action functions know which data they need for the view and I didn't ask for more than required. However linq to anythingelse doesn't have lazy loading. So when my IRepository functions return any of the above mentioned things I lose lazy loading. I extended my interface with functions like "GetAnything" and "GetAnythingDeep" but it's not enough: it has to be much more fine-grained. Which would result about 5-6 functions for the same type of object, depending on the properties I want to get in the result. Maybe could be a general function with some "include properties" parameter, but I don't like that too.
Eventually atm I think if I want to make it testable that will result either much less efficient or much more complicated code. Sounds not right.
Btw I was thinking about to change the data source behind the entity model to either xml or some object data soruce, and so I could keep the linq to entities. I found that it's not supported out of the box... which is also sad: this means that entity framework means database source - not a really useful abstraction.
Specific example:
Entity objects:
Article, Language, Person. Relations: Article can have 1-N languages, and one Person (publisher).
ViewModel object:
ArticleDeepViewModel: Contains all the properties of the article, including the languages and the Name of the Person (it's for view the article, so no need for the other properties of the person).
Controller action which will return this view should get the data from somewhere.
Code before modifications:
using (var context = new Entities.Articles())
{
var article = (from a in context.Articles.Include("Languages")
where a.ID == ID
select new ViewArticleViewModel()
{
ID = a.ID,
Headline = a.Headline,
Summary = a.Summary,
Body = a.Body,
CreatedBy = a.CreatedByEntity.Name,
CreatedDate = a.CreatedDate,
Languages = (from l in context.Languages select new ViewLanguagesViewModel() { ID = l.ID, Name = l.Name, Selected = a.Languages.Contains(l) })}).Single();
this.ViewData.Model = article;
}
return View();
Code after modifications could be something like:
var article = ArticleService.GetArticleDeep(ID);
var viewModel = /* mapping */
this.ViewData.Model = viewModel;
return View();
Problem is that GetArticleDeep should return an Article object with Languages included and the entire Person object included (it shouldn't know that the viewmodel needs just the Name of the Person). Also I have so far 3 different viewmodels for an article. For example if someone wants to see the list of articles, then it's unnecessary to get the languages, the body and some other properties, however it might be useful to get the Name of the publisher (which is in the deep). Before "testable" code the controller actions could just contain the linq to entities query and get whichever data they need using lazy loading, Include function, using subqueries, referencing foreign properties (Publisher.Name) ... So there is no unnecessary query to the database and no unnecessary data transferred from the database.
What should be the IService or IRepository interface provide to get the 3-4 different level of Article objects or sometimes list of these objects?
Not sure if you are planning to stick with lazy loading, but if you want a flexible way to integrate eager loading into your repository and service layers first check out this article:
http://blogs.msdn.com/b/alexj/archive/2009/07/25/tip-28-how-to-implement-include-strategies.aspx
He basically gives you a way to build a strongly-typed include strategy like this:
var strategy = new IncludeStrategy<Article>();
strategy.Include(a => a.Author);
Which can then be passed into a general method on your repository or service layers. This way you don't have to have a separate method for each circumstance (i.e. your GetArticleDeep method).
Here is an example repository method using the above include strategy:
public IQueryable<Article> Find(Expression<Func<Article, bool>> criteria, IncludeStrategy<Article> includes)
{
var query = includes.ApplyTo(context.Articles).Where(criteria);
return query;
}