Entity Framework Model from two databases - entity-framework-4

Is it possible to have an entity model created out of two databases, with all the relationships between tables reflected in the model? How do you do that?
I am referring to ADO.NET Entity Framework version 4.

Maybe. It depends on several factors.
If using a modern version of SQL Server, there is a feature called Synonyms, which allows you to "map" a table from one database into another.
Unfortunately, the EF Data designer doesn't recognize or understand Synonyms. There is a way to manually merge two .EDMX files into one, but this is a huge pain.
When using Synonyms, I prefer to use the Code First approach. I use the Entity Framework Power Tools extension to reverse engineer the tables to a code first model, and this works fine with synonyms.
If you want to use .EDMX, then you can try the approach here:
http://rachel53461.wordpress.com/2011/05/22/tricking-ef-to-span-multiple-databases/

Related

Should a database table exist in more than one edmx file?

Let's say I have an existing database with about 90 tables. I've seen comments that state including them all into one big edmx file is not considered good practice. Suppose I have logical groupings like HR, Legal, and Accounting that I can use to create multiple edmx files. That makes sense. However, what I don't know is what to do if each of these logical groupings would contain a foreign key to commonly used tables (like employee, address, etc). Should each edmx file contain these tables as well, or is there a better way to handle this?
On a side note, when creating an edmx file, how small is too small? Is a context with 5 entities too small? 2? Is there a general rule of thumb?
Any guidance is appreciated!
From the runtime perspective it should not really matter whether you split your model to multiple edmx files or not. 90 entities should be fine but you may start seeing some delay when your app starts. If you experience this you may want pregenerate views which should address the issue. The EF designer is known to be slow if you have many entities. The EF Designer in VS 2012 however allows to have multiple diagrams per model to visualize subsections of your overall model.
If you think you will be able to manage the model easily without splitting it then you can try going with just one model. If it becomes unmanagable then you can think about splitting.

Any MVC 3 tutorials on *Real World* developement situations?

MVC total noobe here but long time web developer - 10 + years. The tutorials for MVC 3 (and earlier versions) are great but as usual they lack a ton for real on the job type scenarios.
For example, how often do you find yourself in a situation where you are going to create a new database from scratch with no stored procs so that you could actually use EF Code-First. I don't know about you but in my career it has been NEVER.
The usual story is that you are creating a new app or enhancing an existing app with new functionality that will connect to an existing very mature database with tons of stored procs, user defined functions and views and you are required either by management or time restrictions to use it all. And of course you may get to create some new tables but they usually will have joins to existing tables or in the least your app will have to query existing tables for some of the data.
To see a tutorial based on that scenario would be WORTH IT'S WEIGHT IN GOLD. Especially the stored procedure scenario.
Thank you for any advice
Most of the earlier examples (e.g. the original NerdDinner) were based on either Linq to Sql or Entity Framework (without CodeFirst). Since CodeFirst is the 'new hotness' most of the latest examples use it.
The interesting part of the question, though, is that it highlights an important point: "it doesn't matter". Your data access strategy (EF, EF code first, NHibernate, L2S, raw SQL) is totally irrelevant to MVC. By that I don't mean it's unimportant, I mean that MVC doesn't place any constraints on you at all in that regard.
You will generally (in a well-designed MVC app) pass your controllers interfaces which let them call data access methods of various sorts (or perhaps another layer of indirection with services that do other things before hitting the storage layer). The implementation of the data access, if it is using an ORM like EF or Nhibernate, will then have mechanisms for you to either use a query syntax of some sort (e.g. LINQ) or to call stored procedures (possible in all major ORMs that Iv'e used) or to push raw SQL if hte situation calls for it.

Entity Framework 4: Does it make sense to create a single diagram for all entities?

I wrote a few assumptions regarding Entity Framework, then a few questions (so please correct where I am wrong). I am trying to use POCOs with EF 4.
My assumptions:
Only one data context can exist for an EF diagram.
Data Contexts can refer to more than one entity.
If you have two data sources, say MS SQL server and Oracle, EF requires two different diagrams to access the data.
The EF diagram data context is the "Unit of Work", having a single Save() for anything on the diagram. (Sure you could wrap it in a UnitOfWork class, but it essentially has the same duties).
Assuming that's correct, here are my questions:
If you don't keep all entities on the same EF diagram, how do you maintain data integrity, like "Orders" cannot exist without a "Customer"? Is this solely a function of the repository to load data just to verify integrity, or do we "try/catch" on database referential integrity errors?
Wouldn't you create an EF diagram for each Entity? For example, I wouldn't expect changes to a customer and changes to a product to be written together as they have nothing to do with each other (having them on the same diagram would cause them to be written together). Or is the scope of an EF diagram to encompass all similar entities stored in the same storage medium?
Is it the norm to divide up the entities like that, or just have a single diagram holding all the entities? I would think the latter, but the thinking is getting the better of me.
Having one big EDM containing all the entities generally is NOT a good practice and is not recommended.
Using one large EDM will cause several issues such as:
Performance Issue in Metadata Load Times:
As the size of the schema files increase, the time it takes to parse and create an in-memory model for this metadata would also increase.
Performance Issue in View Generation:
View generation is a process that compiles the declarative mapping provided by the user into client side Entity Sql views that will be used to query and store Entities to the database. The process runs the first time either a query or SaveChanges happens. The performance of view generation step not only depends on the size of your model but also on how interconnected the model is. If two Entities are connected via an inheritance chain or an Association, they are said to be connected. Similarly if two tables are connected via a foreign key, they are connected. As the number of connected Entities and tables in your schemas increase, the view generation cost increases.
Cluttered Designer Surface:
When you generate an Edm model from a big database schema, the designer surface is cluttered with a lot of Entities and it would be hard to make sense of how your Entity model in total looks like. If you don’t have a good overview of the Entity Model, how are you going to customize it?
Intellisense experience is not great:
When you generate an Edm model from a database with say 1000 tables, you will end up with 1000 different entity sets. Imagine how your intellisense experience would be when you type “context.” in the VS code window.
Cluttered CLR Namespaces:
Since a model schema will have a single EDM namespace, the generated code will place the classes in a single namespace.
For a more detailed discussion, have a look at Working With Large Models In Entity Framework – Part 1
Solution:
While there is no out of the box solution for this but it suggests that instead, you should Naturally Disconnected Subsets in your model meaning that based on your domain model, you should come up with different sets of domain models each containing related objects while each set is unrelated and disconnected from the other one. No Foreign Keys in between could be a good sign for separation. And this make sense because in a large model, usually your application does not require all the tables in a database to be mapped to one Entity Model in order to work.
Even if this kind of separation is not 100% possible - meaning that there are subsets of tables that have out going foreign keys to other tables in the database - it still encourages you do separate them. When you do this, you would have to take the responsibility of setting the foreign key appropriately. There would be no navigation property that allows you to get the Entity that represents this foreign key. Of course you could manually query for this Entity in the other container if needed.
Also, for some tips and tricks on how you can split one large entity model into smaller ones while reusing types, take a look at: Working With Large Models In Entity Framework – Part 2
About your question: Order and Customer belong to the same natural domain and should be kept in the same EDM. Like I said, you can scatter them over 2 different entity data models but then you have to take the responsibility of setting the appropriate foreign keys or you'll get runtime exceptions, by the same token, Customer and Product should be kept in separate entity data models. Following these rules, you can come up with a well defined domain set design in your data access layer.
I realize that this question was about EF4 but I am sure that many people who are just now "making the switch" will end up here via Google and will read this and the approved answer and make decisions based on it even though they are using EF5 (or EF4.4 if you are stuck on .Net 4.0)
EF5 allows multiple diagrams per edmx. This is a big deal, at least to my team, because it allows us to visually separate entities without requiring separate edmx files. Dr. Zim's points are all still valid except (obviously) the "cluttered designer surface".
There are draw backs to having multiple edmx files, the biggest one is that even if you create separate namespaces for each, you cannot duplicate entity names. Yes, if you truly are designing your system "code first" then this should not be a problem. However, many (most) of us are adding EF to existing systems that are already built on top of relational databases which have normalization.
"But normalization is a good thing, right?" Well, if you are using a relational database yes. "But why does that matter if I am using EF?" A common "normalized" table is Address. Possible scenario: Company (location of business/office) and Contact (might be "remote" worker so they are not at the business location) and they both have a FK that points to Address. Using one edmx file for Company and one for Contact (even with different namespaces) that both include the Address table, the code will compile but at run time you will get this beauty:
Multiple types with the name 'Address' exist in the EdmItemCollection
in different namespaces. Convention based mapping requires unique names
without regard to namespace in the EdmItemCollection
You can change the mapping that is used by EF but then you have other "issues" when working through implementation and most people use the default mapping so forums like this won't have many pertinent questions and answers.
You could also rename the Model name for the Address table to "ContactAddress" and "CompanyAddress" respectively, but that gives the illusion that they are different types when they really aren't. OK, so they are different types in EF but not in the database and, as I said, most of us "live" in the world of tacking on EF to an existing system with an existing data store that is a relational database.
This is already a long-winded "answer" so I will stop here. I just wanted to make sure that people who landed here because they searched for "multiple edmx" and did not realize that there are significant difference between EF4 and EF5 were made aware and realized they may need to do some more investigating.

Which ORM supports mapping existing databases?

So I have a layered ASP.NET MVC proof-of-concept application with good separation between presentation concerns, business logic, and infrastructure concerns. Right now it is operating off of a fake repository (i.e. LINQ queries against static IQueryable objects). I would like to create a functional SQL repository now.
That said, I don't want to simply tie it into a database that has a 1-1 mapping between tables and entities. That wouldn't meet the business need I am hoping to solve (partial integration with existing database - no hope for convention over configuration).
Do you have suggestions for which ORM / mapping tools I should consider and/or avoid?
Do you have suggestions for articles/books I could look at to help me approach this topic?
Would it be better to simply use parameterized queries in this scenario?
Entity Framework in version 4 would definitely allow you to:
have a mapping between the physical database schema and your conceptual schema, e.g. having an entity mapped to several tables, or several tables joined together forming a single business entity
grab data from views (instead of tables directly)
use stored procedures (where needed and appropriate) for INSERT, UPDATE, DELETE on every entity
NHibernate sounds like a good fit for what you are looking for. You will be able to make your repositories call queries in either HQL or using the API, either way you can get to your database and shape the data to fit the way your repository is being used. It will always be hard to make a square peg fit into a round hole though. SO has lots of nice support when you get into using NHibernate, good luck.
As you mentioned in the question, it is very debatable to choose an ORM. Different people will have different project needs. I am not exactly sure what will take priority for you. Here is what I have tried myself.
NHibernate seems to be the most commonly used ORM in DotNet projects. I feel it suffers from a typical open source problem. It offers so many features but the documentation really sucks. If you have lots of time at your disposal you can give it a shot.
Another options is to go for something like Entity Framework. Its very easy to set up and get up and running. With version 4.0 and the CTP there is provison for code first as well as fluent mapping and configuration. Since you have said you would want to keep the domain model separated EF 4 will help you because it has a notion of conceptual model which is an abstraction over the mapping layer.
You can refer to few links below for the blogs I had written based on my experience
http://nileshgule.blogspot.com/2010/08/entity-framework-hello-world.html
http://nileshgule.blogspot.com/2010/09/nhibernate-code-first-approach-with.html
http://nileshgule.blogspot.com/2010/09/entity-framework-first-query-using.html
http://nileshgule.blogspot.com/2010/09/entity-framework-learning-series.html

Model for ASP.NET MVC

I just started with ASP.NET MVC 1.0. I've read through some tutorials but I don't have any good ideas on how to build my model.
I've been experimenting with LINQ to SQL. Can't say I like it that much, but I'll give it a try. I prefer using SQL Stored Procedures, but it doesn't seem to work very good together with the optional parameters in my stored procedures.
The project is a prototype, but I would like to start with a good design right away.
The database will probably contain around 50 tables and stored procedures will be used for all queries and updates. Some procedures will also return multiple results. How would the best way to design this model?
All and any ideas are welcome. Should I even use LINQ to SQL? Should I design my stored procedures to only return one result? I feel a little bit lost.
You can use stored procedures with LINQ. You simply need to replace the autogenerated statements on your entities with your stored procs. Look at the properties for the entity (table) in the designer to change these. Your stored procedures won't be able to return multiple results as they need to map onto a collection of either one of your entities or an autogenerated entity, i.e, onto a collection of a single class.
Having said all that, I'd give it a go without using stored procedures. The LINQ code is fully parameterized so you already have this benefit. You'll find that LINQ allows you to easily build up queries alleviating your need for the optional parameters in your stored procs and, likely, resulting in more efficient queries as you don't need to build extra logic into your query to handle the optional parameters. Once you start using LINQ I think you'll find that the power and expressiveness in the code will make you forget about using your stored procedures.
You may still have a need for stored procs or table-valued functions for complex queries -- that's ok. If they map onto an existing entity, you can simply drag the proc/function onto the entity in the designer and you get a nice method on your data context that runs your SQL and returns collections of that entity. If not, I often create a view for the proc/function schema, use that as an entity, and attach the proc/function to it.
May I ask why you need to use stored procedures for this project? They have almost no value today if you are working with a modern database. Parameterized SQL using a good ORM will give you the same cached execution plans and security benefits that you are looking for with stored procedures.
I would recommend NHibernate because it's a far better way to achieve persistence ignorance. And IMHO this is the goal of anyone designing a model. You should be thinking object first instead of data first.
The best resource around is a screencast series by Steve Bohlen "The summer of nhibernate"
Also LINQ to NHibernate is available and works great for any dynamic query that you might need to write.
The only pain you will notice with NHibernate is the XML, but if you really enjoy the ORM you could write a fluent interface instead using FNH.
I found NHibernate to be the best tool around for modeling a domain because your objects are 100% infrastructure free and this allows you to do anything you want with them without having to worry about the database. Also if you are into unit testing of any kind this will make your life much easier :P
Edit
Part of the reason I replied to your question is that I have built many a system using stored procedures and found to regret that decision later. But in my situation I had a requirement by a dba so I couldn't step away from the norm.
If you are already looking at LINQ to SQL I would simply encourage you to look at NHibernate as another great option to achieve the goal of persistence ignorance.
When I got away from thinking about the problem data first it allowed me to work at a much higher level and take advantage of the .net platform for solving problems. TSQL has its place but if you are writing an application and want to maintain it - try to think about how your domain objects will work together in memory first.

Resources