Using ef4 code first you can create and compile classes and dbcontext. What happens when you want to add some classes/tables and relationships in an already compiled dll of a model set?
So far the solutions I have come up with are using "partial" classes, that would be complimented later on, and the second one is writing a whole new dbcontext that includes the first one in some way or extending it, but this would mean additional db connection per module (per db context). Any ideas about this? What's the best practice ? Also I need to be able to work with migrations.
More explicitly, a possible scenario is as follows:
A) You create a .dll with some dbContextBase class and tables(classes) inside that.
B) You create other .dlls that depend/extend dbContextBase in their own way*
C) You refference said .dlls in a project and extend them.
So basically you can have a core dbContext, then add a menu module to it, then you add a blog module to it (but it can be seen by the Menu module in order to create latest blog posts menus etc). On top of that, if you want a specific one-time feature for blog you can quickly integrate that, but also keep your blog module updateable.
As I beggin to see it the best way to do that is Nuget packages with the source code for the models (and the like) per module, instead of compiled dll.
You can build some infrastructure in your core assemblies which will discover entities in your modules and register them to single context. Each entity must have class derived from EntityTypeConfiguration<> (or ComplexTypeConfiguration<> for complex types) which will describe the mapping.
Once you have mapping classes you can either use some module interface to collect all of them for every module or use reflection to browse assemblies and create instances of mapping classes. These classes can be either use in by DbModelBuilder directly (either in OnModelCreating or directly).
Also I need to be able to work with migrations.
I'm not sure if migrations are ready for this because it has some preconditions:
All shared tables must be handled by the core assemblies - its own DbMigration derived class (or classes for new versions)
Every module must handle its own tables - its own DbMigration derived class (or classes for new versions)
Modules mustn't alter shared tables
Modules mustn't alter or access tables of other modules
It means that you have special migration set for core and one migration set for every module. Every migration set is defined in separate assembly - this can be potential problem. I didn't try it myself so I don't know if EF migrations can handle this - I especially target scenarios where you really want modular systems where modules can be added or removed over time so you need both installation (Up method) and uninstallation (Down method).
The problem with migrations is that you cannot for those must and mustn't so if you develop the platform where people can add custom modules you never know if they don't break your core.
Since there is no answer that focuses on the problem the way I put it, I am posting an answer with what seems to be the best workaround at this moment.
For full support of migrations, even custom migrations, and full support in general for code-first design, the best method is to import the source codes and compile directly.
We are using a local nuget feed in order to be able to sync multiple sub-modules freely and swiftly. This also leads to a good update experience since the migrations can easily be created or imported/integrated when needed
What about this scenario: one DbContext with some entities, on OnModelCreating, looks up additional classes on external assemblies which inherit from base classes on the assembly where this DbContext lives. I want to be able to update the already created database according to these classes, assuming they don't change base tables, only possibly add new ones. Is this possible? So far, from my experiences, using MigrateDatabaseToLatestVersion, it merely ignores the new entities, that is, does not generate any new tables.
Related
I'm using Entity Framework Power Tools Reverse Engineer Code First to generate my poco classes, mapping files, and context from the database. I would also like for this process to create my base partial Validation classes for each entity. I am fine with writing the t4 template, but is there a way I can shoehorn that into the process when I run the Reverse Engineer Code First process?
I may be misunderstanding your question, but if not, one way to do it is to add the code to Entity.tt and have your class files include both the POCO class and your validation class. It's generated code that you shouldn't have to be referencing much if at all outside of intellisense in the calling code.
Alternatively maybe you could add a new validation class generating .tt file to the ReverseEngineerCodeFirst folder, but I haven't tried it and it wouldn't surprise me if running ReverseEngineer Code First would not actually run it.
I recently customized the Entity Framework PowerTools to produce an interface for the model context. For this purpose, I added an Interface.tt template. You can review my changes at https://entityframework.codeplex.com/SourceControl/network/forks/khawajaumarfarooq/PowerToolEnhancements.
The source code does have to be modified to add extra templates to be processed when reverse engineering POCO classes.
The source code would also have to be modified if you want to generate additional files and have them be included in the project programmatically, as opposed to adding them in yourself after code generation is complete.
We are currently looking at the newest version (2.60) of NopCommerce in MVC and we will be integrating it pretty soon…We’ve downloaded the Source Code and paid the 20$ for the User Guide documentation. The documentation is great! I mean…it is great in the sense that it explains how to deploy, install and how to work around the UI Frontend and Backend. This is great for an overall overview but what it lacks is the understanding of how to work with NopCommerce as a team. What are/is the best practices etc...
As an example (or parallel), if you decide to work with Dotnetnuke as a team, you usually work in the following fashion:
Each developer downloads/installs Dotnetnuke locally on their
machine.
You also download/install Dotnetnuke on a dedicated server (let’s say
dev-server).
As a developer, you work and create modules which you test locally
within your Dotnetnuke installation.
Once it is done, you package your module (and any SQL scripts that
comes with your module) into a zip file.
Once the package is ready, you upload/install that package on the
dedicated server (dev-server).
This approach works great for Dotnetnuke and more importantly if you have a team of developers creating modules.
My question is how does a team work with NopCommerce MVC?
I’m assuming it is a bad idea to directly work within the source code in case your team decides to modify core elements/source which will make any upgrade to newer versions impossible (or break changes).
I’m not sure if my parallel to Dotnetnuke is a correct one…but would anyone have any idea (or help me clarify) how does a team work with NopCommerce MVC.
In addition, should the team only rely on creating plugins for NopCommerce and stay away from modifying the core or should this be irrelevant?
What about adding new objects in SQL (or modifying existing ones) should we prefix our objects in case an eventual NopCommerce MVC upgrade creates similar objects and/or overwrites them?
Thank you for helping me shed some light on this.
Sincerely
Vince
Plugins in NopCommerce are almost like modules in DNN. Depending on what you need to do, it sometimes is necessary to modify the core code.
What I've been doing for the Services is create a new class and inherit from the existing service, then override the function you want to change. Create a new DependencyRegistrar class and set your new service classes as the implementation for that particular interface. Also make sure the Order property is 1 so that your DR class is loaded after the stock one. Since you're inheriting from the core class, any functions you didn't override will be handled by the parent class. If I need to add a new function, I'm just modifying the interface, putting a stub in the stock class, and implementing it in my own.
Views in the Nop.Web project can be overridden by Themes. The Admin stuff and the Web Controllers get trickier. I'm just modifying those files directly.
The Core and Data classes can be done using partial classes to add your new fields.
In any case you will still need to merge changes with your solution when an update is released. My opinion is that you are better off writing clean, readable code now and bite the merge bullet when it comes.
I don't really worry about SQL scripts right now because I'm a single developer but maybe you add a folder for ALTER scripts and name them after the day they were created. Then each dev knows which scripts they need to run when they get latest.
I know that this information is available somewhere but I obviously don't know how to search for it using the right keywords.
I have downloaded the NerdDinner code and also have the e-book. I have followed the example in the book as well though I have not completed it yet. But my question is really very simple.
I want to follow a repository design pattern and I keep seeing "Visual Studio automatically generates .NET classes that represent the models and database relationships defned using the Entity Framework Designer. An ObjectContext class is also generated for each Entity Framework Designer fle added to the solution." in some phrase or another. But when I create an Entity Framework Project a .designer.cs file is created and basically has all the class entities contained in it which confirms the 2nd portion of the statement. However, I don't automagically get separate class files generated for those entities.
How do I get that? I know I could comb through the designer file and gut out the class declarations for each entity and create a separate file for each of them but it seems like a trivial way to do it like that. So what is the right way???
Is there a tool or some documentation that I can refer for the proper way to create separate Entity Class files?
It's been awhile since I last used Entity Framework but:
If you use the designer to create your model you can use partial classes in separate files to extend or build upon.
There is/was an extension project using Visual Studio that will generate POCO classes based on your EF designer model. You can use those as a one off tool and after that continue working with the resulting classes, you may need to continuously fix your mappings after that point though. Not sure if this POCO template is still current, look for Entity Framework and POCO.
MS has been working on better code first support, my preferred way of working. I haven't looked into it yet, I assume it will try to auto-generate mapping and database based on your classes/entity definitions.
My application has at least 2 projects, the OrganizationMangement and the ContactManagement. Previously, I referenced ContactManagement because I need to work with a class located in the OrganizationManagement. Now I need to do the reverse but I'm getting the following error "Cannot reference OrganizationManagement ... to avoid circular reference."
I guess the error makes sense. But, how can I avoid that? I mean I need those classes just to transfert data from one project to another.
To solve the problem, I copied the class to a folder located in the other project and tried this
var contact = (Contact)TempData["contact"];
Now, I'm getting this error: Cannot convert implicitly ContactManagement.Contact to OrganizationManagement.Contact... an explicit conversion exists...
Thanks for helping
It may not be as bad as you think - a project can always refer to its own classes without needing an explicit reference.
However, when this kind of structural problem crops up, it's usually telling you there's a flaw in the overall design.
I'd guess that there's an implicit third project that you haven't defined yet, that your Organization and Contact projects will each need a reference to. Once you've moved the class in question into the new project, create a reference to it in each of your existing ones, and you'll be all set.
Of course, this may necessitate other structural changes - sorting out this kind of problem can turn out to be a real can of worms.
Bottom line: circular references usually indicate that there's a bit more thought needed to work out what the dependencies in your object model really are.
Maybe you should refactor all the common domain-related classes to their own new project that can then be referenced by the other projects.
On the other hand I could suggest a very bad workaround for the error you encounter by defining an explicit conversion between the structurally identical classes that only differ in name. But please don't do so. Put all the domain stuff (i.e. entitiy-related classes like Contact, Person, Organization, Customer etc.) to its own project and reference this from the projects that need the classes.
Compile both projects and copy the assemblies to a folder "libs". Dont reference the projects but the compiled assemblies.
This works for me. I have 1 project Shop with a Backend ShopBackend. Then I have a project MarketPlace with a a backend MarketPlaceBackend. Both main projects have not a lot in common. Markeplace is a very small application.
The ShopBackend has a class Order that accesses the ShopDatabase. In MarketPlace I use the Assembly ShopBackend to get a list of orders.
On the other hand in Shop I need a list of the participants of MarketPlace. Thats why I call the MarketPlace assembly there.
Yes this design hurts sometimes: It is hard to get the versions right if the signature of methods changed. But in my case applications are realy separate and they have separate databases.
What I want to say: a circular reference can easily be a design issue, but it doesnt have to be. Imagine that both applications came from different companies. I gues it would be ok if Microsoft.dll uses some methods off google.dll and google.dll uses methods of microsoft.dll.
When building ASP.NET projects there is a certain amount of boilerplate, or plumbing that needs to be done, which is often identical across projects. This is especially the case with MVC and ALT.NET approaches. [I'm thinking of things such as: IoC, ORM, Solution structure (projects), Session Management, User Management, I18n etc.]
I would like to know what approach you find best for 'reusing' this plumbing across projects?
Have a 'master solution' which you duplicate and rename somehow? (I'm using a this to a degree at the moment, but it's fairly messy. Would be interested how people do this 'better')
Mainly rely on Shared Library projects? (I find this appropriate for some things, but too restrictive for things that have to be customised)
Code generation tools, such as T4? (Similar to the approach used by SharpArchitecture - have not tried this myself)
Something else?
Visual Studio supports Custom Templates.
I definitely (mostly!) go for T4 templates in conjunction with a modified version of SubSonic 3. I kind of use the database to model my domain and then use the T4 templates to generate the model and associated controllers and views. It takes about 50-60% of the effort out and keeps a consistency in place.
I then work on overrides (partials) of the classes along with filters and extension methods to 'make the app'. Now that I'm familiar with the environment and what I'm doing, I can have a basic model with good plumbing in place in a very short space of time. More importantly, because I create a set of partial class files, I can regenerate all I want without losing any of my 'custom' coding.
It works for me anyway :)
You could do it the bearded, t-shirted, agile style and create a nice template and put it in sourcecontrol. So when you need a new project, you just checkout the template?
For insanely fast MVC site setup, I use modified T4 templates (created with T4 Editor) and with ALOT of help from Oleg Sych's blogs for page generation (for your typical add/edit/index pages) combined with an awesome implementation of an automated create-update-delete called MVCCrud (if LINQ-to-SQL is your preferred data access method)
Using modified T4 templates and MVCCrud you can create fully functional entities (Create/Edit/List/Delete) with error handling and intuitive error messages in about 4 minutes for each.
I create a new project using the new project wizard so that I get unique project GUIDs assigned. Then I would use "Add Existing Item" to copy items from similar projects if it made sense to do so.
I sometimes use a file diff tool to copy references from one project to another, otherwise I just add the references by hand. A file diff tool can also be used to include similar source files, but the underlying files have to be copied anyway, so I prefer "Add Existing Item".
I've used T4 to generate solution and project files, but that definitely seems like an edge case and not something that would normally be necessary. In that case, I'd probably wrap the T4 in a PowerShell like script to create and populate the rest of the directory structure.
I use "shared libraries" pretty aggressively in general, but not specifically due to this scenario.
In general, I don't find myself reusing plumbing between projects much. It's probably more often that I hack away in one "prototype" project, then abandon it, and rebuild the project from scratch following the above approach and only bring over the "non-hacky" code.
I'm creating a MVC2 application template at http://erictopia.com. It will contain all the basic items I think should be in a MVC project. These include BDD specifications, an ORM (NHibernate and possibly Lightspeed), T4 templates, custom providers, ELMAH support, CSS/Javascript minifier, etc.