How to change schema name on runtime for EntityManager - entity-framework-6

Our application connects to oracle.
It should change schema name on runtime, otherwise it causes exception to connect to different schema.
Current solution is to remove Schema="XXX" in edmx file before release.
Do you have any better idea?

DevForce itself doesn't provide anything to change the schema at runtime. You might be able to implement an EF IDbConnectionInterceptor to do this, although I haven't tried. Take a look at the docs for this interface, as well as some older tricks others have used with EF to change the Oracle schema at runtime.

Related

VisualStudio 2013 persisting to mystery db during project run

Environment:
Windows 8.1
Visual Studio 2013
Project type: ASP.NET MVC
Debugging in IIS Express.
DotNet: 4.5
Database: SQLExpress 2012
EntityFramework 5
When I run my solution (F5), in Debug or Release configuration, I can manipulate data through EF with no issues; data changes persist between views. If I query the database in Management Studio however, none of the updates are reflected in it. If I update a record in Management Studio, the changes aren't reflected in my running solution either.
If I then stop and restart, or even just stop and do a build (CTRL, SHIFT, B) in VS, my data in the web application all reverts back to the state matching that of my database through Management Studio.
If I add a trace to the database, I can see reads, but no writes coming through to the db. Additionally, if I stop the SQLExpress service, my pages throw "SQL Server service has been paused/stopped" exceptions. So bizarrely enough, it looks like it's reading from the correct database, but may be writing to a development cache somewhere?
This leads me to think that on every build, a copy of the db is being used for that debug/run session's state.
So the question then becomes, where is this being set, and where is the temp db living? I have scoured my web.config, web.debug.config, web.release.config, but there is no reference to an alternate database.
I have looked in the /App_Data and /bin folders, but there's no extra database there either. I even resorted to watch the filesystem using procmon for any file operations performed by VS with a build, but I couldn't find anything of note (there is tons of data, so may have missed something).
I have a couple of debug statements spitting out the connectionstring being used by EF, and can confirm that it's pointing the the correct SQLExpress instance.
System.Diagnostics.Debug.WriteLine("Conn String: " + ctx.Database.Connection.ConnectionString);
The only other possibility is that EF is suddenly holding a large cache. I doubt this though as I trace the DB frequently and updates generally happen immediately.
This behaviour is relatively new, but don't know exactly when it started. The only significant change was the VS upgrade from 2012 to 2013, but can't be sure it correlates with the upgrade.
Anyway, I'm now at an end of my tether, and would love any suggestions that I could follow.
OK, I figured it out. So for anyone else having similar issues, it relates to synchronising EF contexts.
I was declaring my classes with a static context reference to save having to declare it in every method thus:
public class MyClass : Controller
{
private static MyContext db = new MyContext();
...
}
Being static, as you would expect, it's evaluated at start-up, and held in memory.
Adding that to the fact that I was changing properties on my objects retrieved from the static context, but updating to a different context (helper function confusion), all resulted in the confused state I was seeing.
So the moral of the story:
Don't use static context references. Declare them as you need them.
Double check that you're retrieving and updating to the same context.
Visual studio installs sqlexpress, normally this is what code first uses. You can use management studio to to connect to the express instance. It's also possible that it's using localdb by default for vs 2013 still optional in 2012.

How to work with NopCommerce MVC as a team

We are currently looking at the newest version (2.60) of NopCommerce in MVC and we will be integrating it pretty soon…We’ve downloaded the Source Code and paid the 20$ for the User Guide documentation. The documentation is great! I mean…it is great in the sense that it explains how to deploy, install and how to work around the UI Frontend and Backend. This is great for an overall overview but what it lacks is the understanding of how to work with NopCommerce as a team. What are/is the best practices etc...
As an example (or parallel), if you decide to work with Dotnetnuke as a team, you usually work in the following fashion:
Each developer downloads/installs Dotnetnuke locally on their
machine.
You also download/install Dotnetnuke on a dedicated server (let’s say
dev-server).
As a developer, you work and create modules which you test locally
within your Dotnetnuke installation.
Once it is done, you package your module (and any SQL scripts that
comes with your module) into a zip file.
Once the package is ready, you upload/install that package on the
dedicated server (dev-server).
This approach works great for Dotnetnuke and more importantly if you have a team of developers creating modules.
My question is how does a team work with NopCommerce MVC?
I’m assuming it is a bad idea to directly work within the source code in case your team decides to modify core elements/source which will make any upgrade to newer versions impossible (or break changes).
I’m not sure if my parallel to Dotnetnuke is a correct one…but would anyone have any idea (or help me clarify) how does a team work with NopCommerce MVC.
In addition, should the team only rely on creating plugins for NopCommerce and stay away from modifying the core or should this be irrelevant?
What about adding new objects in SQL (or modifying existing ones) should we prefix our objects in case an eventual NopCommerce MVC upgrade creates similar objects and/or overwrites them?
Thank you for helping me shed some light on this.
Sincerely
Vince
Plugins in NopCommerce are almost like modules in DNN. Depending on what you need to do, it sometimes is necessary to modify the core code.
What I've been doing for the Services is create a new class and inherit from the existing service, then override the function you want to change. Create a new DependencyRegistrar class and set your new service classes as the implementation for that particular interface. Also make sure the Order property is 1 so that your DR class is loaded after the stock one. Since you're inheriting from the core class, any functions you didn't override will be handled by the parent class. If I need to add a new function, I'm just modifying the interface, putting a stub in the stock class, and implementing it in my own.
Views in the Nop.Web project can be overridden by Themes. The Admin stuff and the Web Controllers get trickier. I'm just modifying those files directly.
The Core and Data classes can be done using partial classes to add your new fields.
In any case you will still need to merge changes with your solution when an update is released. My opinion is that you are better off writing clean, readable code now and bite the merge bullet when it comes.
I don't really worry about SQL scripts right now because I'm a single developer but maybe you add a folder for ALTER scripts and name them after the day they were created. Then each dev knows which scripts they need to run when they get latest.

Entity Framework not working when in separate project from MVC3 web application project

I have an Entity Framework Project and a repository class in a separate project from my MVC3 web application. I have established a reference in my MVC project to the Entity Framework data project so i can instantiate an instance of the repository and call the methods thereof. However I get the error:
The specified named connection is either not found in the configuration, not intended to be used with the EntityClient provider, or not valid.
I've run into this before and I believe the solution is to include the connection string from the entity framework app.config file in the MVC web.config file.
This doesn't rest well with me. It feels like there should be another way that would make projects less tightly coupled together. I'm I dreaming or is there a better practice that would allow me just to make call to the referenced dll and be done with it?
Thanks
The app.config file that is included in the DLL of your Entity Framework project contains a Connection String that is used by the EDMX designer to find the target database when running an 'Update Model from Database' command.
When deploying your application, the only configuration file that is known is the web.config. The app.config file from your EF dll is not used in production.
So in your web.config you include the connection string that is used when you are running your MVC application. When using transformations you can also specify different connection strings for different deployment scenarios (test and production for example).
So it's not like you are introducing some sort of coupling. You are just using the configuration methods that .NET offers you.
There are ways, hard-coding the connection string in your repository and using it when you create the context comes to mind, but you most certainly don't want to use them. The right way to handle it is through the configuration file. You really don't want to have it use the configuration file from the DLL, since that would give you less control over what connection string you're using. This would make it harder, rather than easier, to have different connection strings for integration testing, staging, and production. While it's possible to combine the approach (fixed connection string that can be overridden by a configuration setting), having used both my preference is for a completely configuration driven approach. I like the single convention and the one-time step of updating the Web.Config (and any transforms) with the correct configuration setting seems little cost to pay for the simple convention of using the configuration always.
I don't understand how putting a connection string in the MVC project's config file makes it "tightly coupled". The config files themselves are a source for loose coupling. You can always change connection strings using config transforms, meaning you can switch the connection string just by choosing a different solution configuration.

EF code first modular design

Using ef4 code first you can create and compile classes and dbcontext. What happens when you want to add some classes/tables and relationships in an already compiled dll of a model set?
So far the solutions I have come up with are using "partial" classes, that would be complimented later on, and the second one is writing a whole new dbcontext that includes the first one in some way or extending it, but this would mean additional db connection per module (per db context). Any ideas about this? What's the best practice ? Also I need to be able to work with migrations.
More explicitly, a possible scenario is as follows:
A) You create a .dll with some dbContextBase class and tables(classes) inside that.
B) You create other .dlls that depend/extend dbContextBase in their own way*
C) You refference said .dlls in a project and extend them.
So basically you can have a core dbContext, then add a menu module to it, then you add a blog module to it (but it can be seen by the Menu module in order to create latest blog posts menus etc). On top of that, if you want a specific one-time feature for blog you can quickly integrate that, but also keep your blog module updateable.
As I beggin to see it the best way to do that is Nuget packages with the source code for the models (and the like) per module, instead of compiled dll.
You can build some infrastructure in your core assemblies which will discover entities in your modules and register them to single context. Each entity must have class derived from EntityTypeConfiguration<> (or ComplexTypeConfiguration<> for complex types) which will describe the mapping.
Once you have mapping classes you can either use some module interface to collect all of them for every module or use reflection to browse assemblies and create instances of mapping classes. These classes can be either use in by DbModelBuilder directly (either in OnModelCreating or directly).
Also I need to be able to work with migrations.
I'm not sure if migrations are ready for this because it has some preconditions:
All shared tables must be handled by the core assemblies - its own DbMigration derived class (or classes for new versions)
Every module must handle its own tables - its own DbMigration derived class (or classes for new versions)
Modules mustn't alter shared tables
Modules mustn't alter or access tables of other modules
It means that you have special migration set for core and one migration set for every module. Every migration set is defined in separate assembly - this can be potential problem. I didn't try it myself so I don't know if EF migrations can handle this - I especially target scenarios where you really want modular systems where modules can be added or removed over time so you need both installation (Up method) and uninstallation (Down method).
The problem with migrations is that you cannot for those must and mustn't so if you develop the platform where people can add custom modules you never know if they don't break your core.
Since there is no answer that focuses on the problem the way I put it, I am posting an answer with what seems to be the best workaround at this moment.
For full support of migrations, even custom migrations, and full support in general for code-first design, the best method is to import the source codes and compile directly.
We are using a local nuget feed in order to be able to sync multiple sub-modules freely and swiftly. This also leads to a good update experience since the migrations can easily be created or imported/integrated when needed
What about this scenario: one DbContext with some entities, on OnModelCreating, looks up additional classes on external assemblies which inherit from base classes on the assembly where this DbContext lives. I want to be able to update the already created database according to these classes, assuming they don't change base tables, only possibly add new ones. Is this possible? So far, from my experiences, using MigrateDatabaseToLatestVersion, it merely ignores the new entities, that is, does not generate any new tables.

no sql profiling for Entity Framework Code First

I've been struggling with this for a while now, and I see that I'm not the only one with the problem (see this and that).
I've managed to debug for a bit, and found a solution, though I'm pretty sure that this is not the 'right' way.
The first debug session (before the dev server was enabled) showed that the ProfiledDbConnectionFactory and ProfiledDbConnection classes provide the required data, but then AFTER the connection is created, the static Instance property on ProfiledDbProviderFactory is initialized (by calling the default constructor) and apparently CreateConnection() is run on that instance resulting with a null reference exception (tail is null).
I've managed to solve this by running
ProfiledDbProviderFactory.Instance.InitProfiledDbProviderFactory(_profiler, ripInnerProvider(_conn));
at the end of ProfiledDbConnection(DbConnection connection, IDbProfiler profiler).
This allows me to view the sql profiling, but as I wrote, I have a feeling that this isn't the correct fix.
Here's the sample code I used.
Not sure if there is something wrong with my environment, or my code, as I have a feeling that this should work out of the box. Any comments/suggestions?
Sam?
there is nothing wrong with using ProfiledDbConnectionFactory entity framework is designed to support ignorance i generally just use the debugger tools and create a breakpoint to view my generated sql statements as the code first framework provides the sql statement readily there
i would stick to using profiled connections the way you described in your unit testing as opposed to using it in production code as the profiling may be a performance hit you way also want to consider using the sql profiler included in sql server and there is also an sql profiler available as a visual studio extension however i am not sure if it supports code first as of yet
This problem has been resolved in version 1.9.1 of MiniProfiler.EF

Resources