Dependency Injection: How to inject when using a multi-project solution - dependency-injection

hope this question is not all too stupid, I'm trying to get a hold of more advanced programming principles and was thus trying to get used to Dependency Injection using Ninject.
So, my model is split up into several different .dll-projects. One project defines the model specifications (Interfaces) and a couple of others implement these interfaces. All model projects need to use some sort of database system, so they all need access to yet another .dll which implements all my database logic. It is important that all of them can access the same instance of my database object, though, so if wouldn't suffice to just create one instance for each model.
I'm not quite sure how to achieve this using dependency injection, though. My first thought was to create a seperate DI-project and bind all interfaces to their respective implementation, so the DI-project needed references to all the other projects (model interfaces & implementations, database system etc.). Then again, the models would need access to the DI project since, for example, they'd need to request the database system from the DI System (Ninject). Of course this would create a circular reference (binding DI project to model and model to DI project), so it's not possible.
Long story short, I'd need a programming pattern that allows me to bind model interfaces to their implementations but that also allows the model implementations to request other dependencies from Ninject, e.g.
IModel1 -> Model1
IModel2 -> Model2 (different project)
IDatabase -> Database (different project)
Model1 -> request IDatabase -> get Database instance
Model2 -> request IDatabase -> get the same Database instance
Would be glad to get a couple of suggestions, at the moment I'm stuck and out of ideas ;)
Thanks!

the models would need access to the DI project since, for example,
they'd need to request the database system from the DI System
(Ninject)
When you use dependency injection, the model shouldn't need to access the DI framework, since it is the DI framework that injects dependencies. The model objects should not be asking the DI container. When the your objects are asking the container for dependencies it is not called Dependency Injection, but Service Locator. Service Locator is an anti-pattern.
My first thought was to create a seperate DI-project
When you have a single application (e.g. a web app), the usual thing to do is completely configure the DI container in the startup project, as close as possible to the application’s entry point. This entry point where all object graphs are composed is called the Composition Root.
All model projects need to use some sort of database system, so they
all need access to yet another .dll which implements all my database
logic
Try making POCO (plain old CLR objects) model/entity objects, or at least, ensure that those objects don't need to reference any other project, which makes your architecture (and testing) much easier.

Use Ninject with the Register Resolve Release pattern from within a Composition Root.

The client application would use Ninject to inject the actual database and model implementations.
The client application therefore needs to reference the database, idatabase, model and imodel projects.
The idatabase and database projects need to reference the model project, as the methods will return model objects or collections of model objects. Have a look at the repository pattern.
Your model doesn't need to reference any of your projects.

Related

Autofac Container Independence and Relationship Types

We are currently using Autofac as our chosen IoC container. All application code in our reusable assemblies must be kept as clean as possible, so we do not want any direct dependencies on Autofac in our core application. The only place where Autofac is permitted is in the composition root / bootstrappers, where components are registered and wired up. Applications rely on dependency injection to create the required object graphs.
As we are keeping our core application container agnostic, it means that we cannot use the Autofac relationship types, such as Owned, in our core application.
I would like to create a factory that returns components that implement IDisposable. As Autofac tracks disposable objects, I believe I have to use a lifetime scope to create a defined unit of work in which components will be disposed once they go out of scope.
According to the Autofac documentation, this can be achieved by taking a dependency on Func<Owned<T>>, however, as stated above, I cannot take a dependency on Owned as it is an Autofac type. At the bottom of this page, it says
The custom relationship types in Autofac don’t force you to bind your application more tightly to Autofac. They give you a programming model for container configuration that is consistent with the way you write other components (vs. having to know a lot of specific container extension points and APIs that also potentially centralise your configuration.)
For example, you can still create a custom ITaskFactory in your core model, but provide an AutofacTaskFactory implementation based on Func<Owned<T>> if that is desirable.
It is this implementation of ITaskFactory that I believe I need to implement, but I cannot find any examples.
I would be very grateful if someone could provide such an example.
Probably the best "real-world" example of this is the Autofac MVC integration mechanism. While it doesn't use Func<Owned<T>> under the covers it does show you how you might be able to implement a non-Autofac-specific mechanism to talk to Autofac under the covers.
In the MVC case, the System.Web.Mvc.IDependencyResolver is the interface and the Autofac.Integration.Mvc.AutofacDependencyResolver is the implementation. When ASP.NET MVC requests a service, it gets it from System.Web.Mvc.DependencyResolver.Current, which returns an IDependencyResolver. At app startup, that singleton gets set to the Autofac implementation.
The same principle could hold for your custom factory. While IDependencyResolver is not specific to the type it returns (it's just GetService<T>()) you could write a type-specific factory interface just as easily.

Dependency Injection - Passing dependencies all the way down

I have an MVC application with a typical architecture...
ASP.NET MVC Controller -> Person Service -> Person Repository -> Entity Framework DB Context
I am using Castle Windsor and I can see the benefit of using this along with a ControllerFactory to create controller with the right dependencies. Using this approach the Controller gets a Service injected, which in turn knows how to construct the right Repository, which in turn knows the correct DbContext to use.
The windsor config is something like this...
dicontainer = new WindsorContainer();
dicontainer.Register(Component.For<IPersonService>().ImplementedBy<PersonService>());
dicontainer.Register(
Component.For<IPersonRepository>().UsingFactoryMethod(
() => new PersonRepository(new HrContext("connectionString"))));
It this the right way to do it? I don't like the UsingFactoryMethod, but can't think of another way.
Also, what if the Repository needed a dependency (say ILogger) that was not needed by the service layer? Does this mean I have to pass the ILogger into the service layer and not use it. This seems like a poor design. I'd appreciate some pointers here. I have read loads of articles, but not found a concrete example to verify whether I am doing this right. Thanks.
I try to avoid using factory methods (as you mentioned you felt this smelled funny). To avoid this, you could create a database session object that creates a new DbContext. Then your repositories just need to get an instance of IDbSession and use its dbContext property. Then, you can also easily control the scope of the IDbSession object (don't use singleton because it's not thread safe).
I wanted to make that point so that I could make this more important point... Make your constructors take in only objects that are registered in the DI container (no options or configurations in constructors). Options and configurations should be read/writen in classes whose sole purposes it is to read/write those values. If all classes follow this model, then your DI registration becomes easy and classes can just add whatever dependencies they need in their constructors.
If you are trying to use a third party library that has options in constructors, wrap that class in your own class that has an easy to consume constructor and uses a configuration class to read the values needed to pass to the third party library. This design also introduces a layer of abstraction between your code and the third party library which can then be used to more easily swap (or stub) third party libraries when necessary.

Dependency Inject for models

I'm sure someone has asked this before, but I'm struggling to find where.
I'm using Ninject to remove dependencies from my controllers, along with a repository design pattern.
As I understand it, one of the benefits of this approach is that I can easily whip apart my repositories and domain entities and use another assembly should I so wish. Consequently I've kept my domain entities and repositories in external assemblies and can mock all my dependencies from interfaces.
It seems that while I can use interfaces to refer to my domain entities in most places I must use references to my concrete classes when it comes to model binding. I've read that this is to do with serialization which I understand, but is the only way to avoid referring to domain entities to create separate models?
Anything I can do with Custom Model Binding?
A bit of background: I'm an experienced ASP.net developer, but new to MVC.
View Models should be plain data containers with no logic and therefore shouldn't have any dependencies at all. Instead inject the repositories to your controller and have it assign the required data from the repository to the appropriate property of your view model.
The major advantage of using a dependency injection framework is IoC (Inversion of Control):
loosely coupling
more flexibility
easier testing
So what one usually does is to inject repositories through their interfaces like
public class MyController : Controller
{
private IPersonRepository personRepo;
public MyController(IPersonRepository personRepo)
{
this.personRepo = personRepo;
}
...
}
During testing this allows to easily inject my mock repository which returns exactly those values I want to test.
Injecting domain entities doesn't make that much sense as they are more tightly linked with the functionality in the specific class/controller and thus abstracting them further would just be an overhead rather than being a benefit. Instead, if you want to decouple your actual entity model from the controller you might take a look at the MVVM pattern, creating specialized "ViewModels".
Just think in terms of testability of your controller: "What would I want to mock out to unit test it?"
DB accesses -> the repository
External dependencies -> other BL classes, WS calls etc.
I wouldn't include domain entities here as they're normally just a data container.
Some more details would help. A bit of code perhaps?
To start with, you should avoid injecting dependencies into domain entities, but rather use domain services.
Some more info here.
Edit 001:
I think we should clarify our terminology.
There is the domain layer with all you domain entities, e.g. product, category etc.
Then there's the Data Layer with your repositories that hydrate your domain entities and then you have a Service Layer with you application services that talks to the data layer.
Finally you have a presentation layer with your views and controllers. The Controllers talk to you Aplication Service Layer. So a Product Controller talks to a Catalogue Service (e.g. GetProductBySku). The CatalogueService will have one or more repositories injected into its constructor (IProductRepository, ICategoryRepository etc.).
It's quite common in asp.net mvc to have ViewModels too. Put the ViewModels in your Application Service Layer.
So I'm not sure what you mean when you say "models" and "domain enntities" but I hope that clears up the terminology.

Advice on isolating my nhibernate layer such that I could swap it out with EF potentially

Ok it seems my project setup could use some improvments.
I currently have:
1. ASP.NET MVC3 Web project
2. NHibernate project with Repositories/Mappings and some session code.
3. Entities (models used in nhibernate like User.cs)
4. Interfaces (like IUser, IRepository<IUser>, IUserRepository...)
5. Common (UserService, ..)
Now the issue is that I my nhibernate models now need to implement IUser, which I don't like, but I was forced to do this since my IRepository is generic, and I could use IRepository<User> since User is in another project, so I had to create an interface and do IRepository<IUser>
I will never need to have another implemention of User, so this is bugging me.
How can I fix this while keeping things seperate so I can swap out my ORM?
The IUser interface must be defined in the Entities layer if your entities implement it, not in the Interfaces layer. Also I would probably rename this generic Interfaces layer to Repositories or AbstractRepositories or something. Also I would rename the Common layer to Services if it contains services aggregating your repositories.
So the picture could be:
ASP.NET MVC3 Web project
NHibernate project with Repositories/Mappings and some session code.
Domain Entities (models used in nhibernate like User.cs and implementing domain interfaces like IUser)
Repositories (like IRepository<IUser>, IUserRepository...)
Services (UserService, ..)
I think you should approach this problem from Domain Driven Design perspective. Domain should be persistent-ignorant. Proper implementation of DDD repository is a key here. Repository interface is specific, business-focused, not generic. Repository implementation encapsulates all the data access technicalities (ORM). Please take a look a this answer and these 2 articles:
How to write a repository
DDD: The Generic Repository
Your entities should be concrete types, not interfaces. Although you may never need to swap your ORM (as Ladislav is saying in comments), you should design it as if you will need to swap it. This mindset will really help you achieve persistence ignorance.

Keeping dependency inject component out of main code base

I have a MVC app that uses ninject to inject service dependencies into controllers and it works well. However I also have some domain objects that require these services in their constructors and I want to resolve these dependencies using ninject, but don't want to reference ninject directly in my domain objects assembly. I have read lots of questions and answers here but its still not clear to me the best way to go about this. For example I have a ShoppingCart domain object that needs an instance of a IProductCatalogService passed to its constructor. What is the best pattern to create an instance of a shopping cart? I could have a reference to the root kernel and call out to that, but that would mean having references to ninject throughout my domain assembly. Should I wrap access to the kernel in a factory class?
Any thoughts or suggestions welcome!
It is usually considered bad practice to have services in domain objects. I think you need to rethink exactly what you are attempting to achieve. Why does a ShoppingCart need to consume Product Catalog Services?
From a Domain perspective I would assume that a ShoppingCart would consist of many 'items', have properties like total etc and potentially would be passed to an ordering service. Your controller actions would update the Shopping Cart domain by adding items, removing items, etc, etc.
If you really need to consider this option, is to use commonservicelocator. This will separate out your (direct) dependency on ninject.

Resources