Unit test wiring of MVC, FluentValidation, and Ninject - asp.net-mvc

I have set up an MVC web project with FluentValidation and Ninject using the Ninject.Extensions.Mvc.FluentValidation package. I followed the documentation of that package directly.
I'd like to write an automated test that verifies this wiring, and checks that the model validation will use the FluentValidation validators as expected. I'm having a hard time spinning up the right pieces of the MVC application in my test in order to do this.
Basically, I want to spin up the MVC app enough so that the Ninject kernel is created, the model binders are wired, and validators are created. Then I want to try to validate an entity through the MVC model validation and make sure that expected messages from the validator are showing up.
What is the best way to automating the testing of the interaction between Ninject, MVC, and FluentValidation?

I have similar setup but I use Unity instead of Ninject.
My IoC container inherits UnityContainer where I register all my repos, validarots, etc. I benefit from this because I have validators with repo dependencies (e.g. UserRegistrationValidator checks for unique user names).
My IoC container also implements IValidatorFactory. I use it register a global ModelValidatorProvider like this:
var ioc = new IoCContainer();
ModelValidatorProviders.Providers.Add(new FluentValidationModelValidatorProvider(ioc));
DataAnnotationsModelValidatorProvider.AddImplicitRequiredAttributeForValueTypes = false;
With this setup I know that my validators will intercept model validation in controller actions but only if validators are resolved by ValidatorFactory.
So basically I have 2 groups of tests
- IoC tests
With these I test if my IoC container can resolve a validator for a model. With Unity it looks something like this:
// Arrange
var ioc = new IoCContainer();
// Assert
Assert.IsTrue(ioc.IsRegistered(IValidator<MyModel>));
If you have all your models under one namespace you could even write a single unit test by getting all types from specific namespace and check if IValidator is registered for them in a loop.
- Validator tests
These are common tests to test my (view)models:
// Arrange
var validator = new MyModelValidator();
var model = new MyModel{ Name = null };
// Assert
validator.ShouldHaveValidationErrorFor(x => x.Name, model)
With this there is no need to spin application, you just test what you create.
Hope this helps.

What we use here for the integration test is Selenium Webdriver, you can get it with NuGet. We write code usign page pattern so it's easyer to maintain.
When I want to check if a requiered field or any other custom validation I do the following :
[TestFixture]
public class CenterTests : TestBase
{
[Test]
public void CreateViewAndEditShouldWork()
{
S.OpenWithCI();
var loginPage = new LoginPage(S);
var homePage = loginPage.LoginValidUser("email", "Password");
var centerListPage = homePage.ClickCenterAndRoomLink();
var centerPage = centerListPage.ClickCreateLink();
//Create
centerPage.CreateInvalidCenter();
Assert.That(S.FindElement(By.CssSelector("span[for=Name]")).Text, Is.StringContaining(Strings.Error_Required));
Assert.That(S.FindElement(By.CssSelector("span[for=EnglishName]")).Text, Is.StringContaining(Strings.Error_Required));
centerListPage = centerPage.CreateValidCenter("Saguenay", "Sag", "2089 blv Talbot");
Thread.Sleep(2000);
S.ExpectSuccessNotice(Strings.CenterCreatedSuccessfully);
}
}
And here's the code of my TestBase helper class :
namespace Afi.AutomatedTests.Helpers
{
[TestFixture]
public class TestBase
{
protected IWebDriver S;
[SetUp]
public virtual void TestSetup()
{
S = new ChromeDriver();
S.Manage().Window.Size = new Size(1024, 768);
S.Manage().Timeouts().ImplicitlyWait(TimeSpan.FromSeconds(10));
}
[TearDown]
public void TearDown()
{
S.Quit();
}
public string GetUrl(string relativePath)
{
if(!relativePath.StartsWith("/"))
throw new ArgumentException("Relative URL must begins with /");
return "http://afi.local" + relativePath;
}
}
}
All those tests are in an other project call AutomatedTests and I can run them the same way I run unit test (resharper/nunit). It uses Chromedriver to do the tests.
Let me know if you need more informations.

Related

How to implement UnitOfWork with Onion Architecture without introducing dependencies?

I am setting up an asp.Net Mvc 4 app and looking to configure it using the Onion Architecture Pattern.
In the past I have used the Unit of Work Pattern like this
public class UnitOfWork : IUnitOfWork, IDisposable
{
private IRepository<CallModel> _callRepo;
private IRepository<UserModel> _userRepo;
public IRepository<CallModel> CallRepo
{
get
{
if (_callRepo == null)
{
_callRepo = new Repository<CallModel>();
}
return _callRepo;
}
}
public IRepository<UserModel> UserRepo
{
get
{
if (_userRepo == null)
{
_userRepo = new Repository<UserModel>();
}
return _userRepo;
}
}
}
I would then pass the instance of the UnitOfWork Class to the Controller to do simple CRUD stuff like this.
public class QuestionsController : Controller
{
private IUnitOfWork _unitOfWork;
[Inject]
public QuestionsController(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
I have seperated the app into three projects.
Core
Infrastructure
Web
I have my Interfaces all in the Core project and the implementation of the IRepository interface in the Infrastructure project.
If I put the UnitOfWork Class in the Core Project then since it calls for a new Repository in the Infrastructure project I am creating a dependency from the Core to the Infrastructure.
If I include it in the Infrastructure then the Web project (which has the controllers) will have a dependency on the Infrastructure and the whole Solution ends up looking less like an Onion and more like spaghetti.
I have my Interfaces all in the Core project and the implementation of the IRepository interface in the Infrastructure project. If I put the UnitOfWork Class in the Core Project then since it calls for a new Repository in the Infrastructure project I am creating a dependency from the Core to the Infrastructure.
Hmm, not really. Your unit of work class should have a dependency on IRepository, not the Repository implementation itself. If you are using Dependency Injection, this should not pose a problem, as it should find the right type and provide it at runtime. I'm not sure whether the Onion architecture is even possible without using DI.
See david.s's answer as well, as this is exactly how I set things up--have a project for the sole purpose of wiring up dependencies.
What I do is have another project named DependencyResolution which has references to Core and Infrastructure an where I configure my IoC container. Then I can refence only DependencyResolution from the Web project.
I would do like david.s create project named DependencyResolution but let it referance Web, Core and Infrastructure.
In that project you could do:
[assembly: PreApplicationStartMethod(typeof(Start), "Register")]
namespace DependencyResolution
{
public static class Start
{
public static void Register()
{
UnityConfig.Register();
}
}
}
and to register DI.
namespace DependencyResolution
{
public static class UnityConfig
{
public static void Register()
{
DependencyResolver.SetResolver(new UnityDependencyResolver());
}
}
}
So no referance between Web and infrastructure is needed.
Best regards
For what it's still worth, I have implemented my own library that applies the UnitOfWork-pattern a little differently than I've seen in any code sample before, but I have found it to work very well in practice. In short: I kinda copied the way .NET Transactions work by creating a scope and then enlisting resources in the ambient unitofwork(-manager) where necessary. What basically happens is that when a new message/request is being handled, this code is executed:
public void Handle<TMessage>(TMessage message)
{
using (var scope = CreateMessageProcessorContextScope())
{
HandleMessage(message);
scope.Complete();
}
}
Now just as with transactions, as soon as the Thread is still inside the scope, an ambient UnitOfWork-controller is present in which all resources that are used and changed during the request can enlist dynamically. They do this by implementing the IUnitOfWork-interface that has two methods:
public interface IUnitOfWork
{
bool RequiresFlush();
void Flush();
}
Instances that implement this interface can then enlist themselves as follows:
MessageProcessorContext.Current.Enlist(this);
Typically, a Repository-class will implement this interface, and when it detects it's managed aggregates are changed/added/removed, it can enlist itself (double enlistments are ignored).
In my case, the framework assumes that you are using an IOC-framework that will resolve all message-handlers and repositories for you, so I made enlistment to the ambient unit of work controller easier by letting it inject an instance of the current IUnitOfWorkManager into the constructor where required. This way the dependencies of the unit of work manager and the actual pieces that require to be flushed (repositories, services, etc) are reversed:
internal sealed class OrderRepository : IOrderRepository, IUnitOfWork
{
private readonly IUnitOfWorkManager _manager;
private readonly Dictionary<Guid, Order> _orders;
public OrderRepository(IUnitOfWorkManager manager)
{
if (manager == null)
{
throw new ArgumentNullException("manager");
}
_manager = manager;
}
bool IUnitOfWork.RequiresFlush()
{
return _orders.Values.Any(order => order.HasChanges());
}
void IUnitOfWork.Flush()
{
// Flush here...
}
public void Add(Order order)
{
_orders.Add(order.Id, order);
_manager.Enlist(this);
}
}
As soon as a request has been handled succesfully (no exceptions thrown), scope.Complete() will be called which triggers the controller to check with all enlisted items whether they (still) need to be flushed (by calling RequiresFlush()), and if so, flushes them (by calling Flush()).
All in all, this allows for a very maintainable solution (in my perspective) in which new repositories and other dependencies can be added on the fly without changing any master unitofwork class, just like the TransactionManager doesn't need to know upfront which items may take part in any given Transaction.

Multiple dependencies in an asp.net mvc controller constructor injected

In my asp.net mvc controller`s constructor I have multiple (5) interfaces which communicate with my database in this way:
[HttpGet]
public ActionResult Create()
{
var releases = _releaseDataProvider.GetReleases();
var templates = _templateDataProvider.GetTemplates();
var createTestplanViewModel = new CreateTestplanViewModel(templates, releases);
return PartialView(createTestplanViewModel);
}
Above I use 2 different interfaces to get data from the database.
business case: To create a testplan I need to show the user the available releases + templates he can choose from.
How can I decrease the dependency/over-injection of these 2 interfaces
In the MVC project:
public class MyController : Controller
{
private readonly IQueryProcessor _queryProcessor;
public MyController(IQueryProcessor queryProcessor)
{
_queryProcessor = queryProcessor;
}
[HttpGet]
public ActionResult Create()
{
var releases = _queryProcessor.Execute(new ProvideReleaseData());
var templates = _queryProcessor.Execute(new ProvideTemplateData());
var createTestplanViewModel = AutoMapper.Mapper
.Map<CreateTestplanViewModel>(releases);
AutoMapper.Mapper.Map(templates, createTestplanViewModel);
return PartialView(createTestplanViewModel);
}
}
You can then constructor inject your current provider implementations into IQueryHandler implementations. The IQueryProcessor is just infrastructure. See this for more info: https://cuttingedge.it/blogs/steven/pivot/entry.php?id=92
Reply to comments:
It's at the site I linked to. Here's mine:
using System.Diagnostics;
using SimpleInjector;
namespace MyApp.Infrastructure
{
sealed class SimpleQueryProcessor : IQueryProcessor
{
private readonly Container _container;
public SimpleQueryProcessor(Container container)
{
_container = container;
}
[DebuggerStepThrough]
public TResult Execute<TResult>(IDefineQuery<TResult> query)
{
var handlerType = typeof(IHandleQueries<,>)
.MakeGenericType(query.GetType(), typeof(TResult));
dynamic handler = _container.GetInstance(handlerType);
return handler.Handle((dynamic)query);
}
}
}
A good general way to decouple your database would be using a unit of work. Here's a great article on from asp.net, as well as another article on MSDN.
In summary, you create a single unit where all of your database/service calls reside and it can handle the database logic. This would reduce the dependancy of your multiple interfaces into a single point, so you would only need to inject 1 class into your controller.
A quote from the MSDN article:
According to Martin Fowler, the Unit of Work pattern "maintains a list
of objects affected by a business transaction and coordinates the
writing out of changes and the resolution of concurrency problems."
EDIT
It seems to me you basically have these options to reduce constructor dependency count here:
Split the controller
Add layer in front of the two interfaces
Switch to property injection
Service locator
#3 and #4 are included for good measure, but they obviously don't actually decrease the dependency count, they only hide them from the constructor. They also have several disadvantages, and I consider service locator especially evil most of the time.
For #1, if you feel your constructor is actually doing two+ jobs, and there is a clean separation where you could split, I would do so. I assume from your responses that you have already considered this, however, and don't want to do this.
That leaves #2 - adding another layer. In this case that would be introducing a factory interface for that particular view model. Naively, I'll name this ICreateTestplanViewModelFactory, but you can name it something more sensical for your app if you wish. A single method on it would construct a CreateTestplanViewModel.
This makes the fact that the data for this view is coming from 2 sources merely an implementation detail. You would wire up an implementation which takes IReleaseDataProvider and ITemplateDataProvider as constructor dependencies.
This is along the lines of what I was suggesting:
public interface IProvideTestPlanSetupModel
{
CreateTestplanViewModel GetModel();
}
public class TestPlanSetupProvider : IProvideTestPlanSetupModel
{
private readonly IReleaseDataProvider _releaseDataProvider;
private readonly ITemplateDataProvider _templateDataProvider;
public TestPlanSetupProvider(IReleaseDataProvider releaseDataProvider, ITemplateDataProvider templateDataProvider)
{
_releaseDataProvider = releaseDataProvider;
_templateDataProvider = templateDataProvider;
}
public CreateTestplanViewModel GetModel()
{
var releases = _releaseDataProvider.GetReleases();
var templates = _templateDataProvider.GetTemplates();
return new CreateTestplanViewModel(releases, templates);
}
}
public class TestPlanController : Controller
{
private readonly IProvideTestPlanSetupModel _testPlanSetupProvider;
public TestPlanController(IProvideTestPlanSetupModel testPlanSetupProvider)
{
_testPlanSetupProvider = testPlanSetupProvider;
}
[HttpGet]
public ActionResult Create()
{
var createTestplanViewModel = _testPlanSetupProvider.GetModel();
return PartialView(createTestplanViewModel);
}
}
If you don't like constructing a view model anywhere outside the controller, the interface could provide an intermediate object with the same properties that you would copy to the view model. But that is silly, as this combination of data is only relevant for that particular view, which is precisely what the view model is supposed to represent.
On a side note, it seems you are running into pretty common annoyances doing read/write through the same model. Since these issues bother you so, you might investigate CQRS, which perhaps would make you feel less dirty about talking to the database directly for these types of queries and would help you get around the layering labyrinth we all enjoy so much. It seems promising, though I have not yet had the pleasure of test driving it in a production application.

How to test Ninject ConstructorArguments using MOQ objects?

I have been doing my first Test Driven Development project recently and have been learning Ninject and MOQ. This is my first attempt at all this. I've found the TDD approach has been thought provoking, and Ninject and MOQ have been great. The project I am working on has not particularly been the best fit for Ninject as it is a highly configurable C# program that is designed to test the use of a web service interface.
I have broken it up into modules and have interfaces all over the shop, but I am still finding that I am having to use lots of constructor arguments when getting an implementation of a service from the Ninject kernel. For example;
In my Ninject module;
Bind<IDirEnum>().To<DirEnum>()
My DirEnum class;
public class DirEnum : IDirEnum
{
public DirEnum(string filePath, string fileFilter,
bool includeSubDirs)
{
....
In my Configurator class (this is the main entry point) that hooks all the services together;
class Configurator
{
public ConfigureServices(string[] args)
{
ArgParser argParser = new ArgParser(args);
IDirEnum dirEnum = kernel.Get<IDirEnum>(
new ConstructorArgument("filePath", argParser.filePath),
new ConstructorArgument("fileFilter", argParser.fileFilter),
new ConstructorArgument("includeSubDirs", argParser.subDirs)
);
filePath, fileFilter and includeSubDirs are command line options to the program. So far so good. However, being a conscientious kind of guy, I have a test covering this bit of code. I'd like to use a MOQ object. I have created a Ninject module for my tests;
public class TestNinjectModule : NinjectModule
{
internal IDirEnum mockDirEnum {set;get};
Bind<IDirEnum>().ToConstant(mockDirEnum);
}
And in my test I use it like this;
[TestMethod]
public void Test()
{
// Arrange
TestNinjectModule testmodule = new TestNinjectModule();
Mock<IDirEnum> mockDirEnum = new Mock<IDirEnum>();
testModule.mockDirEnum = mockDirEnum;
// Act
Configurator configurator = new Configurator();
configurator.ConfigureServices();
// Assert
here lies my problem! How do I test what values were passed to the
constructor arguments???
So the above shows my problem. How can I test what arguments were passed to the ConstructorArguments of the mock object? My guess is that Ninject is dispensing of the ConstuctorArguments in this case as the Bind does not require them? Can I test this with a MOQ object or do I need to hand code a mock object that implements DirEnum and accepts and 'records' the constructor arguments?
n.b. this code is 'example' code, i.e. I have not reproduced my code verbatim, but I think I have expressed enough to hopefully convey the issues? If you need more context, please ask!
Thanks for looking. Be gentle, this is my first time ;-)
Jim
There are a few problems with the way you designed your application. First of all, you are calling the Ninject kernel directly from within your code. This is called the Service Locator pattern and it is considered an anti-pattern. It makes testing your application much harder and you are already experiencing this. You are trying to mock the Ninject container in your unit test, which complicates things tremendously.
Next, you are injecting primitive types (string, bool) in the constructor of your DirEnum type. I like how MNrydengren states it in the comments:
take "compile-time" dependencies
through constructor parameters and
"run-time" dependencies through method
parameters
It's hard for me to guess what that class should do, but since you are injecting these variables that change at run-time into the DirEnum constructor, you end up with a hard to test application.
There are multiple ways to fix this. Two that come in mind are the use of method injection and the use of a factory. Which one is feasible is up to you.
Using method injection, your Configurator class will look like this:
class Configurator
{
private readonly IDirEnum dirEnum;
// Injecting IDirEnum through the constructor
public Configurator(IDirEnum dirEnum)
{
this.dirEnum = dirEnum;
}
public ConfigureServices(string[] args)
{
var parser = new ArgParser(args);
// Inject the arguments into a method
this.dirEnum.SomeOperation(
argParser.filePath
argParser.fileFilter
argParser.subDirs);
}
}
Using a factory, you would need to define a factory that knows how to create new IDirEnum types:
interface IDirEnumFactory
{
IDirEnum CreateDirEnum(string filePath, string fileFilter,
bool includeSubDirs);
}
Your Configuration class can now depend on the IDirEnumFactory interface:
class Configurator
{
private readonly IDirEnumFactory dirFactory;
// Injecting the factory through the constructor
public Configurator(IDirEnumFactory dirFactory)
{
this.dirFactory = dirFactory;
}
public ConfigureServices(string[] args)
{
var parser = new ArgParser(args);
// Creating a new IDirEnum using the factory
var dirEnum = this.dirFactory.CreateDirEnum(
parser.filePath
parser.fileFilter
parser.subDirs);
}
}
See how in both examples the dependencies get injected into the Configurator class. This is called the Dependency Injection pattern, opposed to the Service Locator pattern, where the Configurator asks for its dependencies by calling into the Ninject kernel.
Now, since your Configurator is completely free from any IoC container what so ever, you can now easily test this class, by injecting a mocked version of the dependency it expects.
What is left is to configure the Ninject container in the top of your application (in DI terminology: the composition root). With the method injection example, your container configuration would stay the same, with the factory example, you will need to replace the Bind<IDirEnum>().To<DirEnum>() line with something as follows:
public static void Bootstrap()
{
kernel.Bind<IDirEnumFactory>().To<DirEnumFactory>();
}
Of course, you will need to create the DirEnumFactory:
class DirEnumFactory : IDirEnumFactory
{
IDirEnum CreateDirEnum(string filePath, string fileFilter,
bool includeSubDirs)
{
return new DirEnum(filePath, fileFilter, includeSubDirs);
}
}
WARNING: Do note that factory abstractions are in most cases not the best design, as explained here.
The last thing you need to do is to create a new Configurator instance. You can simply do this as follows:
public static Configurator CreateConfigurator()
{
return kernel.Get<Configurator>();
}
public static void Main(string[] args)
{
Bootstrap():
var configurator = CreateConfigurator();
configurator.ConfigureServices(args);
}
Here we call the kernel. Although calling the container directly should be prevented, there will always at least be one place in your application where you call the container, simply because it must wire everything up. However, we try to minimize the number of times the container is called directly, because it improves -among other things- the testability of our code.
See how I didn't really answer your question, but showed a way to work around the problem very effectively.
You might still want to test your DI configuration. That's very valid IMO. I do this in my applications. But for this, you often don't need the DI container, or even if your do, this doesn't mean that all your tests should have a dependency on the container. This relationship should only exist for the tests that test the DI configuration itself. Here is a test:
[TestMethod]
public void DependencyConfiguration_IsConfiguredCorrectly()
{
// Arrange
Program.Bootstrap();
// Act
var configurator = Program.CreateConfigurator();
// Assert
Assert.IsNotNull(configurator);
}
This test indirectly depends on Ninject and it will fail when Ninject is not able to construct a new Configurator instance. When you keep your constructors clean from any logic and only use it for storing the taken dependencies in private fields, you can run this, without the risk of calling out to a database, web service or what so ever.
I hope this helps.

TDD Process Outside-In?

Lets assume I've created the following test for my sample blog app:
[TestClass]
public class when_the_blog_controller_index_action_executes : BlogControllerTests
{
//...
[TestMethod]
[TestCategory("BlogController")]
public void it_should_pass_the_latest_blogentries_to_the_view()
{
blogs = new List<BlogEntry>() { new BlogEntry("title1", "b1"), new BlogEntry("title2", "b2"), new BlogEntry("title3", "b3") };
blogServiceMock = new Mock<IBlogService>();
blogServiceMock.Setup(s => s.GetLatestBlogEntries())
.Returns(blogs);
var controller = new BlogController(blogServiceMock.Object);
var model = ((ViewResult)controller.Index()).Model as IEnumerable<BlogEntry>;
Assert.IsTrue(blogs.SequenceEqual(model));
blogServiceMock.VerifyAll();
}
}
After the BlogController implementation, I have a running test:
public class BlogController : Controller
{
IBlogService blogService;
public BlogController(IBlogService blogService)
{
this.blogService = blogService;
}
public ActionResult Index()
{
var model = blogService.GetLatestBlogEntries();
return View(model);
}
}
So what is the next step? Should I create the implementation for the IBlogService (with test) and after create the repository (with test)?
If I'd have a service like this, basically i wouldn't test anything, because I'd just mocking the repository...
public class BlogService : IBlogService
{
IBlogRepository blogRepository;
public BlogService(IBlogRepository blogRepository)
{
this.blogRepository = blogRepository;
}
public IEnumerable<BlogEntry> GetLatestBlogEntries()
{
return blogRepository.GetLatestBlogEntries();
}
}
The next step is to unit test the BlogService implementation you have written. In this test you should ensure that the proper methods are being called on a mocked repository. If later your service logic evolves and you have some methods that perform more than just CRUD repository access your test will start to make more sense.
Your current implementation of this service simply delegates the calls to the repository and that's what you should unit test.
And if you find yourself having your service layer consist of simple CRUD methods that are doing nothing more than delegating the call to some repository maybe you should ask yourself on the usefulness of this service layer and whether it wouldn't be also possible to directly use the repository from the controller.
> TDD Process Outside-In?
I would use both a mix of outside-in (BDD) and inside out (TDD) in two nested loops as described in Behavior-Driven Development with SpecFlow and WatiN
* writing a failing (outside-in) integration tests
* writing a failing (inside out) unit test as part of the solution of the integration test
* making the unittest pass
* refactor
* writing the next failing unit test as part of the integration test
* unitl the integration test passes
* writing the next failing integration tests

EF4 Repository Pattern problems injecting repository into service .Cannot seem to get it right

I am finding difficult to test EntityFramework 4 .I am using it using the database first approach,too late now to move to poco.Needed to deliver pretty quickly,no time to learn properly as usual.
I have implemented the repository pattern with unit of work but I am finding difficult to inject a repository into my Service layer so that I can test the behaviour of my business layer service ,validation etc... without hitting the db.
but I am incurring in many little problems.
In order to inject the Repository into the service(constructor) the calling layer need to have a reference to the DAL (EF Entities) . I dont want this
If i have many repositories EG CustomerRepository ,EmployeeRepository than I need to have as many constructors as repositories so that I can inject the repository.
3.Not sure where to go from here. I have not found any example on the net where they inject the repository into the service using EF4. All the examples I have seen they mock the repository on it's own,which is not good to me.
I need to test my service layer/BizLayer without hitting the database.
The all thing is just not testable and adds so many dependencies and problems.
Noddy example I have put together
public class DepartmentServiceLibrary
{
private readonly IDepartmentRepository _departmentRepository;
public DepartmentServiceLibrary(IDepartmentRepository departmentRepository)
{
_departmentRepository = departmentRepository;
}
public List<DepartmentDto> GetDepartments()
{
return DeparmentBiz.GetDepartments();
}
private DeparmentBL _departmentBiz;
private DeparmentBL DeparmentBiz
{
get
{
return _departmentBiz ?? new DeparmentBL(_departmentRepository);
}
}
}
//internal class
internal class DeparmentBL
{
private readonly IDepartmentRepository _departmentRepository;
public DeparmentBL(IDepartmentRepository departmentRepository)
{
_departmentRepository = departmentRepository;
}
public List<DepartmentDto> GetDepartments()
{
using (var ctx = new AdventureWorksContext())
{
var uow = new UnitOfWork(ctx);
_departmentRepository.UnitOfWork = uow;
var query = _departmentRepository.GetAll();
return query.Select(dpt => new DepartmentDto
{
DepartmentId = dpt.DepartmentID,
Name = dpt.Name,
GroupName = dpt.GroupName
}).ToList();
}
}
}
The following TestMethod requires me to add a ref to the dal which defeats the point
[TestMethod]
public void Should_be_able_to_call_get_departments()
{
var mock = new Mock<IDepartmentRepository>();
var expectedResult = new List<Department>(); //Dependency to DAL as Department is a EF Entity generated by EF.
mock.Setup(x => x.GetAll()).Returns(expectedResult);
var companyService = new MyCompanyBL(mock.Object); //InternalVisibileTO
var departments = companyService.GetAll();
//assert removed for brevity
Any suggestions or examples out there that shows how to do it?
thanks
}
The short answer is - since you're not using POCOs, all your layers will have a reference to your DAL.
Without POCOs, you use code generation, which means EF creates the model classes in the Model.edmx.designer.cs file.
An option (haven't tried this - off the top of my head) is to manually project the EF entities into DTOs.
So your Repository might do this:
public List<OrderDTO> GetOrdersForCustomer(int customerId)
{
return _ctx.Orders
.Where(x => x.CustomerId == customerId)
.ToList()
.Select(x => new OrderDTO { // left to right copy });
}
The OrderDTO class could be in a separate assembly, which the repository references, as well as your other projects. So the other projects would work off the DTO assembly, and wouldn't require a reference to the Repository.
But here you're projecting into classes everywhere (basically doing POCO, but manually, and with more work) left to right copying of properties - very painful.
However, that is an option.
Honestly - it does not take long to move to POCOs.
There is a T4 template which will generate the POCOs for you - you could be up and running in a matter of minutes.
And since you're already using dependency injection and repository, you should either bite the bullet and change to POCOs, or keep the reference to the DAL.
Something similar in terms of code can be seen here in GitHub
and detail explanation can be found in TechNet

Resources