ASP.NET MVC: Mocking Database on Data-driven Environment? - asp.net-mvc

I am making a MSTest project of my app created with Data-driven strategy.
Now, I would like to mock the database using interface. But the DB Entity is automatically generated, so my modification for DI will be destroyed when entity was updated.
Should I give up testing with a mock and access the actual DB every time?
(Update 2022/01/28) I am thinking for compromise: rather than accessing DB entity directly on the model, I would make service facades that on the one class handles DB (for production usage), while the other just works itself.
Short crude examples here:
public interface IMemberDatabaseService
{
Member Search(string name);
void Create(MemberModel model);
}
public class MemberDatabaseService : IMemberDatabaseService, IDisposable
{
private AutomaticallyGeneratedDBContext Con = new();
public Member Search(string name)
{
return Con.Member.SingleOrDefault(mb => mb.Name == name);
}
public void Create(MemberModel model)
{
Member member = Convert(model);
Con.Member.Add(model);
Con.SaveChanges();
}
private static Member Convert(MemberModel model)
{
// convert model to Member
}
// Dispose pattern here...
}
public class MemberTestService : IMemberDatabaseService, IDisposable
{
private static List<Member> MemberList = new();
public Member Search(string name)
{
return name == "John Doe" ? new Member{ Name = name, ...} : null;
}
public void Create(MemberModel model)
{
Member member = Convert(model); // convert model to Member
MemberList.Add(model);
}
private static Member Convert(MemberModel model)
{
// convert model to Member
}
// Dispose pattern here...
}
The drawback is I cannot test the LINQ portion or conflict handling without connecting the DB.

You will need to add specific detail about what your implementation looks like.
Mocking DbContexts/DbSets is possible, but arguably a fair bit of work and ugly to work with. Unit testing is one good argument for implementing a Repository pattern to serve as a boundary for the mocks, but if you're past the point where something like that can be implemented, a simpler solution can be to point your DbContext at an in-memory database that is seeded with suitable test data.
The downside is that most test frameworks accommodate running tests in parallel and not necessarily in order so you need to ensure each test's data row dependencies are safely isolated from each other to ensure you don't get intermittent failures due to one test tampering with data that another test relies on.

Related

How to implement UnitOfWork with Onion Architecture without introducing dependencies?

I am setting up an asp.Net Mvc 4 app and looking to configure it using the Onion Architecture Pattern.
In the past I have used the Unit of Work Pattern like this
public class UnitOfWork : IUnitOfWork, IDisposable
{
private IRepository<CallModel> _callRepo;
private IRepository<UserModel> _userRepo;
public IRepository<CallModel> CallRepo
{
get
{
if (_callRepo == null)
{
_callRepo = new Repository<CallModel>();
}
return _callRepo;
}
}
public IRepository<UserModel> UserRepo
{
get
{
if (_userRepo == null)
{
_userRepo = new Repository<UserModel>();
}
return _userRepo;
}
}
}
I would then pass the instance of the UnitOfWork Class to the Controller to do simple CRUD stuff like this.
public class QuestionsController : Controller
{
private IUnitOfWork _unitOfWork;
[Inject]
public QuestionsController(IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
I have seperated the app into three projects.
Core
Infrastructure
Web
I have my Interfaces all in the Core project and the implementation of the IRepository interface in the Infrastructure project.
If I put the UnitOfWork Class in the Core Project then since it calls for a new Repository in the Infrastructure project I am creating a dependency from the Core to the Infrastructure.
If I include it in the Infrastructure then the Web project (which has the controllers) will have a dependency on the Infrastructure and the whole Solution ends up looking less like an Onion and more like spaghetti.
I have my Interfaces all in the Core project and the implementation of the IRepository interface in the Infrastructure project. If I put the UnitOfWork Class in the Core Project then since it calls for a new Repository in the Infrastructure project I am creating a dependency from the Core to the Infrastructure.
Hmm, not really. Your unit of work class should have a dependency on IRepository, not the Repository implementation itself. If you are using Dependency Injection, this should not pose a problem, as it should find the right type and provide it at runtime. I'm not sure whether the Onion architecture is even possible without using DI.
See david.s's answer as well, as this is exactly how I set things up--have a project for the sole purpose of wiring up dependencies.
What I do is have another project named DependencyResolution which has references to Core and Infrastructure an where I configure my IoC container. Then I can refence only DependencyResolution from the Web project.
I would do like david.s create project named DependencyResolution but let it referance Web, Core and Infrastructure.
In that project you could do:
[assembly: PreApplicationStartMethod(typeof(Start), "Register")]
namespace DependencyResolution
{
public static class Start
{
public static void Register()
{
UnityConfig.Register();
}
}
}
and to register DI.
namespace DependencyResolution
{
public static class UnityConfig
{
public static void Register()
{
DependencyResolver.SetResolver(new UnityDependencyResolver());
}
}
}
So no referance between Web and infrastructure is needed.
Best regards
For what it's still worth, I have implemented my own library that applies the UnitOfWork-pattern a little differently than I've seen in any code sample before, but I have found it to work very well in practice. In short: I kinda copied the way .NET Transactions work by creating a scope and then enlisting resources in the ambient unitofwork(-manager) where necessary. What basically happens is that when a new message/request is being handled, this code is executed:
public void Handle<TMessage>(TMessage message)
{
using (var scope = CreateMessageProcessorContextScope())
{
HandleMessage(message);
scope.Complete();
}
}
Now just as with transactions, as soon as the Thread is still inside the scope, an ambient UnitOfWork-controller is present in which all resources that are used and changed during the request can enlist dynamically. They do this by implementing the IUnitOfWork-interface that has two methods:
public interface IUnitOfWork
{
bool RequiresFlush();
void Flush();
}
Instances that implement this interface can then enlist themselves as follows:
MessageProcessorContext.Current.Enlist(this);
Typically, a Repository-class will implement this interface, and when it detects it's managed aggregates are changed/added/removed, it can enlist itself (double enlistments are ignored).
In my case, the framework assumes that you are using an IOC-framework that will resolve all message-handlers and repositories for you, so I made enlistment to the ambient unit of work controller easier by letting it inject an instance of the current IUnitOfWorkManager into the constructor where required. This way the dependencies of the unit of work manager and the actual pieces that require to be flushed (repositories, services, etc) are reversed:
internal sealed class OrderRepository : IOrderRepository, IUnitOfWork
{
private readonly IUnitOfWorkManager _manager;
private readonly Dictionary<Guid, Order> _orders;
public OrderRepository(IUnitOfWorkManager manager)
{
if (manager == null)
{
throw new ArgumentNullException("manager");
}
_manager = manager;
}
bool IUnitOfWork.RequiresFlush()
{
return _orders.Values.Any(order => order.HasChanges());
}
void IUnitOfWork.Flush()
{
// Flush here...
}
public void Add(Order order)
{
_orders.Add(order.Id, order);
_manager.Enlist(this);
}
}
As soon as a request has been handled succesfully (no exceptions thrown), scope.Complete() will be called which triggers the controller to check with all enlisted items whether they (still) need to be flushed (by calling RequiresFlush()), and if so, flushes them (by calling Flush()).
All in all, this allows for a very maintainable solution (in my perspective) in which new repositories and other dependencies can be added on the fly without changing any master unitofwork class, just like the TransactionManager doesn't need to know upfront which items may take part in any given Transaction.

Multiple dependencies in an asp.net mvc controller constructor injected

In my asp.net mvc controller`s constructor I have multiple (5) interfaces which communicate with my database in this way:
[HttpGet]
public ActionResult Create()
{
var releases = _releaseDataProvider.GetReleases();
var templates = _templateDataProvider.GetTemplates();
var createTestplanViewModel = new CreateTestplanViewModel(templates, releases);
return PartialView(createTestplanViewModel);
}
Above I use 2 different interfaces to get data from the database.
business case: To create a testplan I need to show the user the available releases + templates he can choose from.
How can I decrease the dependency/over-injection of these 2 interfaces
In the MVC project:
public class MyController : Controller
{
private readonly IQueryProcessor _queryProcessor;
public MyController(IQueryProcessor queryProcessor)
{
_queryProcessor = queryProcessor;
}
[HttpGet]
public ActionResult Create()
{
var releases = _queryProcessor.Execute(new ProvideReleaseData());
var templates = _queryProcessor.Execute(new ProvideTemplateData());
var createTestplanViewModel = AutoMapper.Mapper
.Map<CreateTestplanViewModel>(releases);
AutoMapper.Mapper.Map(templates, createTestplanViewModel);
return PartialView(createTestplanViewModel);
}
}
You can then constructor inject your current provider implementations into IQueryHandler implementations. The IQueryProcessor is just infrastructure. See this for more info: https://cuttingedge.it/blogs/steven/pivot/entry.php?id=92
Reply to comments:
It's at the site I linked to. Here's mine:
using System.Diagnostics;
using SimpleInjector;
namespace MyApp.Infrastructure
{
sealed class SimpleQueryProcessor : IQueryProcessor
{
private readonly Container _container;
public SimpleQueryProcessor(Container container)
{
_container = container;
}
[DebuggerStepThrough]
public TResult Execute<TResult>(IDefineQuery<TResult> query)
{
var handlerType = typeof(IHandleQueries<,>)
.MakeGenericType(query.GetType(), typeof(TResult));
dynamic handler = _container.GetInstance(handlerType);
return handler.Handle((dynamic)query);
}
}
}
A good general way to decouple your database would be using a unit of work. Here's a great article on from asp.net, as well as another article on MSDN.
In summary, you create a single unit where all of your database/service calls reside and it can handle the database logic. This would reduce the dependancy of your multiple interfaces into a single point, so you would only need to inject 1 class into your controller.
A quote from the MSDN article:
According to Martin Fowler, the Unit of Work pattern "maintains a list
of objects affected by a business transaction and coordinates the
writing out of changes and the resolution of concurrency problems."
EDIT
It seems to me you basically have these options to reduce constructor dependency count here:
Split the controller
Add layer in front of the two interfaces
Switch to property injection
Service locator
#3 and #4 are included for good measure, but they obviously don't actually decrease the dependency count, they only hide them from the constructor. They also have several disadvantages, and I consider service locator especially evil most of the time.
For #1, if you feel your constructor is actually doing two+ jobs, and there is a clean separation where you could split, I would do so. I assume from your responses that you have already considered this, however, and don't want to do this.
That leaves #2 - adding another layer. In this case that would be introducing a factory interface for that particular view model. Naively, I'll name this ICreateTestplanViewModelFactory, but you can name it something more sensical for your app if you wish. A single method on it would construct a CreateTestplanViewModel.
This makes the fact that the data for this view is coming from 2 sources merely an implementation detail. You would wire up an implementation which takes IReleaseDataProvider and ITemplateDataProvider as constructor dependencies.
This is along the lines of what I was suggesting:
public interface IProvideTestPlanSetupModel
{
CreateTestplanViewModel GetModel();
}
public class TestPlanSetupProvider : IProvideTestPlanSetupModel
{
private readonly IReleaseDataProvider _releaseDataProvider;
private readonly ITemplateDataProvider _templateDataProvider;
public TestPlanSetupProvider(IReleaseDataProvider releaseDataProvider, ITemplateDataProvider templateDataProvider)
{
_releaseDataProvider = releaseDataProvider;
_templateDataProvider = templateDataProvider;
}
public CreateTestplanViewModel GetModel()
{
var releases = _releaseDataProvider.GetReleases();
var templates = _templateDataProvider.GetTemplates();
return new CreateTestplanViewModel(releases, templates);
}
}
public class TestPlanController : Controller
{
private readonly IProvideTestPlanSetupModel _testPlanSetupProvider;
public TestPlanController(IProvideTestPlanSetupModel testPlanSetupProvider)
{
_testPlanSetupProvider = testPlanSetupProvider;
}
[HttpGet]
public ActionResult Create()
{
var createTestplanViewModel = _testPlanSetupProvider.GetModel();
return PartialView(createTestplanViewModel);
}
}
If you don't like constructing a view model anywhere outside the controller, the interface could provide an intermediate object with the same properties that you would copy to the view model. But that is silly, as this combination of data is only relevant for that particular view, which is precisely what the view model is supposed to represent.
On a side note, it seems you are running into pretty common annoyances doing read/write through the same model. Since these issues bother you so, you might investigate CQRS, which perhaps would make you feel less dirty about talking to the database directly for these types of queries and would help you get around the layering labyrinth we all enjoy so much. It seems promising, though I have not yet had the pleasure of test driving it in a production application.

Default values for constructor arguments in a library project

I am writing a library that will provide a collection of public types to its consumers.
I want to make types from this library dependency injection friendly. This means that every class needs to have a constructor through which it is possible to specify every single dependency of the object being initialized. I also want the library to adhere to the convention over configuration principle. This means that if a consumer wants the default behavior, he may use a parameterless constructor and the object will somehow construct the dependencies for itself.
In example (C#):
public class Samurai {
private readonly IWeapon _weapon;
// consumers will use this constructor most of the time
public Samurai() {
_weapon = ??? // get an instance of the default weapon somehow
}
// consumers will use this constructor if they want to explicitly
// configure dependencies for this instance
public Samurai(IWeapon weapon) {
_weapon = weapon;
}
}
My first solution would be to use the service locator pattern.
The code would look like this:
...
public Samurai() {
_weapon = ServiceLocator.Instance.Get<IWeapon>();
}
...
I have a problem with this, though. Service locator has been flagged as an anti-pattern (link) and I completely agree with these arguments. On the other hand, Martin Fowler advocates use of the service locator pattern exactly in this situation (library projects) (link). I want to be careful and eliminate the possible necessity to rewrite the library after it shows up that service locator really was a bad idea.
So in conclusion - do you think that service locator is fine in this scenario? Should I solve my problem in a completely different way? Any thought is welcome...
If you want to make life easier for users who are not using a DI container, you can provide default instances via a dedicated Defaults class which has methods like this:
public virtual Samurai CreateDefaultSamurai()
{
return new Samurai(CreateDefaultWeapon());
}
public virtual IWeapon CreateDefaultWeapon()
{
return new Shuriken();
}
This way you don't need to pollute the classes themselves with default constructors, and your users aren't at risk of using those default constructors unintentionally.
There is an alternative, that is injecting a specific provider, let's say a WeaponProvider in your case into your class so it can do the lookup for you:
public interface IWeaponProvider
{
IWeapon GetWeapon();
}
public class Samurai
{
private readonly IWeapon _weapon;
public Samurai(IWeaponProvider provider)
{
_weapon = provider.GetWeapon();
}
}
Now you can provide a local default provider for a weapon:
public class DefaultWeaponProvider : IWeaponProvider
{
public IWeapon GetWeapon()
{
return new Sword();
}
}
And since this is a local default (as opposed to one from a different assembly, so it's not a "bastard injection"), you can use it as part of your Samurai class as well:
public class Samurai
{
private readonly IWeapon _weapon;
public Samurai() : this(new DefaultWeaponProvider())
{
}
public Samurai(IWeaponProvider provider)
{
_weapon = provider.GetWeapon();
}
}
I have used the following approach in my C# project. The goal was to achieve dependency injection (for unit / mock testing) whilst not crippling the implementation of the code for a "normal use case" (i.e. having a large amount of new()'s that are cascaded through the execution flow).
public sealed class QueueProcessor : IQueueProcessor
{
private IVbfInventory vbfInventory;
private IVbfRetryList vbfRetryList;
public QueueProcessor(IVbfInventory vbfInventory = null, IVbfRetryList vbfRetryList = null)
{
this.vbfInventory = vbfInventory ?? new VbfInventory();
this.vbfRetryList = vbfRetryList ?? new VbfRetryList();
}
}
This allows DI but also means any consumer doesn't have to worry about what the "default instance flow" should be.

EF4 Repository Pattern problems injecting repository into service .Cannot seem to get it right

I am finding difficult to test EntityFramework 4 .I am using it using the database first approach,too late now to move to poco.Needed to deliver pretty quickly,no time to learn properly as usual.
I have implemented the repository pattern with unit of work but I am finding difficult to inject a repository into my Service layer so that I can test the behaviour of my business layer service ,validation etc... without hitting the db.
but I am incurring in many little problems.
In order to inject the Repository into the service(constructor) the calling layer need to have a reference to the DAL (EF Entities) . I dont want this
If i have many repositories EG CustomerRepository ,EmployeeRepository than I need to have as many constructors as repositories so that I can inject the repository.
3.Not sure where to go from here. I have not found any example on the net where they inject the repository into the service using EF4. All the examples I have seen they mock the repository on it's own,which is not good to me.
I need to test my service layer/BizLayer without hitting the database.
The all thing is just not testable and adds so many dependencies and problems.
Noddy example I have put together
public class DepartmentServiceLibrary
{
private readonly IDepartmentRepository _departmentRepository;
public DepartmentServiceLibrary(IDepartmentRepository departmentRepository)
{
_departmentRepository = departmentRepository;
}
public List<DepartmentDto> GetDepartments()
{
return DeparmentBiz.GetDepartments();
}
private DeparmentBL _departmentBiz;
private DeparmentBL DeparmentBiz
{
get
{
return _departmentBiz ?? new DeparmentBL(_departmentRepository);
}
}
}
//internal class
internal class DeparmentBL
{
private readonly IDepartmentRepository _departmentRepository;
public DeparmentBL(IDepartmentRepository departmentRepository)
{
_departmentRepository = departmentRepository;
}
public List<DepartmentDto> GetDepartments()
{
using (var ctx = new AdventureWorksContext())
{
var uow = new UnitOfWork(ctx);
_departmentRepository.UnitOfWork = uow;
var query = _departmentRepository.GetAll();
return query.Select(dpt => new DepartmentDto
{
DepartmentId = dpt.DepartmentID,
Name = dpt.Name,
GroupName = dpt.GroupName
}).ToList();
}
}
}
The following TestMethod requires me to add a ref to the dal which defeats the point
[TestMethod]
public void Should_be_able_to_call_get_departments()
{
var mock = new Mock<IDepartmentRepository>();
var expectedResult = new List<Department>(); //Dependency to DAL as Department is a EF Entity generated by EF.
mock.Setup(x => x.GetAll()).Returns(expectedResult);
var companyService = new MyCompanyBL(mock.Object); //InternalVisibileTO
var departments = companyService.GetAll();
//assert removed for brevity
Any suggestions or examples out there that shows how to do it?
thanks
}
The short answer is - since you're not using POCOs, all your layers will have a reference to your DAL.
Without POCOs, you use code generation, which means EF creates the model classes in the Model.edmx.designer.cs file.
An option (haven't tried this - off the top of my head) is to manually project the EF entities into DTOs.
So your Repository might do this:
public List<OrderDTO> GetOrdersForCustomer(int customerId)
{
return _ctx.Orders
.Where(x => x.CustomerId == customerId)
.ToList()
.Select(x => new OrderDTO { // left to right copy });
}
The OrderDTO class could be in a separate assembly, which the repository references, as well as your other projects. So the other projects would work off the DTO assembly, and wouldn't require a reference to the Repository.
But here you're projecting into classes everywhere (basically doing POCO, but manually, and with more work) left to right copying of properties - very painful.
However, that is an option.
Honestly - it does not take long to move to POCOs.
There is a T4 template which will generate the POCOs for you - you could be up and running in a matter of minutes.
And since you're already using dependency injection and repository, you should either bite the bullet and change to POCOs, or keep the reference to the DAL.
Something similar in terms of code can be seen here in GitHub
and detail explanation can be found in TechNet

Handling dependencies with IoC that change within a single function call

We are trying to figure out how to setup Dependency Injection for situations where service classes can have different dependencies based on how they are used. In our specific case, we have a web app where 95% of the time the connection string is the same for the entire Request (this is a web application), but sometimes it can change.
For example, we might have 2 classes with the following dependencies (simplified version - service actually has 4 dependencies):
public LoginService (IUserRepository userRep)
{
}
public UserRepository (IContext dbContext)
{
}
In our IoC container, most of our dependencies are auto-wired except the Context for which I have something like this (not actual code, it's from memory ... this is StructureMap):
x.ForRequestedType().Use()
.WithCtorArg("connectionString").EqualTo(Session["ConnString"]);
For 95% of our web application, this works perfectly. However, we have some admin-type functions that must operate across thousands of databases (one per client). Basically, we'd want to do this:
public CreateUserList(IList<string> connStrings)
{
foreach (connString in connStrings)
{
//first create dependency graph using new connection string
????
//then call service method on new database
_loginService.GetReportDataForAllUsers();
}
}
My question is: How do we create that new dependency graph for each time through the loop, while maintaining something that can easily be tested?
To defer the creation of an object until runtime, you can use a factory:
public interface ILoginServiceFactory
{
ILoginService CreateLoginService(string connectionString);
}
Usage:
public void CreateUserList(IList<string> connStrings)
{
foreach(connString in connStrings)
{
var loginService = _loginServiceFactory.CreateLoginService(connString);
loginService.GetReportDataForAllUsers();
}
}
Within the loop, do:
container.With("connectionString").EqualTo(connString).GetInstance<ILoginService>()
where "connectionString" is the name of a string constructor parameter on the concrete implementation of ILoginService.
So most UserRepository methods use a single connection string obtained from session, but several methods need to operate against a list of connection strings?
You can solve this problem by promoting the connection string dependency from IContext to the repository and adding two additional dependencies - a context factory and a list of all the possible connections strings the repository might need to do its work:
public UserRepository(IContextFactory contextFactory,
string defaultConnectionString,
List<string> allConnectionStrings)
Then each of its methods can build as many IContext instances as they need:
// In UserRepository
public CreateUserList() {
foreach (string connString in allConnectionStrings) {
IContext context = contextFactory.CreateInstance(connString);
// Build the rest of the dependency graph, etc.
_loginService.GetReportDataForAllUsers();
}
}
public LoginUser() {
IContext context = contextFactory.CreateInstance(defaultConnectionString);
// Build the rest of the dependency graph, etc.
}
We ended up just creating a concrete context and injecting that, then changing creating a wrapper class that changed the context's connection string. Seemed to work fine.

Resources