This is a general question about design. What's the best way to communicate between your business layer and presentation layer? We currently have a object that get pass into our business layer and the services reads the info from the object and sets the result into the object. When the service are finish, we'll have a object populated with result from business layer and then the UI can display according to the result of the object.
Is this the best approach? What other approach are out there?
Domain Driven Design books (the quickly version is freely avaible here) can give you insights into this.
In a nutshell, they suggest the following approach: the model objects transverse from model tier to view tier seamlessly (this can be tricky if you are using static typed languages or different languages on clinet/server, but it is trivial on dynamic ones). Also, services should only be used to perform action that do not belong to the model objects themselves (or when you have an action that involves lots of model objects).
Also, business logic should be put into model tier (entities, services, values objects), in order to prevent the famous anemic domain model anti pattern.
This is another approach. If it suits you, it depends a lot on the team, how much was code written, how much test coverage you have, how long the project is, if your team is agile or not, and so on. Domain Driven Design quickly discusses it even further, and any decision would be far less risky if you at least skim over it first (getting the original book from Eric Evans will help if you choose to delve further).
We use a listener pattern, and have events in the business layer send information to the presentation layer.
It depends on your architecture.
Some people structure their code all in the same exe or dll and follow a standard n-tier architecture.
Others might split it out so that their services are all web services instead of just standard classes. The benefit to this is re-usable business logic installed in one place within your physical infrastructure. So single changes apply accross all applications.
Software as a service and cloud computing are becoming the platform where things are moving towards. Amazons Elastic cloud, Microsofts Azure and other cloud providers are all offering numerous services which may affect your decisions for architecture.
One I'm about to use is
Silverlight UI
WCF Services - business logic here
NHibernate data access
Sql Server Database
We're only going to allow the layers of the application to talk via interfaces so that we can progress upto Azure cloud services once it becomes more mature.
Related
This question already has answers here:
Is the Controller in MVC considered an application service for DDD?
(4 answers)
Closed 6 years ago.
In an ASP.NET MVC world, could controllers act as the application layer, calling into my domain objects and services to get the work done (assuming the controllers just strictly calls the domain and does nothing more). In the particular case that am dealing there is very minimal application flow logic that I need to model, hence am thinking about doing away with application layer and calling the domain directly from within the controller.
Is this a fair approach?
I guess what you mean here is that your business logic is implemented using the Domain Model pattern. In such case, you application layer should be very simple, and by definition it shouldn't host any business logic. All business logic should reside in the domain layer; methods in your application layer should resemble the following steps:
Load instance of an aggregate
Execute action
Persist the updated aggregate
Return response to the user
If that's all you do in your application layer, I see no reason not to put it in the controller.
If, on the other hand, your domain model is anemic, and you do have business logic in the application layer, then I'd prefer to introduce a separate layer.
Treating your MVC controllers as your DDD application layer will present a few negative situations.
The biggest one is that your application layer (regarding DDD), is one of the key places where you model your domain/business processes. For example, you might in an application service have a method called:
PayrollService.CalculatePayslips();
This might involve, checking a domain service or external system for the current active employees, it might then need to query another domain service or external system for absences, pension deductions, etc. It will then need to call a domain service (or entity) to do the calculations.
Capturing domain/business logic like this in the MVC controllers would cause an architectural issue, should you want to trigger the task of calculating payslips from elsewhere in the system. Also if you ever wanted to expose this process as a web service, or even from another MVC controller, you would have references going across or up into your presentation layer. Solutions like these "might work", but they will contribute towards your application becoming Spaghetti Code or a Big Ball of Mud (both code smells in an architectural sense). You risk causing circular references and highly coupled code, which increases maintenance costs (both time, money, developer sanity, etc.) in the future.
At the end of the day, software development is a game of trade-offs. Architectural concerns become bigger issues, the bigger your application grows. For a lot of very small CRUD apps, architectural concerns are probably negligible. One of the key goals of DDD is to tackle complexity, so it could be argued that most DDD projects should be of sufficient complexity to be concerned about good enterprise architecture principles.
I would not do it.
The effort of creating a separate class for the application service and then invoke it from the controller is minimal. The cost is also only there when you build the application.
However, the price you have to pay once you start to maintain the application is much higher due to the high coupling between the business layer and the presentation layer.
Testing, reuse, refactoring etc is so much lower when you make sure to separate different concerns in an application.
I have to integrate various legacy applications with some newly introduced parts that are silos of information and have been built at different times with varying architectures. At times these applications may need to get data from other system if it exists and display it to the user within their own screens based on the business needs.
I was looking to see if its possible to implement a generic federation engine that kind of abstracts the aggregation of the data from various other OData endpoints and have a single version of truth.
An simplistic example could be as below.
I am not really looking to do an ETL here as that may introduce some data related side effects in terms of staleness etc.
Can some one share some ideas as to how this can be achieved or point me to any article on the net that shows such a concept.
Regards
Kiran
Officially, the answer is to use either the reflection provider or a custom provider.
Support for multiple data sources (odata)
Allow me to expose entities from multiple sources
To decide between the two approaches, take a look at this article.
If you decide that you need to build a custom provider, the referenced article also contains links to a series of other articles that will help you through the learning process.
Your project seems non-trivial, so in addition I recommend looking at other resources like the WCF Data Services Toolkit to help you along.
By the way, from an architecture standpoint, I believe your idea is sound. Yes, you may have some domain logic behind OData endpoints, but I've always believed this logic should be thin as OData is primarily used as part of data access layers, much like SQL (as opposed to service layers which encapsulate more behavior in the traditional sense). Even if that thin logic requires your aggregator to get a little smart, it's likely that you'll always be able to get away with it using a custom provider.
That being said, if the aggregator itself encapsulates a lot of behavior (as opposed to simply aggregating and re-exposing raw data), you should consider using another protocol that is less data-oriented (but keep using the OData backends in that service). Since domain logic is normally heavily specific, there's very rarely a one-size-fits-all type of protocol, so you'd naturally have to design it yourself.
However, if the aggregated data is exposed mostly as-is or with essentially structural changes (little to no behavior besides assembling the raw data), I think using OData again for that central component is very appropriate.
Obviously, and as you can see in the comments to your question, not everybody would agree with all of this -- so as always, take it with a grain of salt.
I am developing an application using MVC3 and Entity Framework. Its a three layered approach with Presentation Layer hosted in Web Server and Business Layer and Data Access Layer in Application Server. We are not exposing the Object Context to Presentation Layer or Business Layer. Object Context is wrapped in Data Access Layer only and exposing data access and data persistence as functionalities as Data Access Layer methods(ie data access logic is separated and implemented in Data Access Layer only). Business Layer is calling data access layer methods and returns data to Presentation Layer.
My concern is most of the Business Layer Methods are for just to access the data and it is just forwarding the call to Data Access Layer without any operation. So we are repeating the code in both layers. Do we have any other better approach to avoid this duplication.
Is it a good practise to implement the data access logic in the business layer in Layered approach?
Can someone suggest a good implementation approach for Layered application with 3-tier architecture?
If you find yourself that all that your business layer is doing is simply delegating the calls to the data access layer then this is a strong indication that this business layer is probably not necessary for your application as it brings no additional value. You can get rid of it and have the controllers talk directly to the data access layer (via an abstraction of course) - that's what most of the tutorials out there show with the EF data context.
In simple applications that are consisting mainly of CRUD actions, a business layer might not be necessary. As your application grows in complexity you might decide to introduce it later, but don't do too many abstractions from the beginning especially if those abstractions don't bring you any additional value.
As previously mentioned there is no "right" way to set things up. I have found a few things over the years that help me decide which approach to take.
Two-Tiered
If your shop is stored procedure heavy with a lot of database programmers and tends to put business logic in the database then go with a two-tiered approach. In this situation you will find that your business layer is generally just calling the data layer. Also, If your code base and pages tend to be small and you never repeat functionality then go with a two tiered approach. You will save a lot of time.
Three-Tiered
If your shop likes to have a lot of business logic in code and that logic is repeated everywhere then go with three-tiered. Even if you notice a lot of your business layer just calling the data layer you will find over time when you add functionality that it is a whole lot easier to just add code to the business layer. Also, If you have a large code base with a ton of pages with a lot of repeated logic then I would definitely go with this approach.
I have found a mixture of both in our enterprise level application at my work. The problematic areas are the ones where dynamic sql (gag) was used and business logic was repeated over and over. I'm finding as I refactor I pull this code into a 3-tiered architecture and save myself a ton of time in the future because I can just reuse it.
I don't think its necessarily bad to duplicate the code for logical separation. Time will come when this will pay off. Say you will replace SQL server by Oracle. Or Microsoft will come up with Linq 2.0 and some implementations will change. You will be thanking yourself for having it separate, while those who called database right from the business layer will have to do modifications in both places - DAL and BLL.
For what its worth. But again, there's no right answer, up to utility, usability, convenience, and most importantly having it matching its purpose.
Hope this is of help.
I will designing a couple of web applications shortly. They will probably be done in asp.net mvc.
In my existing web apps, done in delphi, the data access layer is seperated out into a completely separate application, sometimes running on a different server. This is done more for code reuse than for architectuaral reasons. This won't be a factor in the next app as it will be all new.
Is having a separate data access application overkill in a mvc app? I will already be separating out the business classes by virtue of using MVC, and I will be using an ORM to do the db persistance.
Edit: Just to clarify; I use the term tier to refer to separate physical applications, something more than just a logical separation or layer.
The term "Tier" in my experience is generally referring to physical application seperations e.g. Client Tier & Server Tier.
MVC - refers to 3 "Layers" with the concern being separation around the 3 concerns it details Model (Data), View (UI), Controller (App Logic).
Now that I have made that distinction regarding my terminology..
Is having a separate data access application overkill in a mvc app?
I would say No(again depending what you mean by application), it is not overkill, as it may in actual fact result in a more maintainable system. Your ORM will possibly allow for new data access options to be plugged in, but what if you wish to add a new ORM? Having a clearly separated Data Access Layer (DAL) will allows for much greater future flexibility in this aspect of your application.
On the other hand, depending on the scale and vision for the application creating a completely stand alone Data Access option may be overkill, but in a nutshell separation of the DAL into different assemblies is very much a recommended practice in the composition of an application implementing the MVC pattern.
Hope this helps, If you need more depth do comment.
Well I gues it depends a little on whether you're talking about tiers (pysical) or layers (logical/projects).
Regarding the layering - you could take a look at something like s#arp architecture (code.google.com/p/sharp-architecture/ ) for an example of how they do it (they have taken a pretty maximal approach to layering).
For an example of more minimalistic views here, take a look at Ayende's blog: ayende.com/Blog/
Regarding tiers - I think needlessly adding extra tiers and putting everthing over the wire will just hit your performance, unless you need to do this for capacity reasons. Get the layers right, and them seperate them into teirs as you need to adjust for capacity (should not take too much refactoring if you've seperated your concerns well).
Great comment Tobias.
I say add enough layers so that it makes sense to you and makes it easier to maintain. Also to keep a seperation of concerns.
Does anyone have advice or tips on using a web service as the model in an ASP.Net MVC application? I haven't seen anyone writing about doing this. I'd like to build an MVC app, but not tie it to using a specific database, nor limit the database to the single MVC app. I feel a web service (RESTful, most likely ADO.Net Data Services) is the way to go.
How likely, or useful, is it for your MVC app to be decoupled from your database? How often have you seen, in your application lifetime, a change from SQL Server to Oracle? From the last 10 years of projects I've delivered, it's never happened.
Architectures are like onions, they have layers of abstractions above things they depend on. And if you're going to use an RDBMS for storage, that's at the core of your architecture. Abstracting yourself from the DB so you can swap it around is very much a fallacy.
Now you can decouple your database access from your domain, and the repository pattern is one of the ways to do that. Most mature solutions use an ORM these days, so you may want to have a look at NHibernate if you want a mature technology, or ActiveRecord / linq2sql for a simpler active record pattern on top of your data.
Now that you have your data strategy in place, you have a domain of some sort. When you expose data to your client, you can choose to do so through an MVC pattern, where you'll usually send DTOs generated from your domain for rendering, or you can decide to leverage an architecture style like REST to provide more loosely coupled systems, by providing links and custom representations.
You go from tight coupling to looser coupling as you go towards the external layers of your solution.
If your question however was to build an MVC app on top of a REST architecture or web services, and use that as a model... Why bother? If you're going to have a domain model, why not reuse it in your system and your services where it makes sense?
Generating a UI from an MVC app and generating documents needed for a RESTful architecture are two completely different contexts, basing one on top of each other is just going to cause much more pain than needed. And you're sacrificing performance.
Depends on your exact scenario, but remote XML-based service as the model in MVC, from experience, not a good idea, it's probably over-engineering and disregarding the need for a domain to start with.
Edit 2010-11-27; clarified my thoughts, which was really needed.
A web service exposes functionality across different types of applications, not for abstraction in one single application, most often. You are probably thinking more of a way of encapsulating commands and reads in a way that doesn't interfere with your controller/view programming.
Use a service from a service bus if you're after the decoupling and do an async pattern in your async pages. You can see Rhino.ServiceBus, nServiceBus and MassTransit for .Net native implementations and RabbitMQ for something different http://blogs.digitar.com/jjww/2009/01/rabbits-and-warrens/.
Edit: I've had some time to try rabbit out in a way that pushed messages to my service which in turn pushed updates to the book keeping app. RabbitMQ is a message broker, aka a MOM (message oriented middle-ware) and you could use it to send messages to your application server.
You can also simply provide service interfaces. Read Eric Evan's Domain Driven Design for a more detailed description.
REST-ful service interfaces deal a lot with data, and more specifically with addressable resources. It can greatly simplify your programming model and allows great control over output through the HTTP protocol. WCF's upcoming programming model uses true rest as defined in the original thesis, where each document should to some extent provide URIs for continued navigation. Have a look at this.
(In my first version of this post, I lamented REST for being 'slow', whatever that means) REST-based APIs are also pretty much what CouchDB and Riak uses.
ADO.Net is rather crap (!) [N+1 problems with lazy collection because of code-to-implementation, data-access leakage - you always need your db context where your query code is etc] in comparison to for example LightSpeed (commercial) or NHibernate. Spring.Net also allows you to wrap service interfaces in their contain with a web service facade, but (without having browsed it for a while) I think it's a bit too xmly in its configuration.
Edit 1: With ADO.Net here I mean the default "best practice" with DataSets, DataAdapter and iterating lots of rows from a DataReader; it breeds rather ugly and hard-to-debug code. The N+1 stuff, yes, that is about the entity framework.
(Edit 2: EntityFramework doesn't impress me either!)
Edit 1: Create your domain layer in a separate assembly [aka. Core] and provide all domain and application services there, then import this assembly from your specific MVC application. Wrap data access in some DAO/Repository, through an interface in your core assembly, which your Data assembly then references and implements. Wire up interface and implementation with IoC. You can even program something for dynamic service discovery with the above mentioned service buses, to solve for the interfaces. WCF uses interfaces like this and so do most of the above service busses; you can provide a subcomponentresolver in your IoC container to do this automatically.
Edit 2:
A great combo for the above would be CQRS+EventSourcing+ReactiveExtensions. Your write-model would take commands, your domain model would decide whether to accept them, it would push events to the reactive-extensions pipeline, perhaps also over RabbitMQ, which your read-model would consume.
Update 2010-01-02 (edit 1)
The jest of my idea has been codified by something called MindTouch Dream. They have made a screencast where they treat almost all parts of a web application as a (web)-service, which also is exposed with REST.
They have created a highly parallel framework using co-routines to handle this, including their own elastic thread pool.
To all the nay-sayers in this question, in ur face :p! Listen to this screen-cast, especially at 12 minutes.
The actual framework is here.
If you are into this sort of programming, have a look at how monads work and their implementations in C#. You can also read up on CoRoutines.
Happy new year!
Update 2010-11-27 (edit 2)
It turned out CoRoutines got productized with the task parallel library from Microsoft. Your Task now implement the same features, as it implements IAsyncResult. Caliburn is a cool framework that uses them.
Reactive Extensions took the monad comprehensions to the next level of asynchronocity.
The ALT.Net world seems to be moving in the direction I talked about when I wrote this answer the first time, albeit with new types of architectures I knew little of.
You should define your models in a data access agnostic way, e.g. using Repository pattern. Then you can create concrete implementations backed by specific data access technologies (Web Service, SQL, etc).
It really depends on the size of this mvc project. I would say keep the UI and Domain in same running environment if the website is going to be used by a small number of users ( < 5000).
On the other side, if you are planning on a site that is going to be accessed by millions, you have to think distributed and that means you need to build your website in a way that it can scale up/out. That means you might need to use extra servers (Web, application and database).
For this to work nicely, you need to decouple your mvc UI site from the application. The application layer would usually contain your domain model and might be exposed through WCF or a service bus. I would prefer a Service Bus because it is more reliable and might use persistent queues like msmq.
I hope this helps