I have a requirement to design a Web Application which acts as a Facade for Reports based on a specific Schema, However, i have been given the whole choice to design and develop the Application in any Methodology and Technology, regardless of the argument, i have chosen to use RDLC (Client) for reports.
My Question is, in this case, is it better to use a Classic ADO.NET as my Data Access Layer (DataSets, DataTables,etc..) with Stored Procedures, or it is better to use ORM (lets say, NHibernate is the Choice) with ObjectDataSource in the UI ?
The major argument is that I'm expecting a very major changes in Reports and/or Schema it self.
To answer my own Question, i ended up using a Mixed Solution (ORM + Hard Coded SPs), i have used LinqToSql (as my ORM) while developing and utilizing Stored Procedures.
The major reason of this mix is to follow Client's Standards in Solution Development , which is based on SPs, also, the LinqToSql came as just a helper to call Stored Procedures (Also, forced to use it this way).
Related
We are doing a new project, for all devices and browsers compatibility we have decided to use asp.net mvc 4, Html5, css 3, for communicating with Database Entity Framework we want to use.
Our senior members(Manager, DBA(they are also new to mvc 4, EF)) in the team asking us to write every thing will be in the stored procedures while communicating Database so that maintenance becomes easy.
Is it the correct match if we go like that(MVC4+ EF + stored procedures)? Will i not get maintenance and performance if i go with Code first reverse engineering(because database tables are ready i want to do like that), Please reply.
Below is the flow we want to do, please correct me
As Database is already ready, so first we will write the stored procedures for communication with DB.
New Mvc 4 project and will add .edmx file(EF) and select tables and Stored procedures
in mvc controller or web api we write the consuming stored procedures
There is nothing technically wrong with ASP.NET MVC + EF + Stored Procedures approach, from the first sight.
But my experience show, is that typically it's huge overkill. The common problem I see is the conflicting interests between developers and DBA's. In most worst scenarios all DB releated stuff are controlled by DBA, so if developer what to add/change some feature he needs to wait for implementation of it by DBA (or wait for approve, which could also take long).
So, I personally see that as more bureaucratic way of development.
My own perpective is to be more agile on development and tools like Code First matches that. Stored Procedures could still play major role, while code/performance optimization, but not something to start with.
I agree that using stored procedures in the database is a good approach. Centralizing data validation and calculations in the database ensures data integrity. Client-side validation is important for the user experience but you must also ensure that you test the data validity in the database.
Using Entity Framework, you can generate entities which relate directly to tables in your database, or else you can design entities which use procedures for insert/update/delete operations rather than simple table updates.
In MVC you will use the entities as models to manage your data interactions.
Good luck
This is my personal view. I am sure others might have different ones. Since you are asking this question I am hoping you are open for discussions, otherwise I wouldn’t have bothered as this topic is like a religious discussion as lots of people have very strong opinions and are not likely to change them.
Personally I don't think stored procedures are meant to write business logic. They should be used for writing data access logic. I would only use a stored procedure if I want to optimize an expensive query such as a dynamic search but nothing else. You will get slightly less performance if you have your logic in the domain model, but its not even noticeable in most situations.
One of the strong arguments for writing business logic in stored procedures is because you can easily change some logic by changing your stored procedure. But should we really go and change the business logic of a deployed application without doing proper testing. What will happen if you accidently do a mistake? Doing a deployment is not such a big deal now with continuous builds and I don’t think as a professional developer you should take that risk.
When you decide to write your logic in stored procedures, you give up all the object oriented concepts and you end up writing some procedural code that we wrote maybe 10 years ago. C# language has come a long way now and you will not be able to use those new language features in heart of your application which is business logic. You also loose the visual studio features to refactor code, advanced and easy debugging features etc.
I also don’t like the idea of having triggers as it’s not visible in source code. Imagine someone new in your team trying to add a new feature some time later and if he doesn’t know that a trigger exists, he might write some incorrect logic.
If your application contains some complex business logic, (I am sure most applications do) you should have a domain model that contains not only just properties of your entities, but also your logic. Otherwise you will be falling in to the anti-pattern called anemic data model.
You will not be able to test your business logic by writing unit testing if you have your logic in stored procedures.
You will also not be able to deploy your business logic to multiple servers if you have them in stored procedures if your site becomes really successful.
You will also not be using all powerful capabilities of Entity framework and LINQ if you have all your logic in your stored procedures. You actually don’t need an ORM Mapper if that is the approach you are going to take.
This is what I would recommend for your project.
Even though you already have the database, you can still use code first approach of Entity framework. You can download the EF code first reverse engineer power tool and have the code first code auto generated for you. This is going to be a one off thing and after than if you have any more changes, you can directly do to the database and update the code first code accordingly. Fluent API is bit confusing at first, but you can easily learn that from the generated code.
Do not access your data context from the controller. Have a repository layer that will contain all your data access logic. You can access the repository from your controller. (This allows you to unit test your code by mocking the repository). There are lots of video tutorials on how to use the repository pattern on asp.net site.
Your domain model is going to be the entities that got generated from the Entity framework. Try to have your business logic in those models. It takes a little while to get use to the domain model pattern. But one you get used to it you will start to appreciate its benefits.
Hope this helps.
I'm not too big of a fan of direct entity mappers, because I still think that SQL queries are fastest and most optimized when written by hand directly on and for the database (using correct joins, groupings, indexes etc).
On my current project I decided to give BLToolkit a try and I'm very pleased with its wrapper around Ado.net and speed so I query database and get strong type C# objects back. I've also written a T4 that generates stored procedure helpers so I don't have to use magic strings when calling stored procedures so all my calls use strong types for parameters.
Basically all my CRUD calls are done via stored procedures, because many of my queries are not simple select statements and especially my creates and updates also return results which is easily done using a stored procedure making just a single call. Anyway...
Downside
The biggest drawback of BLToolkit (I'd like everyone evaluating BLToolkit to know this) aren't its capabilities or speed but its very scarce documentation as well as support or lack thereof. So the biggest problem with this library is doing trial and error to get it working. That's why I also don't want to use too many different parts of it, because the more I use the more problems I have to solve on my own.
Question
What alternatives do I have to BLToolkit that:
support use of stored procedures that return whatever entities I provide that are not necessarily the same as DB tables
provide a nice object mapper from data reader to objects
supports relations (all of them)
optional (but desirable) support for multiple result-set results
doesn't need any special configuration (I only use data connection string and nothing else)
Basically it should be very lightweight, should basically just have a simple Ado.net wrapper and object mapper.
And the most important requirement: is easy to use, well supported and community uses it.
Alternatives (May 2011)
I can see that big guns have converted their access strategies to micro ORM tools. I was playing with the same idea when I evaluated BLToolkit, because it felt bulky (1.5MB) for the functionality I'd use. In the end I decided to write the aforementioned T4 (link in question) to make my life easier when calling stored procedures. But there are still many possibilities inside BLToolkit that I don't use at all or even understand (reasons also pointed out in the question).
Best alternative are micro ORM tools. Maybe it would be better to call them micro object mappers. They all have the same goals: simplicity and extreme speed. They are not following the NoSQL paradigm of their big fellow ORMs, so most of the time we have to write (almost) everyday TSQL to power their requests. They fetch data and map them to objects (and sometimes provide something more - check below).
I would like to point out 3 of them. They're all provided in a single code file and not as a compiled DLL:
Dapper - used by Stackoverflow itself; all it actually does it provides generic extension methods over IDbConnection which means it supports any backing data store as long there's a connection class that implements IDbConnection interface;
uses parametrised SQL
maps to static types as well as dynamic (.net 4+)
supports mapping to multiple objects per result record (as in 1-1 relationships ie. Person+Address)
supports multi-resultset object mapping
supports stored procedures
mappings are generated, compiled (MSIL) and cached - this can as well be downside if you use huge number of types)
Massive - written by Rob Connery;
only supports dynamic type mapping (no support in .net 3.5 or older baby)
is extremely small (few hundreds of lines of code)
provides a DynamicModel class that your entities inherit from and provides CRUD functionaly or maps from arbitrary baremetal TSQL
implicit paging support
supports column name mapping (but you have to do it every time you access data as opposed to declarative attributes)
supports stored procedures by writing direct parametrised TSQL
PetaPoco - inspired my Massive but with a requirement to support older framework versions
supports strong types as well as dynamic
provides T4 template to generate your POCOs - you'll end up with similar classes as big fellow ORMs (which means that code-first is not supported) but you don't have to use these you can still write your own POCO classes of course to keep your model lightweight and not include DB only information (ie. timestamps etc.)
similar to Dapper it also compiles mappings for speed and reuse
supports CRUD operations + IsNew
implicit paging support that returns a special type with page-full of data + all metadata (current page, number of all pages/records)
has extensibility point for various scenarios (logging, type converters etc)
supports declarative metadata (column/table mappings etc)
supports multi object mapping per result record with some automatic relation setting (unlike Dapper where you have to manually connect related objects)
supports stored procedures
has a helper SqlBuilder class for easier building TSQL statements
Of all three PetaPoco seems to be the liveliest in terms of development and support most of the things by taking the best of the other two (and some others).
Of all three Dapper has the best real-world usage reference because it's used by one of the highest traffic sites on the world: Stackoverflow.
They all suffer from magic string problem because you write SQL queries directly into them most of the time. But some of this can be mitigated by T4, so you can have strong typed calls that provide intellisense, compile-time checking and re-generation on the fly within Visual Studio.
Downside of dynamic type
I think the biggest downside of dynamic types is maintenance. Imagine your application using dynamic types. Looking at your own code after a while will become rather problematic, because you don't have any concrete classes to observe or hang on to. As much as dynamic types are a blessing they're as well a curse on the long run.
MVC total noobe here but long time web developer - 10 + years. The tutorials for MVC 3 (and earlier versions) are great but as usual they lack a ton for real on the job type scenarios.
For example, how often do you find yourself in a situation where you are going to create a new database from scratch with no stored procs so that you could actually use EF Code-First. I don't know about you but in my career it has been NEVER.
The usual story is that you are creating a new app or enhancing an existing app with new functionality that will connect to an existing very mature database with tons of stored procs, user defined functions and views and you are required either by management or time restrictions to use it all. And of course you may get to create some new tables but they usually will have joins to existing tables or in the least your app will have to query existing tables for some of the data.
To see a tutorial based on that scenario would be WORTH IT'S WEIGHT IN GOLD. Especially the stored procedure scenario.
Thank you for any advice
Most of the earlier examples (e.g. the original NerdDinner) were based on either Linq to Sql or Entity Framework (without CodeFirst). Since CodeFirst is the 'new hotness' most of the latest examples use it.
The interesting part of the question, though, is that it highlights an important point: "it doesn't matter". Your data access strategy (EF, EF code first, NHibernate, L2S, raw SQL) is totally irrelevant to MVC. By that I don't mean it's unimportant, I mean that MVC doesn't place any constraints on you at all in that regard.
You will generally (in a well-designed MVC app) pass your controllers interfaces which let them call data access methods of various sorts (or perhaps another layer of indirection with services that do other things before hitting the storage layer). The implementation of the data access, if it is using an ORM like EF or Nhibernate, will then have mechanisms for you to either use a query syntax of some sort (e.g. LINQ) or to call stored procedures (possible in all major ORMs that Iv'e used) or to push raw SQL if hte situation calls for it.
Does anyone have advice or tips on using a web service as the model in an ASP.Net MVC application? I haven't seen anyone writing about doing this. I'd like to build an MVC app, but not tie it to using a specific database, nor limit the database to the single MVC app. I feel a web service (RESTful, most likely ADO.Net Data Services) is the way to go.
How likely, or useful, is it for your MVC app to be decoupled from your database? How often have you seen, in your application lifetime, a change from SQL Server to Oracle? From the last 10 years of projects I've delivered, it's never happened.
Architectures are like onions, they have layers of abstractions above things they depend on. And if you're going to use an RDBMS for storage, that's at the core of your architecture. Abstracting yourself from the DB so you can swap it around is very much a fallacy.
Now you can decouple your database access from your domain, and the repository pattern is one of the ways to do that. Most mature solutions use an ORM these days, so you may want to have a look at NHibernate if you want a mature technology, or ActiveRecord / linq2sql for a simpler active record pattern on top of your data.
Now that you have your data strategy in place, you have a domain of some sort. When you expose data to your client, you can choose to do so through an MVC pattern, where you'll usually send DTOs generated from your domain for rendering, or you can decide to leverage an architecture style like REST to provide more loosely coupled systems, by providing links and custom representations.
You go from tight coupling to looser coupling as you go towards the external layers of your solution.
If your question however was to build an MVC app on top of a REST architecture or web services, and use that as a model... Why bother? If you're going to have a domain model, why not reuse it in your system and your services where it makes sense?
Generating a UI from an MVC app and generating documents needed for a RESTful architecture are two completely different contexts, basing one on top of each other is just going to cause much more pain than needed. And you're sacrificing performance.
Depends on your exact scenario, but remote XML-based service as the model in MVC, from experience, not a good idea, it's probably over-engineering and disregarding the need for a domain to start with.
Edit 2010-11-27; clarified my thoughts, which was really needed.
A web service exposes functionality across different types of applications, not for abstraction in one single application, most often. You are probably thinking more of a way of encapsulating commands and reads in a way that doesn't interfere with your controller/view programming.
Use a service from a service bus if you're after the decoupling and do an async pattern in your async pages. You can see Rhino.ServiceBus, nServiceBus and MassTransit for .Net native implementations and RabbitMQ for something different http://blogs.digitar.com/jjww/2009/01/rabbits-and-warrens/.
Edit: I've had some time to try rabbit out in a way that pushed messages to my service which in turn pushed updates to the book keeping app. RabbitMQ is a message broker, aka a MOM (message oriented middle-ware) and you could use it to send messages to your application server.
You can also simply provide service interfaces. Read Eric Evan's Domain Driven Design for a more detailed description.
REST-ful service interfaces deal a lot with data, and more specifically with addressable resources. It can greatly simplify your programming model and allows great control over output through the HTTP protocol. WCF's upcoming programming model uses true rest as defined in the original thesis, where each document should to some extent provide URIs for continued navigation. Have a look at this.
(In my first version of this post, I lamented REST for being 'slow', whatever that means) REST-based APIs are also pretty much what CouchDB and Riak uses.
ADO.Net is rather crap (!) [N+1 problems with lazy collection because of code-to-implementation, data-access leakage - you always need your db context where your query code is etc] in comparison to for example LightSpeed (commercial) or NHibernate. Spring.Net also allows you to wrap service interfaces in their contain with a web service facade, but (without having browsed it for a while) I think it's a bit too xmly in its configuration.
Edit 1: With ADO.Net here I mean the default "best practice" with DataSets, DataAdapter and iterating lots of rows from a DataReader; it breeds rather ugly and hard-to-debug code. The N+1 stuff, yes, that is about the entity framework.
(Edit 2: EntityFramework doesn't impress me either!)
Edit 1: Create your domain layer in a separate assembly [aka. Core] and provide all domain and application services there, then import this assembly from your specific MVC application. Wrap data access in some DAO/Repository, through an interface in your core assembly, which your Data assembly then references and implements. Wire up interface and implementation with IoC. You can even program something for dynamic service discovery with the above mentioned service buses, to solve for the interfaces. WCF uses interfaces like this and so do most of the above service busses; you can provide a subcomponentresolver in your IoC container to do this automatically.
Edit 2:
A great combo for the above would be CQRS+EventSourcing+ReactiveExtensions. Your write-model would take commands, your domain model would decide whether to accept them, it would push events to the reactive-extensions pipeline, perhaps also over RabbitMQ, which your read-model would consume.
Update 2010-01-02 (edit 1)
The jest of my idea has been codified by something called MindTouch Dream. They have made a screencast where they treat almost all parts of a web application as a (web)-service, which also is exposed with REST.
They have created a highly parallel framework using co-routines to handle this, including their own elastic thread pool.
To all the nay-sayers in this question, in ur face :p! Listen to this screen-cast, especially at 12 minutes.
The actual framework is here.
If you are into this sort of programming, have a look at how monads work and their implementations in C#. You can also read up on CoRoutines.
Happy new year!
Update 2010-11-27 (edit 2)
It turned out CoRoutines got productized with the task parallel library from Microsoft. Your Task now implement the same features, as it implements IAsyncResult. Caliburn is a cool framework that uses them.
Reactive Extensions took the monad comprehensions to the next level of asynchronocity.
The ALT.Net world seems to be moving in the direction I talked about when I wrote this answer the first time, albeit with new types of architectures I knew little of.
You should define your models in a data access agnostic way, e.g. using Repository pattern. Then you can create concrete implementations backed by specific data access technologies (Web Service, SQL, etc).
It really depends on the size of this mvc project. I would say keep the UI and Domain in same running environment if the website is going to be used by a small number of users ( < 5000).
On the other side, if you are planning on a site that is going to be accessed by millions, you have to think distributed and that means you need to build your website in a way that it can scale up/out. That means you might need to use extra servers (Web, application and database).
For this to work nicely, you need to decouple your mvc UI site from the application. The application layer would usually contain your domain model and might be exposed through WCF or a service bus. I would prefer a Service Bus because it is more reliable and might use persistent queues like msmq.
I hope this helps
If I use stored procedures, can I use an ORM?
EDIT:
If I can use a ORM, doesn't that defeat part of the database agnosticity reason for using an ORM? In other words, why else would I want to use an ORM, if I am binding myself to a particular database with stored procedures (or is that assumption wrong)?
Using ORM to access stored procedures is one of the best uses of ORM. It'll give you strongly typed objects, while you still have full control over the SQL.
In my experience I would let the ORM handle the 'CRUD' operations, and leave the specialty work to the stored procedures. Generally, using a stored procedure for 'CRUD' operations is overkill, and to let the ORM handle it, could drastically improve your productivity.
Yes, you can, all main ORMs support stored procedures.
As for your assumption, you are particulary right, when you use stored procedures with ORM you are coupling your project to a particular database. But in practice it is 99% that you will not need to change your database provider, so in this case you use ORM not to abstract from concrete DB provider, but to help yourself with object-relational mapping task - which is a main ORM's task and which ORM was originally made for.
It raises an interesting point.
Once you have ORM, and relatively simple queries, why do you need stored procedures? SP's are intimately bound to the database. ORM frees you from having to maintain a lot of DB-specific code. What is DB-specific can be isolated and managed.
I suggest that an ORM is a golden chance to cut the complexity and put all the processing in the code where it belongs.
Use the database for what it does best -- store data.
Use your application for what it does best -- process data.
You can use both ORM features and stored procedures functionality at once. Particularly use ORM until it fits you, but if you have some trouble with performance or need some low level tune - include stored procedures in your business-logic.
Yes you can but you will want to spend some time investigating what capabilities the ORM provides around stored procedures.
Most will allow you to run a stored procedure that returns a strongly typed object / entity. More advanced ORM's will allow you to plug stored procedures in for performing CRUD actions as well (so your generic querying, deleting etc goes via a stored procedure rather than a dynamic query).
Generally ORM's are great for generating ad-hoc queries and getting strongly typed entities but having strong stored procedure support has the benefit of allowing you to (sometimes) more easily access native capability of your RDMS that may not be exposed as first class citizens in the ORM - especially if the ORM supports many database engines.
Following up from your edit:
Often you will want to use the ad-hoc querying engine provided by the ORM however as I alluded to earlier - sometimes you want to query using a capability not exposed from the ORM.
The benefits of strongly typed entities is invaluable as it means you have domain object usually, rather than data readers, data tables etc. You can cleanly encapsulate behaviors and logic within those entities that you have retrieved.
The list of additional benefits is very long indeed - for example, with the LightSpeed ORM (and most others) your entities will support standard binding interfaces, error reporting interfaces, validation etc. On the querying side you will lose out on lazy loading etc unless you write it yourself.
Database "agnosticity" (?) is not the only reason to use an ORM. However, you could take advantage of being DB independent on 99% of your interactions with the DB and in 1% (or 2% or 10% or whatever) you might need stored procedures for speed/clarity/complexity. If you changed DBs, you would need to rewrite those.
I use netTiers a lot at work and we let it generate our stored procedures for us. These only handle the basic CRUD operations, but they are very fast and save me a TON of time. netTiers will also let us create custom stored procedures and generate our data access code with these procedures.
You can, but many of the more advanced ORM features tend to become more cumbersome to use. Something like iBatis is very easy to integrate with stored procedures, while the more sophisticated features of more complex engines like (N?)Hibernate like generation of dynamic SQL and lazy loading of large fields can become more of a hassle than they're worth.
I believe that any tool that frees you from redoing work and concentrate in solving the problems is valid. ORMs appear to be that tool when it come to basic CRUD operations - even if using SPs to better implement a requirement (like using a hammer on a nail, it's just the right tool for task).
The point is: there's no black or white, just a scale of gray. Very inneficient and badly coded applications use the excuse of being 'DB agnostic' to explain the exagerated use of DB resources. In many cases, being very tied to a database is not good too. The objective is: getting maximum 'DB agnosticism' while not wasting customer IT resources without need.
There's no 'old vs new', just people saying that extreme 'pure' approaches are better. I don't really believe so. I believe that, as with any tool, the 'best' (notice the quotes) approach is using ORM until still is the right tool to make your data access. And use SPs inside your ORM when you reach a point where you're wasting resources and reducing scalability and 'worth life' (I forgot the english expression equivalent for the portuguese 'vida útil') of TI resources. Or, in other words, use SP when it's for the processing at hand what a hammer is for the nail.