BLToolkit alternative object mapper that supports stored procedures - stored-procedures

I'm not too big of a fan of direct entity mappers, because I still think that SQL queries are fastest and most optimized when written by hand directly on and for the database (using correct joins, groupings, indexes etc).
On my current project I decided to give BLToolkit a try and I'm very pleased with its wrapper around Ado.net and speed so I query database and get strong type C# objects back. I've also written a T4 that generates stored procedure helpers so I don't have to use magic strings when calling stored procedures so all my calls use strong types for parameters.
Basically all my CRUD calls are done via stored procedures, because many of my queries are not simple select statements and especially my creates and updates also return results which is easily done using a stored procedure making just a single call. Anyway...
Downside
The biggest drawback of BLToolkit (I'd like everyone evaluating BLToolkit to know this) aren't its capabilities or speed but its very scarce documentation as well as support or lack thereof. So the biggest problem with this library is doing trial and error to get it working. That's why I also don't want to use too many different parts of it, because the more I use the more problems I have to solve on my own.
Question
What alternatives do I have to BLToolkit that:
support use of stored procedures that return whatever entities I provide that are not necessarily the same as DB tables
provide a nice object mapper from data reader to objects
supports relations (all of them)
optional (but desirable) support for multiple result-set results
doesn't need any special configuration (I only use data connection string and nothing else)
Basically it should be very lightweight, should basically just have a simple Ado.net wrapper and object mapper.
And the most important requirement: is easy to use, well supported and community uses it.

Alternatives (May 2011)
I can see that big guns have converted their access strategies to micro ORM tools. I was playing with the same idea when I evaluated BLToolkit, because it felt bulky (1.5MB) for the functionality I'd use. In the end I decided to write the aforementioned T4 (link in question) to make my life easier when calling stored procedures. But there are still many possibilities inside BLToolkit that I don't use at all or even understand (reasons also pointed out in the question).
Best alternative are micro ORM tools. Maybe it would be better to call them micro object mappers. They all have the same goals: simplicity and extreme speed. They are not following the NoSQL paradigm of their big fellow ORMs, so most of the time we have to write (almost) everyday TSQL to power their requests. They fetch data and map them to objects (and sometimes provide something more - check below).
I would like to point out 3 of them. They're all provided in a single code file and not as a compiled DLL:
Dapper - used by Stackoverflow itself; all it actually does it provides generic extension methods over IDbConnection which means it supports any backing data store as long there's a connection class that implements IDbConnection interface;
uses parametrised SQL
maps to static types as well as dynamic (.net 4+)
supports mapping to multiple objects per result record (as in 1-1 relationships ie. Person+Address)
supports multi-resultset object mapping
supports stored procedures
mappings are generated, compiled (MSIL) and cached - this can as well be downside if you use huge number of types)
Massive - written by Rob Connery;
only supports dynamic type mapping (no support in .net 3.5 or older baby)
is extremely small (few hundreds of lines of code)
provides a DynamicModel class that your entities inherit from and provides CRUD functionaly or maps from arbitrary baremetal TSQL
implicit paging support
supports column name mapping (but you have to do it every time you access data as opposed to declarative attributes)
supports stored procedures by writing direct parametrised TSQL
PetaPoco - inspired my Massive but with a requirement to support older framework versions
supports strong types as well as dynamic
provides T4 template to generate your POCOs - you'll end up with similar classes as big fellow ORMs (which means that code-first is not supported) but you don't have to use these you can still write your own POCO classes of course to keep your model lightweight and not include DB only information (ie. timestamps etc.)
similar to Dapper it also compiles mappings for speed and reuse
supports CRUD operations + IsNew
implicit paging support that returns a special type with page-full of data + all metadata (current page, number of all pages/records)
has extensibility point for various scenarios (logging, type converters etc)
supports declarative metadata (column/table mappings etc)
supports multi object mapping per result record with some automatic relation setting (unlike Dapper where you have to manually connect related objects)
supports stored procedures
has a helper SqlBuilder class for easier building TSQL statements
Of all three PetaPoco seems to be the liveliest in terms of development and support most of the things by taking the best of the other two (and some others).
Of all three Dapper has the best real-world usage reference because it's used by one of the highest traffic sites on the world: Stackoverflow.
They all suffer from magic string problem because you write SQL queries directly into them most of the time. But some of this can be mitigated by T4, so you can have strong typed calls that provide intellisense, compile-time checking and re-generation on the fly within Visual Studio.
Downside of dynamic type
I think the biggest downside of dynamic types is maintenance. Imagine your application using dynamic types. Looking at your own code after a while will become rather problematic, because you don't have any concrete classes to observe or hang on to. As much as dynamic types are a blessing they're as well a curse on the long run.

Related

Using ORM Vs. SP when developing a Solely Reporting Oriented Application

I have a requirement to design a Web Application which acts as a Facade for Reports based on a specific Schema, However, i have been given the whole choice to design and develop the Application in any Methodology and Technology, regardless of the argument, i have chosen to use RDLC (Client) for reports.
My Question is, in this case, is it better to use a Classic ADO.NET as my Data Access Layer (DataSets, DataTables,etc..) with Stored Procedures, or it is better to use ORM (lets say, NHibernate is the Choice) with ObjectDataSource in the UI ?
The major argument is that I'm expecting a very major changes in Reports and/or Schema it self.
To answer my own Question, i ended up using a Mixed Solution (ORM + Hard Coded SPs), i have used LinqToSql (as my ORM) while developing and utilizing Stored Procedures.
The major reason of this mix is to follow Client's Standards in Solution Development , which is based on SPs, also, the LinqToSql came as just a helper to call Stored Procedures (Also, forced to use it this way).

Any MVC 3 tutorials on *Real World* developement situations?

MVC total noobe here but long time web developer - 10 + years. The tutorials for MVC 3 (and earlier versions) are great but as usual they lack a ton for real on the job type scenarios.
For example, how often do you find yourself in a situation where you are going to create a new database from scratch with no stored procs so that you could actually use EF Code-First. I don't know about you but in my career it has been NEVER.
The usual story is that you are creating a new app or enhancing an existing app with new functionality that will connect to an existing very mature database with tons of stored procs, user defined functions and views and you are required either by management or time restrictions to use it all. And of course you may get to create some new tables but they usually will have joins to existing tables or in the least your app will have to query existing tables for some of the data.
To see a tutorial based on that scenario would be WORTH IT'S WEIGHT IN GOLD. Especially the stored procedure scenario.
Thank you for any advice
Most of the earlier examples (e.g. the original NerdDinner) were based on either Linq to Sql or Entity Framework (without CodeFirst). Since CodeFirst is the 'new hotness' most of the latest examples use it.
The interesting part of the question, though, is that it highlights an important point: "it doesn't matter". Your data access strategy (EF, EF code first, NHibernate, L2S, raw SQL) is totally irrelevant to MVC. By that I don't mean it's unimportant, I mean that MVC doesn't place any constraints on you at all in that regard.
You will generally (in a well-designed MVC app) pass your controllers interfaces which let them call data access methods of various sorts (or perhaps another layer of indirection with services that do other things before hitting the storage layer). The implementation of the data access, if it is using an ORM like EF or Nhibernate, will then have mechanisms for you to either use a query syntax of some sort (e.g. LINQ) or to call stored procedures (possible in all major ORMs that Iv'e used) or to push raw SQL if hte situation calls for it.

ADO.NET: do you have lots of stored procedure in your own systems?

hi all
We do know that CommandType property of a SqlCommand object has 3 options: TableDirect, Text and StoredProcedure or "SP".
Knowing that "SP" has benefits over two other options, my question is do you make lots of SP in your own systems?
Or What solution do you have instead of creating SP?
Thank you
Aside of creating Stored Procedures you can use Object Relational Mapping
Such as:
linq to sql
Nhibernate
Entity Framework
Data Access :SP's vs ORMs
Choose the best way that suits you.
In all production system I used SPs and pure ADO.NET Core to access the data. Systems range from having 100-300 tables and about 500-1000 stored procedures.
Most of the Data Access code is generated using a tool. I've posted the source code and sample application on my blog if you're interested in using/modifying it. The tool can generate over 100,000 lines of code in about 20-25 seconds going against a database with about 750 stored procedures.
Data Access Layer - Code Gen
Of course if you're no familiar with Databases, data modeling/design and stored procedures you're probably better off using Linq to SQL or EF4 (Entity Framework version 4) or similar. If you need brute force performance then ADO.NET core along with Stored procedures is the way to go.
Re: your first question
When you go down the path of stored procedures, the number of stored procedures begins to grow continually for the life of the project. Outside of the basic CRUD operations, each stored procedure tends to be tightly bound to a particular problem and not very re-usable. A rule of thumb is that I can expect 8-12 stored procedures for each data table (excluding reference or code tables, such as the list of states or countries).
The very large number of procs makes naming conventions very important so that you can find anything without constantly visually re-scanning the whole list of 400-500 procs.
Re: your second question
There are a lot of ugly things that happen with sql written inside of strings inside of C# or VB.NET -- it's error prone, ugly, etc.
Linq, nHybernate and many others exist, but the "concept count" (the number of things you need to learn to start being productive), is much higher than learning how to write a good stored procedure executer in C#.
I try to make sure that stored procedures are only created for database functionality - not business logic.
It's Database Functionality when I have some database architecture that's a bit obscure and I want to hide that from callers.
It's Business Logic when it is simply the way in which my application adds or updates or how much validation they do, etc., etc.

How would the 'Model' in a Rails-type webapp be implemented in a functional programming language?

In MVC web development frameworks such as Ruby on Rails, Django, and CakePHP, HTTP requests are routed to controllers, which fetch objects which are usually persisted to a backend database store. These objects represent things like users, blog posts, etc., and often contain logic within their methods for permissions, fetching and/or mutating other objects, validation, etc.
These frameworks are all very much object oriented. I've been reading up recently on functional programming and it seems to tout tremendous benefits such as testability, conciseness, modularity, etc. However most of the examples I've seen for functional programming implement trivial functionality like quicksort or the fibonnacci sequence, not complex webapps. I've looked at a few 'functional' web frameworks, and they all seem to implement the view and controller just fine, but largely skip over the whole 'model' and 'persistence' part. (I'm talking more about frameworks like Compojure which are supposed to be purely functional, versus something Lift which conveniently seems to use the OO part of Scala for the model -- but correct me if I'm wrong here.)
I haven't seen a good explanation of how functional programming can be used to provide the metaphor that OO programming provides, i.e. tables map to objects, and objects can have methods which provide powerful, encapsulated logic such as permissioning and validation. Also the whole concept of using SQL queries to persist data seems to violate the whole 'side effects' concept. Could someone provide an explanation of how the 'model' layer would be implemented in a functionally programmed web framework?
Without wanting to bash object oriented MVC frameworks -- I don't know Rails, but Django is an excellent piece of software to my eye -- I'm not sure that Object-Relational Mapping is a particularly good metaphor1.
Of course in an OO language it may seem natural to want to think of tables in terms of objects, but in a functional language it is perfectly natural to think of tables in terms of tables. A single row can be represented easily using an algebraic data type (in Haskell and other statically typed functional languages) or a map (a.k.a. a dictionary; an associative structure mapping keys to values); a table then becomes a sequence of rows, which after all it is even at the DB level. Thus there is no special mapping from the DB construct of a table to some other construct available in the programming language; you can simply use tables on both sides.2
Now this does not in any way mean that it is necessary to use SQL queries to manipulate the data in the DB, foregoing the benefits of abstraction over varios RDBMSs' quirks. Since you're using the Clojure tag, perhaps you might be interested in ClojureQL, an embedded DSL for communicating with various DBs in a generic way. (Note that it's being reworked just now.) You can use some such DSL for extracting data; manipulate the data thus obtained using pure functions; then display some results and maybe persist some data back to the DB (using the same DSL).
1 If you think comparing a technology to the Vietnam war is a bit extreme, I guess I agree, but that doesn't mean that article doesn't do a very good job of discribing why one might not want to sink in the ORM quagmire.
2 Note that you could use the same approach in an OO language and abstract over DB backends in the same way in which it's done in FP languages (see the next paragraph). Of course then your MVC framework would no longer look quite like Rails.
Have a look at the Conjure web application framework for an example of how one might implement an MVC framework in a functional programming language. Conjure uses clj-record for the model layer, which has support for associations and validations.

Would I use an ORM if I am using Stored Procedures?

If I use stored procedures, can I use an ORM?
EDIT:
If I can use a ORM, doesn't that defeat part of the database agnosticity reason for using an ORM? In other words, why else would I want to use an ORM, if I am binding myself to a particular database with stored procedures (or is that assumption wrong)?
Using ORM to access stored procedures is one of the best uses of ORM. It'll give you strongly typed objects, while you still have full control over the SQL.
In my experience I would let the ORM handle the 'CRUD' operations, and leave the specialty work to the stored procedures. Generally, using a stored procedure for 'CRUD' operations is overkill, and to let the ORM handle it, could drastically improve your productivity.
Yes, you can, all main ORMs support stored procedures.
As for your assumption, you are particulary right, when you use stored procedures with ORM you are coupling your project to a particular database. But in practice it is 99% that you will not need to change your database provider, so in this case you use ORM not to abstract from concrete DB provider, but to help yourself with object-relational mapping task - which is a main ORM's task and which ORM was originally made for.
It raises an interesting point.
Once you have ORM, and relatively simple queries, why do you need stored procedures? SP's are intimately bound to the database. ORM frees you from having to maintain a lot of DB-specific code. What is DB-specific can be isolated and managed.
I suggest that an ORM is a golden chance to cut the complexity and put all the processing in the code where it belongs.
Use the database for what it does best -- store data.
Use your application for what it does best -- process data.
You can use both ORM features and stored procedures functionality at once. Particularly use ORM until it fits you, but if you have some trouble with performance or need some low level tune - include stored procedures in your business-logic.
Yes you can but you will want to spend some time investigating what capabilities the ORM provides around stored procedures.
Most will allow you to run a stored procedure that returns a strongly typed object / entity. More advanced ORM's will allow you to plug stored procedures in for performing CRUD actions as well (so your generic querying, deleting etc goes via a stored procedure rather than a dynamic query).
Generally ORM's are great for generating ad-hoc queries and getting strongly typed entities but having strong stored procedure support has the benefit of allowing you to (sometimes) more easily access native capability of your RDMS that may not be exposed as first class citizens in the ORM - especially if the ORM supports many database engines.
Following up from your edit:
Often you will want to use the ad-hoc querying engine provided by the ORM however as I alluded to earlier - sometimes you want to query using a capability not exposed from the ORM.
The benefits of strongly typed entities is invaluable as it means you have domain object usually, rather than data readers, data tables etc. You can cleanly encapsulate behaviors and logic within those entities that you have retrieved.
The list of additional benefits is very long indeed - for example, with the LightSpeed ORM (and most others) your entities will support standard binding interfaces, error reporting interfaces, validation etc. On the querying side you will lose out on lazy loading etc unless you write it yourself.
Database "agnosticity" (?) is not the only reason to use an ORM. However, you could take advantage of being DB independent on 99% of your interactions with the DB and in 1% (or 2% or 10% or whatever) you might need stored procedures for speed/clarity/complexity. If you changed DBs, you would need to rewrite those.
I use netTiers a lot at work and we let it generate our stored procedures for us. These only handle the basic CRUD operations, but they are very fast and save me a TON of time. netTiers will also let us create custom stored procedures and generate our data access code with these procedures.
You can, but many of the more advanced ORM features tend to become more cumbersome to use. Something like iBatis is very easy to integrate with stored procedures, while the more sophisticated features of more complex engines like (N?)Hibernate like generation of dynamic SQL and lazy loading of large fields can become more of a hassle than they're worth.
I believe that any tool that frees you from redoing work and concentrate in solving the problems is valid. ORMs appear to be that tool when it come to basic CRUD operations - even if using SPs to better implement a requirement (like using a hammer on a nail, it's just the right tool for task).
The point is: there's no black or white, just a scale of gray. Very inneficient and badly coded applications use the excuse of being 'DB agnostic' to explain the exagerated use of DB resources. In many cases, being very tied to a database is not good too. The objective is: getting maximum 'DB agnosticism' while not wasting customer IT resources without need.
There's no 'old vs new', just people saying that extreme 'pure' approaches are better. I don't really believe so. I believe that, as with any tool, the 'best' (notice the quotes) approach is using ORM until still is the right tool to make your data access. And use SPs inside your ORM when you reach a point where you're wasting resources and reducing scalability and 'worth life' (I forgot the english expression equivalent for the portuguese 'vida Ăștil') of TI resources. Or, in other words, use SP when it's for the processing at hand what a hammer is for the nail.

Resources