As generic repositories can be use for Insert, Update, Delete etc. but it is limited only for simple Crud operations, if I want to use my generic repositories for some more complex insertion and searching like joining of tables and retrieve by stored procedures. And insert data to multiple tables using stored procedures so how will I handle it from generic repositories?
1. Can I call stored procedures from my generic repository?
2. Is it a good idea to use generic repositories + stored procedures?
3. Will it compatible for criteria base complex search?
Your question should be more specific in order to give you a right answer, but here's my two cents:
Yes it is possible to call stored procedures from your repository. If you are using Entity Framework, it is possible to map stored procedures to a CLR object, which you can call from your repository. Depending on the framework (or lack of it) you use, you'll need to figure out the right syntax to do so.
Personally I wouldn't use this approach as I'd rather put business logic in the business layer. However in some cases it will definitely be useful, for instance when performance is ultra important or where you want complete control over your queries, or some other complex operations.
Do keep in mind that you do can use LINQ to build your complex queries (joining, ordering, filtering, aggregating, selecting,, ..). Some parts can even be automated, like the WHERE clause using PredicateBuilder, enabling you to dynamically create and/or queries based on user input, search criteria,..
In general, YES it is possible but it all depends on the context where you're working in.
Related
I'm not too big of a fan of direct entity mappers, because I still think that SQL queries are fastest and most optimized when written by hand directly on and for the database (using correct joins, groupings, indexes etc).
On my current project I decided to give BLToolkit a try and I'm very pleased with its wrapper around Ado.net and speed so I query database and get strong type C# objects back. I've also written a T4 that generates stored procedure helpers so I don't have to use magic strings when calling stored procedures so all my calls use strong types for parameters.
Basically all my CRUD calls are done via stored procedures, because many of my queries are not simple select statements and especially my creates and updates also return results which is easily done using a stored procedure making just a single call. Anyway...
Downside
The biggest drawback of BLToolkit (I'd like everyone evaluating BLToolkit to know this) aren't its capabilities or speed but its very scarce documentation as well as support or lack thereof. So the biggest problem with this library is doing trial and error to get it working. That's why I also don't want to use too many different parts of it, because the more I use the more problems I have to solve on my own.
Question
What alternatives do I have to BLToolkit that:
support use of stored procedures that return whatever entities I provide that are not necessarily the same as DB tables
provide a nice object mapper from data reader to objects
supports relations (all of them)
optional (but desirable) support for multiple result-set results
doesn't need any special configuration (I only use data connection string and nothing else)
Basically it should be very lightweight, should basically just have a simple Ado.net wrapper and object mapper.
And the most important requirement: is easy to use, well supported and community uses it.
Alternatives (May 2011)
I can see that big guns have converted their access strategies to micro ORM tools. I was playing with the same idea when I evaluated BLToolkit, because it felt bulky (1.5MB) for the functionality I'd use. In the end I decided to write the aforementioned T4 (link in question) to make my life easier when calling stored procedures. But there are still many possibilities inside BLToolkit that I don't use at all or even understand (reasons also pointed out in the question).
Best alternative are micro ORM tools. Maybe it would be better to call them micro object mappers. They all have the same goals: simplicity and extreme speed. They are not following the NoSQL paradigm of their big fellow ORMs, so most of the time we have to write (almost) everyday TSQL to power their requests. They fetch data and map them to objects (and sometimes provide something more - check below).
I would like to point out 3 of them. They're all provided in a single code file and not as a compiled DLL:
Dapper - used by Stackoverflow itself; all it actually does it provides generic extension methods over IDbConnection which means it supports any backing data store as long there's a connection class that implements IDbConnection interface;
uses parametrised SQL
maps to static types as well as dynamic (.net 4+)
supports mapping to multiple objects per result record (as in 1-1 relationships ie. Person+Address)
supports multi-resultset object mapping
supports stored procedures
mappings are generated, compiled (MSIL) and cached - this can as well be downside if you use huge number of types)
Massive - written by Rob Connery;
only supports dynamic type mapping (no support in .net 3.5 or older baby)
is extremely small (few hundreds of lines of code)
provides a DynamicModel class that your entities inherit from and provides CRUD functionaly or maps from arbitrary baremetal TSQL
implicit paging support
supports column name mapping (but you have to do it every time you access data as opposed to declarative attributes)
supports stored procedures by writing direct parametrised TSQL
PetaPoco - inspired my Massive but with a requirement to support older framework versions
supports strong types as well as dynamic
provides T4 template to generate your POCOs - you'll end up with similar classes as big fellow ORMs (which means that code-first is not supported) but you don't have to use these you can still write your own POCO classes of course to keep your model lightweight and not include DB only information (ie. timestamps etc.)
similar to Dapper it also compiles mappings for speed and reuse
supports CRUD operations + IsNew
implicit paging support that returns a special type with page-full of data + all metadata (current page, number of all pages/records)
has extensibility point for various scenarios (logging, type converters etc)
supports declarative metadata (column/table mappings etc)
supports multi object mapping per result record with some automatic relation setting (unlike Dapper where you have to manually connect related objects)
supports stored procedures
has a helper SqlBuilder class for easier building TSQL statements
Of all three PetaPoco seems to be the liveliest in terms of development and support most of the things by taking the best of the other two (and some others).
Of all three Dapper has the best real-world usage reference because it's used by one of the highest traffic sites on the world: Stackoverflow.
They all suffer from magic string problem because you write SQL queries directly into them most of the time. But some of this can be mitigated by T4, so you can have strong typed calls that provide intellisense, compile-time checking and re-generation on the fly within Visual Studio.
Downside of dynamic type
I think the biggest downside of dynamic types is maintenance. Imagine your application using dynamic types. Looking at your own code after a while will become rather problematic, because you don't have any concrete classes to observe or hang on to. As much as dynamic types are a blessing they're as well a curse on the long run.
hi all
We do know that CommandType property of a SqlCommand object has 3 options: TableDirect, Text and StoredProcedure or "SP".
Knowing that "SP" has benefits over two other options, my question is do you make lots of SP in your own systems?
Or What solution do you have instead of creating SP?
Thank you
Aside of creating Stored Procedures you can use Object Relational Mapping
Such as:
linq to sql
Nhibernate
Entity Framework
Data Access :SP's vs ORMs
Choose the best way that suits you.
In all production system I used SPs and pure ADO.NET Core to access the data. Systems range from having 100-300 tables and about 500-1000 stored procedures.
Most of the Data Access code is generated using a tool. I've posted the source code and sample application on my blog if you're interested in using/modifying it. The tool can generate over 100,000 lines of code in about 20-25 seconds going against a database with about 750 stored procedures.
Data Access Layer - Code Gen
Of course if you're no familiar with Databases, data modeling/design and stored procedures you're probably better off using Linq to SQL or EF4 (Entity Framework version 4) or similar. If you need brute force performance then ADO.NET core along with Stored procedures is the way to go.
Re: your first question
When you go down the path of stored procedures, the number of stored procedures begins to grow continually for the life of the project. Outside of the basic CRUD operations, each stored procedure tends to be tightly bound to a particular problem and not very re-usable. A rule of thumb is that I can expect 8-12 stored procedures for each data table (excluding reference or code tables, such as the list of states or countries).
The very large number of procs makes naming conventions very important so that you can find anything without constantly visually re-scanning the whole list of 400-500 procs.
Re: your second question
There are a lot of ugly things that happen with sql written inside of strings inside of C# or VB.NET -- it's error prone, ugly, etc.
Linq, nHybernate and many others exist, but the "concept count" (the number of things you need to learn to start being productive), is much higher than learning how to write a good stored procedure executer in C#.
I try to make sure that stored procedures are only created for database functionality - not business logic.
It's Database Functionality when I have some database architecture that's a bit obscure and I want to hide that from callers.
It's Business Logic when it is simply the way in which my application adds or updates or how much validation they do, etc., etc.
I'm developing a Java EE application (JSF + Richfaces + Oracle 10g), and i wanted to use JPA.
But in the end, i didn't see any advantages of using it, because it's going to complexify my code.
I found that calling my procedures (stored procedures in my orale DB) is better than using JPA (because i can, for example, change some lines in those procedures without the need to re'compile my "WAR" project every time i have some error + i can use PL/SQL which helps me a lot)
So, i wanted to ask you people, when to use JPA ?
Isn't making your own queries (you can choose the right ordering for you selects, the columns that you want to select, and not all the columns: because of ORM and the fact that your entities attributes are mapped to the columns of your table, and that oblige you to select all the attributes present in your entity,....)
Is it my method that i used (stored function)
But in the end, i didn't see any advantages of using it, because it's going to complexify my code.
This might be subjective but, personally, I find JDBC typically more verbose, harder to maintain and thus somehow more complex. With an ORM like JPA, you don't have to write all the CRUD queries, you don't have to handle the mapping of query results to objects, you don't have to handle the low level stuff yourself, etc.
I found that calling my procedures (stored procedures in my orale DB) is better than using JPA (because i can, for example, change some lines in those procedures without the need to re'compile my "WAR" project every time i have some error + I can use PL/SQL which helps me a lot)
This is totally dependent on your definition of "better". In your case, you might prefer SP because of your development workflow (I don't run my code in container to setup the persistence part) and because you are comfortable with PL/SQL. But again, personally, I don't find SP "better":
I don't really enjoy hand writing SQL queries for everything
I find that SP are harder to test, debug, maintain
I find that SP are bad for code reuse
I find that SP lock you in (I consider this as a disadvantage)
I tend to find systems build around SP harder to scale
Read also Who Needs Stored Procedures, Anyways? (amongst other resource) for more opinions.
So, I wanted to ask you people, when to use JPA ?
When you want to speed-up development, to focus on implementing business code instead of plumbing. And of course, when appropriate.
Isn't making your own queries (...)
Did you identify a particular performance problem? Or are you just assuming there will be a problem. To my experience, retrieving more columns than required is most of time not an issue. And if it becomes an issue, there are solutions.
You can define what fields are retrieved in a query, using JPA.
Why is the complexity of your code going up with use of JPA ? You state no example. Before you'd have to do messy JDBC, and now you just call "em.persist(obj)" ... is that really more complex.
You ought to be thinking in terms of what relations there are between objects (if any) and that be the determiner on whether you need an ORM
I just started with ASP.NET MVC 1.0. I've read through some tutorials but I don't have any good ideas on how to build my model.
I've been experimenting with LINQ to SQL. Can't say I like it that much, but I'll give it a try. I prefer using SQL Stored Procedures, but it doesn't seem to work very good together with the optional parameters in my stored procedures.
The project is a prototype, but I would like to start with a good design right away.
The database will probably contain around 50 tables and stored procedures will be used for all queries and updates. Some procedures will also return multiple results. How would the best way to design this model?
All and any ideas are welcome. Should I even use LINQ to SQL? Should I design my stored procedures to only return one result? I feel a little bit lost.
You can use stored procedures with LINQ. You simply need to replace the autogenerated statements on your entities with your stored procs. Look at the properties for the entity (table) in the designer to change these. Your stored procedures won't be able to return multiple results as they need to map onto a collection of either one of your entities or an autogenerated entity, i.e, onto a collection of a single class.
Having said all that, I'd give it a go without using stored procedures. The LINQ code is fully parameterized so you already have this benefit. You'll find that LINQ allows you to easily build up queries alleviating your need for the optional parameters in your stored procs and, likely, resulting in more efficient queries as you don't need to build extra logic into your query to handle the optional parameters. Once you start using LINQ I think you'll find that the power and expressiveness in the code will make you forget about using your stored procedures.
You may still have a need for stored procs or table-valued functions for complex queries -- that's ok. If they map onto an existing entity, you can simply drag the proc/function onto the entity in the designer and you get a nice method on your data context that runs your SQL and returns collections of that entity. If not, I often create a view for the proc/function schema, use that as an entity, and attach the proc/function to it.
May I ask why you need to use stored procedures for this project? They have almost no value today if you are working with a modern database. Parameterized SQL using a good ORM will give you the same cached execution plans and security benefits that you are looking for with stored procedures.
I would recommend NHibernate because it's a far better way to achieve persistence ignorance. And IMHO this is the goal of anyone designing a model. You should be thinking object first instead of data first.
The best resource around is a screencast series by Steve Bohlen "The summer of nhibernate"
Also LINQ to NHibernate is available and works great for any dynamic query that you might need to write.
The only pain you will notice with NHibernate is the XML, but if you really enjoy the ORM you could write a fluent interface instead using FNH.
I found NHibernate to be the best tool around for modeling a domain because your objects are 100% infrastructure free and this allows you to do anything you want with them without having to worry about the database. Also if you are into unit testing of any kind this will make your life much easier :P
Edit
Part of the reason I replied to your question is that I have built many a system using stored procedures and found to regret that decision later. But in my situation I had a requirement by a dba so I couldn't step away from the norm.
If you are already looking at LINQ to SQL I would simply encourage you to look at NHibernate as another great option to achieve the goal of persistence ignorance.
When I got away from thinking about the problem data first it allowed me to work at a much higher level and take advantage of the .net platform for solving problems. TSQL has its place but if you are writing an application and want to maintain it - try to think about how your domain objects will work together in memory first.
If I use stored procedures, can I use an ORM?
EDIT:
If I can use a ORM, doesn't that defeat part of the database agnosticity reason for using an ORM? In other words, why else would I want to use an ORM, if I am binding myself to a particular database with stored procedures (or is that assumption wrong)?
Using ORM to access stored procedures is one of the best uses of ORM. It'll give you strongly typed objects, while you still have full control over the SQL.
In my experience I would let the ORM handle the 'CRUD' operations, and leave the specialty work to the stored procedures. Generally, using a stored procedure for 'CRUD' operations is overkill, and to let the ORM handle it, could drastically improve your productivity.
Yes, you can, all main ORMs support stored procedures.
As for your assumption, you are particulary right, when you use stored procedures with ORM you are coupling your project to a particular database. But in practice it is 99% that you will not need to change your database provider, so in this case you use ORM not to abstract from concrete DB provider, but to help yourself with object-relational mapping task - which is a main ORM's task and which ORM was originally made for.
It raises an interesting point.
Once you have ORM, and relatively simple queries, why do you need stored procedures? SP's are intimately bound to the database. ORM frees you from having to maintain a lot of DB-specific code. What is DB-specific can be isolated and managed.
I suggest that an ORM is a golden chance to cut the complexity and put all the processing in the code where it belongs.
Use the database for what it does best -- store data.
Use your application for what it does best -- process data.
You can use both ORM features and stored procedures functionality at once. Particularly use ORM until it fits you, but if you have some trouble with performance or need some low level tune - include stored procedures in your business-logic.
Yes you can but you will want to spend some time investigating what capabilities the ORM provides around stored procedures.
Most will allow you to run a stored procedure that returns a strongly typed object / entity. More advanced ORM's will allow you to plug stored procedures in for performing CRUD actions as well (so your generic querying, deleting etc goes via a stored procedure rather than a dynamic query).
Generally ORM's are great for generating ad-hoc queries and getting strongly typed entities but having strong stored procedure support has the benefit of allowing you to (sometimes) more easily access native capability of your RDMS that may not be exposed as first class citizens in the ORM - especially if the ORM supports many database engines.
Following up from your edit:
Often you will want to use the ad-hoc querying engine provided by the ORM however as I alluded to earlier - sometimes you want to query using a capability not exposed from the ORM.
The benefits of strongly typed entities is invaluable as it means you have domain object usually, rather than data readers, data tables etc. You can cleanly encapsulate behaviors and logic within those entities that you have retrieved.
The list of additional benefits is very long indeed - for example, with the LightSpeed ORM (and most others) your entities will support standard binding interfaces, error reporting interfaces, validation etc. On the querying side you will lose out on lazy loading etc unless you write it yourself.
Database "agnosticity" (?) is not the only reason to use an ORM. However, you could take advantage of being DB independent on 99% of your interactions with the DB and in 1% (or 2% or 10% or whatever) you might need stored procedures for speed/clarity/complexity. If you changed DBs, you would need to rewrite those.
I use netTiers a lot at work and we let it generate our stored procedures for us. These only handle the basic CRUD operations, but they are very fast and save me a TON of time. netTiers will also let us create custom stored procedures and generate our data access code with these procedures.
You can, but many of the more advanced ORM features tend to become more cumbersome to use. Something like iBatis is very easy to integrate with stored procedures, while the more sophisticated features of more complex engines like (N?)Hibernate like generation of dynamic SQL and lazy loading of large fields can become more of a hassle than they're worth.
I believe that any tool that frees you from redoing work and concentrate in solving the problems is valid. ORMs appear to be that tool when it come to basic CRUD operations - even if using SPs to better implement a requirement (like using a hammer on a nail, it's just the right tool for task).
The point is: there's no black or white, just a scale of gray. Very inneficient and badly coded applications use the excuse of being 'DB agnostic' to explain the exagerated use of DB resources. In many cases, being very tied to a database is not good too. The objective is: getting maximum 'DB agnosticism' while not wasting customer IT resources without need.
There's no 'old vs new', just people saying that extreme 'pure' approaches are better. I don't really believe so. I believe that, as with any tool, the 'best' (notice the quotes) approach is using ORM until still is the right tool to make your data access. And use SPs inside your ORM when you reach a point where you're wasting resources and reducing scalability and 'worth life' (I forgot the english expression equivalent for the portuguese 'vida Ăștil') of TI resources. Or, in other words, use SP when it's for the processing at hand what a hammer is for the nail.