We're working on a ASP.NET MVC 4 project with Oracle DB (11g). Customer has asked us to add ad-hoc reports (OLAP) to our system, so we're looking at possible options. User interface should be integrated into existing ASP.NET MVC web-site and data source should be Oracle DB. What is the best available options for such configuration?
I guess this is rather late (being well over a year after the post!) but I would strongly recommend the OLAP option that is embedded into the 11g database.
Oracle have wrapped it in all sorts of crud and provided some not very good client tools (e.g. the OLAP worksheet) but the underlying engine, based on a tool originally called Express, has an extremely good pedigree and remains one of the best on the market. Performance is great, it has an excellent and fully-featured language and costs a fraction of Oracle's own Hyperion offering.
Best of all, it is embedded in the Oracle database allowing (relatively) easy transfer of data from one to the other (although they are still surprisingly much at arm's length, given how long the technology has been owned by Oracle).
Having had relatively limited success with the client tools provided by Oracle, we have tended to go back to basics and define/populate objects manually in the OLAP cube - although most of our applications tend to involve modelling/forecasting so require write-back, which is a strength of Oracle OLAP, but not well supported by the client tools as Oracle would rather you used the more expensive Hyperion.
Related
We built an ASP.NET MVC project (with EF) which were running on private servers of our customer, some big enterprises (let's call it the e edition). Now we want to build an edition which can be run on the clouds for other small companies (the c edition).
The e edition is quite straight forward: an ASP.NET MVC website with some .mdf files (SQL Server Express is just enough) in App_data.
For the c edition, there are some solutions:
For the website application itself:
1.1 Create each of the application as a standalone website --- which is costly and unmaintainable, so it's out of consideration.
1.2 A single website hosting services for all the enterprises, sounds good. (Am I right?)
For the database, it's quite complex.
2.1 Add "Enterprise Id" in all the tables. Sounds terrible because a. performance could be low. b. security is difficult to maintain. c. The code of e edition and the c edition would be different, because the new added Id.
2.2 Create different App_Data folders for each enterprise, the same website access different folders via programmatic Connection String. Both the website and tables are just same in e and c edition.
2.2 sounds great, but there is a big problem:
In clouds like Azure (with which I'm not familiar), I do not think that it has SQL Server Express running. Instead, those "different folders" would be "different db instances", which is very costly, considering that there might be hundreds of Trial enterprise. One way walking around is to rent an "old style host server", which is basically a virtual machine with windows server, so I can get SQL Server Express installed. But this really sounds not a fashion.
2.3 Create tables with programmatic prefix in names, so one db but many tables for different enterprises.
But there is another problem: I'm using EF code first, so all the name of the tables are defined in [Table("TableName")] before compiling.
After reading "Migrating MVC application to AZure Appservice and Cloud Service", I would say the Azure app service would be my first choice.
So, which one should be the best strategy or am I missing some better solutions? Thanks.
This is a matter of opinion and is likely to be closed. But a few things that you might not be aware of:
The Elastic Database Client Library - helps in managing N different customer databases. I have not used this library personally. Take a look here.
Elastic pools - If you do end up managing multiple databases, you don't have to pay rack rates per database. If spikes vary across clients, you can define and pay for an elastic pool of DTUs. All of your databases in the pool draw from this same pool. For lots and lots of databases, this can greatly reduce cost.The docs are here.
What is the best way to write a client-server application under delphi? I know there's a DataSnap technology, but it's not in Professional version. Do You have any experience that You can share?
This is fairly wide open question, as it can depend on your database decision.
DataSnap really allows for N-Tier solutions, if your looking for Client Server you have
most everything you need in the professional version depending on the Database Choice.
For Client Server:
Client Server Architecture is when the Client communicates directly with the server.
There are several frameworks available they all follow the same pattern.
DB Connection -> Query -> (Optional Provider -> TClientDataset) -> TDataSource -> Visual Control
DBX
TSqlConnection - Connects to the Database
TSqlQuery - Query against DB producing uni-directional Dataset
TSqlStoredProc - Executes Stores Procedures against DB
ADO
TAdoConnection - Connects to Database
TAdoQuery - Query against DB producing Bi-Directional Dataset
Common Components
TClientDataSet - In Memory dataset that is bi-directional
TDatasetProvider - Takes other datasets and ties the data to TClientDataset
TDataSource - Ties a Dataset to a data-aware visual control
There are several other options available depending on Database Choice.
However, you seem to be asking about N-Tier (Middle-Tier) type solutions
For N-Tier
N-Tier architecture is when the Client communicates with Middle Tier that then communicates with the Server. It's referred to N-Tier as you have option to have multiple Middle Tiers or Application Servers.
Commercial Options (Required additional $$ to be spent)
DataSnap
DataAbstract
RemObjects SDK (Part of DataAbstract but can be used by itself)
KBMMw
Midware
I personally don't know of any free or open source options, although I suspect some exist.
Two options:
DIY (Do It Yourself). Write a communications layer and protocol yourself using Indy and/or ICS internet components. A lot of hard work and needs a lot of testing to get right.
Use a ready made framework such as kbmMW: http://components4developers.com/ or RemObjects: http://www.remobjects.com/ Both are not free but well worth the money you pay even if only measured by the development time/costs that you spare.
You can use
WST is a free and open source toolkit for web services consumption and creation with support for SOAP and XmlRPC and JsonRPC (the JsonRPC support is available only for FPC). It is compatible with Delphi. Better check out from svn as the 0.5 release is actualy outdated.
With Delphi Professional it is possible to write simple (no WS-* standards, no Soap 1.2 servers) SOAP client and server applications.
In many cases, Soap offers advantages regarding cross-platform / cross-language integration, standardization, design-by-contract and mature implementation guidelines, best practices and patterns.
For Soap there are great (and free) tools like SoapUI and IDE editors for Web Service Description Language (WSDL) documents like NetBeans.
Take a look at our Open Source Client/Server ORM.
It's multi-tier compatible, and you can have ORM at both Client and Server level.
ORM is used everywhere, and JSON is the format chosen for the Client/Server transmission.
You can start your application as local application, then just by changing the class type used to access to the data, it will become a Client/Server application communicating via Named Pipes, HTTP/1.1 or GDI messages.
It was designed to work with SQLite3 as a small but efficient database engine on the server side, but you can use the ORM without SQlite3. There is a pure Delphi in-memory engine provided, if you prefer.
This framework try to implement N-Tier architecture from the bottom up.
The upcoming 1.13 version will have a powerful filtering and validation mechanism, perfect for N-Tier architecture. There is some User-Interface units, with full reporting (and pdf generation), able to create most of the User Interface from code, using the ORM layout of the data.
It's based on the RESTful paradigm to access the data from the Client, via JSON. And there is a easy way of implementing Client/Server Services if the RESTful approach is not enough, just like DataSnap.
It's Unicode ready (uses UTF-8 at all internal level), and works with every version of the IDE, from Delphi 6 up to XE (even the Starter edition).
Since a few months ago I stopped to implement new projects with this kind of architecture (n-tiers, 2-tiers) Based on Delphis and specific DB technologies. I believe these architecture are not future prof. The architecture i'm using now is a 2-pier one. The server is a normal HTTP server. It works as app server* and optionally provides a web client. Developing clients in Delphi it's harder but worth it. Since specif tools are not available as the ones offered for DB connections, I use indy to send and receive data from the HTTP server. I do a GET request to fetch data and then parse it to show it on the GUI. Then a POST request to update or insert new data. The HTTP server handle all business logic :-)
Apart of being future prof, this architecture is cheaper and platform independent. And if you analyze it, this is the same architecture used by most mobile apps. So, if you plan to write a mobile client in the future, consider developing the app server with script languages (Python, PHP, Ruby, etc.).
That's my recommendation. Don't forget: Great things require great commitments!
An App Server is a service which provides your application (thin client) with with an interface to get and send data. Also it control the business logic. Your application don't care about DB's or controlling record relations and data constrains. That's all is transparently done by the app server.
For general-purpose client-server communication you can use our lightweight MsgConnect product. This is a cross-platform MOM (message-oriented-middleware).
At the moment I have a fairly big website with about 10k visitors a day.
This is a community website with news/blogs/videos and a big forum.
This all runs on a self made PHP5 application which performance faily well, has good performance. The database is a MySQL5.1 database.
Now I am getting fedup with PHP and the loose typed framework, lack of namespacing and a proper MVC setup, so I am thinking of rewriting the site in MVC3 ASP.NET.
Now I have experience with this, but not in the MVC framework yet, and I have a few questions about the performance, especially the Entity Framework:
Is it even worth using the entity framework? Will it cost alot of overhead and performance degration? I am not sure yet if I should switch to MSSQL.
Entity Framework was not performance optimized as far as i have worked with it. That is not it's primary goal by design. Performance is important of course, but EF is at most easy to use, and start with. It provides an fast way integrate with existing database, or create database really, really fast. Performance is not the primary goal, but improving indeed.
Here is an good chart with some benchmarks from the Microsoft team, that can be a ground for making the decision.
As you can see, there is a dramatic performance improvement from .NET 4.0 to 4.5 as far as it comes to "LINQ to Entities". It still seems twice slower than direct SQL though. LINQ to SQL is the slowest operation of course.
ASP.NET MVC is an good direction, especially if you want to try something different. EF is not the best you can do if you pursue performance. Writing the business layer on your own, including the SQL stored procedures would be better, but will require a lot more time.
When using an ORM like EF or NHibernate you always have to live with the trade-off between performance and convenience. If you can live with a relatively bad performance (I think it should be possible to run your site w/ an ORM) NHibernate should your first choice, from my point of view it is more mature while EF is still lacking provider support and has some shortcomings with respect to the develevopment workflow (which you possibly expect when using NHibernate but not when using a MS tool).
If you switch from PHP to .NET you should consider switching from MySQL to MSSQL, just because it fits perfectly into the MS ecosystem (and performance/scalability should be improved too, this could possibly outweight the performance degradation you expierence when using an ERM).
You could also take a look at LINQ which could be an alternative between a classic ORM and hard coded SQL commands (also with respect to the performance, LINQ to SQL is pretty fast and you can also use LINQ to Entitiy Framework when using EF (thats pretty slow)). LINQ would fit your needs if you want some level of abstraction wihtout the need of endless configuration and if you like RAD (who doesn't?).
In general the performance of ASP.NET MVC3 is quite good, but you should know that you need some experience to tweak your application and avoid (common) pitfalls. Cutting a long story short: You should be easily able to write an ASP.NET application that has a better performance than a scripted PHP page (by design)
But you should also know, if you decide to commit yourself to the MS ecosystem (.NET, MSSQL w/ LINQ/EF) its hard to break out and providers for LINQ and EF for non-MS RDBMS might cost some bucks (check www.devart.com).
Hope this gives you some guidance
Further reading:
ORM wars: Comparing nHibernate, LINQ To SQL & the Entity Framework
NHibernate vs Entity Framework: a performance test
NHibernate vs. Entity Framework 4.0
This is way too much of a general question, and there's no good answer to it. I highly recommend that you perform some tests using the Entity Framework as well as MVC3 and see that it meets your needs.
Also, and not to be condescending, but 10K visitors a day is not that much compared to other sites out there that are succesfully running ASP.NET MVC and Entity Framework.
In the end, I'd say it most definitely will meet your needs, but as with any project that scales, you will have to be aware of bottlenecks in your particular app and come up with solutions to address those bottlenecks.
This is 100% dependent on the queries.
I've worked on sites with 22 million a month with Nhibernate and about 10 million a month with Linq to Sql.
The queries that performed the slowest were always those weird aggregate quintuple join monsters. ORMs get you 95% of the way there. The rest you'll have to optimize. No ORM gets SELECT * FROM table wrong. Its the outliers that matter.
This will probably get closed as subjective/argumentative in the next five minutes, but I'll give it a shot:
You'll hear a lot of differing views regarding EF and other ORMs, but my personal view is that they cause more problems than they solve. Queries should be kept in the database, where they belong. Proper separation of concerns (SQL code in the database, app code in the project) saves you a lot of problems long-term.
Think of it this way: if your queries were all in stored procedures, you wouldn't even be thinking about migrating/rewriting them at this stage. You could focus on implementing the web side of your app instead of worrying about how to transition your SQL code.
So, skip EF for now, focus on what you know.
I'm project managing an intranet application being developed at work. We're in the early planning stages. I've previously done all my development in Python using Django, but as we're a windows shop we're probably going to go with ASP.NET MVC.
We won't really be able to afford a SQLServer license though, so we were perhaps looking into using Postgresql. However I can't seem to find many examples or guides for people who want to utilise a third party ORM - or at least, an ORM with similar usage as Django - that works with Postgresql.
Ultimately we'd like to handle authentication via Active Directory [including groups], but store actual content within the db.
There have been previous questions of a similar nature, but most of them are over a year old when MVC was stil in Beta.
Any ideas?
NHibernate by a country mile.
It also supports MySql and should you want to change it has the main commmercial ones too. Haven't switched between db vendors but if you don't do much bespoke t-sql and say use fluent nhibernate you could almost plug and play between database platforms.
The support and community behind NHibernate when it comes to Mvc is second to none. It is categorically the ORM of choice.
You can try DataObjects.Net - open source ORM with GPLv3 or commercial licenses. It also supports Postrges.
NHibernate supports PostgreSQL. See http://vampirebasic.blogspot.com/2009/02/nhibernate-with-postgresql.html for some advice on how to integrate the two.
Also worth mentioning is that NHibernate now supports LINQ syntax. See http://ayende.com/Blog/archive/2009/07/26/nhibernate-linq-1.0-released.aspx for more details.
I'd recommend Mindscape LightSpeed. It supports PostgreSQL and has a visual designer baked into Visual Studio with full database round tripping to PostgreSQL.
When there was no add-in for Visual Studio to support PostgreSQL from the Server Explorer the guys wrote a free visual studio add-in for supporting it.
Solid O/R Mapper with LINQ mixed with first class visual model development against PostgreSQL. It is a commercial product however there is a free edition for small databases.
Mindscape LightSpeed O/R Mapper
Try Devart LinqConnect - http://www.devart.com/linqconnect/. This Framework supports PostgreSQL, Oracle, MySQL, SQLIte.
Unfortunately, most answers you get on a question like this are going to be based on the responder's opinion and experience and not based on yours.
Most of the suggestions here are good... however... if you are looking for a lightweight/fast ORM that is similar to Django, JackD has the right solution (LinqConnect)...
I've used most of the solutions listed including Django, and find that I usually pick LinqConnect if I'm looking for fast, lightweight and easy. For heavier (read larger) projects I would use something more robust like NHibernate.
But to answer your question correctly... the closest match and least learning curve for someone using Django would definitely be LinqConnect.
I am thinking about using SQLite
It is a self-contained, server less, zero-configuration, transactional SQL database engine and is open source.
Will I gain anything by using Blackfish instead of SQLight ?
Why not Firebird?
http://www.firebirdsql.org
http://www.firebirdfaq.org
"...Firebird is a relational database offering many ANSI SQL standard features that runs on Linux, Windows, and a variety of Unix platforms. Firebird offers excellent concurrency, high performance, and powerful language support for stored procedures and triggers. It has been used in production systems, under a variety of names, since 1981...."
Stick with FOSS (Free Open Source Software).
Both SQLite and Firebird are excellent choices. Both fill your requirements. Both are very reliable, zero-configuration and support transactions.
Without knowing enough about your intended use, Firebird would be my first choice because it makes possible to migrate to a Client Server deployment with close to zero effort and it has a very robust set of features. It is all about the options.
IMHO Blackfish is not a god choice - "Blackfish SQL runs on both the .NET framework and on the Java platform." - enough said.
Use SQLite.
Much smaller, less overhead, no licensing issues etc.
Laslty, only you can decide which one will do everything you need a db system to do. Which one has all of the features you need to support?
John
I will vote for Sqlite first, because it's compact, lite and fast, but depend on your application you may go with other choices like Firebird or PostgreSql.
for example Sqlite has limitation to one writer at the same time, it may not be problem for most of desktop applications that used by one user, but it will not scale for more uses in future.
you can go with FireBird embedded as solution which will be act like Sqlite, then move to firebird server when you need more users.
In other side BlackFish database, will force you to install .Net framework on your client's machines, which is something isn't good IMO, specially if you need to distribute your application on Internet, beside it will require license when it scale for more than developer edition.
Another thing to consider is how are you going to access your data. If you are using DBX4 for instance, it is very easy to change what database you are looking at. In which case I would suggest trying both (plus any others that people highly recommend you try). On the other hand if you are using the the Interbase or ADO data access components then your ability to change is somewhat more limited.