Suggestions for caching a dataset - delphi

I'd like to perform the following:
1) Open a dataset (using TMSQuery, SDAC DevArt component)
2) caching the content to disk (imagine a list of cutsomers)
3) the next time I need to Open the dataset I will first populate it with cached data, then I will just Refresh it by calling TMSQuery.RefreshQuick method.
In this way I plan to obtain a substantial improvement in speed, because I don't need to retrieve records that I already retrieved in previous application runs.
How can I obtain this caching? I have many datamodules with TMSQuery, so somehow I would like to have a global routine that checks that everytime I try to Open a TMSQuery, if that query is somehow tagged i will try to restore from cache, call RefreshQuick, in case this fails I will call Open.
Can you please suggest?
(I use Delphi 2009 and SDAC 4.80)

you can use the TClientDataSet and TDataSetProvider components for this, connecting the components in this way.
TMSQuery->TDataSetProvider->TClientDataSet
The TClientDataSet is a very good alternative to persist and retrieve data from an disk.
see these links for more info about the ClientDataset
Using the MIDAS ClientDataset as a replacement for cached updates
A Guide to Using the TClientDataSet in Delphi Database Applications
Effective ClientDataSets and the BriefCase Model

You can do 2 things:
Make descendant of the TMSQuery component and override the Open function
(you search all you datamodule .dfm and .pas files with TMSQuery and replace with TCachedTMSQuery)
Detour/hook the TMSQuery.Open (runtime patching)

Related

What should I use between FDQuery, FDMemtable, and ClientDataset in Delphi FireDac?

New to Delphi. I'm developing an application that needs to access a MSSQL database, to do this I've used an FDConnection, FDQuery and a DataSource component connected to a grid. With these I can access/modify/delete data just fine. Now if for example I want to filter the grid, I can do this by changing the FDQuery component at run-time, but I'm not sure if this is the right approach.
I've thought about using something that stores tables in memory like ClientDatasets because I'm not sure if FDQuery does this, so that I can manage data I've already retrieved without accessing database more than needed. My problem is I don't have a fundamental understanding of any of these components, so my question is:
Do I need to use anything else other than FDQuery?
A little more context on what I'm building: UniGUI web application, with the MSSQL server in the same LAN as the Web Server, and multiple user access to DB.
Now that I understand these components better, I found this FAQ from Embarcadero's doc that explains what I wanted to know.
Q1: Can I use TFDQuery and connect it to a dataset provider and retrieve the data in an Embarcadero client dataset?
A: TFDQuery is a mix of TFDMemTable, TFDTableAdapter and several
TFDCommand's. So, TFDQuery has everything inside to execute SQL
commands, send parameter data, receive and store result sets, browse
result sets and post changes back to a database. There is no reason to
use TFDQuery + DSP + CDS.
You can use TFDMemTable, TFDTableAdapter and TFDCommand directly,
instead of TFDQuery. They give more flexibility, but also require more
coding. Take for example synchronized cached updates across datasets.
In other words, TFDQuery is an optimal "shortcut" for every day data
application programming.

Import delphi data to access [duplicate]

I need to insert 800000 records into an MS Access table. I am using Delphi 2007 and the TAdoXxxx components. The table contains some integer fields, one float field and one text field with only one character. There is a primary key on one of the integer fields (which is not autoinc) and two indexes on another integer and the float field.
Inserting the data using AdoTable.AppendRecord(...) takes > 10 Minutes which is not acceptable since this is done every time the user starts using a new database with the program. I cannot prefill the table because the data comes from another database (which is not accessible through ADO).
I managed to get down to around 1 minute by writing the records to a tab separated text file and using a tAdoCommand object to execute
insert into table (...) select * from [filename.txt] in "c:\somedir" "Text;HDR=Yes"
But I don't like the overhead of this.
There must be a better way, I think.
EDIT:
Some additional information:
MS Access was chosen because it does not need any additional installation on the target machine(s) and the whole database is contained in one file which can be easily copied.
This is a single user application.
The data will be inserted only once and will not change for the lifetime of the database. Though, the table contains one additional field that is used as a flag to indicate that the corresponding record in another database has been processed by the user.
One minute is acceptable (up to 3 minutes would be too) and my solution works, but it seems too complicated to me, so I thought there should be an easier way to do this.
Once the data has been inserted, the performance of the table is quite good.
When I started planning/implementing the feature of the program working with the Access database the table was not required. It only became necessary later on, when another feature was requested by the customer. (Isn't that always the case?)
EDIT:
From all the answers I got so far, it seems that I already got the fastest method for inserting that much data into an Access table. Thanks to everybody, I appreciate your help.
Since you've said that the 800K records data won't change for the life of the database, I'd suggest linking to the text file as a table, and skip the insert altogether.
If you insist on pulling it into the database, then 800,000 records in 1 minute is over 13,000 / second. I don't think you're gonna beat that in MS Access.
If you want it to be more responsive for the user, then you might want to consider loading some minimal set of data, and setting up a background thread to load the rest while they work.
It would be quicker without the indexes. Can you add them after the import?
There are a number of suggestions that may be of interest in this thread Slow MSAccess disk writing
What about skipping the text file and using ODBC or OLEDB to import directly from the source table? That would mean altering your FROM clause to use the source table name and an appropriate connect string as the IN '' part of the FROM clause.
EDIT:
Actually I see you say the original format is xBase, so it should be possible to use the xBase ISAM that is part of Jet instead of needing ODBC or OLEDB. That would look something like this:
INSERT INTO table (...)
SELECT *
FROM tablename IN 'c:\somedir\'[dBase 5.0;HDR=NO;IMEX=2;];
You might have to tweak that -- I just grabbed the connect string for a linked table pointing at a DBF file, so the parameters might be slightly different.
Your text based solution seems the fastest, but you can get it quicker if you could get an preallocated MS Access in a size near the end one. You can do that by filling an typical user database, closing the application (so the buffers are flushed) and doing a manual deletion of all records of that big table - but not shrinking/compacting it.
So, use that file to start the real filling - Access will not request any (or very few) additional disk space. Don't remeber if MS Access have a way to automate this, but it can help much...
How about an alternate arrangement...
Would it be an option to make a copy of an existing Access database file that has this table you need and then just delete all the other data in there besides this one large table (don't know if Access has an equivalent to something like "truncate table" in SQL server)?
I would replace MS Access with another database, and for your situation I see Sqlite is the best choice, it doesn't require any installation into client machine, and it's very fast database and one of the best embedded database solution.
You can use it in Delphi in two ways:
You can download the Database engine Dll from Sqlite website and use Free Delphi component to access it like Delphi SQLite components or SQLite4Delphi
Use DISQLite3 which have the engine built in, and you don't have to distribute the dll with your application, they have a free version ;-)
if you still need to use MS Access, try to use TAdoCommand with SQL Insert statment directly instead of using TADOTable, that should be faster than using TADOTable.Append;
You won't be importing 800,000 records in less than a minute, as someone mentioned; that's really fast already.
You can skip the annoying translate-to-text-file step however if you use the right method (DAO recordsets) for doing the inserts. See a previous question I asked and had answered on StackOverflow: MS Access: Why is ADODB.Recordset.BatchUpdate so much slower than Application.ImportXML?
Don't use INSERT INTO even with DAO; it's slow. Don't use ADO either; it's slow. But DAO + Delphi + Recordsets + instantiating the DbEngine COM object directly (instead of via the Access.Application object) will give you lots of speed.
You're looking in the right direction in one way. Using a single statement to bulk insert will be faster than trying to iterate through the data and insert it row by row. Access, being a file-based database will be exceedingly slow in iterative writes.
The problem is that Access is handling how it optimizes writes internally and there's not really any way to control it. You've probably reached the maximum efficiency of an INSERT statement. For additional speed, you should probably evaluate if there's any way around writing 800,000 records to the database every time you start the application.
Get SQL Server Express (free) and connect to it from Access an external table. SQL express is much faster than MS Access.
I would prefill the database, and hand them the file itself, rather than filling an existing (but empty) database.
If the data you have to fill changes, then keep an ODBC access database (MDB file) synchronized on the server using a bit of code to see changes in the main database and copy them to the access database.
When the user requests a new database zip up the MDB, transfer it to them, and open it.
Alternately, you may be able to find code that opens and inserts data into databases directly.
Alternately, alternately, you may be able to find another format (other than csv) which access can import that is faster.
-Adam
Also check to see how long it takes to copy the file. That will be the lower bound of how fast you can write data. In db's like SQL, it usually takes a bulk load utility to get close to that speed. As far as I know, MS never created a tool to write directly to MS Access tables the way bcp does. Specialized ETL tools will also optimize some of the steps surrounding the insert, such as the way SSIS does transformations in memory, DTS likewise has some optimizations.
Perhaps you could open a ADO Recordset to the table with lock mode adLockBatchOptimistic and CursorLocation adUseClient, write all the data to the recordset, then do a batch update (rs.UpdateBatch).
If it's coming from dbase, can you just copy the data and index files and attach directly without loading? Should be pretty efficient (from the people who bring you FoxPro.) I imagine it would use the existing indexes too.
At the least, it should be a pretty efficient single-command Import.
how much do the 800,000 records change from one creation to the next? Would it be possible to pre populate the records and then just update the ones that have changed in the external database when creating the new database?
This may allow you to create the new database file quicker.
How fast is your disk turning? If it's 7200RPM, then 800,000 rows in 3 minutes is still 37 rows per disk revolution. I don't think you're going to do much better than that.
Meanwhile, if the goal is to streamline the process, how about a table link?
You say you can't access the source database via ADO. Can you set up a table link in MS Access to a table or view in the source database? Then a simple append query from the table link would copy the data over from the source database to the target database for you. I'm not sure, but I think this would be pretty fast.
If you can't set up a table link until runtime, maybe you could build the table link programatically via ADO, then build the append query programatically, then invoke the append query.
HI
The best way is Bulk Insert from txt File as they said
you should insert your record's in txt file then bulk insert the txt file into table
that time should be less than 3 second.

dbgrid without client dataset

I have a form with a dbgrid and an sqlquery component. I am trying to fill the dbgrid with the sqlquery. When I do I get the message, "Operation not allowed on Unidirectional dataset." I do NOT want to use a client data set, as I do not want a 'local' copy of the data, I would like to read and display the data directly. How can this be done?
The documentation clearly states (emphasis added):
TSQLQuery is a unidirectional dataset. Unlike other datasets, unidirectional datasets do not buffer multiple records in memory. Because of this, you can only navigate using the First and Next methods. There is no built-in editing support: you can only edit the data in an SQL query by explicitly creating an SQL UPDATE command or by connecting the dataset to a client dataset using a provider.
Because there's no buffering of multiple records, you can't move in any direction except forward, which means that the DBGrid could not display multiple rows or support scrolling.
(In fact, all of the DBExpress components are unidirectional, according to the documentation on Types of DBExpress DataSets.)
You'll have to either use a TClientDataSet or change from using DBExpress to some other method of accessing the data such as ADO instead, or display the data using something other than TDBGrid (like a TStringGrid) and implement your own internal storage. However, TClientDataSet doesn't have to be a disk file, if the amount of data you're retrieving is manageable in memory; all of the data can just stay there without being a "local copy" (an "in-memory dataset").

Is it possible to have a detached TpFIBDataset for FIBPlus?

I remember that when I was working with ADO for Delphi (dbGo) there was a possibility of creating a detached dataset. The idea was that I could read all the data which I wanted from database and then set the connection property to nil. That caused TADOQuery to work as a memory table. I could then use and pass TADOQueryas a TDataSet parameter to my other methods without worrying that I am keeping unnecessary connection or transaction opened.
I would like to have the same functionality when using FIBPlus library. Currently I need to copy data from TpFiBDataset to other structure and then close the data set. Otherwise to access the rows of dataset, transaction must stay opened, even if I have all the data fetched.
I could not achieve detached dataset functionality on my own, is this possible?
No. TpFIBDataSet could not work as a standalone dataset. You should use TpFIBClientDataSet (if you want to apply later updates to db) or any TInMemoryDataSet (just for local reading).

Is it possible to take a snapshot of a dataset?

IN my application I use DBAware components exclusively (except a few places).
I have a scenario in which I create a Master dataset (e.g. customer), detail dataset (e.g. Orders), subdetail dataset (e.g. order items). TYpically I allow users to make changes (the dataset are in Browse mode) and then I post. Simple.
Anyway on editing the subdataset I want to add a kind of simple undo feature: one opens a form to edit the dataset (that is with db componets, so changes to the form will change the dataset), if the user Cancels the operation I would like to restore the dataset as it was before opening the form.
Now for implementing this I can think of makeing a copy of the dataset in a TClientDataSet or similar component, but are there other techiniques? Like is it possible with Delphi to create in an easy way a "snapshot" of the data. With pseudocode:
MySubDetailDataSet.SaveSnapShot;
SubDetailForm.ShowModal;
if ModalResult = mrCancel then MySubDetailDataSet.RestoreSnapShot;
Is something like that possible "off the shelf" with Delphi components?
By the way I use SDAC from DevArt components, so if you know a technique that is available only with those components and not with Delphi standard ones it is welcome!
In a client dataset changes are stored in a delta - you can call CancelUpdates to clear the delta and revert to the original dataset. There are other more granular approaches. See "Undoing changes" in the help.
If you're using a RDBMS and you're properly inside a transaction, you can rollback the transaction. Some databases offer savepoints to rollback to a given savepoint instead of rolling back a whole transaction, but that's database specific. Transactions usually are per session, not per single table or query. You have to be sure only the changes you may need to roll back are performed in a given transaction.
Client dataset may be a "lighter" approach because they manage data client-side only and don't require database resources. While you're in a transaction inside the database, some resources are needed to keep track of it and changed data. Transaction should be as long as required, but not longer.
Be also aware that transactions may imply some locks. Lock management can be very different from database to database, and some may escalate locks, blocking more users than needed. Always test with a sufficient number of concurrent users to ensure transactions are used properly.
In AnyDAC you can do:
var
iPrevSP: Integer;
...
iPrevSP := MySubDetailDataSet.SavePoint;
SubDetailForm.ShowModal;
if ModalResult = mrCancel then
MySubDetailDataSet.SavePoint := iPrevSP;
The similar technique is accessible with TClientDataSet, kbmMemTable. Not the answer probably, as you are using DevArt product.
With DevArt I managed copying data to a TVitualTable (a DevArt Version of a TCLientDataSet), anyway SavePoint feature as easy as in AnyDAC is not there.
You can use TClientDataset and load it from file or stream and save the original data inside, and every time you want to rollback, reload it from original data.

Resources