Delphi, Csv,Import Firebird - delphi

Lemme explain my problem.
I have One ID and 2 Other Fields in a CsV file. the ID connected to a database table.
I have to show the curresponding entries in the db and fields from Csv. Need sort the Fields too.
My Idea was load into a ClientDataset, lookup to a Query with table and Use sort and show.
My Csv have 85 K Records and its taking 120 seconds to load and sort, Its not acceptable. Can you tell me, can I use Bacthmove for this. So I can easily pick fields by a simple query. if I can use Batchmove please give me the guidelines.
Also Is there any other Techniques for this?
Thanks and Regards,
Vijesh V.Nair

May be you can take a look in global temporary table but I don't think it will be quicker

i often convert database from csv file to firebird server too.
usually i do like these
- use ms excel to read this csv file first to see if this file is not corrupt.
- still using excel, save this csv file to xls format, close ms excel after converting.
- using axolot component (xlsreadwrite) (www.axolot.com) , try to read cell by cell and insert them to memory table.
- using fibplus component (www.devrace.com), insert them to firebird server.
- job done, going home.
In this case, of course you must prepare your firebird server too. After converting finished, everything is easy actually. you can use uib component (www.progdigy.com) too to connect to firebird server. do not use ibexpress component (ibx) cause of it's author EXPLICITLY state that ibx is never intended for firebird.

Related

Assign, memcpy or other method required for TClientDataset or TDataSetProvider (dbexpress)

I'm usign the dbExpress components within the Embarcadero C++Builder XE environment.
I have a relatively large table with something between 20k and 100k records, which I display in a DBGrid.
I am using a DataSetProvider, which is connected to a SQLQuery and a ClientDataSet, which is connected to the DataSetProvider.
I also need to analyze the data and therefore I need to run through the whole table. For smaller tables I always used code, which is basically something like this:
Form1->ClientDataSet1->First();
while(!Form1->ClientDataSet1->Eof){
temp=Form1->ClientDataSet1->FieldByName("FailReason")->AsLargeInt;
//do something with temp
Form1->ClientDataSet1->Next();
}
Of course this works out, but it is very slow, when I need to run through the whole DBGrid. For some 50000 records in can take up to some minutes. My suspicion is that the most perform is lost since the DBGrid needs to be repainted as the actual Dataset increments its address.
Therefore I am looking for a method which allows me to either read the data without manipulating the actual ClientDataSet. Maybe a method which copies the data of a column into a variable, or another way to run through the datasets, which is more efficient. I am sure if I would have a copy in a variable the operation would take less than a few seconds...
I googled now for hours, but didn't find anything useful so far.
Best regards,
Bodo
if your cds is connected to some db-aware control(-s) (via TDataSource) then first of all consider using DisableControls()
another option would be to avoid utilizing FieldByName within the loop

How to execute SQL statements on a dataset which didn't come from a database?

Suppose I have an application which fetches a custom XML packet from the server which represents a dataset. Then, suppose I wish to execute a SQL statement on that data via a dataset. What can I use to do this? I don't need to know the code necessarily, but just what to use to make this possible and a general explanation of how.
For example, I may fetch a list of customers in XML format from the server. Then, I can use any third-party parser to dump that XML data into some client dataset. Then, execute a query on that dataset, for example select * from customers where ZipCode = '12345' without fetching this data from the server again.
XML is not the only limitation, that's just an example. I might want to do the same to some application settings loaded from an INI file. Either way, the concept is that the original source of the data is unknown.
Whether the dataset stores its temporary data in the memory or on the disk doesn't matter, but it would be excellent if it could keep it in the disk.
TXQuery (http://code.google.com/p/txquery/) is a component that provides a local SQL engine for executing SQL queries against one or more TDataSets. The only issues I have had with it is updating data via a TDBGrid of a query joining multiple tables (TDataSets) - specifically which table is being updated.
AnyDac v6 (now FireDac) also has a local SQL engine. http://www.da-soft.com/anydac/docu/frames.html?frmname=topic&frmfile=Local_SQL.html
Edit: For the example SQL in your question, because it only involves a single table, you do this with just a Filter on the datatset. For example
ADataSet.Filtered := False;
ADataSet.Filter := 'ZipCode=' + QuotedStr('12345');
ADataSet.Filtered := True;
Such a feature can be done using a local database. You just insert the TDataSet result into a local in-memory (or file-based) stand-alone database, then you can use regular SQL queries on it, including JOIN.
You can for instance use SQLite3, or the free edition of NexusDB.
NexusDB embedded has the benefit of being a native Delphi database, so stick to the DB.pas TDataSet paradigm.
Another option is to use the so-called Virtual Table mechanism of SQLite3, which allows to expose any data (even from TDataSet, XML, JSON or in-memory objects) to the SQLite3 engine, just as regular tables. Then you can run SQL statements on those "virtual" tables, including JOINs. With this approach, you do not require to INSERT the data into regular tables, but the data remain in their original form. Of course, you will miss some performance features like indexes, which should be handled on the virtual table provider side. We use this feature as the database core of our mORMot ORM/SOA framework, and this is pretty powerful.
The general process that you want to perform is complicated by the difference in data representation. SQL data is stored in tables made up of distinguishable records. XML is a structured representation of data, but in tree form rather than table/row form.
Each of these data forms may be qualified by a schema that provides a context for the data.
You have two general paths that you can follow:
Take the XML, and based on the schema insert it into a set of interlinked tables, then perform the SQL query. - if you have the schema, you can use code generators to make a parser, and then based ont the parse tree, you can insert into a local db with tables constructed on the fly. You can set up my SQL pretty easily from https://dev.mysql.com/doc/refman/5.7/en/installing.html and then in your version of delphi make a connection to the database, first fill it in, then query. This would satisfy your desire to have the data stored on the disk. unless you purge the tables when done, the data are still available in the local machine db.
This seems like more work than:
Use Xpath or Xquery and work directly on the XML. For this, a package like saxon in your favorite environment, or expat in python would work nicely.
Let me know if either of these paths seems as if it may be fruitful.

Delphi's GetTableNames sometimes returns tables with owner names, and sometimes not

I have some apps that connect to ODBC databases using ADO, and BDE. I have to call GetTableNames to return a list of table. Sometimes, I find that the table names are qualified with owner names, and sometimes not. The permutations are mysterious to me. Anyone can shed light on this?
Most DBMS these days provide a way to retreive a list of tablenames in a result set. I would suggest this approach rather than using the built in GetTableNames function.
For example, on MySQL it is
SHOW TABLES
on MS SQL Server it is:
SELECT name FROM <database name>..sysobjects where xtype = 'U';
Hope that helps

Querying a TClientDataSet using a TADOQuery

My question is very simple. I have a TClientDataSet that is linked to a TADOQuery via a TDataSetProvider. I can put data into the TClientDataSet from the TADOQuery, but how do I get data from the TClientDataSet back into the TADOQuery?
Data is automatically transferred from the TADOQuery to the TClientDataSet when I run a query and then set the TClientDataSet's Active property to True, but if I deactivate the TADOQuery and then activate it again, how can I get the data back from the TClientDataSet?
I am running the same query on several databases and using the TClientDataSet to concatenate the results. This is working fine. My problem now is that I need to get the concatenated result set back from the TClientDataSet into the TADOQuery so that I can use the TADOQuery's SaveToFile procedure (for compatibility reasons). How can I do this?
I don't do TADOQuery as I use dbExpress, but I imagine that one needs to use the same technique. After you have posted your changes to TClientDataSet, call 'ApplyUpdates (0)', which transfers the data from the clientdataset to its provider.
You could always write the dataset back out to a temp table and then query it. Ouch!!
I've just about finished looking into this. My application allows the user to generate reports by querying their databases. I can get this to work and it is very efficient for small result sets - however, as this is a reporting application, and it's entirely possible that hundreds of thousands of records can be returned, using a ClientDataSet gives massive performance problems. Once you get above around 50,000 records (reasonable, given the customer base), processing starts to increase exponentially, so this is now basically moot.

MSSQL2000: Using a stored procedure results as a table in sql

Let's say I have 'myStoredProcedure' that takes in an Id as a parameter, and returns a table of information.
Is it possible to write a SQL statement similar to this?
SELECT
MyColumn
FROM
Table-ify('myStoredProcedure ' + #MyId) AS [MyTable]
I get the feeling that it's not, but it would be very beneficial in a scenario I have with legacy code & linked server tables
Thanks!
You can use a table value function in this way.
Here is a few tricks...
No it is not - at least not in any official or documented way - unless you change your stored procedure to a TVF.
But however there are ways (read) hacks to do it. All of them basically involved a linked server and using OpenQuery - for example seehere. Do however note that it is quite fragile as you need to hardcode the name of the server - so it can be problematic if you have multiple sql server instances with different name.
Here is a pretty good summary of the ways of sharing data between stored procedures http://www.sommarskog.se/share_data.html.
Basically it depends what you want to do. The most common ways are creating the temporary table prior to calling the stored procedure and having it fill it, or having one permanent table that the stored procedure dumps the data into which also contains the process id.
Table Valued functions have been mentioned, but there are a number of restrictions when you create a function as opposed to a stored procedure, so they may or may not be right for you. The link provides a good guide to what is available.
SQL Server 2005 and SQL Server 2008 change the options a bit. SQL Server 2005+ make working with XML much easier. So XML can be passed as an output variable and pretty easily "shredded" into a table using the XML functions nodes and value. I believe SQL 2008 allows table variables to be passed into stored procedures (although read only). Since you cited SQL 2000 the 2005+ enhancements don't apply to you, but I mentioned them for completeness.
Most likely you'll go with a table valued function, or creating the temporary table prior to calling the stored procedure and then having it populate that.
While working on the project, I used the following to insert the results of xp_readerrorlog (afaik, returns a table) into a temporary table created ahead of time.
INSERT INTO [tempdb].[dbo].[ErrorLogsTMP]
EXEC master.dbo.xp_readerrorlog
From the temporary table, select the columns you want.

Resources