Dataset is not connected Anyone agrees with me? - ado

I think dataset is not connected .You can close db connection and open other connection, update dataset data to other database. Anyone agrees with me?

A dataset is kind of a structure of your database, just in memory. But you are right, the dataset itsself is not directly connected to a database

Related

What should I use between FDQuery, FDMemtable, and ClientDataset in Delphi FireDac?

New to Delphi. I'm developing an application that needs to access a MSSQL database, to do this I've used an FDConnection, FDQuery and a DataSource component connected to a grid. With these I can access/modify/delete data just fine. Now if for example I want to filter the grid, I can do this by changing the FDQuery component at run-time, but I'm not sure if this is the right approach.
I've thought about using something that stores tables in memory like ClientDatasets because I'm not sure if FDQuery does this, so that I can manage data I've already retrieved without accessing database more than needed. My problem is I don't have a fundamental understanding of any of these components, so my question is:
Do I need to use anything else other than FDQuery?
A little more context on what I'm building: UniGUI web application, with the MSSQL server in the same LAN as the Web Server, and multiple user access to DB.
Now that I understand these components better, I found this FAQ from Embarcadero's doc that explains what I wanted to know.
Q1: Can I use TFDQuery and connect it to a dataset provider and retrieve the data in an Embarcadero client dataset?
A: TFDQuery is a mix of TFDMemTable, TFDTableAdapter and several
TFDCommand's. So, TFDQuery has everything inside to execute SQL
commands, send parameter data, receive and store result sets, browse
result sets and post changes back to a database. There is no reason to
use TFDQuery + DSP + CDS.
You can use TFDMemTable, TFDTableAdapter and TFDCommand directly,
instead of TFDQuery. They give more flexibility, but also require more
coding. Take for example synchronized cached updates across datasets.
In other words, TFDQuery is an optimal "shortcut" for every day data
application programming.

Storing data coming form cupcarbon into database

I am creating an IoT simulation using Cupcarbon IoT simulator. I want to store the simulation data in some database. Anyone, please help me how to do that.
A simple solution could be to use the command printfile.
With this command you will able to store in the file with the name results some data you will choose. Every sensor node could have its own result file.
Hope this works for you.
I would love to ready if you found another solution.
BR

Auto refresh a TDataSet / DBGrid

I'm developing a software that displays information in a DBGrid via a TSimpleDataSet (dbExpress components)
The software in question is used on 2 different computers by 2 different people.
They both view and edit the same information at different times.
I'm trying to figure out a way to automatically update the DBGrid (or rather, the DataSet, right?) on Computer B once Computer A makes a change to a row (edits something/whatever) and vice-versa.
Currently I've set up a TButton named Refresh that once clicked executes the following code:
procedure TForm2.actRefreshDataExecute(Sender: TObject);
begin
dbmodule.somenameDataSet.MergeChangeLog;
dbmodule.somenameDataSet.ApplyUpdates(-1);
dbmodule.somenameDataSet.Refresh;
dbmodule.somename1DataSet.MergeChangeLog;
dbmodule.somename1DataSet.ApplyUpdates(-1);
dbmodule.somename1DataSet.Refresh;
dbmodule.somename2DataSet.MergeChangeLog;
dbmodule.somename2DataSet.ApplyUpdates(-1);
dbmodule.somename2DataSet.Refresh;
dbmodule.somename3DataSet.MergeChangeLog;
dbmodule.somename3DataSet.ApplyUpdates(-1);
dbmodule.somename3DataSet.Refresh;
end;
This is fine and works as intended, once clicked.
I'd like an auto update feature for this, for example when Computer A edits information in a row, Computer B's DBGrid should update it's display accordingly, without the need to click the refresh button.
I figured I would use a TTimer and set it at a specific interval, on both software on both PC's.
My actual question is:
Is there a better way than a TTimer for this? If so, please elaborate.
Also, if the TTimer route is the way to go any further info you might find useful to state would be appreciated (pro's and con's and so on)
I'm using Rad Studio 10 Seattle and dbExpress components, the datasets connect to a MySQL database on my hosting where my website is.
Thanks!
Well, Ken White and Sertac Akyuz are certainly correct that using a server-originated notification to determine when to refresh your local dataset is preferable to continually re-reading all the data you are using from the server.
The problem AFAIK is that there is no Emba-supplied notification system which works with MySql. See this list of databases supported by FireDAC's Database Alerts:
http://docwiki.embarcadero.com/RADStudio/XE8/en/Database_Alerts_(FireDAC)
and note that it does not list MySql.
Luckily, I think there is a work-around which should be viable for a v. small system like yours currently is. As I understand it, you and your colleague's PCs are on a LAN and the MySql Server is outside your LAN and on the internet. In that situation, it doesn't need a round trip to the server for one of you to get a notification that the other has changed something in the database. Using an analogy akin to Ken's, you can, as it were, lean over the desk and say to your colleague "Hey, I've changed something, so you need to refresh your data."
A very low-tech way of implementing that would be to have somewhere on your LAN a resource that both of you can easily get at, which you can update when you make a change to the DB that means that the other of you should update your data from the server. One way to do that is to have a small, shared datafile with a number of records in it, one per server db table, which has some sort of timestamp or version-ID number which gets updated when you update the corresponding server table. Then, you can periodically check (poll) this datafile to see whether a given table has changed since you last checked; obviously, if it has, you then re-read the data you want from it from the server and update your local record of the info you read from the shared file.
You can update the shared file using handlers for the events of your Delphi client-side datasets.
There are a number of variations on this theme that I'm sure will be apparent to you; the implementational details really don't matter.
To update the shared file I'm talking about, you will need to lock it while writing to it. This answer:
How do I get the handle for locking a file in Delphi?
will show you how to do that.
Of course, the shared local resource doesn't have to be a data file. One alternative would be to use a Microsoft Message Queue service, which is sometimes used for this kind of thing, but has a steeper learning curve than a shared data file.
By the way, this kind of thing is far easier to do (at least on a small scale like you have) if you use 3-tier database access (e.g. using datasnap).
In a three tier system, only the middle tier (a Delphi datasnap server which you write, but it's not that hard) talks to the server, and the clients only talk to the middle tier. This makes it easy for the middle tier server to notify the other client(s) when one of them changes the db data.
The three-tier arrangement also helps minimise the security problems with accessing a database server via the internet, because you only need one secure connection to the server, not one per client. But that's straying a bit far from your immediate problem.
I hope all this is clear, if not, ask.
Just use a timer and make it refresh the dataset every 5 min. No big deal.
If the usage is not frequent then you can set it to fire every 10 or 15 min.
There is nothing wrong with the timer if it set on longer intervals.
Today's broadband connection's can easily handle the traffic so can Access.
If the table is not huge of course.

Is it possible to have a detached TpFIBDataset for FIBPlus?

I remember that when I was working with ADO for Delphi (dbGo) there was a possibility of creating a detached dataset. The idea was that I could read all the data which I wanted from database and then set the connection property to nil. That caused TADOQuery to work as a memory table. I could then use and pass TADOQueryas a TDataSet parameter to my other methods without worrying that I am keeping unnecessary connection or transaction opened.
I would like to have the same functionality when using FIBPlus library. Currently I need to copy data from TpFiBDataset to other structure and then close the data set. Otherwise to access the rows of dataset, transaction must stay opened, even if I have all the data fetched.
I could not achieve detached dataset functionality on my own, is this possible?
No. TpFIBDataSet could not work as a standalone dataset. You should use TpFIBClientDataSet (if you want to apply later updates to db) or any TInMemoryDataSet (just for local reading).

Temporary table resource limit

i have two applications (server and client), that uses TQuery connected with TClientDataSet through TDCOMConnection,
and in some cases clientdataset opens about 300000 records and than application throws exception "Temporary table resource limit".
Is there any workaround how to fix this? (except "do not open such huge dataset"?)
update: oops i'm sorry there is 300K records, not 3 millions..
The error might be from the TQuery rather than the TClientDataSet. When using a TQuery it creates a temporary table and it might be this limit that you are hitting. However in saying this, loading 3,000,000 records into a TClientDataSet is a bad idea also as it will try to load every record into memory - which maybe possible if they are only a few bytes each but it is probably still going to kill your machine (obviously at 1kb each you are going to need 3GB of RAM minimum).
You should try to break your data into smaller chunks. If it is the TQuery failing this will mean adjusting the SQL (fewer fields / fewer records) or moving to a better database (the BDE is getting a little tired after all).
You have the answer already. Don't open such a huge dataset in a ClientDataSet (CDS).
Three million rows in a CDS is a huge memory load (depending on the size of each row, it can be gigantic).
The whole purpose of using a CDS is to work quickly with small datasets that can be manipulated in memory. Adding that many rows is ridiculous; use a real dataset instead, or redesign things so you don't need to retrieve so many rows at a time.
over 3 million records is way too much to handle at once. My guess is that you are performing an export or something like that which requires that many records to be sent down the wire. One method you could use to reduce this issue would be to have the middle-tier generate an export file, and then deliver that file to the client (preferably compressing first using ZLIB or something simular).
If you are pulling data back to the client for viewing purposes, then consider sending summary information only, and then allowing the client to dig thier way thru the data a portion at a time. The users would thank you because your performance will go way up and they won't have to dig thru records they don't care about looking at.
EDIT
Even 300,000 records is way too much to handle at once. If you had that many pennies, would you be able to carry them all? But if you made it into larger denominations, you could. if your sending data to the client for a report, then I strongly suggest a summary method... give them the large picture and let them drill slowly into the data. send grouped data and then let them open up slowly.
If this is a search results screen, then set a limit of the number of records to be returned + 1. For example to display 100 records, set the limit to 101. Still only display 100, the last record means that there were MORE than 100 records so the customer needs to adjust thier search criteria to return a smaller subset.
Temporary table resource limit is not a limit for one single query. it is the limit for all open queries together. so it may be a solution for you to close all other queries at the time.
if it is not possible for you to use ADO connection, also you can design a paging mechanism for querying data page by page.
GOOD LUCK

Resources