Mygeneration autonumber field bug with MS Access database - mygeneration

I am using Mygeneration tools to create the abstract classes responsible for dealing with database to perform CRUD operation as well as some other dooDad operations. Problem is I cant retrieve the auto number field (it is also Primary Key) of table using the code
Employees newObj = new Employees();
newObj.ConnectionString = connectionString;
newObj.AddNew();
// Your Properties will be different here
newObj.FirstName = "Joe";
newObj.LastName = "Plank Plank";
newObj.Save();
int staffid=newObj.StaffID;
The same thing is working fine in MS SQL server or other databases. Looks like auto number is not generated instantly which can be accessed once I added the entry. But, later, when I am checking the database, I found that auto number is generated there. Not sure, why this is happening. Anybody having expertise with dooDads, please help with info.
Edited:
The main problem is I cant access the autonumber field instantly after I create the fresh row entry. Looks like MS Access autonumber takes some time to show up and even in VS, you can see this phenomenon. How to fix this problem?

I have built many applications using Doodads , using MS Access , you have only to make the filed as autonumber .. and generate the stored procedures and other classes.
i.e your code should work ..
also I made modification to Dodads to return list of Objects
How to get list of objects from BusinessEntity using myGeneration?

Related

Looping through list of objects created by createEntry() and editing their properties before doing submitChange()

Good afternoon fellow developers,
I have come across a scenario where I found myself needing to retrieve the list of pending changes from my model and editing a specific property of those entries before sending them to my back-end.
These are new entities I created using the createEntry() method of the OData model v2. But, at the time of creation of said entities, I do not possess the value I need to add to them yet. This is the list of entities I retrieve by using the getPendingChanges() method on my model:
What I need to do is to loop through each of these newly created entities and set a specific property into them before actually sending them to my back-end with the submitChanges() method. Bare in mind that these are entry objects created by the createEntry() method and exist only in my front-end until I am able to submit them with success.
Any ideas that might point me in the right direction? I look forward to reading from you!
I was able to solve this issue in the following way:
var oPendingChanges = this.model.getPendingChanges();
var aPathsPendingChanges = $.map(oPendingChanges, function(value, index) { return [index];});
aPathsPendingChanges.forEach(sPath => oModel.setProperty("/" + sPath + "/PropertyX","valueFGO"));
The first two instructions retrieve the entire list of pendingChanges objects and then builds an array of paths to each individual entry. I then use that array of paths to loop through my list of pending changes and edit into the property I want in each iteration of the loop. Special thanks to the folks at answers.sap for the guidance!

DataSnap using AutoInc key and refresh current record only after insert

I've not been able to find an answer on this anywhere. Using Delphi XE7 with TClientDataSet, DataSnap & SQL Server. I need to insert a record, apply updates and then refresh that record so I can get the Id and assign it to my object. Seems pretty basic requirement, but on the contrary it is proving to be a royal pain.
I've found the obvious stuff on EDN, SO and Dr Bob:
http://edn.embarcadero.com/article/20847
DataSnap and the autoinc field
http://www.drbob42.com/examines/examinC0.htm
However these seem to focus on a "Refresh" of the TClientDataSet to re-fetches the entire table/query. Whilst this does actually resolve the Id field itself (good!), it also moves the cursor off the current record which was just inserted and so I'm not able to get the Id and assign it to my object. Also, for performance over HTTP I don't really want to refetch the entire table every time a record is inserted, if there's 10,000 records this will consume too much bandwidth and be ridiculously slow!
Consider the following code:
function TRepository<I>.Insert(const AEntity: I): I;
begin
FDataSet.DisableControls;
try
FDataSet.Insert;
AssignEntityToDataSet(AEntity); // SET'S ALL THE RELEVANT FIELDS
FDataSet.Post;
FDataSet.ApplyUpdates(-1);
FDataSet.Refresh; // <--- I tried RefreshRecord here but it cannot resolve the record
AEntity.Id := FDataSet.FieldByName('Id').AsInteger; // <----- THIS NOW POINTS TO WRONG ROW
finally
FDataSet.EnableControls;
end;
end;
Does anyone know how to achieve this? I need to be able to refresh and stay on the current record otherwise I do not know the Id of the record just created and the GUI cannot stay focused on the current record.
Hopefully something obvious I'm missing.
Cheers.
Rick.
Assuming you can get hands on the new ID inside the AfterUpdateRecord event of your DataProvider, your event handler then may look like this (the current record of DeltaDS is the one just inserted into SourceDS):
if (UpdateKind = ukInsert) then begin
DeltaDS.FindField('Id').NewValue := <TheNewID>;
end;
Make sure to have the poPropogateChanges option set in the provider. This will transfer the changed Id field back to the ClientDataSet.
Now you can get rid of the FDataSet.Refresh call.
SQL Server does allow you to get the last identity it generated in several ways - there's no need to "refresh" the record/query which means re-issuing a SELECT and can generate undesiderable side-effects. You can use SELECT SCOPE_IDENTITY() or use an OUTPUT clause. If the Delphi database driver supports it, TField.AutogenerateValue should accomplish that task automatically (see http://docwiki.embarcadero.com/Libraries/XE7/en/Data.DB.TField.AutoGenerateValue)
Otherwise you have to put that new data into your delta (see Raabe answer - this has to be done on the datasnap server which actually talks to the database) after reading it, so it's sent back to the client. You also need to set properly and TField.ProviderFlags to ensure data are applied correctly (see http://docwiki.embarcadero.com/RADStudio/XE7/en/Influencing_How_Updates_Are_Applied), usually you don't want those field appear in an UPDATE.

my sqlite3 DB doesn't show column values in device but it does in simulator

My DB is not getting copied over to my device, but it does to the simulator.
Here is what I am doing:
Create a new sqllite3 db from terminal:
sqlite> create table myTable (id integer primary key, name text);
sqlite> insert into myTable (name) values ('john');
sqlite> select * from myTable;
1|john
This creates a db in this path: users/John/iosApp.db
Then I close the terminal and copy that db to my xamarin project and set its buildAction to 'content'.
Here is my model:
[Table("myTable")]
public class MyTable
{
[PrimaryKey, AutoIncrementAttribute, Column("id")]
public int ID {get; set;}
[Column("name")]
public string Name { get; set; }
}
And I do this to copy the db to the Document folder:
string pathToDatabase = "iosApp.db";
userPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), pathToDatabase);
File.Delete (userPath); // delete first and copy next
File.Copy (pathToDatabase, userPath);
var myDB = new SQLiteConnection (userPath);
MyTable myTable = myDB.Get<MyTable> (1);
then I run the app and I set a breaking point after the last line in the code above and I hover over the myTable:
if I am using the simulator, I see the schema and value of 1 for ID and 'john' for Name.
if I am using the device, I see the schema but 0 value for ID and null for Name!
Looking at the path when I am using the device, points to this:
"/private/var/mobile/Applications/277749D4-C5CC-4BF4-8EF0-23B23833FCB1/Documents/iosApp.db"
I loaded the files in using iFunBox and the db file is there with the exact size
I have tried all the following:
Clean All in the project
Rebuild All
removed the 'debug' folder from the project
restarted Xamarin
and even restart the machine
But still the same behavior, what else should I try to be able to see the values of ID and Name?
my sdk version is attached
UPDATE:
After a lot of changes and cleaning up, I managed to display the value of all columns except the identity column displayed as 0. Puzzled, I went back to the xamarin sample project: http://developer.xamarin.com/recipes/ios/data/sqlite/create_a_database_with_sqlitenet/
it displayed the value of the identity correctly.
Trying to bring in similar code to my project, but no success.
To role out the possibility of version issue, I went and downloaded the latest sqlite from this link:
http://components.xamarin.com/gettingstarted/sqlite-net/true
The same behavior... I created a whole new page in my project, used the references the sample used and only has the code to create a sample table. Same behavior, the identity value is displayed in the other project but not mine. This leads me to conclude that there is something completely is wacky in my project. Now I am considering creating a whole new project and move my files to the new one after making sure first that the piece of being able to see the value of my id in my model shown up. Stay toned, I will make sure to update this thread.
If you have any pointers, please share them
I couldn't find a solution to my problem, but I found an alternate method to create the DB that turns out to be even nicer than the original one.
One important thing to note is that in the original problem (details above), the DB code was working for months since I started developing the application. I don't know when it started behaving badly, but I suspect it was due to the download of the new Xamarin 3.0. I can't think of any other reason.
So, to solve my issue, There are two main things I did:
I followed this link on how to create DB and tables and do CRUD operations: http://components.xamarin.com/gettingstarted/sqlite-net/true
This method seems to be the newest way to create DB. It was: published on
June 24, 2014. It has an SQLite.dll, whereas my previous solution was using a SQLite.cs file. So, now I am creating my DB now at runtime.
Something didn't work still with the new method. It was giving me an object null exception error. I didn't spend much time investigating about it. When I provided values for my primary key and identity values, the error went away. Actually, this could have been the solution to my previous problem. I would have tried providing the identity values against the old code, if I am not already happier with the new method.
I hope this helps someone.

Why is Entity framework loading data from the db when I set a property?

I have two tables (there are more in the database but only two are involved here).
Account and AccountStatus, an account can have an AccountStatus (active,inactive etc).
I create a new Account and set a couple of properties but when I reach this code:
1. var status = db.AccountStatuses.SingleOrDefault(s => s.ID == (long)AccountStatusEnum.Active);
2. account.AccountStatus = status;
3. db.Accounts.AddObject(account);
The first line executes fine, but when I reach the second line it takes a REALLY long time, and when I step in to the code it seems that every single account is loaded from the database.
I don't see why it should even want to load all the accounts?
We use Entity Framework 4 and Poco and we have lazy loading enabled.
Any suggestions?
Cheers
/Jimmy
You have to be careful which constructs you use to fetch data, as some will pull in the whole set and filter afterword. (aside: the long time delay may be the database being created and seeded, if there isn't one already, it will occur the first time you touch it, likely with a query of some sort. Also remember that when you retrieve a whole dataset, you may in actuality only have what amounts to a compiled query that won't be evaluated until you interact with it).
Try this form instead and see if you have the same issue:
var status = db.AccountStatuses.Where(s => s.ID == (long)AccountStatusEnum.Active);

How to prevent Delphi ADO from loading the entire table into memory?

I am not a Delphi programmer, but I I got an old Delphi 7 application that I need to fix and it is using ADO.
The database table (MS Accesss) contains +100,000 rows and when I set the ADOTable.Active=true it starts to load the entire table into RAM and that takes a lot of memory and time.
How can I prevent ADO to load the entire table? I tried to set the MaxRecords but it does not help.
Basically all we do is att program startup:
// Connect to database
DataModule.MyADOConnection.Connected:=true;
DataModule.MeasurementsADOTable.MaxRecords:=1;
// Open datatables
DataModule.MeasurementsADOTable.Active:=true;
After setting Active=true it starts to load the entire measurements into RAM and it takes TIME!
We are using the MSDASQL.1 provider. Perhaps it does not support the MaxRecords property?
How do I add some limiting query into this data object to only "load TOP 1 * from Measurements" ?
You could use TADOQuery to limit the result set with a sql query. Or you could use TADOTable and set the CursorLocation to a Server side cursor to prevent the client loading the complete resultset in memory.
You could use that adoTable with an Server OpenForwardOnly cursor and
an TCLientDataset with PacketRecords set to nonzero value. Worked
wonderfully when I had to write an app to pump data from MSSQL to
Oracle on a customized way with tables with millions of records.
EDIT -> It would be something on the lines of this:
procedure ConfigCDSFromAdoQuery(p_ADOQ: TADOQuery; p_CDS: TClientDataset; p_Prov: TDatasetProvider);
begin
If p_ADOQ.Active then p_ADOQ.Close;
p_ADOQ.CursorLocation := clServer;
p_ADOQ.CursorType := ctOpenForwardOnly;
p_Prov.Dataset := p_ADOQ;
p_CDS.SetProvider(p_Prov);
p_CDS.PacketRecords := 100;
p_CDS.Open;
end ;
I've done this all by code, but most of that you can do in design-time.
This article is BDE specific, but applies to ADO or most client data access libraries.
http://dn.codegear.com/article/28160
I would recommend using TADODataSet (it's "closer" to the ADO layer than TADOQuery) and selecting only the data the client needs by providing a custom search form (date range, list of specific items, etc)
Good luck
On your datamodule where "MeasurementsADOTable" currently resides, drop a TADOQuery and name it "MeasurementsADOQuery"
Set the Connection property of MeasurementsADOQuery to MyADOConnection (assuming this is the case based on the little code snippet provided.)
I'm also assuming that you are displaying a grid or otherwise using a DataSource - change the DataSource component's "DataSet" property from MeasurementsADOTable to MeasurementsADOQuery
Edit the actual query to be executed by setting the SQL property of MeasurementsADOQuery. (In runtime before opening: Measurements.SQL.Text := 'select top 10 * from measurements order by whatever')
Analyze/change all references in code from MeasurementsADOTable to MeasurementsADOQuery
Dont make the adotable active on startup and turning it true later is one way but still not really gonna help....use a adodataset and populate that rather as needed during runtime with your connection text. Only relevant data will be retrieved making it much faster.
use adoquery
If you do not need any row and just want insert new row use sql command like this
'select * from myTable where id=-1'
Since Id is autonumber no rows will return .
or
'select * from myTable where 1=-1'
But I think it is not good way for Insering data. Using adocommand is sure much better.
if you want X rows
'select top X * from myTable '
In furthering Fabrico's answer above, I have legacy application which has a table with 177000 rows and 212 columns. Upon trying to open this table I get the error 'table already open' and no records are available for update. setting Table.CursorLocation := clUseServer;
fixed this issue for me.
I have found ADO + Access w/Delphi to be painfully slow, for lots of things (big table reads like you're describing, but also inserts as well, etc). My answer became "Quit using ADO and Access altogether." Never did understand why it performed so poorly, especially when earlier technologies seemed not to.

Resources