TClientDataSet and processing records with StatusFilter - delphi

I'm using a TClientDataSet as a local dataset without the provider concept. After working with it a method is called that should generate the corresponding SQL statements using the StatusFilter to resolve the changes (generate SQL basically).
This looked easy initially after reading documentation (set StatusFilter to [dsInsert], process all inserts SQL, set StatusFilter to [dsModified] process all updates, the same with deletes) but after a few tests now looks far from trivial, for example:
If I add a record, then edit it: setting the StatusFilter to [dsInserted] displays it, but with the original data.
If I add a record, then edit, then delete it: the record appears with StatusFilter set to [dsInserted] and [dsModified] also.
And other similar situations..
1) I know that if first I process all inserts, then all updates then all deletes the database will be updated in the correct state but it looks far from right this approach (generating useless sql statements).
2) I've tried to access the PRecInfo(ClientDataSet.ActiveBuffer + ClientDataSet.RecordSize).Attribute information (dsRecNew, dsRecOrg, etc.) but still not manage to resolve the logic.
3) I can program the logic to resolve it, for example before processing and insert, set StatusFilter to [dsDeleted], and locating by the primary key if the record to see if its deleted thereafter.. the same with edits, before inserting, checking if the record was updated after so the insert sql in the updated version and so on.. but it should be more easy..
¿Did someone tried to solve this in an elegant and straightforward way? ¿I'm missing something? Thanks

Related

Delphi FireDAC: how to refresh data in cache

i need to refresh data in a TFDQuery which is in cached updates.
to simplify my problem, let's suppose my MsACCESS database is composed of 2 tables that i have to join.
LABTEST(id_test, dat_test, id_client, sample_typ)
SAMPLEType(id, SampleName)
in the Delphi application, i am using TFDConnection and 1 TFDQuery (in cached updates) in which i join the 2 tables which script is:
"SELECT T.id_test, T.dat_test, T.id_client, T.sample_typ, S.SampleName
FROM LABTEST T
left JOIN SAMPLEType S ON T.sample_typ = S.id"
in my application, i also use a DBGrid to show the result of the query.
and a button to edit the field "sample_typ", like this:
qr.Edit;
qr.FieldByName('sample_typ').AsString:=ce2.text;
qr.Post;
the edition of the 'sample_typ' field works fine but the corresponding 'sampleName' field is not changing (in the grid) after an update.
in fact it is not refreshed !
the problem is here: if i do refresh of the query, an exception is raised: "cannot refresh dataset. cached updates must be commited or canceled
and batch mode terminated before refreshing"
if i commit the updates, data will be sent to database and i don't want that, i need to keep the data in cache till the end of the operation.
also if i get out of the cache, data will be refreshed in the grid but will be sent to the database after qr.post and i don't want that.
i need to refresh data in the cache. what is the solution ?
Thanks in advance.
The issue comes down to the fact that you haven't told your UI that there is any dependency on the two fields - it clearly can't know how to do the join itself without resubmitting it so if you don't want to send the updates and reload you will have a problem.
It's not clear exactly what you are trying to do, but these two ideas may help you.
If you are not going to edit the fields in the SAMPLEType tables (S) then load the values from that table into a lookup table. You can load this into a TFDMemTable. You can use an adapter which loads from a query. Your UI controls can then show the value based on the valus looked up in your local TFDMemTable. Dependiong on the UI control this might be a 'LookupField' or some such.
You may also be able to store your main data in a TFDMemTable with an Adapter - you can specify diferent TFDCommands to read the whole recordset, refresh a record, update, insert and delete a record. The TFDCommands can act on multiple tables for joined recordsets like this. That would automatically refresh the individual record for you when you post it.

How to find when "Function" (date) is added to responsibility Oracle EBS 12.1.3

I was adding new Menu functions to Responsibility and then i updated wrong one. But not sure if it is updated now or it was there before.
So, I am searching for a query to find when a certain menu function is added to the certain responsibility.
Please help me.
If you have SQL query access, the who columns (audit columns) are a good place to start looking - check out the LAST_UPDATE_DATE for the date the modification was made and LAST_UPDATED_BY will tell you who updated the table. All the tables should have them. The tables involved are:
fnd_menu
fnd_menu_entry
fnd_nodes
fnd_user_resp_groups
fnd_resp_function
fnd_responsibility

AnyDac aka FireDac cannot generate update query

I was using UniDac for a long time now and decided to move on to FireDac as it has good Asynch methods after moving on i saw that none of my data editing works anymore it gives me an error:
[FireDAC][Phys]-330. Cannot generate update query. Update table undefined.
What I am trying to do here is i have a TFDStoredProc component who gets all the data from the database and lets me edit it, with unidac I could easily edit the data without any problem like this:
StoredProc.Edit;
StoredProcCreatedID.Value := SomeValue;
StoredProc.Post;
and it worked, but with AnyDac it doesn't, I tried specifying manually the UpdateTable which leads to another problem:
[FireDAC][Phys][ODBC][Microsoft][SQL Server Native Client 11.0][SQL Server]Invalid column name 'CreatedID'.
I am using Microsoft SQL Server 2012 FireDac 8.0 and stored procedures for getting back results any ideas?
P.S.
The query looks like this
SELECT
CreatedBy as CreatedID,
usr.UserName as CreatedBy
FROM
Sales
LEFT JOIN
Users usr ON usr.ID = Sales.CreatedBy
it looks like the FireDac update builder doesn't recognize the aliases on the fields, any help would be appreciated.
Well i figured out what was the issue, it seems that if you specify a alias in the query for a field the Origin property will be set to the alias and not the real field i downloaded CNPack a must have for a delphi developer also its free, ran the component selector and changed all my aliased fields to their real fields and it works, but this is still a big issue in FireDac component because it doesnt recognize the aliased fields, lets hope it will be fixed in the future as to specify for every query what table it should update and what fields that just allot of work if you are migrating from a big project in my case 220+ stored procedures.

Recover Redmine data from production log

I had a project in Redmine with more than 600 issues. I moved all the issues to a different project. I had no idea that the move deletes all the data for the custom fields!
So all the custom field values are now lost. I did not backup the database before this action as I really did not think that I was going to do any harm by moving issues as moving is a native function in the UI.
What I noticed is though that the production.log contains events for all creation and updates. All my 600 issues are in order in the production log. How can I use these log statements to repeat the actions? If I can import all the log actions, I can migrate the custom fields that it writes to the original Redmine instance and restore my values.
Entries look like this:
Processing IssuesController#update (for XX.XX.XX.X at 2013-02-07 11:19:54) [PUT]
Parameters: {"_method"=>"put", "authenticity_token"=>"nWNSSRYjHhN0BGb+Ya8M4pYWPPgsfdM=", "issue"=>{"assigned_to_id"=>"", "custom_field_values"=>{"10"=>"", "5"=>"Not translated", "1"=>"fi", "8"=>"http://screencast.com/t/ODknR8K", "9"=>"", "3"=>"", "4"=>""}, "done_ratio"=>"0", "due_date"=>"", "priority_id"=>"4", "estimated_hours"=>"", "start_date"=>"2013-02-07", "subject"=>"1\tInstallation in English", "tracker_id"=>"1", "lock_version"=>"0", "description"=>"Steps:\r\nOpen Nitro\r\n\r\nProblem:\r\nNot localized"}, "controller"=>"issues", "time_entry"=>{"hours"=>"", "activity_id"=>"", "comments"=>""}, "attachments"=>{"1"=>{"description"=>""}}, "id"=>"3876", "action"=>"update", "commit"=>"Submit", "notes"=>""}
I am really hoping that there is a way, any help will be greatly appreciated
You could use a decent text editor and/or spreadsheet application and do a massive find and replace and construct a series of UPDATE SQL commands and run them directly on the database (TEST FIRST!!)
Extract from log
Remove unnessary information
Copy into spreadsheet
Split text into columns
Add in columns with necessary SQL commands "UPDATE SET etc" copy into all rows of this column etc.
Join columns to make one text command per row
Export joined data to a text file
Run against test database as sql
If all goes well run against production database as sql
The log entry, following "Parameters:", looks like a regular Ruby hash definition. I'd parse that out and eval it back into a hash variable.
From there you will need to peel off elements and insert them into a database. I'd do that using Sequel, but use what works for you.
Talk to the RedMine support people and get the schema for their tables so you can figure out what data goes where and the database driver needed.

Entity Framework Stored Procedure Mapping HELP!

Seems my appreciation for the Entity Framework is taking a serious hit. The "MS almost got it right, but they just missed it because of something L-A-M-E" thought is coming up. Until today everything has been fine. For some unknown reason, it won't compile anymore with Error 2048. I've read up on this and I've seen how you need to map all three operations. Why is this even necessary? What I if I don't need a delete function and only need insert and update? I tried mapping a dummy SP to my delete function. If that fixes my problem, however cheezy, fine. Only problem is, it just created more problems.
Here's what I have. I'm writing a simple newsletter app in MVC. I have entities for a publication, issue and article all generated from my DB (SQL 08). I set up the the relationships in my DB and they translated fine to my EDMX. I made some SPs to insert and update my issues and article. I added them to the EDMX and mapped them accordingly. I don't need a delete function for any of them and I don't need anything for the publication entity. Why is the compiler forcing me to map all functions? IMO, this a MAJOR, MAJOR PROBLEM with EF4 and I can't believe MS would release it with this kind of crap coming up.
The other strange issue is I've tried mapping sp's to entities in another project and configured with only insert and update and they compile fine. Why is the compiler inconsistent?
I would rather not resort to having to use the Imported Functions. Is that my only option? If that's the case it eliminates the ability to the SaveChanges method. Come On MS!!! Fix this!!!!!!!
After much digging and side by side text comparison, I think I have found the solution.
The SP in question was set up like
CREATE PROCEDURE updateArticle
(
#ArticleID INT,
#Content TEXT
)
where it should have been something like
CREATE PROCEDURE updateArticle
(
#ArticleID INT,
#IssueID INT,
#Content TEXT
)
Now, I still don't know why EF4 even requires this since I'm not updating the issueid and the error message lends little help in diagnosing the problem. My SP doesn't even use the IssueID, but EF needs it regardless. Hopefully this will help someone down the road. MS still could do a better job regarding the need for this.

Resources