MDS Staging with automatic codes creation - staging

everyone!
We're using MS SQL Master Data Services to organize our enterprise master data and some of entities, we keep, consists of data, that we load from external sources almost as-is. We regularly update them using jobs or SSIS packages, placing data into staging tables ([stg].[<name>_Leaf]) and starting staging process using procedures, named as [stg].[udp_<name>_Leaf] as described in THIS and THIS topics about staging process in MDS.
Sometimes data, we import from some external source is presented as a flat table, just containing a set of rows, that we might want to reference in our other tables and then we load it and enjoy (in fact we place data into staging tables, call SP and let MDS proceed it in any, comfortable for server, moment, due to the main workload of staging process is running asynchronously, via Broker).
But there are a lot of other, ugly, but real-life cases, when data, that we load is presented as a tree, containing references to members, that we've not loaded yet and just going to place them into staging tables.
The problem is that in most cases we use automatic code creation function (and we cannot use not surrogate code), and we're not able to set member referencing's field value (where referenced's member code must be placed) to newly created member, before member is created and inserted into the base table and code is generated and set.
As I can see, we could resolve this problem, if we could reference staging member by staging table's ID, which is IDENTITY and assigned right after insert.
-OR-
If we could receive a callback from the staging process when data is placed into our base tables and codes are assigned. Then we'd calculate all references and update them (using the same staging process mech).
Currently, we use stupid not-very-elegant workaround, generating GUIDs and using them as Code value, when this scenario is.
Can anyone offer anything more enterprise? (:

When loading hierarchical data load the parent records into the staging table and then run the associated stored procedure to apply them to the entity table where they will be assigned the automatically generated code.
Next when loading the child records into staging leaf lookup the parent code using a subscription view of the parent entity.
When using automatically generate codes, I recommend you start at 10000 or 100000, as Excel sorts the codes as strings.

Related

Dynamics365 Operations: Created/Updated timestamps with Data Entities

I am new to Dynamics FnO, and recently followed the articles to access data through oData, and was successful.
What I see missing in the data objects that I normally receive in integrations out of the Microsoft World is the created/updated timestamps.
I am trying to put a synchronous data flow from FnO to my NodeJs application, so that my app keeps polling data from FnO whenever there is a change. This can be achieved easily if there were timestamps with the data that flows in.
Is there a way to setup those timestamps somewhere?
You have to make sure that the underlying table that you are querying has the fields added on it, and also that the data entity you are accessing through odata has the fields setup up on it as well.
Make sure this is setup on the table:
And then you have to drag and drop the field(s) from the datasource field list to the exposed field list in the data entity:
After this, you will have these fields

Stored Procedure Failing to Insert to a Table, No Errors Given

Im working on an azure database just adding a couple of Stored Procedures and just making sure the program I'm building with it in .NET is all aligned properly.
I'm not going to go into the Stored Procedure itself nor the Program I'm developing because I don't believe the problem is there, I have a Development program and database which is using the exact same code and they work fine, I'm using Microsoft's SQL Server Management Studio to handle everything on the servers side.
The only difference to the current setup is that I myself scripted a bunch of the Stored procedures and a single View of a table that I did not create....(I did not create the table, but I made a view for it which is a slightly different format)
The person creating most of these databases and table is one of the database administrators I guess (not Microsoft, but an employee of the company using their services), I on the other hand am a freelance programmer and I'm guessing I have somewhat limited access to the server (limited credentials).....although it's allowing me to do more or less anything I need to do like creating SP's etc.
My current (and only problem) is a single stored procedure that runs through without an error does not update the table (the table i did not create) the Stored Procedure just inserts a couple of records and then deletes a record from the same table.......
It deletes the record just fine but for some reason the INSERT doesn't insert anything.
Again, this works fine on another Development database and the programs are sending the exact same strings but this new database just doesn't want to play along.....
Could this be a permission problem I'm having between my stored procedure and the table I did not create?
I would love to dump this onto the admin guy (and already did but he dumped it back on me haha) so I just want to be sure I'm not wasting his time....... and give him something solid to go on.
Thanks for your help Paul S.

Rails communication with web services

I am developing a Rails web shop application and I have the following system set up:
2 separate web services (very simple Rails apps with the same code but different databases)
Main Rails application which stores information from both web services.
The main application gets some information from both web services (in JSON format) and has to choose items (based on price). For testing purposes I currently take all items from both and add them to the main application's database. However, when items are being stored in the main database (with a simple .create and a hash with all parameters it seems as if it's adding on item multiple times and thus it takes a very long time.
First, what is generally a good strategy for doing this type of thing - getting data from the web services and storing it? Also, at what point do I want to ask for an update of the main database? It seems too much if it is every time a user connects.
I assume there is a key value for id in the data... if not you should define one. Most likely an auto incrementing integer ID since this is tagged as rails. Although you'll probably want a UUID (perhaps SecureRandom.uuid) since the two data sources are independent of each other, which adds significant complexity in a rails app
In that case you could use #model = Model.find_or_create_by(key_value: value) to avoid duplicates being created, and #model.update_attributes (essentially use an update action) to only modify what has changed.

OData Expand across multiple datasource

I have created a few Lightswitch HTML apps. Each app has an ApplicationData data source;however, in addition they relate to each other via one or more OData data sources.
For example:
A HumanResource app contains
Employees table in it's ApplicationData.
A Project Management App contains:
a Tasks table and a Projects table in it's ApplicationData datasource
a Employees table in a related OData datasource (HumanResourcesData) which loads data from the Employees table in the HumanResouce app's ApplicationData datasource.
A Task has an Employee related to it as 1..0 or 1
The problem is
myapp.activeDataWorkspace.ApplicationData.Tasks.expand("Project, Employees").execute());
DOES NOT WORK which is very understandable.
Tasks can be loaded because they reside in "ApplicationData", Employees cannot because it resides in "HumanResourceData". OData cannot fetch data from 2 sources and magically relate them, and neither can Lightswitch apparently, at least when your manually loading data.
Two solution I can think of are:
use task.getEmployees().then(function(results){});
This will cause a roundtrip to the server for every task. NOT GOOD ENOUGH
myapp.activeDataWorkspace.ApplicationData.Tasks.load()
myapp.activeDataWorkspace.HumanResourceData.Employees.load()
//The mystery method that I'm looking for
Resolve_My_Datas_Relationships_Please_Lightswitch();
While reviewing trace.axd I noticed that when I drag and drop the collection into the viewmodel and have all the data loaded automagically with related entities and all, I get one call to the HumanResourcesData datasource for each related entity, which looks exactly like solution #1.
This leads me to believe that solution #2 does not exist since it is not the approach that the LightSwitch Team has taken, for which they may have valid reasons (probably because it's very hard to optimize for any particular set of queries if you do not know what they are before hand)
Can anyone shed some light?

How to handle multiple database accesses?

In my program I have multiple databases. One is fixed and cannot be changed, but there are also some others, the so called user databases.
I thought now I have to start for every database one connection and to connect to each data dictionary. How is it possible to connect to more than one database with one connection by handing over the data dictionary filename? Btw. I am using a local server.
thank you very much,
André
P.S.: Okay I might find the answer to my problem.
The Key word is CreateDDLink. The procedure is connecting to another data dictionary, but before a master dictionary has to be set.
Links may be what you are looking for as you indicated in the question. You can use the API or SQL to create a permanent link alias, or you can dynamically create links on the fly.
I would recomend reviewing this specific help file page: Using Tables from Multiple Data Dictionaries
for a permanent alias (using SQL) look at sp_createlink. You can either create the link to authenticate the current user or set up the link to authenticate as a specific user. Then use the link name in your SQL statements.
select * from linkname.tablename
Or dynamically you can use the following which will authenticate the current user:
select * from "..\dir\otherdd.add".table1
However, links are only available to SQL. If you want to use the table directly (i.e. via a TAdsTable component) you will need to create views. See KB 080519-2034. The KB mentions you can't post updates if the SQL statement for the view results in a static cursor, but you can get around that by creating triggers on the view.

Resources