Databricks- Move data from Databricks temp to DataWarehouse or move from Databricks Table to Dataware house directly - data-warehouse

How do we move data from databricks temp view or table to dataware house directly . Please let me know
as I am new to databricks.
The document just shows direct connectivity to Datawarehouse but how do we move data

You move data by writing a Data Frame to an external location. EG
df.write
.format("com.databricks.spark.sqldw")
.option("url", "jdbc:sqlserver://<the-rest-of-the-connection-string>")
.option("forwardSparkAzureStorageCredentials", "true")
.option("dbTable", "my_table_in_dw_copy")
.option("tempDir", "wasbs://<your-container-name>#<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")
.save()
https://docs.databricks.com/data/data-sources/azure/synapse-analytics.html

Related

how to assign data from database to dxlooktreeview?

saving data from the component to database work great
dm.tabtier.FieldByName('clefamtiers').Asinteger:=dm.tabfamtiers.FieldByName('clefamtiers').AsInteger;
tabfamtiers is datasource of dxlookuptreeview
i need to show data from database to dxlookuptreeview but i can't make it go right ,
if condition found record then locate and show it
dxlookuptreeview1.text:=dm.tabfamtiers.FieldByName('famtiers').AsString
dxlookuptreeview1.rootvalue:=dm.tabtier.FieldByName('clefamtiers').Asinteger
but it did not work

Unique field in cloud kit while syncing with local core data cache

Am trying to implement cloudkit sync with local core data stack, I had a few doubts as follows
I needed a table to have unique field other then the CKRecordID field. So how can i achieve this ? The manual process of checking before inserting in cloudkit is too cumbersome.
Prevent deletion of a parent record if a child record exist in cloud kit.
Explanation: Say i have two devices both are synced with cloud with a single child for a parent record, now Device 1 inserts a child record for that parent and syncs with cloud kit and device 2 deletes that particular parent, now when device 2 syncs with cloud it automatically deletes the parent and the new inserted child with it. What i want to achieve is if a new child exist on a parent in cloud then the parent record must not get deleted instead when device 2 syncs, it gets the parent along with new child record.
Note: Using Private database with custom zone.
Any suggestions are most welcome. Thanks in advance
When you start syncing between cloud kit and core data start it from parent entities and then sync child entities.
When Device 1 is synced with cloud kit then the child record will be synced.
When Device 2 will sync then before delete from cloud kit to core data, You have to check following scenario :-
If Parent deleted from local and it sync with cloud kit then first check is there any child entity available of parent’s entity on cloud kit, If available then don’t delete it from local core data.
(Note :- Before delete any parent from cloud kit or core data you have to check first is there any new child available from its opposite side)
You said private database, you should be using CloudKit CoreData that's exactly the scenario it was designed for.
Public database is a different story.
You can use the called to this method to generate your own UUID that you can then use as a record ID.
let uuid = CFUUIDCreateString(nil, CFUUIDCreate(nil))
You need to be creating references that point backwards. So you wouldn't put this in a reference filled of the parent RECORD and then create a child with it ;
You would create the parent with this ID and point the children to it
That way if the parent gets deleted, yes the child does [assuming you choose that option]; but if the child gets deleted the parent remains in place.
Here the some Swift 2.0 code to bring it all together, newRecord here is a CKRecord, theLink is a reference field. This code you would use when you create the child record.
uniqReference = NSUUID().UUIDString
let singleLink2LinkthemALL = CKRecordID(recordName: uniqReference)
let theLinkRef = CKReference(recordID: singleLink2LinkthemALL, action: .DeleteSelf)
newRecord2.setObject(theLinkRef, forKey: "theLink")
I strongly suspect; I have not tested it that if you had a child record with two backward references to two distinct parent records, it would remain in place even if one of the parents was deleted.

UITableView handling Json and Core Data

What would be the best practise and best for user experience to achieve the following?
1:) Retrieve data from JSON
2:) Store in Core Data
3:) Display in UITableViewController
Do i store the JSON first, then populate the table using the stored data? OR Do i store it in the Core Data (background process) and populate the table using the JSON for the first time?
I want the user to be presented with a UITableview with minimum load time.
Thanks
This is what I would do:
Create your Core Data database and model.
Create a data access layer that will contain the read and write methods for each of your objects
In the read functions you can query the core data, if there is data then return that. Then in the background call the web server and and update your core data with the new JSON.
If there is no data go and request it from the web server, populate your core data tables using the JSON and then return the data from the core data so it is always consistent.
You can also have a last update date on your data so you are only requesting the new data from the web server that isnt already in your local core data DB. This will reduce the amount of data coming down to your ios device.
If you want minimum load time then I'd serve from JSON and that save to CoreData afterwards. That way the user can see content straight away without first having to wait for all the data to be saved (and parsed).
The course of action in this matter heavily depends on:
A. The amount of JSON data you are downloading
B. How effective your backend is at only sending necessary JSON results (rather than sending everything in bulk)
C. How you are attaching Core Data to your UITableViewController.
I recently made a pretty big project that does exactly this, which involved fetching a pretty big chunk of JSON, parsing it, and inserting it into Core Data. The only time there is any delay is during the initial load. This is how I accomplished it:
Download JSON. Cast as [[String: AnyObject]]: if let result = rawJSON as? [[String: AnyObject]] {}
Check to see if IDs for objects already exist in Core Data. If that object already exists, check if it needs update. If it doesn't exist, create it. Also check if IDs have been deleted from the JSON, if so remove them from.
Use NSFetchedResultsController to manage data from Core Data and populate the UITableView. I use NSFetchedResultsController rather than managedObjectContext.executeFetchRequest() because NSFetchedResultsController has delegate methods that are called every time the managedObjectContext is updated.

How to avoid renaming Oledbconnection data source

When transfering file to another location i always need to change the source or directory..
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Users\Renz\Desktop\FINAL\Database\AuditDB.mdb")
Is there a way I can avoid that?
You could use a path relative to your applications location, e.g.
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=Database\AuditDB.mdb")
Where in this example I am presuming that your database is stored in a folder called Database in the same folder as your program.

how to load breeze entity from local storage websql

I don't have access to an OData provider just a simple REST api with json and I need to store the data locally (on the mobile device websql) in different tables reflecting the backend model. Following the Edmunds example I have got the entity and the relationships working from the REST api. How can I make it work the same way from the data stored locally. I would like to fetch the data from the local DB and recreate my entities, any advice would be appreciated thanks.
After you have queried the data thru the REST api, just export the EntityManager to local storage. Something like this
var changesExport = myEntityManager.exportEntities();
ok(window.localStorage, "this browser supports local storage");
var stashName = "arbitrary name for storage area"";
window.localStorage.setItem(stashName, changesExport);
This data can later be reimported into any existing EntityManager, and then queried locally by simply reimporting the data.
importedData = window.localStorage.getItem(stashName);
anotherEntityManager.importEntities(importedData);

Resources