AZCopy DateTime - azcopy

Having an issue downloading a SQL backup file from azure storage based on the shared snapshot datetime details.
I'm using the following string:
AzCopy.exe /Source:https://ascentibackup.file.core.windows.net/sqlbackups/TICCSAZR-WSFC1%24TICCS%20Prod%20AG01/Postcode/FULL_COPY_ONLY/TICCSAZR-WSFC1%24TICCS%20Prod%20AG01_Postcode_FULL_COPY_ONLY_20200227_223325.bak?sharesnapshot=2020-02-29T05%3A02%3A24.0000000Z /Dest:R:\Restore\gcfs /SourceKey: Key /Pattern:TICCSAZR-WSFC1%24TICCS%20Prod%20AG01_Postcode_FULL_COPY_ONLY_20200227_223325.bak

Related

Get Resources\Raw Path in .NET Maui

I have an application which uses a SQLite database to store data locally. I get the path to and open the database using:
string dbPath = System.IO.Path.Combine(FileSystem.AppDataDirectory, "mydatabase")
db = new SQLiteConnection(dbPath);
I have created an example database (example.db3) and copied this to the Resources\Raw folder with the Build Action MauiAsset.
My question is how do I get the path to this database file so I can use
string dbPath = System.IO.Path.Combine("*path to Resources\Raw foldder*", "example.db3");
to get the full path to open the example database.
I have tried searching but the best I can get is using FileSystem.Current.OpenAppPackageFileAsync but this opens the file for stream reading whereas I need to get the path.
Ideally the method would work on all platforms.
It is a known issue about this problem.
You can follow it up here: https://github.com/dotnet/maui/issues/3270.
Thanks for your feedback and support.

Azure Data Factory Get Metadata to get blob filenames and transfer them to Azure SQL database table

I am trying to use Get Metadata activity in Azure Data Factory in order to get blob filenames and copy them to Azure SQL database table.
I follow this tutorial: https://www.mssqltips.com/sqlservertip/6246/azure-data-factory-get-metadata-example/
Here is my pipeline, Copy Data > Source is the source destination of the blob files in my Blob storage. I need to specify my source file as binary because they are *.jpeg files.
For my Copy Data > Sink, its the Azure SQL database, I enable the option "Auto Create table"
In my Sink dataset config, I had to choose one table because the validation won't pass if I don't select the table in my SQL database even though this table is not related at all to the blob filenames that I want to get.
Question 1: Am I supposed to create a new table in SQL DB before to have the columns matching the blob filenames that I want to extract?
Then, I tried to validate the pipeline and I get this error.
Copy_Data_1
Sink must be binary when source is binary dataset.
Question 2: How can I resolve this error? I had to select the file type of the source as binary as it's one of the step when creating source dataset. Therefore, when I choose sink dataset that is Azure SQL table, I didn't have to select the type of dataset so it doesn't seem to match.
Thank you very much in advance.
New screenshot of the new pipeline, I can now get itemName of filenames in the json output files.
Now I add Copy Data activity just after Get_File_Name2 activity and connect them together to try to get the json output files as source dataset.
However, I need to choose the source dataset location first before specify type as json. But, as far as I understand these output json files are the output from Get_File_Name2 activity and they are not yet stored on Blob storage. How do I make the copy data activity reading these json output file as source dataset?
Update 10/14/2020
Here is my new activity stored procedure, I added the parameter as suggested however, I changed the name to JsonData as my stored procedure requires this parameter.
This is my stored procedure.
I get this error at the stored procedure:
{
"errorCode": "2402",
"message": "Execution fail against sql server. Sql error number: 13609. Error Message: JSON text is not properly formatted. Unexpected character 'S' is found at position 0.",
"failureType": "UserError",
"target": "Stored procedure1",
"details": []
}
But when I check the input, it seems like it already successfully reading the json string itemName.
But, when I check output, it's not there.
Actually, you may could using Get metadata output json as the parameter and then call the stored procedure: Get metedata-->Stored Procedure!
You just need focus on the coding of the stored procedure.
Get Metadata output childitems:
{
"childItems": [
{
"name": "DeploymentFiles.zip",
"type": "File"
},
{
"name": "geodatalake.pdf",
"type": "File"
},
{
"name": "test2.xlsx",
"type": "File"
},
{
"name": "word.csv",
"type": "File"
}
}
Stored Procedure:
#activity('Get Metadata1').output.childitems
About how to create the stored procedure(get data from json object), you could ref this blog: Retrieve JSON Data from SQL Server using a Stored Procedure.

Can i insert data to influxdb from a .log file

I have a file with data in a .log file. how can i get inserted into the database like the NOAA_water_database they suggest in the documentation.
the link to the doc is below
https://docs.influxdata.com/influxdb/v1.7/query_language/data_download/
CLI command influx has an option called -import to import data to influx. Refer more details here.
In the same document you mentioned in question there is a command to import data to influx
influx -import -path=NOAA_data.txt -precision=s -database=NOAA_water_database
Inserting the data from a log file to influx DB is possible if the lines of a log file are in line protocol.
Please refer below URL :
https://community.influxdata.com/t/writing-logs-manually-into-influxdb/6247/4

How to save RegIniFile as IniFile in Delphi?

My Win32 application uses a registry key utilizing TRegIniFile like this:
MyData:=TRegIniFile.Create('\Software\MyApp');
The registry key has subkeys and values.
How can I export all the data as INI file utilizing TIniFile?

Upload all files from local storage to Azure Blob Storage

I am currently struggling to upload multiple files from the local storage to the Azure Blob Storage, I was wondering if anyone could help me, below is the code i was previously using to upload a single zip file.
private void SaveZip(string id, string fileName, string contentType, byte[] data)
{
// Create a blob in container and upload image bytes to it
var blob = this.GetContainer().GetBlobReference(fileName);
blob.Properties.ContentType = contentType;
// Create some metadata for this image
var metadata = new NameValueCollection();
metadata["Id"] = id;
metadata["Filename"] = fileName;
}
SaveZip(
Guid.NewGuid().ToString(),
zipFile.FileName,
zipFile.PostedFile.ContentType,
zipFile.FileBytes);
Thanks, Sami.
It's quite straightforward with Set-AzureStorageBlobContent from azure storage powershell.
ls -File -Recurse | Set-AzureStorageBlobContent -Container upload
MSDN documentation : http://msdn.microsoft.com/en-us/library/dn408487.aspx
I don't think there's any build-in methods you can use to upload multiple files to the BLOB. What you can do is to upload them one by one, or parallel.
If you're just starting to work with Blob Storage, I'd encourage you to take a look at the "How to" article we've published. Specifically, the section on "How to Upload a Blob into a Container" should be helpful. Beyond that, Shaun is correct - there is no built-in support in the StorageClient library for uploading multiple files at once, but you can certainly upload them one-by-one.
If your need is just to get it done, and not to make an app out of it, you should consider checking out Cloud Storage Studio.
Like CodeThug said, "You never do anything with the byte array".
You have to upload the data stream to the blob and you are done.

Resources