How to load avro files from blob storage to azure data factory MOVING DATA FLOW? - avro

How to load avro files from blob storage to azure data factory MOVING DATA FLOW?
I'm trying to load but unable to import schema and preview.
My avro files in blob are result of event hub capture function.
I have to move data from azure blob to azure sql db using azure data factory's moving data flow.

Data Factory support Avro files.
In Data Flow, create a new Source from Azure Blob Storage:
Choose the Avro format for your files:
Choose the file path:
Then you can add the sink to Azure SQL dataset:
Here is another tool may be helpful for you: Load AVRO Files Data to Azure SQL Database in Real Time
Hope this helps.

Related

Sinchronize blob files from cloud to IoT Edge Blob (local)

Please refer to the following hypothetical diagram for an IoT Edge device implementation. We want to know if there is an automated mechanism for it using the Azure IoT infrastructure.
An admin application will write several JSON configurations files associated with a specific device. Each device has a different config, and the config files are large (1Mb), so using twins is not a good solution.
We want those files stored in the cloud to be sent automatically to the target device, for it to store them in its local blob storage. The local files shall always reflect what is in the cloud, almost like a OneDrive.
Is there any facility for this in Azure/Edge? How we can isolate the information for each device without exposing other configurations stored in the cloud blob?
Upload the BLOB to Azure storage (or anywhere, really), and set a properties.desired property containing the link+SAS token (or if you want: a hash of the contents if you want to keep the URL always the same). Your edge module will get a callback (during startup, and during runtime) that the property value has changed, and can connect to the cloud do download the configuration. No need to use LocalBlobStorage module, the config can be cached in the edge modules /tmp directory.

Calling MS Graph API from Azure Data Factory Pipeline

Is it possible to use ADF pipeline to call MS Graph API (get user by email) by passing a set of userPrincipalName values from a Source (eg: Azure Table Storage)? Please provide an example.
This is what I have tried: I added Office 365 Linked Service. I have a copy data activity to copy data to Table Storage. Then I have added a Web activity. What dataset should I include in the settings? Also, how would I pass values from Azure Table Storage to Web activity?

Querying device's telemetry/status/event from Azure IoT Central

I've created an Azure IoT Central solution, where I successfully registered an IoT device. I'm able to save telemetry/status/event data for the device, however I could not find an option to query the existing data using .NET code. For example I would like to query the telemetry data for the last month using C#.
Is it not supported by the SDK?
In order to query the telemetry data for the last month you will need to export that to a Blob Storage or routing it to another DB (like Cosmos DB). The supported way to export your data in Azure IoT Central is described here.
You can then leverage REST API on your c# code to extract the data from your devices.
Read Avro files by using C#
Azure Storage samples using .NET
Azure Cosmos DB: .NET examples for the SQL API

How can I change bucket or prefix when writing to Google Cloud Storage from Dataflow?

In a streaming dataflow pipeline, how can I dynamically change the bucket or the prefix of data I write to cloud storage?
For example, I would like to store data to text or avro files on GCS, but with a prefix that includes the processing hour.
Update: The question is invalid because there simply is no sink you can use in streaming dataflow that writes to Google Cloud Storage.
Google Cloud Dataflow currently does not allow GCS sinks in streaming mode.

Azure blob storage as ROR's Dragonfly data store

With EngineYard recently adding Azure's IAAS in their bucket, one of our client is deciding to migrate to Azure blob storage.
In the application, we are using predefined S3 data store settings for dragonfly and in there, other settings are available for various cloud services as well.
Has anyone implemented dragonfly's data store for Azure blob storage?

Resources