Sinchronize blob files from cloud to IoT Edge Blob (local) - azure-iot-edge

Please refer to the following hypothetical diagram for an IoT Edge device implementation. We want to know if there is an automated mechanism for it using the Azure IoT infrastructure.
An admin application will write several JSON configurations files associated with a specific device. Each device has a different config, and the config files are large (1Mb), so using twins is not a good solution.
We want those files stored in the cloud to be sent automatically to the target device, for it to store them in its local blob storage. The local files shall always reflect what is in the cloud, almost like a OneDrive.
Is there any facility for this in Azure/Edge? How we can isolate the information for each device without exposing other configurations stored in the cloud blob?

Upload the BLOB to Azure storage (or anywhere, really), and set a properties.desired property containing the link+SAS token (or if you want: a hash of the contents if you want to keep the URL always the same). Your edge module will get a callback (during startup, and during runtime) that the property value has changed, and can connect to the cloud do download the configuration. No need to use LocalBlobStorage module, the config can be cached in the edge modules /tmp directory.

Related

Is all WSO2 API Manager's configuration saved in the database?

Say one implements a WSO2 API Manager Docker instance connecting to a separate database (like MySql) which is not dockerized. Say some API configuration is made within the API Manager (like referencing a Swagger file in a GitHub).
Say someone rebuilds the WSO2 API Manager Docker image (to modify CSS files for example), will the past configuration still be available from the separate database? Or does one have to reconfigure everything in the new Docker instance?
To put it in another way, if one needs to reconfigure everything, is there an easy way to do it? Something automatic?
All the configurations are stored in database. (Some are stored in internal registry, but registry saves data in database at the end)
API artifacts (synapse files) are saved in the file system [1]. You can use API Manager's API import/export tool to migrate API artifacts (and all other related files such as swagger, images, sequences etc.) between one server to another.
[1] <APIM_HOME>/repository/deployment/server/synapse-configs/default/api/

How to get public URL of Azure Files file?

I am using Azure Files to store files for my Web Application, which I have previously mentioned here.
I am currently processing the files/sub-directories within a directory, and outputting a navigation table so the user can navigate into sub-directories, and in the end, obtain said files. I'm doing this by using the methods described in the 'Access the file share programmatically' section of this Azure Documentation article.
My question is very simple, how can I, from my Web App, which is
running in Azure app service, provide a public URL were the user
can download/view the file?
Please note, I would prefer that the file is not automatically downloaded, since most of the files would be a .PDF, and therefor preview-able in the browser.
My question is very simple, how can I, from my Web App, which is
running in Azure app service, provide a public URL were the user can
download/view the file?
One possible solution would be to create a Shared Access Signature (SAS) on the files with at least Read permission and use that SAS URL. Depending on the file's content type, the file's contents will be either displayed inline in the browser or the user will be prompted to download the file. If you want to force the download, you could always override Content-Disposition response header in the SAS.
Using Shared Access Signature (SAS) could be a solution but it is probably an overkill in the given scenario.
In provided scenario the Blob Storage with public access is the most practical way to store files. From documentation:
... You can specify that a container and its blobs, or a specific blob, are available for public access. When you indicate that a container or blob is public, anyone can read it anonymously; no authentication is required. Public containers and blobs are useful for exposing resources such as media and documents that are hosted on websites. To decrease network latency for a global audience, you can cache blob data used by websites with the Azure CDN.
https://azure.microsoft.com/en-us/documentation/articles/storage-introduction/
To set container permissions from the Azure Portal, follow these steps:
Navigate to the dashboard for your storage account.
Select the container name from the list. Clicking the name exposes the blobs in the chosen container
Select Access policy from the toolbar.
In the Access type field, select "Blob"

Access MP3 files on server from iOS.

I'm building a streaming app similar to pandora. However right now I'm storing all my files on http and accessing them with urls. Is there an alernative to this because all the files are in the public html folder? For example how does apps like pandora or spotify pull files off their servers. I'm new to web severs and not sure where to ask this question. I have a centos server on vps hosting with apache, MySQL, http, ftp.
You just need to provide the content as a bit stream rather than a file download. The source of that data to send as a stream can be stored as binary data in a BLOB column in a database or as a regular file on a non-public part of the file system. It really does not mater which one you use.
Storing them in the database gives your app a bit easier access and makes the app more portable since it is not restricted the file system level permissions.
The fact you currently have the files in a public folder is not really that critical of an issue since you are making them available for download. You would just need to make sure you have an authentication requirement if you want to restrict who can access them.

Storing and displaying image files in grails application in cloud foundry

I have an grails application, which allows users to upload image files and these images can be displayed. The application actually only stores file path in database and saves the image files in file system.
I've read some posts which says Cloud Foundry doesn't support local file system access. So my question is what modification should I do if I want to deploy my application to Cloud Foudry? I hope images still can be displayed directly on the webpage and users don't have to download them to their own computer only for viewing them.
The images stored on file system can disappear when your application stops, crashes, or moves. It should not be used for content that you want to persist. Further, the file system storage is not scalable. That is to say if more than one instance of your app is running the local storage is only visible to a specific instance of the app, and is not visible or shared across all instances.
To meet your requirements, a local service such as MongoDB GridFS, MySQL with blob data type or external blob stores such as Box.net or Amazon S3 can be used.

Azure - uploading files to blob storage via shared hosting

Im struggling to find an answer to this. I have a website that is deployed in a shared hosting environment. I want to allow people to upload files to my azure blob storage account.
I have this working locally, using the storage emulator, however when I publish the site I get a Security Exception.
Is this actually possible under a shared hosting envrionment ?
Cheers
A bit more detail would help, in understanding how these uploads are taking place. That said, I'll make the assumption that people are uploading directly to Blob Storage, and not through your Website (or Web Service).
To allow direct uploads, you need to provide either a public blob or container (which everyone in the world can see), or create a temporary Shared Access Signature (SAS) on a specific blob or container, that grants access for a short time window.
If your app is Silverlight, then you are probably running into a cross-domain issue (and you'll need to correct that with an access policy).
If you provide more details around the way uploads are being sent, as well as the client and server technology, I can edit my answer to be more specific.

Resources