I have an grails application, which allows users to upload image files and these images can be displayed. The application actually only stores file path in database and saves the image files in file system.
I've read some posts which says Cloud Foundry doesn't support local file system access. So my question is what modification should I do if I want to deploy my application to Cloud Foudry? I hope images still can be displayed directly on the webpage and users don't have to download them to their own computer only for viewing them.
The images stored on file system can disappear when your application stops, crashes, or moves. It should not be used for content that you want to persist. Further, the file system storage is not scalable. That is to say if more than one instance of your app is running the local storage is only visible to a specific instance of the app, and is not visible or shared across all instances.
To meet your requirements, a local service such as MongoDB GridFS, MySQL with blob data type or external blob stores such as Box.net or Amazon S3 can be used.
Related
Please refer to the following hypothetical diagram for an IoT Edge device implementation. We want to know if there is an automated mechanism for it using the Azure IoT infrastructure.
An admin application will write several JSON configurations files associated with a specific device. Each device has a different config, and the config files are large (1Mb), so using twins is not a good solution.
We want those files stored in the cloud to be sent automatically to the target device, for it to store them in its local blob storage. The local files shall always reflect what is in the cloud, almost like a OneDrive.
Is there any facility for this in Azure/Edge? How we can isolate the information for each device without exposing other configurations stored in the cloud blob?
Upload the BLOB to Azure storage (or anywhere, really), and set a properties.desired property containing the link+SAS token (or if you want: a hash of the contents if you want to keep the URL always the same). Your edge module will get a callback (during startup, and during runtime) that the property value has changed, and can connect to the cloud do download the configuration. No need to use LocalBlobStorage module, the config can be cached in the edge modules /tmp directory.
I've built an app where users can upload their avatars. I used the paperclip gem and everything works fine on my local machine. On Heroku everything works fine until server restart. Then every uploaded images disappear. Is it possible to keep them on the server?
Notice: I probably should use services such as Amazon S3 or Google Cloud. However each of those services require credit card or banking account information, even if you want to use a free mode. This is a small app just for my portfolio and I would rather avoid sending that information.
No, this isn't possible. Heroku's filesystem is ephemeral and there is no way to make it persistent. You will lose your uploads every time your dyno restarts.
You must use an off-site file storage service like Amazon S3 if you want to store files long-term.
(Technically you could store your images directly in your database, e.g. as a bytea in Postgres, but I strongly advise against that. It's not very efficient and then you have to worry about how to provide the saved files to the browser. Go with S3 or something similar.)
I have rails application running on heroku, as heroku file system is read-only so we cannot store any files or images on heroku.
A lot of people has suggested to use amazon s3, but can i use external storage to save user files and images and to retrieve them from there with paperclip or carrier-wave or anything similar.
Currently i am using Dropbox for images and files storage. But its too slow. I have a shared hosting account, there i have a lot of disk space and i want to use that to store files.
Any idea on how to do that?
I'm developing a web application using Grails.
I'm wondering where should I store the uploaded files (pictures, pdf ...) ? in the application server? a remote ftp server? or where ?
It depends
what you want to do next time with this files
do you want to allow to download those
how many clients and how big traffic to those files has your application
how many files foresee to have
The easiest way is the best, so you can start simply store files in local filesystem.
Good practics is to create dedicated calss/service for suporting file storage, when in the feature you want store files elsewhere you only have to change implementation in one place.
Im struggling to find an answer to this. I have a website that is deployed in a shared hosting environment. I want to allow people to upload files to my azure blob storage account.
I have this working locally, using the storage emulator, however when I publish the site I get a Security Exception.
Is this actually possible under a shared hosting envrionment ?
Cheers
A bit more detail would help, in understanding how these uploads are taking place. That said, I'll make the assumption that people are uploading directly to Blob Storage, and not through your Website (or Web Service).
To allow direct uploads, you need to provide either a public blob or container (which everyone in the world can see), or create a temporary Shared Access Signature (SAS) on a specific blob or container, that grants access for a short time window.
If your app is Silverlight, then you are probably running into a cross-domain issue (and you'll need to correct that with an access policy).
If you provide more details around the way uploads are being sent, as well as the client and server technology, I can edit my answer to be more specific.