I m developing web-service using spring boot. I have two API. In first Api I a allowing user to upload the file. I m saving the file in my system (Mac). Where should I store those file (any specific folder/directory) so that I can return an url and client (iOS) later can access that file using that url.
You can store all the files in a local directory and you can run a simple http server locally serving on any port .
You can access the file in client side simply using local url(localhost:port/filename).The whole setup has to be local
But the best way to do this is to store the file in s3 bucket or any other cdn(ex-uploadcare)
which can be shared using simple url
Related
So , I have setup my project and I am planning to self serve my files from my server. Currently I have two folders from where I want users to access files , 1. Media folder and another is static folder . I wanna set up signed urls just like S3 provides so that one cannot bruteforce and get all my files from the server's either folders. Also I want to make it so that I can handle uploading of files directly from nginx rather than through my application.
For the signed URLS - need to generate them with your application. Then you have two options:
Serve the files with your application after validating the URL is valid
Serve the files with nginx but do the validation/authorization with your app. This is what you need:
http://nginx.org/en/docs/http/ngx_http_auth_request_module.html#auth_request
For the upload part the situation is similar, you can either:
Handle uploads with your app
Use nginx-upload-module and handle the authorization only with your app
I would definitely recommend handling all uploads within your app, it is just simpler.
I am currently using code based on this turtorial http://sweettutos.com/2015/11/06/networking-in-swift-how-to-download-a-file-with-nsurlsession/ to download a remote file using URLSession.downloadTask. This was suprisingly simple. However I would now like to download the entire contents of a remote directory.
Can I use URLSessionDownloadTask or is this only for single files? If not then how can I obtain a list of the files contained in the remote directory so that I can use downloadTask on each of them individually?
First of all you are thinking it in wrong way.
From the remote server, only a file that can be downloaded(not the folder) and save inside the app. The file extension that you have to download should be configure from a server side. Then the client side which you can use Sweettutos tutorial.
First thing you have to do was :
Talk with the server side developer that he had to zip the remote directory in (.zip or .rar) that you can download it only.
Then, at your code download the url which the server-side given to you and save it in document directory, extract and read the file which you want to.
At the URLSession Documentation :
Download tasks retrieve data in the form of a file, and support
background downloads and uploads while the app is not running.
So, there is no way you can download remote directory (unknown file extension) until you make that remote file available to some file extensions from the server-side.
I want to use RoxyFileMan to manage uploaded images and files. But I should save them on the server. As you know, RoxyFileMan Upload uploads files and images in a folder named Uploads in a fileman directory. We can change FILES_ROOT to another local path to change the directory files get uploaded to.
But, I want to upload files on the server and then read them from the server after they've been uploaded so that they can be edited in ckeditor.
Can anyone please provide advice/guidance on how to achive this outcome?
It's not very clear how your question is meant, but if I understand you correctly, you are using RoxyFileMan on a local server and want to upload files to a remote online server, right?
Roxy uploads files to a directory on the server it is run from. Which means if run from localhost, it will use directory on your localhost. If run from a server, it will use a directory on server.
As far as my knowledge goes, you cannot upload from a localhost to an online server directly.
You could maybe achieve that using some custom script to open an FTP connection, but then you would have to also remake all the code to also load images from there... which would be rather silly.
I'm trying to upload a file to my rails app by sending the user's local location of the file via a parameter, so the input URL would look like this:
http://rails-app.herokuapp.com/element?file=C:\temp\data.txt
It's easy if I'm working on my local machine, since I can just use File.read(filename), however this doesn't work on heroku. Thanks for any help!
First of all Heroku has read-only file system. So you can't upload anything directly to heroku.
Use something like Amazon S3 to keep files.
The second issue is your approach.
When running app locally - it has access to your C:/ drive.
But app that is located at remote server does not have access to your computers C:/ drive, so it can't fetch file.
You should upload either through browser file field or through passing accessible to anyone http link.
I am trying to create a dashboard using CSV files, Highcharts.js, and HTML5. In a local development environment I can render the charts using CSVs both on my file system and hosted on the web. The current goal is to deploy the dashboard live on Heroku.
The CSVs will be updated manually - for now - once per day in a consistent format as required by Highcharts. The web application should be able to render the charts with these new, "standardized" CSVs whenever the dashboard page is requested. My question is: where do I host these CSVs? Do I use S3? Do I keep them on my local file system and manually push the updates to heroku daily? If the CSVs are hosted on another machine, is there a way for my application (and only my application) to access them securely?
Thanks!
Use the gem carrierwave direct to upload the file directly from the client to an Amazon S3 bucket.
https://github.com/dwilkie/carrierwave_direct
You basically give the trusted logged in client a temporary key to upload the file, and nothing else, and then the client returns information about the uploaded file to your web app. Make sure you have set the upload to be private to prevent any third parties from trying to brut force find the CSV. You will then need to create a background worker to do the actually work on the CVS file. The gem has some good docs on how to do this.
https://github.com/dwilkie/carrierwave_direct#processing-and-referencing-files-in-a-background-process
In short in the background process you will download the file temporarily to heroku, parse it out, get the data you need and then discard the copy on heroku, and if you want the copy on S3. This way you get around the heroku issue of permanent file storage, and the issue of tied up dynos with direct uploads, because there is nothing like NGINX for file uploads on heroku.
Also make sure that the file size does not exceed the available memory of your worker dyno, otherwise you will crash. Sense you don't seem to need to worry about concurrency I would suggest https://github.com/resque/resque.