This question already has answers here:
Heroku file upload problem
(5 answers)
Closed 8 years ago.
I've got ROR application deployed on Heroku but the file upload doesn't seem to be working at all.
I do have multipart => true set up on the forms.
This is definitely working on my localhost environment.
Is there something that I need to enable to get this working?
Heroku doesn't accept file uploads the last I checked. You'll need an Amazon S3 or something similar to accept the files. If you upload a file to Heroku, you can access it for the duration of the request via the Tempfile class, but it will not get saved.
https://devcenter.heroku.com/articles/paperclip-s3
https://devcenter.heroku.com/articles/s3
This error trips up a lot of people when they start using Heroku, me included!
The reason that you can't store files locally on Heroku is that this just isn't how Heroku works. Heroku takes a copy of your git repository and bundles it up into a "slug" which then gets run on their servers. Anything outside your slug (i.e. that's not stored within your git repo) will be lost when the dyno (virtual UNIX instance) restarts.
You can see this by firing up a console with heroku run rails c and creating and saving a new file using Ruby's File object. The new file will save correctly, and you can do things like require it or read from it, but if you close and reopen the console window, the file will have disappeared.
So if you want to store files that are being uploaded through a form, you need to use an external storage service. I like Amazon S3 as it's very simple to integrate with Heroku using Paperclip, as the links in the other answer mention.
Related
I have an issue which is strangely not addressed anywhere.
I am using paperclip to upload attachments to S3 in a Heroku app. But since the upload takes time i started using delayed_paperclip. But the issue is the Sidekiq worker fails with a error message that 'unable to open the file or file not found'. This makes perfect sense as the heroku worker and web are running on different dynos.
Is there any solution to it? except that the web has to upload it to S3, which defeats the whole purpose.
The bottom line problem for me is I am unable to share files in tmp folder between the web and the worker
The upload process is not related to your Rails App at all, you are going to have improvements if processing the file is taking too much time, but in your case, the problem sounds like it's related to a big file and slow network
I see everywhere that heroku deletes changes you made to the filesystem once you deploy or scale your application but in my case it seems files disappear immediately.
My rails application uploads files to public/uploads and then a delayed job tries to read those files, when it tries to read them they are not found.
If I do everything in the same thread it works, but when I try to use the delayed job or check the filesystem using heroku run bash the files are gone.
Why this happens?
heroku is a read only file system. so actually you don't even write the files but just keep them in memory while in one thread.
if you want to use some free storage system i recommend google drive. you'll need to do some searching of how to use that since not too long ago they changed they're login policy only with Oauth, no more password/username login
I am using Heroku to try to deploy a personal Ruby on Rails project and everything was going great until today.
I am very very new to Ruby on Rails and Heroku so please bare that in mind. I am not sure what is causing my issue and therefore not sure what code or information is best to supply so please ask me what you think you need to know to help resolve the issue and I will provide it.
My Ruby on Rails app worked fine both locally and on Heroku until I followed the information here to try and serve static images from Amazons S3 bucket. Note I only went as far as the static assets section.
This appeared to stop my Ruby on Rails application from recognising changes in my code. So I would make a change to a HTML file in my editor but the server was serving up the older version of the HTML file, even restarting the server didn't fix this.
I have been searching the web for hours trying to figure out what has gone wrong.
I deleted everything under public assets and I ran the precompile command:
rake assets:precompile
And this seems to have improved things locally, when I edit a HTML file the changes are reflected on localhost. However when I push to Heroku and go to my application hosted on Heroku it still shows the older HTML file no matter how many changes I make and pushes I do to Heroku.
The HTML files that are not updating are located here:
app/assets/templates
I'm not sure what I may have changed that has caused the HTML files not to get updated on Heroku?? What should I look at and try? What other information would be useful in helping track down the issue?
The answer marked as correct in this StackOverflow question worked for me - Updated CSS Stylesheet not loaded following deployment to Heroku? - It looks like I accidently added assets precompiled file in my git repo somewhere along my development and that caused the issue.
I have a scheduled job running every 12 hrs that unzips image files from an FTP server into my tmp folder. The problem is that due to Heroku's ephemeral filesystem, whenever I deploy new code, the tmp folder is cleared when the dyno's restart and the files are no longer there. I'd like to be able to deploy code at will, without this concern.
I have thought of creating a second app that runs this task and connects to the same database. As per this SO answer. This way I can deploy code updates unrelated to this task to my production server, and can chose more selectively when to deploy to the second server.
Does anyone have any experience with having two apps running on the same database? Or is there a better way to solve my issue? I've read that Heroku may change database URL's at any time, so the second app may lose its connection. How common is this? Thanks!
I would create a folder under public e.g. public/storage and save unzipped files here.
I believe that it is possible using an app on Heroku.
Check this out: https://devcenter.heroku.com/articles/s3
I've done 2 Rails apps involving image uploads with paperclip. The first one I did a while ago worked fine locally but I had issues when I deployed to Heroku; I realized I needed to use AWS to enable image uploads.
I did that project a while ago. I recently started another project and tried to incorporate similar functionality. Before I enabled AWS with paperclip when I tried to deploy, I just wanted to test what would happen if I tried to upload an image. To my surprise, it worked without AWS! I was hoping to understand why it didn't work the first time and why it does work now. How does paperclip work with heroku and rails behind the scenes?
It's likely the first application was hosted on the legacy Bamboo stack, that had a read-only file system.
In the new Cedar stack, the file system is no longer read-only, so the upload will not fail. However, you should keep using AWS or any other external storage because Heroku distributes your compiled application across several machines, and it's not guaranteed that the image will be visible from another request. They call it ephemeral filesystem.
In other words, with the Bamboo stack it was easy to understand that you could not store files on the file system, because you were experiencing an immediate crash. With Cedar the upload succeeds, but the uploaded image will be unavailable at anytime in the future.
The moral of the story is: you should keep using AWS or any other storage outside Heroku file-system.