Using delayed_paperclip in Sidekiq in Heroku - ruby-on-rails

I have an issue which is strangely not addressed anywhere.
I am using paperclip to upload attachments to S3 in a Heroku app. But since the upload takes time i started using delayed_paperclip. But the issue is the Sidekiq worker fails with a error message that 'unable to open the file or file not found'. This makes perfect sense as the heroku worker and web are running on different dynos.
Is there any solution to it? except that the web has to upload it to S3, which defeats the whole purpose.
The bottom line problem for me is I am unable to share files in tmp folder between the web and the worker

The upload process is not related to your Rails App at all, you are going to have improvements if processing the file is taking too much time, but in your case, the problem sounds like it's related to a big file and slow network

Related

Heroku active storage S3 Seahorse::Client::NetworkingError (Net::OpenTimeout)

I am having a really weird issue. We are using the following combination
Rails 7.0.0 master branch
Heroku
Active Storage
Bucketeer addon
Staging app
Production app
I have two environments staging and production. On staging everything works fine, on production I keep running into: Seahorse::Client::NetworkingError (Net::OpenTimeout). The heroku support was, unfortunately, less than helpful (was worth a shot) so I am asking here.
If I use s3 directly, from a rails console, everything works fine. I can upload and download objects from my bucket so I know for a fact that the environment variables are valid.
If I on the other hand, try to upload a user avatar using active storage I get this error message: Seahorse::Client::NetworkingError (Net::OpenTimeout), which, to me, indicates a complete failure to connect to S3.
I have experimented with different timeouts both for active storage's storage.yml and for the global amazon configuration with no differing result. The error seems to be returned faster than the timeout (open timeout of 15 seconds should wait 15 seconds but it does not).
Any pointers in the right direction greatly appreciated.
After deleting config/credentials/production.yml everything is well.

Are files uploaded to heroku immediately deleted?

I see everywhere that heroku deletes changes you made to the filesystem once you deploy or scale your application but in my case it seems files disappear immediately.
My rails application uploads files to public/uploads and then a delayed job tries to read those files, when it tries to read them they are not found.
If I do everything in the same thread it works, but when I try to use the delayed job or check the filesystem using heroku run bash the files are gone.
Why this happens?
heroku is a read only file system. so actually you don't even write the files but just keep them in memory while in one thread.
if you want to use some free storage system i recommend google drive. you'll need to do some searching of how to use that since not too long ago they changed they're login policy only with Oauth, no more password/username login

How to deploy to Heroku without losing tmp files?

I have a scheduled job running every 12 hrs that unzips image files from an FTP server into my tmp folder. The problem is that due to Heroku's ephemeral filesystem, whenever I deploy new code, the tmp folder is cleared when the dyno's restart and the files are no longer there. I'd like to be able to deploy code at will, without this concern.
I have thought of creating a second app that runs this task and connects to the same database. As per this SO answer. This way I can deploy code updates unrelated to this task to my production server, and can chose more selectively when to deploy to the second server.
Does anyone have any experience with having two apps running on the same database? Or is there a better way to solve my issue? I've read that Heroku may change database URL's at any time, so the second app may lose its connection. How common is this? Thanks!
I would create a folder under public e.g. public/storage and save unzipped files here.
I believe that it is possible using an app on Heroku.
Check this out: https://devcenter.heroku.com/articles/s3

How to use paperclip with rails and how does it work in deployment?

I've done 2 Rails apps involving image uploads with paperclip. The first one I did a while ago worked fine locally but I had issues when I deployed to Heroku; I realized I needed to use AWS to enable image uploads.
I did that project a while ago. I recently started another project and tried to incorporate similar functionality. Before I enabled AWS with paperclip when I tried to deploy, I just wanted to test what would happen if I tried to upload an image. To my surprise, it worked without AWS! I was hoping to understand why it didn't work the first time and why it does work now. How does paperclip work with heroku and rails behind the scenes?
It's likely the first application was hosted on the legacy Bamboo stack, that had a read-only file system.
In the new Cedar stack, the file system is no longer read-only, so the upload will not fail. However, you should keep using AWS or any other external storage because Heroku distributes your compiled application across several machines, and it's not guaranteed that the image will be visible from another request. They call it ephemeral filesystem.
In other words, with the Bamboo stack it was easy to understand that you could not store files on the file system, because you were experiencing an immediate crash. With Cedar the upload succeeds, but the uploaded image will be unavailable at anytime in the future.
The moral of the story is: you should keep using AWS or any other storage outside Heroku file-system.

Rails - creating "huge" XML file on Heroku

Once a month I need to create almost 40MB XML file in my Rails app and save this file. As a storage I use Amazon S3 platform. When I do this task on my localhost (Webrick server), this task takes like 5 minutes => the file is saved in the Amazon's bucket. All are happy.
But when I run this task on Heroku, the app is not responding like 45 minutes and the file is not saved in Amazon.
I know heroku allows to run a task for just 30 seconds, but after this time is displayed an error message and the task is running in the background. While this operation is the app "idle".
But, how is possible that the file is not created and saved? Is there any limitation on Heroku for file transfer or something like that?
I spent whole afternoon with searching the problem, but until now without success.
Thanks in advance
I'm pretty sure the process is killed after 30 seconds, not running in the background as you suggest. Use delayed_job to get backgrounding as detailed here: https://devcenter.heroku.com/articles/delayed-job. If you have a single method that creates and transfers your file you can just use delayed_job's delay method on that. Note that a delayed_job spins up its own little environment. So in my case, I write the file to tmp and then transfer that to S3 and note that all has to be done in the same job because when the first job is done that tmp directory evaporates.

Resources