Rails : carrier wave and heroku without S3 - ruby-on-rails

I am using rails , carrierwave and heroku but right now I don't have a s3 account so I used this configuration
How to: Make Carrierwave work on Heroku
It worked very well for files uploaded by the user but It didn't work for files uploaded through seeds
I am using this syntax
book.cover = File.open(File.join(Rails.root, 'photo.jpg'))
book.save!

Try doing this instead:
file = File.open(File.join(Rails.root, 'photo.jpg'))
book.cover = file
file.close
book.save!

Related

Rails - Resave All Models for S3 Migration

rails 6.1.3.2
aws-sdk-s3 gem
I currently have a rails app in production that uses ActiveStorage to attach image data to a wrapper Image model. It's currently using the local strategy to save images to disk and I am migrating it to S3. I am not using paperclip or anything similar.
I succeeded in setting it up. Currently it is set to use local primarily and have S3 as a mirror so that I can write to two places during the migration. However the documentation says that it will only save new images to S3 upon create and update of a record. I would like to "re-save" all models in production to force the migration to happen. Does anyone know how to do this?
Looks like it was already answered!
If you happen to be stuck with only access to the Rails Console like I was, this solution worked perfectly. If you copy-paste this code into the console, it will begin to produce output of the S3 uploads. After 5k of those, I was done. An immense thank you to Tayden for the solution.
all_services = [ActiveStorage::Blob.service.primary, *ActiveStorage::Blob.service.mirrors]
# Iterate through each blob
ActiveStorage::Blob.all.each do |blob|
# Select services where file exists
services = all_services.select { |file| file.exist? blob.key }
# Skip blob if file doesn't exist anywhere
next unless services.present?
# Select services where file doesn't exist
mirrors = all_services - services
# Open the local file (if one exists)
local_file = File.open(services.find{ |service| service.is_a? ActiveStorage::Service::DiskService }.path_for blob.key) if services.select{ |service| service.is_a? ActiveStorage::Service::DiskService }.any?
# Upload local file to mirrors (if one exists)
mirrors.each do |mirror|
mirror.upload blob.key, local_file, checksum: blob.checksum
end if local_file.present?
# If no local file exists then download a remote file and upload it to the mirrors (thanks #Rystraum)
services.first.open blob.key, checksum: blob.checksum do |temp_file|
mirrors.each do |mirror|
mirror.upload blob.key, temp_file, checksum: blob.checksum
end
end unless local_file.present?

Rails 4, Fog, Amazon s3 - retrieving all the images as an array from a specific folder in a bucket.

I am using amazon s3, rails 4, and the FOG gem. I have an amazon bucket called uipstudy with 100 folders, each containing about 20 images. I use the following to get all the images in a specific folder (In my application_helper.rb which is included in the application_controller.rb).
def get_files(image_folder)
connection = Fog::Storage.new(
provider: 'AWS',
aws_access_key_id: '######',
aws_secret_access_key: '#######'
)
connection.directories.get('uipimages', prefix:image_folder).files.map do |file|
file.key
end
end
In my controller I have this....in this example I am looking in the folder "1" in the uipstudy bucket.
#Amazon solution:
#images = get_files('1')
#images.each do |image|
image = "https://s3.amazonaws.com/uipstudy/#{image}"
#image_array << image
end
The problem is that its returning the files inside the folder labelled "1" but also in 10, 11, 12,13....etc. I assumed that the prefix was an absolute but it appears not. Is there a way to enforce that the prefix gets exactly the folder specified in the prefix?
I think you should be able to make a small change in your script to get the behavior you want. Simply append a forward slash to the prefix so that it clearly shows you want things that are like a directory instead of any/all things that begin with a particular character.
So, that would get you something like:
directory = connection.directories.get('upimages', prefix: image_folder + '/')
directory.files.map do |file|
file.key
end
(I just split it into two commands to make it format/read easier)
Below is my solution using the aws-sdk gem.
initialize s3 client
s3 = AWS::S3.new
bucket = s3.buckets[ENV['AWS_BUCKET']]
regex for ipa files in _inbox folder
regex = %r{_inbox/(?:[^/]+/)*[^/]+\.ipa}i
get and process ipa files
bucket.objects.select { |o| o.key.match(regex) }.each do |ipa|

Unknown Constants AssetsController::S3

I'm attempting to download files from my S3 File server via a rails app that I've written. However, I'm having a difficult time trying to figure out how to accomplish this. I've been attempting to use this reference from Amazon's blog to get it working.
in the get method in my controller, I have the following:
asset = current_user.assets.find_by_id(params[:id])
File.open('filename', 'wb') do |file|
reap = s3.get_object({ bucket:'bucket-name', key: URI.encode(asset.uploaded_file.url)}, target: file)
end
however I'm getting the following error:
uninitialized constant AssetsController::s3
I'm using the gem aws-sdk. Any suggestions would be much appreciated.
uninitialized constant AssetsController::s3
You need to define s3, the below should work
asset = current_user.assets.find_by_id(params[:id])
File.open('filename', 'wb') do |file|
s3 = Aws::S3::Client.new
reap = s3.get_object({ bucket:'bucket-name', key: URI.encode(asset.uploaded_file.url)}, target: file)
end

Refinerycms on Heroku not working with AmazonS3 bucket

I try to setup Amazon S3 support to store images in the cloud with refinerycms.
I created the bucket at https://console.aws.amazon.com/s3/
I named it like the app 'bee-barcelona' and it says it is in region US Standard
In ~/config/initializers/refinery/images.rb I entered all the data (where 'xxx? stands for the actual keys I entered:
# Configure S3 (you can also use ENV for this)
# The s3_backend setting by default defers to the core setting for this but can be set just for images.
config.s3_backend = Refinery::Core.s3_backend
config.s3_bucket_name = ENV['bee-barcelona']
config.s3_access_key_id = ENV['xxx']
config.s3_secret_access_key = ENV['xxx']
config.s3_region = ENV['xxx']
Then I applied the changes to heroku with:
heroku config:add S3_KEY=xxx S3_SECRET=xxx S3_BUCKET=bee-barcelona S3_REGION=us-standard
But still, in the app I only get: "Sorry, something wen wrong" when I try to upload.
What did I miss?
What a sad error. I didn't think about that option till I went for a 10 km run…
I had the app set up to be "beekeeping"
My bucket on Amazon was named "bee-barcelona"
I did register the correct bucket in the app. Still refinery tried to keep on going to another persons bucket, named "beekeeping". With my secret key there was no way my files would end up there.
I created a new app and a new bucket, all with crazy names, BUT! They are the same on AmazonS3 and GIT!!!
No it works like a charm.
What a very rare situation...
The way I did it was as follows:
Create a bucket in region US-STANDARD!!!!!!!!!!
Did you see that? US-STANDARD, not oregon, not anywhere else.
Add gems to Gemfile
gem "fog"
gem "unf"
gem "dragonfly-s3_data_store"
In config/application.rb
config.assets.initialize_on_precompile = true
In config/environments/production.rb
Refinery::Core.config.s3_backend = true
In config/environments/development.rb
Refinery::Core.config.s3_backend = false
Configure S3 for heroku (production) and local storage for development. In config/initializers/refinery/core.rb
if Rails.env.production?
config.s3_backend = true
else
config.s3_backend = false
end
config.s3_bucket_name = ENV['S3_BUCKET']
config.s3_region = ENV['S3_REGION']
config.s3_access_key_id = ENV['S3_ACCESS_KEY']
config.s3_secret_access_key = ENV['S3_SECRET_KEY']
Add variables to heroku:
heroku config:add S3_ACCESS_KEY=xxxxxx S3_SECRET_KEY=xxxxxx S3_BUCKET=bucket-name-here S3_REGION=us-east-1
I had a lot of issues because I had before S3_REGION=us-standard. This is WRONG. Set your US-Standard bucket as shown:
S3_REGION=us-east-1
This worked flawlessly for me on Rails 4.2.1 and refinery 3.0.0. Also, make sure you are using the exact same names for the variables. Sometimes it says S3_KEY instead of S3_ACCESS_KEY or S3_SECRET instead of S3_SECRET_KEY. Just make sure you have the same ones in your files and your Heroku variables.

Heroku - how to write into "tmp" directory?

I need to use the tmp folder on Heroku (Cedar) for writing some temporarily data, I am trying to do that this way:
open("#{Rails.root}/tmp/#{result['filename']}", 'wb') do |file|
file.write open(image_url).read
end
But this produce error
Errno::ENOENT: No such file or directory - /app/tmp/image-2.png
I am trying this code and it's running properly on localhost, but I cannot make it work on Heroku.
What is the proper way to save some files to the tmp directory on Heroku (Cedar stack)?
Thank you
EDIT:
I am running method with Delayed Jobs that needs to has access to the tmp file.
EDIT2:
What I am doing:
files.each_with_index do |f, index|
unless f.nil?
result = JSON.parse(buffer)
filename = "#{Time.now.to_i.to_s}_#{result['filename']}" # thumbnail name
thumb_filename = "#{Rails.root}/tmp/#{filename}"
image_url = f.file_url+"/convert?rotate=exif"
open("#{Rails.root}/tmp/#{result['filename']}", 'wb') do |file|
file.write open(image_url).read
end
img = Magick::Image.read(image_url).first
target = Magick::Image.new(150, 150) do
self.background_color = 'white'
end
img.resize_to_fit!(150, 150)
target.composite(img, Magick::CenterGravity, Magick::CopyCompositeOp).write(thumb_filename)
key = File.basename(filename)
s3.buckets[bucket_name].objects[key].write(:file => thumb_filename)
# save path to the new thumbnail to database
f.update_attributes(:file_url_thumb => "https://s3-us-west-1.amazonaws.com/bucket/#{filename}")
end
end
I have in database information about images. These images are stored in Amazon S3 bucket. I need to create thumbnails to these images. So I am going through one image by another one, load the image, temporarily save it, then resize it and afterwards I will upload this thumbnail to S3 bucket.
But this procedure doesn't seems to be working on Heroku, so, how could I do that (my app is running on Heroku)?
Is /tmp included in your git repo? Removed in your .slugignore? The directory may just not exist out on Heroku.
Try tossing in a quick mkdir before the write:
Dir.mkdir(File.join(Rails.root, 'tmp'))
Or even in an initializer or something...
Here's an elegant way
f = File.new("tmp/filename.txt", 'w')
f << "hi there"
f.close
Dir.entries(Dir.pwd.to_s + ("/tmp")) # See your newly created file in /tmp
Don't forget that whenever your app restarts (for any reason, including those outside your control), your files will be deleted, as they are only stored ephemerally.
Try it with heroku restart, you will see the new file you created is no longer there

Resources