rails AWS S3 delete file - ruby-on-rails

I do uploads to amazon S3 with carrierwave that works fine.
But Now I want to add a delete function I tried this:
AWS::S3::S3Object.delete(#vid.video, 'bucket')
I got this error:
uninitialized constant MoviesController::AWS
The reason is clear .. But how do I set the AWS constant correctly and where?
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => '----',
:aws_secret_access_key => '----',
:region => 'eu-central-1',
}
config.fog_use_ssl_for_aws = false
config.fog_directory = 'bucekt'
config.storage = :fog
end

You must first configure the AWS gem. Add this code to the config/initializers/aws.rb file.
Aws.config.update({
region: '<default-region>',
credentials: Aws::Credentials.new('<access-key-id>', '<secret-access-key')
})
You can also set the environment variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_REGION on your server and the SDK will pick them up automatically.
Then, anywhere in your app or a controller action, you can call the S3 API like this:
def some_action
# You can simply call Aws::S3::Client.new
# if you are already configuring using the
# above methods or configure by passing
# parameters explicitly
s3_client = Aws::S3::Client.new(
credentials: Aws::Credentials.new('<aws_access_key_id>', '<aws_secret_key>'),
region: '<aws_region>'
)
# delete object by passing bucket and object key
s3_response = s3_client.delete_object({
bucket: '<bucket-name>', # required
key: '<object-key>', # required
})
end

Related

Fog/Carrierwave config for Rails app on AWS Elastic Beanstalk

I'm trying to set up Carrierwave and Fog to handle image and file uploads on a rails app that I have hosted on AWS' Elastic Beanstalk.
I'm a little confused on how to properly set up the Fog config.
I tried using my AWS Access and Secret keys (commented out in the example below). That through an error on my EB CLI (ERROR: NotAuthorizedError - Operation Denied. The security token included in the request is invalid.)
I'm tyring to use IAM instead of having my Access/Secret codes in my ruby code. Can anyone tell me how to set this up properly?
Here's my config file:
CarrierWave.configure do |config|
# Use local storage if in development or test
if Rails.env.development? || Rails.env.test?
CarrierWave.configure do |config|
config.storage = :file
end
end
# Use AWS storage if in production
if Rails.env.production?
CarrierWave.configure do |config|
config.storage = :fog
end
end
config.fog_credentials = {
:provider => 'AWS', # required
# :aws_access_key_id => 'My Access', # required
# :aws_secret_access_key => 'My Secret', # required
:use_iam_profile => true,
:region => 'eu-west-2' # optional, defaults to 'us-east-1'
}
config.fog_directory = 'elasticbeanstalk-us-west-2-XXXXXXXXXX' # required
#config.fog_host = 'https://assets.example.com' # optional, defaults to nil
config.fog_public = false # optional, defaults to true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'} # optional, defaults to {}
end
This is a setup that works for me:
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws' # required
config.fog_credentials = {
provider: 'AWS', # required
aws_access_key_id: ENV['aws_access_key_id'], # required
aws_secret_access_key: ENV['aws_secret_access_key'], # required
#region: 'Singapore', # optional, defaults to 'us-east-1'
#host: 's3.example.com', # optional, defaults to nil
#endpoint: 'olucube-images.s3-website-ap-southeast-1.amazonaws.com', # optional, defaults to nil
}
config.fog_directory = ENV['fog_directory'] # required
#config.fog_public = false # optional, defaults to true
# config.fog_attributes = { 'Cache-Control' => "max-age=#{365.day.to_i}" }, # optional, defaults to {}
end
and I used figaro gem to hold my credentials as follow:
config/application.yml
aws_access_key_id: 'XXXXXXXXXXXXXXXXXXXX'
aws_secret_access_key: 'XXXXXXXXXXXXXXXXXX'
fog_directory: 'myAppName'
This was a bit a of a wild ride. I had a hard time figuring out that Figaro gem. It's probably simple but I didn't really understand it. So for a test, I put my keys directly in the code. It still didn't work.
I pushed my code to github (publicly) and didn't think too much of it. I was going to change the keys just in case. Before I was able to do that someone found my code on github and gained access to my AWS account. They started a bunch of EC2 instances and racked up $3000 worth of usage in a few hours!
My AWS account got suspended and I'm still dealing with having the charges reversed.
Anyway. I found out that you can actually set environment variables on the Elastic Beanstalk web interface. Its under Configuration → Software Configuration. So I did that instead of using Fiagro (much safer IMO). Now it works great. I simplified my Carrierwave config file to only use AWS calling the environment variables from EB. Here's the file:
# config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws'
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['S3_KEY'],
aws_secret_access_key: ENV['S3_SECRET'],
region: ENV['S3_REGION']
}
config.fog_directory = ENV['S3_BUCKET']
config.fog_public = false
config.storage = :fog
end
I changed my uploader files to use fog too. Here's an example:
# app/uploaders/image_uploader.rb
class ImageUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
def extension_white_list
%w(jpg jpeg gif png)
end
end
Everything works great now. I hope this helps someone else.

Carrierwave and S3 not working

I have a problem with authentication, i try to upload but i get this
The request signature we calculated does not match the signature you provided. Check your key and signing method.
initializer carrierwave.rb
CarrierWave.configure do |config| # optional, defaults to "fog"
config.fog_credentials = {
provider: 'AWS', # required
aws_access_key_id: ENV["AWS_ACCESS_KEY_ID"], # required
aws_secret_access_key: ENV["AWS_ACCESS_SECRET_KEY"], # required
}
config.fog_directory = ENV["AWS_BUCKET_NAME"] # required
config.fog_attributes = { 'Cache-Control' => "max-age=#{365.day.to_i}" } # optional, defaults to {}
end
gem "carrierwave"
gem "fog"
gem "figaro"
set the environment variables
i also created a bucket in S3.
Is there anything i am missing here? It works locally though.
Also changed the image_uploader.rb to storage :fog

How can I get my CarrierWave model to return the full URL to me in a test?

Trying to test a carrierwave model is really difficult. I configured my test environment like this:
if Rails.env.test?
CarrierWave.configure do |config|
config.storage = :file
config.enable_processing = false
config.fog_directory = BUCKET # required
config.fog_public = false # optional, defaults to true
config.fog_credentials = {
:provider => 'Local', # required
:local_root => LOCAL_ROOT, # required
:endpoint => "http://localhost:3000" # required
}
end
else
CarrierWave.configure do |config|
config.storage = :fog
config.max_file_size = 1.gigabytes # defaults to 5.megabytes
config.fog_directory = BUCKET # required
config.fog_public = false # optional, defaults to true
config.fog_credentials = {
:provider => PROVIDER, # required
:aws_access_key_id => access_key_id, # required
:aws_secret_access_key => secret_access_key # required
}
end
end
and it works great for testing uploads. Makes it difficult for testing downloads though.
Here is a simple test:
require "test_helper"
class UploadTests < ActiveSupport::TestCase
let(:user) { User.me }
let(:repo) { Repository.first }
let(:sub) { user.subscriptions.where(repository_id: repo).first}
it "uploads a CSV file and lets me read it" do
filename = Rails.root.join("test/testfiles/product_upload_test.csv").to_s
upload = sub.uploads.new
File.open(filename) do |f|
upload.text_file_name = f
end
upload.save!
end
end
All very simple. But what I want to do is read the file from the model. In other words, call some CarrierWave API that lets me grab the file and read it.
In production, I store everything on S3. In test, everything is in a local file. And I set local_root to be my app's public directory.
CarrierWave only reports the url as the path, but doesn't include the local_root. I don't feel that it is my job to manually construct the path to a file that I want to read from CarrierWave. How/Where the file is stored should be hidden from my test... I should have to construct that path.
But I don't know what else to do. All I want to do is read that file.
I changed my test configuration to the following:
CarrierWave.configure do |config|
config.storage = :file
config.enable_processing = false
config.fog_directory = BUCKET # required
config.fog_public = true # optional, defaults to true
config.fog_credentials = {
:provider => 'Local', # required
:local_root => LOCAL_ROOT, # required
:endpoint => LOCAL_ROOT # required
}
end
and now the full url is returned.

Issue with CarrierWave and AWS S3.

I have deployed my app on Ninefold, but it crashes when I try uploading pictures. The logs suggest that I'm missing my AWS credentials:
ArgumentError (Missing required arguments: aws_access_key_id, aws_secret_access_key)
But I'm fairly sure I've got them set up correctly using Fog and my .env file. CarrierWave initializer looks like this at the moment:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => ENV['S3_KEY'], # required
:aws_secret_access_key => ENV['S3_SECRET'], # required
:region => 'us-east-1' # optional, defaults to 'us-east1'
}
config.fog_directory = ENV['S3_BUCKET'], # required
end
Any suggestions on how to get this working correctly? Don't know what other info to give but if you need more info to help me solve let me know.
Have you set the environment variables in your app?
You need to add some variables with the relevant names in the Environment Variables section under the app deploy.
S3_KEY
S3_SECRET
S3_BUCKET
You amazon account should have the relevant details.

Carrierwave Upload with Amazon S3 - 403 Forbidden Error

I am attempting to use Carrierwave with Amazon S3 in my Rails app, and I keep getting the error
"Excon::Errors::Forbidden (Expected(200) <=> Actual(403 Forbidden)."
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.
I also receive the warning
"[WARNING] fog: the specified s3 bucket name() is not a valid dns name, which will negatively impact performance. For details see: http://docs.amazonwebservices.com/AmazonS3/latest/dev/BucketRestrictions.html"
config/initializers/carrierwave.rb:
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV["AWS_ACCESS_KEY_ID"],
aws_secret_access_key: ENV["AWS_ACCESS_KEY"]
}
config.fog_directory = ENV["AWS_BUCKET"]
end
My bucket name is "buildinprogress"
I've double checked that my access key ID and access key are correct.
How can I fix this error??
It is a problem with Fog/Excom that kept throwing random errors for me too.
My fix was to remove gem 'fog' and replace it with gem 'carrierwave-aws' instead.
Then, in your *_uploader.rb change
storage :fog ---> storage :aws
and update your carrierwave.rb file Ex.:
CarrierWave.configure do |config|
config.storage = :aws # required
config.aws_bucket = ENV['S3_BUCKET'] # required
config.aws_acl = :public_read
config.aws_credentials = {
access_key_id: ENV['S3_KEY'], # required
secret_access_key: ENV['S3_SECRET'] # required
}
config.aws_attributes = {
'Cache-Control'=>"max-age=#{365.day.to_i}",
'Expires'=>'Tue, 29 Dec 2015 23:23:23 GMT'
}
end
For more info check out the carrierwave-aws GitHub page

Resources