I'm trying to figure out how to incorporate S3 with dragonfly. This gem works locally, but because I'm using Heroku, the files gets deleted... so, I'm thinking I need to implement it into S3.
I'm using this gem: dragonfly-s3_data_store where my dragonfly.rb looks like this:
/config/initializers/dragonfly.rb
require 'dragonfly'
require 'dragonfly/s3_data_store'
# Configure
Dragonfly.app.configure do
plugin :imagemagick
url_format "/media/:job/:name"
datastore :s3,
bucket_name: 'thelab',
access_key_id: '{my access key}',
secret_access_key: '{my secret key}',
region: "us-west-1"
end
... #more code
This is my error when I'm trying to upload an image:
hostname "thelab.thelab.s3-us-west-1.amazonaws.com" does not match the server certificate (OpenSSL::SSL::SSLError)
I'm not sure what's causing the issue, but also on my console, I see this popping up:
[fog][WARNING] fog: followed redirect to thelab.s3-us-west-2.amazonaws.com, connecting to the matching region will be more performant
There's inconsistency here... I've also tried to switch the region to us-west-2, but I still get the same errors and warning signs.
form:
= simple_form_for(resource, as: resource_name, url: registration_path(resource_name), html: { method: :put }) do |f|
= f.file_field :image
= f.button :submit, "Save changes"
And of course, the model, in this case user.rb, has this:
dragonfly_accessor :image
Am I missing anything else that needs to be configured?
Also, my CORS Configuration for thelab bucket
<CORSConfiguration>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Related
I am trying to upload a photo to S3 using Rails Active Storage.
I can attach a photo:
user.photo.attach(io: File.open('spec/images/filename.png'), filename: 'filename.png')
I can save the user and I can get the photo service url and see it in the browser, and in my bucket:
user.photo.service_url
However, if I restart my console and try to get the service url, I receive the following error:
Module::DelegationError (service_url delegated to attachment, but attachment is nil)
Here are my bucket settings:
storage.yml:
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
application.rb:
config.active_storage.service = :amazon
user.rb:
has_one_attached :photo
I am also having trouble using public: true in the storage.yml file.
I receive the following error if I try to set the config:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
public: true
I also wanted to upload my files to AWS S3, and have them publically available.
I ran into this issue as well and found the following:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
comes from this file in the aws-sdk-ruby gem. As per the error message, the aws-sdk-ruby gem does not support the public: true option.
I used the following work-around (special thanks to this article):
I updated my storage.yml to:
public_amazon:
service: S3
access_key_id: some_key
secret_access_key: some_secret
bucket: some-bucket-name
region: some-region
upload:
acl: "public-read"
The above sets the uploaded file permissions to be public.
Retrieve the public URL like this:
user.photo.attach(params[:file])
url = user.photo.service.send(:object_for, user.photo.key).public_url
Transfer Accelerated S3 bucket allows you to upload/download faster. The setup should be simple in theory:
The endpoint changes from:
mybucket.s3.us-east-1.amazonaws.com
to:
mybucket.s3-accelerate.amazonaws.com
In Ruby on Rails config/storage.yml the environment variables look like this:
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: <%= ENV['AWS_REGION'] %>
bucket: <%= ENV['AWS_BUCKET'] %>
The problem is the pattern is different, so I can't just change the AWS_REGION. There is an extra .s3 in there.
I wish to learn how to implement S3 transfer acceleration with ROR and Active Storage?
I had to add this below the bucket...:
use_accelerate_endpoint: true
I've just added ActiveStorage to a Rails app using S3. My first use of AS on a staging env raises
Aws::S3::Errors::InvalidBucketName: The specified bucket is not valid.
The bucket already exists on S3 and ActiveStorage has the correct bucket name, region, etc. defined in storage.yml.
ActiveStorage has been given AWS IAM credentials that have full S3 access.
The bucket has been configured for CORS
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
Why am I getting an InvalidBucketName error? What other things should I look at?
#storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV.fetch('AWS_ACCESS_KEY_ID') %>
secret_access_key: <%= ENV.fetch('AWS_SECRET_ACCESS_KEY') %>
region: us-east-1
bucket: <%= ENV.fetch('AWS_S3_BUCKET') %>
I'm trying to replace the depreciated (for Rails 5.2) paperclip with image uploads using ActiveStorage and aws s3. I'm using this guide and the GoRails episode on it, but have hit a console error I can't parse. Everything looks okay on my form, but when I go to submit I get this in my console:
activestorage.self-6e8d967adecc8b2a7259df0f51ef5b6f171c33267c7d149a474beccd90c68697.js?body=1:1 POST http://localhost:3000/rails/active_storage/direct_uploads 500 (Internal Server Error)
I have this on my blogs#new _form.html.erb:
<div class="form-group">
<%= f.label "Picture" %>
<%= f.file_field :image, direct_upload: true %>
</div>
I put this in my model:
has_one_attached :image
I have :image in my params
def blog_params
params.require(:blog).permit(:title, :teaser, :body, :user_id, :tag_list, :link_text, :link_filename, :pillars_id, :image)
end
I added this to my storage.yml:
Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
amazon:
service: S3
access_key_id: <%= ENV["AWS_ACCESS_KEY_ID"] %>
secret_access_key: <%= ENV["AWS_SECRET_ACCESS_KEY"] %>
region: us-west-1
bucket: nameofbucket
And this in my development environment:
config.active_storage.service = :amazon
I included the javascript/css from the RubyOnRails Guides linked above. Anyone see where this error is coming from? I'm not too experienced with debugging console errors, so any info or wisdom is appreciated.
ADDITIONAL INFORMATION
Here's the error that happens in my Rails Console:
LoadError - Unable to autoload constant ActiveStorage::Blob::Analyzable, expected /Users/lizbayardelle/.rvm/gems/ruby-2.3.3/gems/activestorage-5.2.0/app/models/active_storage/blob/analyzable.rb to define it:
::1 - - [12/Jul/2018:13:56:05 PDT] "POST /rails/active_storage/direct_uploads HTTP/1.1" 500 7784
http://localhost:3000/blogs/new -> /rails/active_storage/direct_uploads
Other Note
When I try to create a blog without an image I get this error:
undefined method `[]=' for nil:NilClass
In Rails 4.0.2, I am using s3_direct_upload and aws-sdk gems for file uploads for s3 bucket directly. In development environment it is working fine but in production environment it is throwing an error like below,
ActionView::Template::Error (no implicit conversion of nil into String)
In views,
<%= s3_uploader_form :callback_url=>create_cv_url, :id=> "s3_uploader", :key=> "cv_uploads/{unique_id}/${filename}",
:key_starts_with=> "cv_uploads/", :callback_param=> "cv[direct_upload_url]", :max_file_size=> 1.megabytes,
:expiration=> 24.hours.from_now.utc.iso8601 do %>
<%= file_field_tag :file, multiple: true, :max_file_size => 1.megabytes, accept: 'application/pdf application/msword application/rtf application/doc application/docx' %>
<% end %>
<script id="template-upload" type="text/x-tmpl">
<div id="upload_{%=o.unique_id%}" class="upload">
<h5 class="mt1">Please Wait. <span style="color: #5f6fa0;"> {%=o.name%} </span>is processing...</h5>
<div class="progress"><div class="progress-bar progress-bar-striped active" style="width: 0%;"></div></div>
</div>
Here, the issue is mainly pointing to s3_uploader_form line(in views).
This feature is fully referred from http://blog.littleblimp.com/post/53942611764/direct-uploads-to-s3-with-rails-paperclip-and
In paperclip.rb
Paperclip::Attachment.default_options.merge!(
url: :s3_domain_url,
path: ':class/:attachment/:id/:style/:filename',
storage: :s3,
s3_credentials: Rails.configuration.aws,
s3_permissions: :private,
s3_protocol: 'http'
)
require 'paperclip/media_type_spoof_detector'
module Paperclip
class MediaTypeSpoofDetector
def spoofed?
false
end
end
end
In aws.rb
require 'aws-sdk'
# Rails.configuration.aws is used by AWS, Paperclip, and S3DirectUpload
Rails.configuration.aws = YAML.load(ERB.new(File.read("# {Rails.root}/config/aws.yml")).result)[Rails.env].symbolize_keys!
AWS.config(:logger=> Rails.logger)
AWS.config(Rails.configuration.aws)
In s3_direct_upload.rb
S3DirectUpload.config do |c|
c.access_key_id = Rails.configuration.aws[:access_key_id]
c.secret_access_key = Rails.configuration.aws[:secret_access_key]
c.bucket = Rails.configuration.aws[:bucket]
c.region = "s3"
end
Is it because of configuration issue in production environment? Please help me to solve this issue.
I had the same problem and I solve it adding the file config.yml with my S3 credentials:
RailsApp/config.yml
# Fill in your AWS Access Key ID and Secret Access Key
# http://aws.amazon.com/security-credentials
access_key_id: xxxxxx
secret_access_key: xxxxxxx
More info: https://docs.aws.amazon.com/AWSSdkDocsRuby/latest/DeveloperGuide/ruby-dg-setup.html
The code seems ok. As skozz mentioned, one of the issue may be with your configuration keys that may be not assigned properly. Please check aws production keys in "/config/aws.yml".
I was suffering the same, by following this link. The thing is that I've added the initializers, but I needed to restart rails (it was running but not refreshed).
Executing this command worked for me. figaro heroku:set -e production