Debugging Help: Ruby on Rails File Saving Failed with S3 and Carrierwave - ruby-on-rails

I can't figure out for the life of me why this isn't working. I need debugging help and I am new to ruby on rails.
I want to store just a simple s3 file to my amazon bucket. That's it. I don't need this file attached to any row in a controller or stored in a database like all the examples show. I just need it to put it in the bucket. I think I might need to overwrite the store_dir in the avatar uploader but I wouldn't know how.
Below is what I have
The View:
= form_tag import_orders_path, :class => 'order-uploads', :multipart => true do
= file_field_tag 'upload[file]'
/ :file for just getting param[:file] from server
%br
= submit_tag "Import CSV", :class => 'submit-file'
The Controller:
uploader = AvatarUploader.new
puts YAML::dump(params[:upload][:file].path)
uploader.store!(params[:upload][:file])
File Path name:
/tmp/RackMultipart20150115-9225-o2c5hp
avatar.rb uploader
# encoding: utf-8
class AvatarUploader < CarrierWave::Uploader::Base
# Choose what kind of storage to use for this uploader:
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
carrierwave.rb initializer
require 'fog'
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'my_key_id', # required
:aws_secret_access_key => 'my_secret_key', # required
:region => 'us-east-1', # optional, defaults to 'us-east-1'
}
config.fog_directory = "my_bucket"
end
Error:
NoMethodError (undefined method `id' for nil:NilClass):
app/uploaders/avatar_uploader.rb:16:in `store_dir'
app/controllers/orders_controller.rb:18:in `import'
NoMethodError (undefined method `id' for nil:NilClass):
app/uploaders/avatar_uploader.rb:16:in `store_dir'
app/controllers/orders_controller.rb:18:in `import'

you uploaded file should be
It should be app/uploaders/avatar_uploader.rb
class AvatarUploader < CarrierWave::Uploader::Base
# Choose what kind of storage to use for this uploader:
storage :fog
end
Then
uploader = AvatarUploader.new
uploader.store!(my_file)
uploader.retrieve_from_store!('my_file.png')

Related

Using carrierwave to upload images to google cloud storage, the file name ends up being saved and not the public link to the image in the bucket

I'm trying to implement image upload to google cloud storage from my rails 4.2 app using the carrierwave gem. Whenever I go to upload the image it is uploaded to the bucket fine but its saved in the db as the original image name e.g. image.png, not as the google cloud storage public link for the image e.g. https://storage.googleapis.com/project/bucket/image.png
Not too sure what is needed to be done from here to get it saving the public link from the bucket and not just the file name.
carrierwave.rb file
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: 'key',
google_storage_secret_access_key: 'secret key'
}
config.fog_directory = 'bucket-name'
end
uploaders/check_item_value_image_uploader.rb
class CheckItemValueImageUploader < CarrierWave::Uploader::Base
# Include RMagick or MiniMagick support:
# include CarrierWave::RMagick
# include CarrierWave::MiniMagick
# Choose what kind of storage to use for this uploader:
#storage :file
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"check-item-value-images/#{model.id}"
end
# Add a white list of extensions which are allowed to be uploaded.
# For images you might use something like this:
def extension_white_list
%w(jpg jpeg gif png)
end
end
related gems
gem 'gcloud'
gem "fog"
gem 'google-api-client', '~> 0.8.6'
gem "mime-types"
check_category_item_value model
mount_uploader :value, CheckItemValueImageUploader
check_category_item_value update method
if #check_category_item_value.save
flash[:success] = "Successfully updated"
redirect_to category_items_edit_path(#guide, #category, #category_item)
else
render 'category_items/edit'
end
edit form
<%= form_for(#check_category_item_value) do |f| %>
<%= f.file_field :value, :value => item_key.value, accept: "image/jpeg, image/jpg, image/gif, image/png" %>
<%= f.submit "Submit" %><hr>
<% end %>
The forms works fine but the value saved is the original image name not the google cloud storage public link for the image.
I used the carrierwave docs, this post, and this video by google cloud platform to get what I have now. What am I missing?
update
adding config.fog_public = true does nothing
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: 'key',
google_storage_secret_access_key: 'secret key'
}
config.fog_public = true
config.fog_directory = 'bucket-name'
end
To set the link public, please check this config this in your config file:
# You may set it false now
config.fog_public = true
For filename you may overwrite in your CheckItemValueImageUploader, here is an example:
class CheckItemValueImageUploader < CarrierWave::Uploader::Base
def filename
"#{model.id}-#{original_filename}.#{file.extension}" if original_filename.present?
end
end

carrierwave image upload to s3 "hostname does not match certificate error"

I first got carrierwave working by following the directions from this railscast:
http://railscasts.com/episodes/253-carrierwave-file-uploads
Then I hooked up s3 by following the directions here:
http://railgaadi.wordpress.com/2012/06/03/saving-files-in-amazon-s3-using-carrierwave-and-fog-gem/
My image_uploader.rb file:
class ImageUploader < CarrierWave::Uploader::Base
include CarrierWave::RMagick
storage :fog
def store_dir
"development/uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
version :iphone do
process :resize_to_limit => [320, 160]
end
end
And my fog.rb file:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'xxx', # required
:aws_secret_access_key => 'xxx', # required
}
config.fog_directory = 'goodlife.carrierwave' # required
end
This is the error I'm getting:
hostname "goodlife.carrierwave.s3-us-west-1.amazonaws.com" does not match the server certificate
Any advice? Thanks!
Adding :path_style => true to config.fog_credentials worked for me. I learned it from an answer to
Amazon S3 - hostname does not match the server certificate (OpenSSL::SSL::SSLError) + rails.
Is goodlife.carrierwave the name of your bucket?
Edit:
Remove the period from your bucket name. That should fix it.
From Amazon:
If you want to access a bucket by using a virtual hosted-style
request, for example, http://mybucket.s3.amazonaws.com over SSL, the
bucket name cannot include a period (.).

Rails Engine not uploading to s3 with Carrierwave

I am creating a rails plugin with rails new plugin my_plugin --mountable
This was quite some work to figure out but it is supposed to upload files to S3 with carrierwave, but it says ok but nothing is uploaded
Carrierwave is used to generate an uploader with rails g uploader photo
the file looks like this
# my_engine/app/uploaders/my_engine/photo_uploader.rb
# encoding: utf-8
module my_engine
class PhotoUploader < CarrierWave::Uploader::Base
# Choose what kind of storage to use for this uploader:
storage :file
# storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
end
the model had an mount :photo, PhotoUploader
module PdfGeneratorEngine
class Assemble < ActiveRecord::Base
attr_accessible :color, :photo, :qr_code_url, :text
mount_uploader :photo, PhotoUploader
end
end
my CarrierWave config file is this
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'MY_ACCES_KEY',
:aws_secret_access_key => 'MY_SECRET_KEY',
:provider => 'AWS',
:region => 'eu-west-1'
}
config.fog_directory = 'my.bucket.com'
config.fog_host = 'https://s3-eu-west-1.amazonaws.com/my.bucket.com'
config.storage = :fog
config.s3_use_ssl = true
config.fog_public = true
end
So first of all it starts screaming at fog_host, it is okay if it is asset_host
Next it's problem lies within s3_use_ssl, while it is an merged issue on CarrierWave's github. But the host is already defined as https:// so I don't see why that line is necessary.
After that it says 'Okay it's done' and when I try to check (with a deamon) for the file, there's nothing there.
What did I miss? Or is there something of an issue with CarrierWave and Rails mountable engines?
In your photo_uploader.rb
comment storage:file and uncomment storage:fog
# storage :file
storage :fog
--
Look at your fog.rb, Its inconsistent with what is given here
carrierwave#using-amazon-s3
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'xxx', # required
:aws_secret_access_key => 'yyy', # required
:region => 'eu-west-1' # optional, defaults to 'us-east-1'
:hosts => 's3.example.com' # optional, defaults to nil
:endpoint => 'https://s3.example.com:8080' # optional, defaults to nil
}
config.fog_directory = 'name_of_directory' # required
config.fog_public = false # optional, defaults to true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'} # optional, defaults to {}
end
Okay So there is a bit of a problem with CarrierWave.
I have quickly setup RightAws and now it uploads to S3 and I can find it from my deamon.
in my uploader I added
#s3 = RightAws::S3Interface.new('MY KEY', 'MY SECRET KEY')
#s3.put('my.bucket.com', assemble.photo.identifier ,params[:assemble][:photo])
Thanks for your help Nishant, CarrierWave would be a lot slicker and nicer but it currently does not work. There has been an issue for this in their github regarding use in Rails engines.

Public URL with Fog and Amazon S3

Versions of all RubyGems. I am using Ruby on Rails 3.1.3, Ruby 1.9.2, CarrierWave 0.5.8, and Fog 1.1.2.
I am using the CarrierWave RubyGem too for image uploading and the Fog RubyGem for Amazon S3 file upload.
In my CarrierWave initializer file I have:
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: 'xxx',
aws_secret_access_key: 'xxx'
}
if Rails.env.production?
config.fog_directory = 'bucket1'
elsif Rails.env.development?
config.fog_directory = 'bucket2'
else
config.fog_directory = 'bucket3'
end
config.fog_public = false
config.fog_authenticated_url_expiration = 60
end
I have an uploader file:
class PageAttachmentUploader < CarrierWave::Uploader::Base
CarrierWave.configure do |config|
if Rails.env.development? || Rails.env.development? || Rails.env.production?
config.fog_public = true
end
end
storage :fog
end
I am having two uploader files. I want one to be set to private and one to public.
I am trying to overwrite CarrierWave configuarations when PageAttachmentUploader is invoked and set the URL to public. This works like charm in the local machine, but it does not work in staging, sandbox and production.
I changed config.fog_public = true in the CarrierWave intializer. Even that does not work in sandbox. How do I fix this problem?
No, you should not use CarrierWave.configure directly in your uploaders as it will change the default configuration for all uploaders and not only each uploader.
I don't know if that's the best solution but you can change the default fog configuration directly by setting class methods in your uploaders like this :
class ImageUploader < CarrierWave::Uploader::Base
storage :fog
def self.fog_public
true # or false
end
end
Actually, the best way (I've found) is to do the following:
class ImageUploader < CarrierWave::Uploader::Base
storage :fog
configure do |c|
c.fog_public = true # or false
end
end
It feels more in line with CarrierWave's style to do it this way.

Carrierwave/Fog - Argument error, provider not recognized

I'm using Carrierwave 0.5.3 and Fog to upload images to Amazon-S3.
The setup works smoothly when running locally, no errors.
But when running on Heroku, uploads fail with this message:
2011-03-31T12:53:46-07:00 app[web.1]: ArgumentError ( is not a recognized storage provider):
2011-03-31T12:53:46-07:00 app[web.1]: app/controllers/useditems_controller.rb:36:in `create'
I've got an initializer:
# /config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'secret',
:aws_secret_access_key => 'also secret',
:region => 'eu-west-1'
}
config.fog_directory = 'jabberwocky'
end
And an uploader:
# /app/uploaders/image_uploader.rb
# encoding: utf-8
class ImageUploader < CarrierWave::Uploader::Base
# Include RMagick or ImageScience support:
include CarrierWave::RMagick
# Choose what kind of storage to use for this uploader:
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"useditems"
end
def cache_dir
"#{Rails.root}/tmp/uploads"
end
# Create different versions of your uploaded files:
version :thumb do
process :resize_to_limit => [220, 2000]
end
# Add a white list of extensions which are allowed to be uploaded.
# For images you might use something like this:
def extension_white_list
%w(jpg jpeg gif png)
end
end
I've traced the error message to Fog, and it seems that Fog, under Heroku, isn't getting the config info from the initializer. :provider is somehow empty. But I'm stumped as to how to fix it.
Any help would be much appreciated.
I'm using:
rails 3.0.4
heroku 1.19.1
fog 0.7.1
ruby 1.9.2 under rvm
The error was due to the fact that I had mistakenly added the initializer to the .gitignore-file. Thus, it was never uploaded to Heroku.
Adding this for completeness...
After smashing my head against the wall for hours with this error message, I found out that I had this line at the beginning of the carrierwave initializer:
if Rails.env.test?
...
So the initializer was only considered in the test environment. After removing it, everything worked as expected.

Resources