Error When Uploading to S3 via Carrierwave and Fog - ruby-on-rails

So I'm trying to use Carrierwave and Fog to upload files to Amazon S3. I first set up Carrierwave working with the local file server. Worked golden. Then I tried adding the fog gem, switching the storage to fog and adding the carrier_wave.rb initializer file and I get the error "no implicit conversion of Array into String" when I try to upload anything.
My initializer code is:
CarrierWave.configure do |config|
config.fog_credentials = {
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ['XXXX'],
:aws_secret_access_key => ['XXXX']
}
config.fog_directory = ['XXXX']
end
And my uploader code is:
class MediaUploader < CarrierWave::Uploader::Base
include CarrierWave::MiniMagick
include CarrierWave::MimeTypes
process :set_content_type
process :save_content_type_and_size_in_model
def save_content_type_and_size_in_model
model.content_type = file.content_type if file.content_type
model.file_size = file.size
end
storage :fog
The error seems to be stemming from the fact that carrierwave & fog is trying to pass an array into my string "media" field in my model (project). See the parameters being passed in:
{"utf8"=>"✓",
"authenticity_token"=>"XXXXXX",
"project"=>{"name"=>"Cat",
"description"=>"Meow meow",
"media"=>#<ActionDispatch::Http::UploadedFile:0x007fd15513f3e8 #tempfile=# <Tempfile:/var/folders/8_/77fwkkc56r71w67p2jj5x9200000gn/T/RackMultipart20141004-77510-1k6nxyw>,
#original_filename="SG Square.jpg",
#content_type="image/jpeg",
#headers="Content-Disposition: form-data; name=\"project[media]\"; filename=\"SG Square.jpg\"\r\nContent-Type: image/jpeg\r\n">},
"commit"=>"Create Project"}
Help!

Related

Rails carrierwave link generated different from s3 storage link

I created a rails api but I have a problem with image upload.
I'm using carrierwave , the upload of picture is working but I get a wrong link.
Example :
This is the link I find in the RESTful api :
https://s3.eu-west-2.amazonaws.com/gpsql/uploads/driver/picture/35/imagename.png
But when I check S3 storage I find a different link :
https://s3.eu-west-2.amazonaws.com/gpsql/gpsql/gpsql/uploads/driver/picture/35/imagename.png
This is initializer for s3 carrierwave :
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws' # required
config.fog_credentials = {
provider: 'AWS', # required
aws_access_key_id: '...', # required
aws_secret_access_key: '...', # required
region: 'us-west-2',
path_style: true,
}
config.fog_directory = 'gpsql' # required
config.asset_host = 'https://s3.eu-west-2.amazonaws.com/gpsql'
config.fog_attributes = {'Cache-Control' => "max-age=#{365.day.to_i}"} # optional, defaults to {}
end
In picture uploader :
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
How can I fix the link that is shown in the RESTful api also why there is so much "bucket name" in amazon link why not something straightforward link/bucketname/image.png
For the first link I find in restful api it doesn't work at all I get access denied or key not found for the second one in amazon s3 it works without any problem.
One of the problem is this
config.asset_host = 'https://s3.eu-west-2.amazonaws.com/gpsql'
it should be
config.asset_host = 'https://s3.eu-west-2.amazonaws.com'
Anyway I don't know why it's repeating twice...
So, if you can you should fix it in the configuration and move the folder in S3 to the proper place
If you can't move it I would try to change the store dir to "gpsql/gpsql/uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
I'm not sure if that works but that would be my first step

Carrierwave upload to Amazon s3 with wrong url

I am using Carrierwave/RailsAPI to upload images to Amazon S3.
Media is uploading correctly and into the correct folder and bucket.
PROBLEM:
Carrierwave is saving the url to the image and the thumb in the wrong format.
The correct url is:
https://region.amazonaws.com/bucket/folder/filename.jpeg
Carrierwave saves
https://bucket.s3.amazonaws.com/folder/filename.jpeg
My configurations follow:
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws'
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => "AWS_KEY",
:aws_secret_access_key => "SECRET_KEY",
:region => 'us-west-2'
}
config.fog_directory = "bucket"
end
class ImageUploader < CarrierWave::Uploader::Base
include CarrierWave::MiniMagick
storage :fog
def store_dir
"folder/"
end
def default_url
"/images/fallback/" + [version_name, "default.png"].compact.join('_')
end
version :thumb do
process :resize_to_fill => [150, 150]
end
def extension_white_list
%w(jpg jpeg gif png)
end
def filename
DateTime.now.strftime('%Q') + ".jpeg"
end
end
Help Appreciated!!
Both forms of the URL are valid. From the Amazon docs:
Amazon S3 supports both virtual-hosted–style and path-style URLs to
access a bucket.
In a virtual-hosted–style URL, the bucket name is part of the domain name in the URL. For example:
http://bucket.s3.amazonaws.com
http://bucket.s3-aws-region.amazonaws.com.
In a virtual-hosted–style URL, you can use either of these endpoints. If you make a request to the http://bucket.s3.amazonaws.com
endpoint, the DNS has sufficient information to route your request
directly to the region where your bucket resides.
In a path-style URL, the bucket name is not part of the domain (unless you use a region-specific endpoint). For example:
US East (N. Virginia) region endpoint, http://s3.amazonaws.com/bucket
Region-specific endpoint, http://s3-aws-region.amazonaws.com/bucket
In a path-style URL, the endpoint you use must match the region in which the bucket resides. For example, if your bucket is in the South
America (Sao Paulo) region, you must use the
http://s3-sa-east-1.amazonaws.com/bucket endpoint. If your bucket is
in the US East (N. Virginia) region, you must use the
http://s3.amazonaws.com/bucket endpoint.

Rails 3 carrierwave-azure Azure::Core::Http::HTTPError OutOfRangeInput (400): One of the request inputs is out of range

I'm trying to use the carrierwave and carrierwave-azure gems to save uploaded files in my app to the azure storage blob. The documentation for the carrierwave-azure gem seems a bit lite on github, but I believe have followed all the setup directions correctly (https://github.com/unosk/carrierwave-azure). However it still doesn't work.
When I attempt to upload a file I get the following error:
Azure::Core::Http::HTTPError in DownstreamsController#create
OutOfRangeInput (400): One of the request inputs is out of range.
My carrierwave.rb looks like this:
CarrierWave.configure do |config|
config.azure_credentials = {
storage_account_name: 'myaccountname',
storage_access_key: 'reallylongstringwithcapandlowercaselettersand+signs=='
}
config.azure_container = 'http://myapp.blob.core.windows.net/shipmentdocs'
end
My uploader:
include CarrierWave::RMagick
storage :azure
def store_dir
"http://myapp.blob.core.windows.net/shipmentdocs/uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
version :thumb do
process :convert => 'jpg'
process :resize_to_limit => [100, 100]
end
def extension_white_list
%w(jpg jpeg gif png pdf doc)
end
I've included both gems in my gem file. If I switch the storage from :azure to :file the upload works however it stores in the app directory, I want to store the file as a blob in Azure.
Not sure that it matters, but I am trying to do this from my local dev environment via localhost. Any help would be greatly appreciated.
UPDATE:
Here are the request parameters being submitted:
> {"utf8"=>"✓",
> "authenticity_token"=>"X/Q1ADOLgdaLJVsdrerdHdK9S/kJtSfjkjiutEuYsTYRY=",
> "downstream"=>{"bols_attributes"=>{"0"=>{"file_name"=>#<ActionDispatch::Http::UploadedFile:0x00000107c58340
> #original_filename="test-azure-storage-bol-file.png",
> #content_type="image/png", #headers="Content-Disposition: form-data;
> name=\"downstream[bols_attributes][0][file_name]\";
> filename=\"test-azure-storage-bol-file.png\"\r\nContent-Type:
> image/png\r\n",
> #tempfile=#<Tempfile:/var/folders/dc/y286vygx1jq5wjw3f6b6bcww0000gn/T/RackMultipart20140514-8268-ngdw0m>>}},
> "sales_order"=>"RTEWW423", "to_company_id"=>"2",
> "ship_date"=>"2014-05-14", "ship_total_pallets"=>"1",
> "downstreamDetails_attributes"=>{"0"=>{"tb_product_type_id"=>"17",
> "ship_total_net_weight"=>"3000", "ship_total_gross_weight"=>"3000",
> "_destroy"=>"false"}}}, "commit"=>"Submit"}
The answer turned out to be that in my carrierwave.rb file I was using the full URL for my config.azure_container setting (ie. 'http://myapp.blob.core.windows.net/shipmentdocs'). It turns out you should provide the container name as in:
config.azure_container = 'shipmentdocs'
It's also helpful to note that making this change requires a restart of the server for it to take effect because the carrierwave.rb file should be in your config/initializers folder.

Carrierwave AWS Heroku : current_path and File.read can't find the file

I have (what I thought was) a perfectly working Heroku-Carrierwave-AWS.
I can upload images like a charm.
I now need to send the respective images, via a JSON request, to an app. This has been working on my test server, but for some reason I'm getting the following from my Heroku Logs from my rails call:
Started POST "/downloadUserPhotos" for ?? at 2014-05-03 03:27:38 +0000
Errno::ENOENT (No such file or directory # rb_sysopen - uploads/photo/mainphoto/1/largeimage.jpg):
app/controllers/stats_controller.rb:22:in `read'
app/controllers/stats_controller.rb:22:in `downloadPhotos'
I'm pretty sure this has something to do with the following Ruby/Rails code:
def downloadPhotos
#photos = Photo.find_by_user_id(current_user.id)
#mainphoto = Base64.strict_encode64(File.read(#photos.mainphoto.current_path))
end
When I use my console on Heroku and type the following:
#photos = Photo.find(1)
It works and I get the correct record shown. When I ask for current_path for mainphoto, I get:
irb(main):002:0> #photos.mainphoto.current_path
=> "uploads/photo/mainphoto/1/largeimage.jpg"
So, it knows it exists. And it's in the right place.
Can anyone enlighten me (or point me in the right direction) as to why I can't use File.read. And, more importantly, how I get it to now read the image file and encode it??
This has perplexed me somewhat.
I've tried to use #photos.mainphoto.url, but other than giving me the whole URL, it still doesn't find the file using File.read.
My carrier wave config is:
CarrierWave.configure do |config|
config.fog_credentials = {
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'],
:region => ENV['S3_REGION']
}
config.cache_dir = "#{Rails.root}/tmp/uploads"
config.fog_directory = ENV['S3_BUCKET_NAME']
end
And I have the following in my Uploader:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.user.id}"
end
Thanks in advance.
In the line:
#mainphoto = Base64.strict_encode64(File.read(#photos.mainphoto.current_path))
Change current_path for url

Carrierwave & Amazon S3 file downloading/uploading

I have a rails 3 app with an UploadsUploader and a Resource model on which this is mounted. I recently switched to using s3 storage and this has broken my ability to download files using the send_to method. I can enable downloading using the redirect_to method which is just forwarding the user to an authenticated s3 url. I need to authenticate file downloads and I want the url to be http://mydomainname.com/the_file_path or http://mydomainname.com/controller_action_name/id_of_resource so I am assuming I need to use send_to, but is there a way of doing that using the redirect_to method? My current code follows. Resources_controller.rb
def download
resource = Resource.find(params[:id])
if resource.shared_items.find_by_shared_with_id(current_user) or resource.user_id == current_user.id
filename = resource.upload_identifier
send_file "#{Rails.root}/my_bucket_name_here/uploads/#{filename}"
else
flash[:notice] = "You don't have permission to access this file."
redirect_to resources_path
end
end
carrierwave.rb initializer:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'xxxx', # copied off the aws site
:aws_secret_access_key => 'xxxx', #
}
config.fog_directory = 'my_bucket_name_here' # required
config.fog_host = 'https://localhost:3000' # optional, defaults to nil
config.fog_public = false # optional, defaults to true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'} # optional, defaults to {}
end
upload_uploader.rb
class UploadUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads"
end
end
All of this throws the error:
Cannot read file
/home/tom/Documents/ruby/rails/circlshare/My_bucket_name_here/uploads/Picture0024.jpg
I have tried reading up about carrierwave, fog, send_to and all of that but everything I have tried hasn't been fruitful as yet. Uploading is working fine and I can see the files in s3 bucket. Using re_direct would be great as the file wouldn't pass through my server. Any help appreciated. Thanks.
Looks like you want to upload to S3, but have not-public URLs. Instead of downloading the file from S3 and using send_file, you can redirect the user to the S3 authenticated URL. This URL will expire and only be valid for a little while (for the user to download).
Check out this thread: http://groups.google.com/group/carrierwave/browse_thread/thread/2f727c77864ac923
Since you're already setting fog_public to false, do you get an authenticated (i.e. signed) url when calling resource.upload_url

Resources