Amazon access key showing in URL for Carrierwave and Fog - ruby-on-rails

I just switched from storing my images uploaded via Carrierwave locally to using Amazon s3 via the fog gem in my Rails 3.1 app. While images are being added, when I click on an image in my application, the URL is providing my access key and a signature. Here is a sample URL (XXX replaced the string with the info):
https://s3.amazonaws.com/bucketname/uploads/photo/image/2/IMG_4842.jpg?AWSAccessKeyId=XXX&Signature=XXX%3D&Expires=1332093418
This is happening in development (localhost:3000) and when I am using heroku for production. Here is my uploader:
class ImageUploader < CarrierWave::Uploader::Base
include CarrierWave::RMagick
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
process :convert => :jpg
process :resize_to_limit => [640, 640]
version :thumb do
process :convert => :jpg
process :resize_to_fill => [280, 205]
end
version :avatar do
process :convert => :jpg
process :resize_to_fill => [120, 120]
end
end
And my config/initializers/fog.rb :
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'XXX',
:aws_secret_access_key => 'XXX',
}
config.fog_directory = 'bucketname'
config.fog_public = false
end
Anyone know how to make sure this information isn't available?
UPDATE: Adding view and controller code:
from a partial in users/show.html.erb:
<% if #user.photos.any? %>
<% for photo in #user.photos %>
<li class="span4 hidey">
<div class="thumb_box">
<%=link_to(image_tag(photo.image_url(:thumb).to_s), photo.image_url.to_s,
:class=>"lb_test") %>
...
</div>
</li>
<% end %>
<% end %>
users_controller.rb:
def show
#user = User.find(params[:id])
end
UPDATE: Adding an error page I get when removing the access key information from the url:
This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>47077D6EC13AD1D8</RequestId>
<HostId>+HTeODcWTqv3gbRIAwf+lI6sPzfNTegDXjT9SnMdqrYr7gLD1TD0qN+OgMLwA1JO
</HostId>
</Error>

Remove
config.fog_public = false
That's a non-default value :)

What you are seeing is a signed-url. Without the full url (including key,signature,expires), you'll get an access denied. It is working exactly as it should. And I am guessing the key is just a public key, that is useless without your private key (which AWS has).

Try photo.image.url instead of photo.image_url. That's what I'm using.

Related

signed url configuration for google cloud

I am using carrierwave fog-google configuration for file download and upload to GCS bucket. However, my concern is I wanted to have a signed URL returned from GCS response with some expiration time.
Is there any configuration I need to set which would help me in receiving a response from GCS which will have signed URL and will expire let say after an hour?
class TestUploader < CarrierWave::Uploader::Base
storage :fog
def fog_credentials
{
:provider => 'google',
:google_project =>'my project',
:google_json_key_location =>'myCredentialFile.json'
}
end
def fog_provider
'fog/google'
end
def fog_directory
'{#bucket-name}'
end
def store_dir
when :File
"#{file.getpath}/file"
when :audio
"#{file.getpath}/audio"
else
p " Invalid file "
end
end
end

Rails 3: How to configure Zencoder to work with Amazon S3?

I'm following this tutorial to integrate Zencoder in my Rails 3 app: http://www.nickdesteffen.com/blog/video-encoding-with-uploadify-carrierwave-and-zencoder
The tutorial uses Rackspace for storage, but I'd like to adapt the code so that I can use Amazon S3 for storage instead. I replaced all the Rackspace info with my Amazon S3 info, but whenever I try to upload a video in my form, I get this HTTP error: "There was an error with the file you tried uploading.Please verify that it is the correct type."
What do I need to fix here to make this work?
carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'xxx',
:aws_secret_access_key => 'xxx',
}
config.fog_directory = 'mybucket'
config.fog_public = true
config.fog_attributes = {'Cache-Control' => 'max-age=315576000'}
end
video_uploader.rb
class VideoUploader < CarrierWave::Uploader::Base
include Rails.application.routes.url_helpers
Rails.application.routes.default_url_options = ActionMailer::Base.default_url_options
after :store, :zencode
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
def extension_white_list
%w(mov avi mp4 mkv wmv mpg)
end
def filename
"video.mp4" if original_filename
end
private
def zencode(args)
zencoder_response = Zencoder::Job.create({:input => 's3://mybucket/key.mp4',
:outputs => [{:label => 'vp8 for the web',
:url => 's3://mybucket/key_output.webm'}]})
zencoder_response.body["outputs"].each do |output|
if output["label"] == "web"
#model.zencoder_output_id = output["id"]
#model.processed = false
#model.save(:validate => false)
end
end
end
end
I've been working on the same issue.
Using Fog for my credentials I created my urls like so:
bucket = AttachmentUploader.fog_directory
input = "s3://#{bucket}/#{self.path}"
base_url = "s3://#{bucket}/#{store_dir}"
Take a look at my gist for more detail: https://gist.github.com/4002368
Don't forget to allow Zencoder to acces your bucket via security policy: https://app.zencoder.com/docs/guides/getting-started/working-with-s3

Carrierwave with MongoMapper returning nothing for non-uploaded users

We have a user model that implements carrierwave for avatar uploading.
For users that have uploaded a photo, everything is fine. But for those that don't, when you call the #photo method, you get Nothing. Not nil, or blank string, absolutely nothing, so we can't index or do a number of things because of this. Seems like it should return nil.
Any thoughts on how to make that happen?
I'm using http://github.com/brandonhilkert/carrierwave-mongomapper
class User
mount_uploader :photo, PhotoUploader
...
end
class PhotoUploader < CarrierWave::Uploader::Base
include CarrierWave::MiniMagick
storage :fog
process :resize_to_fit => [200, 200]
version :normal do
process :resize_to_fill => [100, 100]
end
version :thumb do
process :resize_to_fill => [50, 50]
end
def store_dir
"#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
irb(main):024:0> User.brandon.photo
=>
irb(main):025:0> User.brandon.photo.class
=> PhotoUploader
irb(main):026:0> User.chris.photo
=> https://[redacted]/IMG_1160_2_bigger.jpg
irb(main):027:0> User.chris.photo.class
=> PhotoUploader
According to the documentation, you can use blank?:
User.brandon.photo.blank? # should be true
User.chris.photo.blank? # should be false
Just implement default_url in your Uploader to a provider a default avatar
def default_url
"/assets/avatars/{version_name}_default.png"
end
Then display your image as follows
image_tag User.brandon.photo_url(:normal)

Carrierwave uploading with s3 and fog

been trying to search the reason for this error for a long time and can't seem to find any...
So I have a rails app, and I utilize carrierwave for pictures uploading. I also want to utilize Amazon S3 for file upload storage in my app.
Initially as I am developing the app I allowed file uploads to be on the on :file, i.e.
image_uploader.rb
# Choose what kind of storage to use for this uploader:
storage :file
# storage :fog
Now upon finishing up development and placing it live (I use heroku), I decided to change the carrierwave storage to S3 to test it locally.
image_uploader.rb
# Choose what kind of storage to use for this uploader:
# storage :file
storage :fog
However, now when I try to upload a picture (be it for user avatar, etc) I get this error:
Excon::Errors::Forbidden in UsersController#update
Expected(200) <=> Actual(403 Forbidden)
request => {:connect_timeout=>60, :headers=>{"Content-Length"=>74577, "x-amz- acl"=>"private", "Content-Type"=>"image/png", "Date"=>"Sun, 26 Feb 2012 10:00:43 +0000", "Authorization"=>"AWS AKIAJOCDPFOU7UTT4HOQ:8ZnOy7X71nQAM87yraSI24Y5bSw=", "Host"=>"s3.amazonaws.com:443"}, :instrumentor_name=>"excon", :mock=>false, :read_timeout=>60, :retry_limit=>4, :ssl_verify_peer=>true, :write_timeout=>60, :host=>"s3.amazonaws.com", :path=>"/uploads//uploads%2Fuser%2Favatar%2F1%2Fjeffportraitmedium.png", :port=>"443", :query=>nil, :scheme=>"https", :body=>"\x89PNG\r\n\x1A\n\x00\x00\x00\rIHDR\x00\x00\x00\xC2\x00\x00\x00\xC3\b\x06\x00\x00\x00\xD0\xBD\xCE\x94\x00\x00\nCiCCPICC Profile\x00\x00x\x01\x9D\x96wTSY\x13\xC0\xEF{/\xBD\xD0\x12B\x91\x12z\rMJ\x00\x91\x12z\x91^E%$\
...
# The code you see above to the far right repeats itself a LOT
...
1#\x85\xB5\t\xFC_y~\xA6=:\xB2\xD0^\xBB~i\xBB\x82\x8F\x9B\xAF\xE7\x04m\xB2i\xFF\x17O\x94S\xF7l\x87\xA8&\x00\x00\x00\x00IEND\xAEB`\x82", :expects=>200, :idempotent=>true, :method=>"PUT"}
response => #<Excon::Response:0x007fc88ca9f3d8 #body="<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>8EFA56C0DDDC8878</RequestId><HostId>1OxWXppSSUq1MFjQwvnFptuCM3gKOuKdlQQyVSEgvzzv4Aj+r2hSFM2UUw2NYyrR</HostId></Error>", #headers={"x-amz-request-id"=>"8EFA56C0DDDC8878", "x-amz-id-2"=>"1OxWXppSSUq1MFjQwvnFptuCM3gKOuKdlQQyVSEgvzzv4Aj+r2hSFM2UUw2NYyrR", "Content-Type"=>"application/xml", "Transfer-Encoding"=>"chunked", "Date"=>"Sun, 26 Feb 2012 10:00:47 GMT", "Connection"=>"close", "Server"=>"AmazonS3"}, #status=403>
And then it says this as well for my application trace:
app/controllers/users_controller.rb:39:in `update'
And my REQUEST parameters:
{"utf8"=>"✓",
"_method"=>"put",
"authenticity_token"=>"DvADD1vYpCLcghq+EIOwVSjsfmAWCHhtA3VI5VGD/q8=",
"user"=>{"avatar"=>#<ActionDispatch::Http::UploadedFile:0x007fc88cde76f8
#original_filename="JeffPortraitMedium.png",
#content_type="image/png",
#headers="Content-Disposition: form-data; name=\"user[avatar]\";
filename=\"JeffPortraitMedium.png\"\r\nContent-Type: image/png\r\n",
#tempfile=#<File:/var/folders/vg/98nv58ss4v7gcbf8px_8dyqc0000gq/T/RackMultipart20120226- 19096-1ppu2sr>>,
"remote_avatar_url"=>"",
"name"=>"Jeff Lam ",
"email"=>"email#gmail.com",
"user_bio"=>"Tester Hello",
"shop"=>"1"},
"commit"=>"Update Changes",
"id"=>"1"}
Here's my users_controller.rb partial code:
def update
#user = User.find(params[:id])
if #user.update_attributes(params[:user])
redirect_back_or root_path
flash[:success] = "Your have updated your settings successfully."
else
flash.now[:error] = "Sorry! We are unable to update your settings. Please check your fields and try again."
render 'edit'
end
end
My image_uploader.rb code
# encoding: utf-8
class ImageUploader < CarrierWave::Uploader::Base
# Include RMagick or MiniMagick support:
# include CarrierWave::RMagick
include CarrierWave::MiniMagick
# Choose what kind of storage to use for this uploader:
# storage :file
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
# Provide a default URL as a default if there hasn't been a file uploaded:
# def default_url
# "/images/fallback/" + [version_name, "default.png"].compact.join('_')
# end
# Process files as they are uploaded:
# process :scale => [200, 300]
#
# def scale(width, height)
# # do something
# end
# Create different versions of your uploaded files:
version :thumb do
process resize_to_fill: [360, 250]
end
version :cover_photo_thumb do
process resize_to_fill: [1170, 400]
end
version :event do
process resize_to_fill: [550, 382]
end
version :product do
process resize_to_fit: [226, 316]
end
# Add a white list of extensions which are allowed to be uploaded.
# For images you might use something like this:
def extension_white_list
%w(jpg jpeg gif png)
end
# Override the filename of the uploaded files:
# Avoid using model.id or version_name here, see uploader/store.rb for details.
# def filename
# "something.jpg" if original_filename
# end
# fix for Heroku, unfortunately, it disables caching,
# see: https://github.com/jnicklas/carrierwave/wiki/How-to%3A-Make-Carrierwave-work-on-Heroku
def cache_dir
"#{Rails.root}/tmp/uploads"
end
end
Finally, my fog.rb file in the config/initializers
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'ACCESS_KEY', # required
:aws_secret_access_key => 'SECRET_ACCESS_KEY/ZN5SkOUtOEHd61/Cglq9', # required
:region => 'Singapore' # optional, defaults to 'us-east-1'
}
config.fog_directory = 'ruuva/' # required
config.fog_public = false # optional, defaults to true
end
I'm actually quite confused on some of the things in my fog.rb. Firstly, should I change my region to Singapore if I created a bucket called "ruuva", with region "Singapore" on my amazon s3 account?
Thank you to anyone that can help in advance!
First make sure you use the right credentials by not setting custom region and custom directory (create a fake bucket for free in the default region)
Then I think you are not using the right name for the region. Try setting your region like this:
:region => 'ap-southeast-1'
We were facing the same problem and fixed that changing the user's permission associated to your access key, changing it to "Power User". Check if you need your user to be power user before put it into productions.

Saving files using Paperclip without upload

I had a quick question. Is it possible to save a file without actually uploading it through a form?
For example, let's say I'm looking at attachments from emails, and I want to save them using a paperclip. How do I do this? Do I manually have to call a save_file(or something similar) somewhere?
Any help would be much appreciated!
I have a rake task that loads images (client logos) from a directory directly onto parperclip. You can probably adapt it to your needs.
This is my simplified Client model:
class Client < ActiveRecord::Base
LOGO_STYLES = {
:original => ['1024x768>', :jpg],
:medium => ['256x192#', :jpg],
:small => ['128x96#', :jpg]
}
has_attached_file :logo,
:styles => Client::LOGO_STYLES,
:url => "/clients/logo/:id.jpg?style=:style"
attr_protected :logo_file_name, :logo_content_type, :logo_size
Then on my rake task I do this:
# the logos are in a folder with path logos_dir
Dir.glob(File.join(logos_dir,'*')).each do |logo_path|
if File.basename(logo_path)[0]!= '.' and !File.directory? logo_path
client_code = File.basename(logo_path, '.*') #filename without extension
client = Client.find_by_code(client_code) #you could use the ids, too
raise "could not find client for client_code #{client_code}" if client.nil?
File.open(logo_path) do |f|
client.logo = f # just assign the logo attribute to a file
client.save
end #file gets closed automatically here
end
end
Regards!
The file saved in Paperclip doesn't have to be uploaded directly through a form.
I'm using Paperclip in a project to save files from URLs from webcrawler results. I'm not sure how you'd get email attachments (are they on the local file system of the server? Is your app an email app like GMail?) but as long as you can get a file stream (via something like open(URI.parse(crawl_result)) in my case...) you can attach that file to your model field that's marked has_attached_file.
This blog post about Easy Upload via URL with Paperclip helped me figure this out.
Since it now appears the original blog post is no longer available - here's the gist of it pulled from wayback machine:
This example shows a Photo model that has an Image attachment.
The technique we're using requires adding a *_remote_url (string) column for your attachment, which is used to store the original URL. So, in this case, we need to add a column named image_remote_url the photos table.
# db/migrate/20081210200032_add_image_remote_url_to_photos.rb
class AddImageRemoteUrlToPhotos < ActiveRecord::Migration
def self.up
add_column :photos, :image_remote_url, :string
end
def self.down
remove_column :photos, :image_remote_url
end
end
Nothing special is required for the controller...
# app/controllers/photos_controller.rb
class PhotosController < ApplicationController
def create
#photo = Photo.new(params[:photo])
if #photo.save
redirect_to photos_path
else
render :action => 'new'
end
end
end
In the form, we add a text_field called :image_url, so people can upload a file or provide a URL...
# app/views/photos/new.html.erb
<%= error_messages_for :photo %>
<% form_for :photo, :html => { :multipart => true } do |f| %>
Upload a photo: <%= f.file_field :image %><br>
...or provide a URL: <%= f.text_field :image_url %><br>
<%= f.submit 'Submit' %>
<% end %>
The meaty stuff is in the Photo model. We need to require open-uri, add an attr_accessor :image_url, and do the normal has_attached_file stuff. Then, we add a before_validation callback to download the file in the image_url attribute (if provided) and save the original URL as image_remote_url. Finally, we do a validates_presence_of :image_remote_url, which allows us to rescue from the many exceptions that can be raised when attempting to download the file.
# app/models/photo.rb
require 'open-uri'
class Photo < ActiveRecord::Base
attr_accessor :image_url
has_attached_file :image # etc...
before_validation :download_remote_image, :if => :image_url_provided?
validates_presence_of :image_remote_url, :if => :image_url_provided?, :message => 'is invalid or inaccessible'
private
def image_url_provided?
!self.image_url.blank?
end
def download_remote_image
self.image = do_download_remote_image
self.image_remote_url = image_url
end
def do_download_remote_image
io = open(URI.parse(image_url))
def io.original_filename; base_uri.path.split('/').last; end
io.original_filename.blank? ? nil : io
rescue # catch url errors with validations instead of exceptions (Errno::ENOENT, OpenURI::HTTPError, etc...)
end
end
Everything will work as normal, including the creation of thumbnails, etc. Plus, since we're doing all of the hard stuff in the model, "uploading" a file via URL works from within script/console as well:
$ script/console
Loading development environment (Rails 2.2.2)
>> Photo.new(:image_url => 'http://www.google.com/intl/en_ALL/images/logo.gif')
=> #<Photo image_file_name: "logo.gif", image_remote_url: "http://www.google.com/intl/en_ALL/images/logo.gif">

Resources