Paperclip multiple storage - ruby-on-rails

I want to move my assets folder to Amazon S3 and since it has a big size, during the transaction i need to upload files both in my local storage and amazon s3 through paperclip.
Is there a way to configure paperclip to store uploaded files both on filesystem and amazon s3?

Maybe you'd benefit from this:
http://airbladesoftware.com/notes/asynchronous-s3/
What you'll have to do is firstly upload to your local storage, and then "asynchronously" upload to S3
This is typically done through the likes of Resque or DelayedJob (as the tutorial demonstrates), and will require you to run some sort of third-party processing engine on your server (typically Redis or similar)
From the tutorial:
### Models ###
class Person < ActiveRecord::Base
has_attached_file :local_image,
path: ":rails_root/public/system/:attachment/:id/:style/:basename.:extension",
url: "/system/:attachment/:id/:style/:basename.:extension"
has_attached_file :image,
styles: {large: '500x500#', medium: '200x200#', small: '70x70#'},
convert_options: {all: '-strip'},
storage: :s3,
s3_credentials: "#{Rails.root}/config/s3.yml",
s3_permissions: :private,
s3_host_name: 's3-eu-west-1.amazonaws.com',
s3_headers: {'Expires' => 1.year.from_now.httpdate,
'Content-Disposition' => 'attachment'},
path: "images/:id/:style/:filename"
after_save :queue_upload_to_s3
def queue_upload_to_s3
Delayed::Job.enqueue ImageJob.new(id) if local_image? && local_image_updated_at_changed?
end
def upload_to_s3
self.image = local_image.to_file
save!
end
end
class ImageJob < Struct.new(:image_id)
def perform
image = Image.find image_id
image.upload_to_s3
image.local_image.destroy
end
end
### Views ###
# app/views/people/edit.html.haml
# ...
= f.file_field :local_image
# app/views/people/show.html.haml
- if #person.image?
= image_tag #person.image.expiring_url(20, :small)
- else
= image_tag #person.local_image.url, size: '70x70'

Related

Paperclip: Choose between saving files local or S3 at runtime

I'm using Paperclip for saving files. I have configured successfully for Paperclip saving files directly to Amazon S3. But in some situations, I need files only be saved locally. My question is: How can I do this.
Here is my example Paperclip configuration:
Paperclip.interpolates(:upload_url) { |attachment, style| "#{ENV.fetch('UPLOAD_PROTOCOL', 'http')}://#{ENV.fetch('UPLOAD_DOMAIN', 'localhost:3000')}/uploads/:class/:attachment/:id_:style.:extension" }
Paperclip::Attachment.default_options.merge!(
storage: :s3,
s3_region: ENV['CEPH_REGION'],
s3_protocol: 'http',
s3_host_name: ENV['CEPH_HOST_NAME'],
s3_credentials: {
access_key_id: ENV['CEPH_ACCESS_KEY_ID'],
secret_access_key: ENV['CEPH_SECRET_KEY'],
},
s3_options: {
endpoint: ENV['CEPH_END_POINT'],
force_path_style: true
},
s3_permissions: 'public-read',
bucket: ENV['CEPH_BUCKET'],
url: ':s3_path_url',
path: ':class/:id/:basename.:extension',
use_timestamp: false
)
module Paperclip
def self.string_to_io(options)
data = StringIO.new(options[:data])
data.class.class_eval{ attr_accessor :original_filename }
data.original_filename = options[:original_file_name]
data
end
end
You could use a lambda. As shown here. https://github.com/thoughtbot/paperclip#dynamic-configuration
it would look something like this.
class YOURMODEL < ActiveRecord::Base
has_attached_file :FILE, storage: lambda { |attachment| (attachment.instance.use_s3? ? :s3 : :filesystem) }
end
Then you would have to add a method use_s3? to your model to check if you wanted to store the file locally or with s3.

Getting URL for public assets in Rails model

I have a Rails application that uses Paperclip and saves images to S3. When the user uploads an asset without an image, it gets the default image set in the Paperclip setup.
My API serves those assets and has the links to the images in the JSON response (using jbuilder), however I can't seem to return the default image URL, it only returns "missing.png" and I wanted it to return the entire URL to the server with the missing image path attached to it.
I'm setting the default url in the model, and I've tried using ActionView::Helpers::AssetUrlHelper to get the image_url but it never works even though it is working inside the rails console. Any idea on what can I do to solve it?
The JBuilder file:
json.profile_picture_smallest asset.profile_picture.url(:smallest)
json.profile_picture_small asset.profile_picture.url(:small)
json.profile_picture_medium asset.profile_picture.url(:medium)
json.profile_picture_large asset.profile_picture.url(:large)
json.profile_picture_original asset.profile_picture.url(:original)
The part of paperclip that is included in the Model
module Picturable
extend ActiveSupport::Concern
included do
has_attached_file :profile_picture, path: '/images/' + name.downcase.pluralize + '/:style/:basename', default_url: "missing.png",
styles: {
smallest: '50x50>',
small: '100x100>',
medium: '200x200>',
large: '400x400>',
png: ['400x400>',:png]
}, :convert_options => {
smallest: '-trim',
small: '-trim',
medium: '-trim',
large: '-trim',
png: '-trim'
}
# Validate the attached image is image/jpg, image/png, etc
validates_attachment_content_type :profile_picture, :content_type => /\Aimage\/.*\Z/
end
def set_uuid_name
begin
self.profile_picture_file_name = SecureRandom.uuid
end while self.class.find_by(:profile_picture_file_name => self.profile_picture_file_name)
end
end
Paperclip config:
Paperclip::Attachment.default_options[:s3_host_name] = 's3hostname'
Development config:
config.paperclip_defaults = {
:storage => :s3,
:s3_credentials => {
:bucket => 'paperclipdev',
:access_key_id => 'accesskey',
:secret_access_key => 'secretaccesskey'
}
}
I think the way to do this is use the asset helpers in your jbuilder file:
json.profile_picture_smallest asset_url(asset.profile_picture.url(:smallest))
It's worth a mention here that you can also pass a symbol method name to paperclip for the default_url parameter if you want the default url to be dynamic based on the model.

Paperclip Upload to S3 with Delayed Job

I am trying to upload a file to s3 in background using Rails (4.2) with delayed_job gem.
My code is basically what is shown in this post:http://airbladesoftware.com/notes/asynchronous-s3/ (with small changes).
The Photo Modal
class Photo < ActiveRecord::Base
belongs_to :gallery
has_attached_file :local_photo,
path: ":rails_root/public/system/:attachment/:id/:style/:basename.:extension",
url: "/system/:attachment/:id/:style/:basename.:extension"
has_attached_file :image,
styles: {large: '500x500#', medium: '180x180#'},
storage: :s3,
s3_credentials: lambda { |attachment| attachment.instance.s3_keys },
s3_permissions: :private,
s3_host_name: 's3-sa-east-1.amazonaws.com',
s3_headers: {'Expires' => 1.year.from_now.httpdate,
'Content-Disposition' => 'attachment'},
path: "images/:id/:style/:filename"
validates_attachment_content_type :image,
content_type: [
"image/jpg",
"image/jpeg",
"image/png"]
def s3_keys
{
access_key_id: SECRET["key_id"],
secret_access_key: SECRET["access_key"],
bucket: SECRET["bucket"]
}
end
after_save :queue_upload_to_s3
def queue_upload_to_s3
Delayed::Job.enqueue ImageJob.new(id) if local_photo? && local_photo_updated_at_changed?
end
def upload_to_s3
self.image = Paperclip.io_adapters.for(local_photo)
save!
end
end
class ImageJob < Struct.new(:image_id)
def perform
image = Photo.find(image_id)
image.upload_to_s3
image.local_photo.destroy
end
end
With this code, the job (ImageJob) run in background (I can see it on delayed_job_web), no error. But the file is not uploaded.
If I "disable background" and use only:
ImageJob.new(id).perform if local_photo? && local_photo_updated_at_changed?
The file is uploaded to the amazon and local file is deleted too.
Any suggestions ?
Thanks in advance
UPDATE #1 Now I can see the error:
Job failed to load: undefined class/module ImageJob. Handler: "--- !ruby/struct:ImageJob\nimage_id: 412\n"
UPDATE #2 I make this working using delay method, something like this:
def perform
if local_photo? && local_photo_updated_at_changed?
self.delay.move_to_s3
end
end

How can I set paperclip's storage mechanism based on the current Rails environment?

I have a rails application that has multiple models with paperclip attachments that are all uploaded to S3. This app also has a large test suite that is run quite often. The downside with this is that a ton of files are uploaded to our S3 account on every test run, making the test suite run slowly. It also slows down development a bit, and requires you to have an internet connection in order to work on the code.
Is there a reasonable way to set the paperclip storage mechanism based on the Rails environment? Ideally, our test and development environments would use the local filesystem storage, and the production environment would use S3 storage.
I'd also like to extract this logic into a shared module of some kind, since we have several models that will need this behavior. I'd like to avoid a solution like this inside of every model:
### We don't want to do this in our models...
if Rails.env.production?
has_attached_file :image, :styles => {...},
:path => "images/:uuid_partition/:uuid/:style.:extension",
:storage => :s3,
:url => ':s3_authenticated_url', # generates an expiring url
:s3_credentials => File.join(Rails.root, 'config', 's3.yml'),
:s3_permissions => 'private',
:s3_protocol => 'https'
else
has_attached_file :image, :styles => {...},
:storage => :filesystem
# Default :path and :url should be used for dev/test envs.
end
Update: The sticky part is that the attachment's :path and :url options need to differ depending on which storage system is being used.
Any advice or suggestions would be greatly appreciated! :-)
I like Barry's suggestion better and there's nothing keeping you from setting the variable to a hash, that can then be merged with the paperclip options.
In config/environments/development.rb and test.rb set something like
PAPERCLIP_STORAGE_OPTIONS = {}
And in config/environments/production.rb
PAPERCLIP_STORAGE_OPTIONS = {:storage => :s3,
:s3_credentials => "#{Rails.root}/config/s3.yml",
:path => "/:style/:filename"}
Finally in your paperclip model:
has_attached_file :image, {
:styles => {:thumb => '50x50#', :original => '800x800>'}
}.merge(PAPERCLIP_STORAGE_OPTIONS)
Update: A similar approach was recently implemented in Paperclip for Rails 3.x apps. Environment specific settings can now be set with config.paperclip_defaults = {:storage => :s3, ...}.
You can set global default configuration data in the environment-specific configuration files. For example, in config/environments/production.rb:
Paperclip::Attachment.default_options.merge!({
:storage => :s3,
:bucket => 'wheresmahbucket',
:s3_credentials => {
:access_key_id => ENV['S3_ACCESS_KEY_ID'],
:secret_access_key => ENV['S3_SECRET_ACCESS_KEY']
}
})
After playing around with it for a while, I came up with a module that does what I want.
Inside app/models/shared/attachment_helper.rb:
module Shared
module AttachmentHelper
def self.included(base)
base.extend ClassMethods
end
module ClassMethods
def has_attachment(name, options = {})
# generates a string containing the singular model name and the pluralized attachment name.
# Examples: "user_avatars" or "asset_uploads" or "message_previews"
attachment_owner = self.table_name.singularize
attachment_folder = "#{attachment_owner}_#{name.to_s.pluralize}"
# we want to create a path for the upload that looks like:
# message_previews/00/11/22/001122deadbeef/thumbnail.png
attachment_path = "#{attachment_folder}/:uuid_partition/:uuid/:style.:extension"
if Rails.env.production?
options[:path] ||= attachment_path
options[:storage] ||= :s3
options[:url] ||= ':s3_authenticated_url'
options[:s3_credentials] ||= File.join(Rails.root, 'config', 's3.yml')
options[:s3_permissions] ||= 'private'
options[:s3_protocol] ||= 'https'
else
# For local Dev/Test envs, use the default filesystem, but separate the environments
# into different folders, so you can delete test files without breaking dev files.
options[:path] ||= ":rails_root/public/system/attachments/#{Rails.env}/#{attachment_path}"
options[:url] ||= "/system/attachments/#{Rails.env}/#{attachment_path}"
end
# pass things off to paperclip.
has_attached_file name, options
end
end
end
end
(Note: I'm using some custom paperclip interpolations above, like :uuid_partition, :uuid and :s3_authenticated_url. You'll need to modify things as needed for your particular application)
Now, for every model that has paperclip attachments, you just have to include this shared module, and call the has_attachment method (instead of paperclip's has_attached_file)
An example model file: app/models/user.rb:
class User < ActiveRecord::Base
include Shared::AttachmentHelper
has_attachment :avatar, :styles => { :thumbnail => "100x100>" }
end
With this in place, you'll have files saved to the following locations, depending on your environment:
Development:
RAILS_ROOT + public/attachments/development/user_avatars/aa/bb/cc/aabbccddeeff/thumbnail.jpg
Test:
RAILS_ROOT + public/attachments/test/user_avatars/aa/bb/cc/aabbccddeeff/thumbnail.jpg
Production:
https://s3.amazonaws.com/your-bucket-name/user_avatars/aa/bb/cc/aabbccddeeff/thumbnail.jpg
This does exactly what I'm looking for, hopefully it'll prove useful to someone else too. :)
-John
How about this:
Defaults are established in application.rb. The default storage of :filesystem is used, but the configuration for s3 is initialized
Production.rb enables :s3 storage and changes the default path
Application.rb
config.paperclip_defaults =
{
:hash_secret => "LongSecretString",
:s3_protocol => "https",
:s3_credentials => "#{Rails.root}/config/aws_config.yml",
:styles => {
:original => "1024x1024>",
:large => "600x600>",
:medium => "300x300>",
:thumb => "100x100>"
}
}
Development.rb (uncomment this to try with s3 in development mode)
# config.paperclip_defaults.merge!({
# :storage => :s3,
# :bucket => "mydevelopmentbucket",
# :path => ":hash.:extension"
# })
Production.rb:
config.paperclip_defaults.merge!({
:storage => :s3,
:bucket => "myproductionbucket",
:path => ":hash.:extension"
})
In your model:
has_attached_file :avatar
Couldn't you just set an environment variable in production/test/development.rb?
PAPERCLIP_STORAGE_MECHANISM = :s3
Then:
has_attached_file :image, :styles => {...},
:storage => PAPERCLIP_STORAGE_MECHANISM,
# ...etc...
My solution is same with #runesoerensen answer:
I create a module PaperclipStorageOption in config/initializers/paperclip_storage_option.rb
The code is very simple:
module PaperclipStorageOption
module ClassMethods
def options
Rails.env.production? ? production_options : default_options
end
private
def production_options
{
storage: :dropbox,
dropbox_credentials: Rails.root.join("config/dropbox.yml")
}
end
def default_options
{}
end
end
extend ClassMethods
end
and use it in our model
has_attached_file :avatar, { :styles => { :medium => "1200x800>" } }.merge(PaperclipStorageOption.options)
Just it, hope this help
Use the :rails_env interpolation when you define the attachment path:
has_attached_file :attachment, :path => ":rails_root/storage/:rails_env/attachments/:id/:style/:basename.:extension"

Paperclip S3 download remote images

How I can download a remote image (http protocol, the url is in the image_remote_url attribute) and save it as an attachment to S3 via Paperclip ?
class Product < ActiveRecord::Base
require 'open-uri'
attr_accessor :image_remote_url
has_attached_file :photo,
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => ":class/:id/:style.:extension",
:bucket => "my_bucket",
:styles => {
:icon => "32x32#",
}
def fetch_image
# how should this method look ?
end
end
How should the method "fetch_image" look ?
Here's a link to a page that explains exactly what you need.
http://trevorturk.wordpress.com/2008/12/11/easy-upload-via-url-with-paperclip/
I've implemented it successfully on my own site.
I'm not sure this is still useful for you or not, but in a pull request to paperclip just a few hours ago, I've managed to make this super easy.
def set_photo
self.photo = URI.parse(self.image_remote_url)
end
This should do the job now on paperclip (version > 3.1.3) (not 3.1.3 but whatever comes after).

Resources