Rails + Paperclips + Rackspace CloudFiles with the Private CDN - ruby-on-rails

I have a Rails application that uses Paperclip to handle uploaded files and we are currently hosted by Rackspace.
The application is currently hosted on a single server and I am building out a more scalable solution with load balancers, application servers, and a separate database server. The last thing I need to do is a solution for the uploaded assets. I have tried to use Rackspace's CloudFiles, but it seems the only way to use paperclip and CloudFiles is to have them on the public CDN, which I can't use, a user needs to be authenticate to access the files. Before I turn to Amazon S3, since they have the option for temporary URLs, does know how to use CloudFiles with Paperclip and require authentication to access the files?
Any help, tips, google searches, links, or solutions would be greatly appreciated.

As it happens, Cloud Files also supports the generation of temporary URLs, and it appears that Paperclip does allow you to make use of it. Just generate the URL from your Attachment with #expiring_url instead of #url in your views:
= image_tag #organization.logo.expiring_url(Time.now.to_i + 100, :original).gsub(/^http:/, "https")
Paperclip will only generate http urls, but since Rackspace's temporary URLs don't use the scheme in their checksums, you can use a gsub call to turn it into an https URL. Also, notice that the first argument to #expiring_url is an absolute timestamp (in seconds-since-the-epoch).
Expiring URLs for Rackspace only made it into fog somewhat recently -- v1.18.0 -- so if you're using an older version, you may need to upgrade fog to take advantage of them:
bundle upgrade fog
Paperclip also supports generating obfuscated URLs, which looks interesting, but would be less secure, since the server wouldn't expire it.

You can add the key like this:
class Rackspace
def self.add_temp_url_key
require 'fog'
puts "Creating Storage Service"
begin
service = Fog::Storage.new(
:provider => 'rackspace',
:rackspace_username => ENV['FOG_USERNAME'],
:rackspace_api_key => ENV['FOG_API_KEY'],
:rackspace_region => ENV['RACKSPACE_REGION'].to_sym
)
service.post_set_meta_temp_url_key(ENV['RACKSPACE_TEMP_URL_KEY'])
puts "X-Account-Meta-Temp-Url-Key successfully set to #{ENV['RACKSPACE_TEMP_URL_KEY']}"
rescue => e
puts "Unable to set X-Account-Meta-Temp-Url-Key - #{e.inspect}"
puts e.backtrace
end
end
end

Related

Upload Image via facebook marketing api through browser

I am trying to use the Facebook marketing API SDK to upload images to Facebook.
This is the sdk
I want the user to be able to click to select a file from the browser and run the upload via Rails and the SDK.
Basically, here is the flow I am trying to do.
user select file
click upload
The backend controller processes the request and uploads it to facebook via the API.
However, the issue I am running into is, for security reasons, browsers do not have access to file path, which Facebook SDK asks for.
ad_account.adimages.create({
'logo1.png' => File.open('./assets/logo1.jpg')
})
If I use ActionDispatch::Http::FileUpload that is built into Rails or carrierwave, I get access to the tempfile, which has a name similar to RackMultipart20170803-89798-1e9hr
If I try to upload that to Facebook, I get an error saying
API does not accept files of this type
Does anyone have an idea on what the best option is? The only thing I can think of is upload the file to a host like cloudinary, then get the url from that and upload via the url from cloudinary.
You are right, a possible solution for your case is using Cloudinary.
Cloudinary's Ruby integration library is available as an open-source Ruby GEM.
To install the Cloudinary Ruby GEM, run:
gem install cloudinary
If you use Rails 3.x or higher, edit your Gemfile, add the following line and run bundle.
gem 'cloudinary'
Your cloud_name account parameter is required to build URLs for your media assets. api_key and api_secret are further needed to perform secure API calls to Cloudinary.
Setting the configuration parameters can be done either programmatically in each call to a Cloudinary method or globally using a cloudinary.yml configuration file, located under the config directory of your Rails project.
Here's an example of a cloudinary.yml file:
production:
cloud_name: "sample"
api_key: "874837483274837"
api_secret: "a676b67565c6767a6767d6767f676fe1"
Uploading directly from the browser is done using Cloudinary's jQuery plugin
http://cloudinary.com/documentation/jquery_integration
To ensure that all uploads were authorized by your application, a secure signature must first be generated in your server-side Rails code.
Full disclosure: I work as a software engineer at Cloudinary.
A solution I found is the create a duplicate copy of the uploaded files in the public folder and then process from there.
uploaded_file = params["file"]
file_name = uploaded_file.original_filename
file_path = File.expand_path(uploaded_file.tempfile.path)
file = File.open("#{FILE_PATH}/#{file_name}", "wb")
file.write uploaded_file.tempfile.read
file.close

Managing access to a file with a direct link

I have a rails app where I upload files. I'm using gem cancan for managing access to files. The files are stored on the disk. Is it possible to manage access to a file or restrict/allow it even when a user has a direct link to it? Note I'm not using nginx or apache, it's a local application, therefore at best it's unicorn or simply the standard Rails web server.
Yes, it's possible. Move your files out of public folder, which is served by by your web server, to some other folder (private for example) and use send_file in controller to transmit data. Some pseudo-code for controller:
def get_file
absolute_path = Rails.root + 'private' + params[:filepath]
if current_user.allowed_to_download?(params[:filepath]) && File.exists?(absolute_path)
send_file absolute_path, status: 200 # ok
else
render status: 403 # forbidden
end
end
You will have to setup route for this action, but basicly that's all.

How can I migrate CarrierWave files to a new storage mechanism?

I have a Ruby on Rails site with models using CarrierWave for file handling, currently using local storage. I want to start using cloud storage and I need to migrate existing local files to the cloud. I am wondering if anyone can point out a method for doing this?
Bonus points for using a model attribute that would allow me to do this row-by-row in the background without interrupting my site for extended downtime (in other words, some model rows would still have local storage while others used cloud storage).
My first instinct is to create a new uploader for each model that uses cloud storage, so I have two uploaders on each model, then transferring the files from one to the other, setting an attribute to indicate which file should be used until they are all transferred, then removing the old uploader. That seems a little excessive.
Minimal to Possibly Zero Donwtime Procedure
In my opinion, the easiest and fastest way to accomplish what you want with almost no downtime is this: (I will assume that you will use AWS cloud, but similar procedure is applicable to any cloud service)
Figure out and setup your assets bucket, bucket policies etc for making the assets publicly accessible.
Using s3cmd (command line tool for interacting with S3) or a GUI app, copy entire assets folder from file system to the appropriate folder in S3.
In your app, setup carrierwave and update your models/uploaders for :fog storage.
Do not restart your application yet. Instead bring up rails console and for your models, check that the new assets URL is correct and accessible as planned. For example, for a video model with picture asset, you can check this way:
Video.first.picture.url
This will give you a full cloud URL based on the updated settings. Copy the URL and paste in a browser to make sure that you can get to it fine.
If this works for at least one instance of each model that has assets, you are good to restart your application.
Upon restart, all your assets are being served from cloud, and you didn't need any migrations or multiple uploaders in your models.
(Based on comment by #Frederick Cheung): Using s3cmd (or something similar) rsync or sync the assets folder from the filesystem to S3 to account for assets that were uploaded between steps 2 and 5, if any.
PS: If you need help setting up carrierwave for cloud storage, let me know.
I'd try the following steps:
Change the storage in the uploaders to :fog or what ever you want to use
Write a migration like rails g migration MigrateFiles to let carrierwave get the current files, process them and upload them to the cloud.
If your model looks like this:
class Video
mount_uploader :attachment, VideoUploader
end
The migration would look like this:
#videos = Video.all
#videos.each do |video|
video.remote_attachment_url = video.attachment_url
video.save
end
If you execute this migration the following should happen:
Carrierwave downloads each image because you specified a remote url for the attachment(the current location, like http://test.com/images/1.jpg) and saves it to the cloud because you changed that in the uploader.
Edit:
Since San pointed out this will not work directly you should maybe create an extra column first, run a migration to copy the current attachment_urls from all the videos into that column, change the uploader after that and run the above migration using the copied urls in that new column. With another migration just delete the column again. Not that clean and easy but done in some minutes.
When we use Heroku, most of people suggest to use cloudinary. Free and simple setup.
My case is when we use cloudinary service and need move into aws S3 for some reasons.
This is what i did with the uploader:
class AvatarUploader < CarrierWave::Uploader::Base
def self.set_storage
if ENV['UPLOADER_SERVICE'] == 'aws'
:fog
else
nil
end
end
if ENV['UPLOADER_SERVICE'] == 'aws'
include CarrierWave::MiniMagick
else
include Cloudinary::CarrierWave
end
storage set_storage
end
also, setup the rake task:
task :migrate_cloudinary_to_aws do
profile_image_old_url = []
Profile.where("picture IS NOT NULL").each do |profile_image|
profile_image_old_url << profile_image
end
ENV['UPLOADER_SERVICE'] = 'aws'
load("#{Rails.root}/app/uploaders/avatar_uploader.rb")
Profile.where("picture IS NOT NULL OR cover IS NOT NULL").each do |profile_image|
old_profile_image = profile_image_old_url.detect { |image| image.id == profile_image.id }
profile_image.remote_picture_url = old_profile_image.picture.url
profile_image.save
end
end
The trick is how to change the uploader provider by env variable. Good luck!
I have migrated the Carrier wave files to Amazon s3 with s3cmd and it works.
Here are the steps to follow:
Change the storage kind of the uploader to fog.
Create a bucket on Amazon s3 if you already dont have one.
Install s3cmd on the remote server sudo apt-get install s3cmd
Configure s3cmd s3cmd --configure.
You would need to enter public and secret key here, provided by Amazon.
Sync the files by this command s3cmd sync /path_to_your_files ://bucket_name/
Set this flag --acl-public to upload the file as public and avoid permission issues.
Restart your server
Notes:
sync will not duplicate your records. It will first check if the file is present on remote server or not.

Rails serving large files

I'm developing an application serving large videos only to logged users.
To keep these videos private i put them in a private folder inside Rails project and let Rails serve them, instead of using the public folder and excluding requests from apache (to avoid direct linking to them).
My action in the controller looks like this:
def video
respond_to do |format|
format.mp4{
send_file File.join([Rails.root, "private/videos", #lesson.link_video1 + ".mp4"]),
:disposition => :inline, :stream => true
}
end
end
Everything works perfectly, but just with small files, as soon as i try with real files i receive the error:
NoMemoryError (failed to allocate memory)
I read somewhere that is not a good practice to use send_file for large files, but using the other approach, to let apache serve the files, i had an issue serving files to mobile apple devices, as they're not sending the HTTP_REFERER.
Do you have any idea on how small is this memory limit?
My videos are from 400MB to 2GB (trying to reduce them).
The only question i found here is without an answer serving large media files from the assets folder in rails
I managed to activate X-Sendfile on Apache instead of letting Rails to serve large files. Working with Capistrano i found a good solution. Here is explained how Capistrano & X-Sendfile

Change permissions for files in S3 using Dragonfly

in my rails project i use dragonfly for uploading files and store them in S3.
initially i pass {'x-amz-acl' => 'private'} for uploaded files and use private urls with expiration date.
is there an easy way to change it to 'public-read' after the file was uploaded to S3?
I use the aws/s3 gem. Handling permissions can be done with something like this:
S3Object.store(
'kiss.jpg',
data,
'marcel',
:access => :public_read
)
In your case, you would use S3Object.find and then change the policy. The gem is documented here.

Resources