How to upload using CarrierWave an http_basic_authentication protected file - ruby-on-rails

I'm trying to upload a file that is protected under http_basic_authentication with CarrierWave. Here is the tested code:
hs = House.new
hs.remote_house_url = "http://username:password#127.0.0.1:3000/houses/export.csv"
hs.save!
I'm expecting the file to be uploaded, but I get the following:
(13.2ms) BEGIN
(0.8ms) ROLLBACK
ActiveRecord::RecordInvalid: The validation failed : House could not download file: userinfo not supported. [RFC3986]
from /Users/htaidirt/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.0/lib/active_record/validations.rb:57:in `save!'
I know it's a problem with giving http_basic_authentication credentials (username & password) thanks to the message http_basic_authentication. But what is the right way to do it? Thanks.

I just encountered a similar problem. OpenURI does not allow you to supply basic auth credentials as part of the url, instead it should be like
open("http://www.your-website.net",
http_basic_authentication: ["user", "password"])
(which I found here: http://blog.andreamostosi.name/2013/04/open-uri-and-basic-authentication/)
Carrierwave does not seem to support this by default. For now, I have monkey patched the CarrierWave::Uploader::Download::RemoteFile class to add the required basic auth. I will try and submit a better version of this as a pull request so hopefully it can be added to the gem, but for now I created config/initializers/overrides.rb with the contents:
#add basic auth to carrierwave
module CarrierWave
module Uploader
module Download
class RemoteFile
private
def file
if #file.blank?
#file = Kernel.open(#uri.to_s, http_basic_authentication: ["USERNAME", "PASSWORD])
#file = #file.is_a?(String) ? StringIO.new(#file) : #file
end
#file
rescue Exception => e
raise CarrierWave::DownloadError, "could not download file: #{e.message}"
end
end
end
end
end

Related

Testing ActiveStorage attachments (FileNotFound)

I'm getting an error testing the ActiveStorage attachment. The code is something like this:
class AssemblyTest < ActiveSupport::TestCase
test 'Updating svg attachment should upload the updated file' do
#assembly = Assembly.create(name: assemblies(:head_gasket).name,
image:
fixture_file_upload('files/track-bar.svg', 'image/svg+xml'))
assert #assembly.image.attached?
assert_not_empty #assembly.image.download
end
end
I'm getting the following error
Minitest::UnexpectedError: ActiveStorage::FileNotFoundError: ActiveStorage::FileNotFoundError when #assembly.image.download is called. The attached? assertion is passing, but I can't figure out why the download of the file is failing. Also, nothing shows up in the tmp/storage directory, where the ActiveStorage is configured to store files.
While digging in the ActiveStorage code I found this snipped which relies on actual database commits to execute the document upload (or save to disc):
after_commit(on: %i[ create update ]) { attachment_changes.delete(name.to_s).try(:upload) }
In case you use database transactions in the test environment this will then not store the document.
To solve this you can trigger the commit callback manually:
run_callbacks(:commit)
So in your case this might work:
class AssemblyTest < ActiveSupport::TestCase
test 'Updating svg attachment should upload the updated file' do
#assembly = Assembly.create(name: assemblies(:head_gasket).name,
image:
fixture_file_upload('files/track-bar.svg', 'image/svg+xml'))
#assembly.run_callbacks(:commit) # Run commit callback to store on disk
assert #assembly.image.attached?
assert_not_empty #assembly.image.download
end
end
Try this
#assembly = Assembly.create(name: assemblies(:head_gasket).name)
#assembly.image.attach(io: File.open('/path/to/file'), filename: 'file.name', content_type: 'mime/type')
You can create the blob directly (which is how the direct upload process works) and then attach it so the blob is guaranteed to already be uploaded.
blob = ActiveStorage::Blob.create_and_upload!(
io: File.open(Rails.root.join("test/fixtures/files/test.csv")),
filename: "test.csv",
content_type: "text/csv",
identify: false
)
#model.file.attach(blob)

Ruby on Rails AWS S3 Download URL

How can I form a url link for a user so that when the user clicks on the link, it forces them to download the AWS S3 object?
I've seen these two solutions: Using send_file to download a file from Amazon S3? and Using send_file to download a file from Amazon S3? however, they seem to reference an old AWS S3 v1 SDK and there does not seem to be a url_for in the v2 AWS S3 SDK.
Thanks.
Ended up using the following code snippet to solve. Hope this helps others.
presigner = Aws::S3::Presigner.new
url = presigner.presigned_url(:get_object, #method
bucket: ENV['S3_BUCKET'], #name of the bucket
key: s3_key, #key name
expires_in: 7.days.to_i, #time should be in seconds
response_content_disposition: "attachment; filename=\"#{filename}\""
).to_s
Here's what I got:
def user_download_url(s3_filename, download_filename=nil)
s3_filename = s3_filename.to_s # converts pathnames to string
download_filename ||= s3_filename.split('/').last
url_options = {
expires_in: 60.minutes,
response_content_disposition: "attachment; filename=\"#{download_filename}\""
}
object = bucket.object(s3_filename)
object.exists? ? object.presigned_url(:get, url_options).to_s : nil
end
def bucket
#bucket ||= Aws::S3::Resource.new(region: ENV['AWS_REGION']).bucket(ENV['AWS_S3_BUCKET'])
end
To create a link for downloading, simply put redirect_to user_download_url(s3_file_path) in a controller action, and create a link to that controller action.

Ruby on Rails - How to Save Remote File over HTTPS and Basic Authentication

I am using a supplier's api and the response they send to our server includes a url to a file, upon trying to save this file locally I fail miserably.
def self.create_file_new(filename, ext, url)
require 'open-uri'
file = Tempfile.new(filename + ext)
file.binmode
# data = open(url).read
# data = open(url, :http_basic_authentication => [username, password])
file << open(url, :http_basic_authentication => [username, password]).read
# file.write CGI::unescape(data)
file.close
file = File.open(file.path)
return file
end
I was originally getting a OpenURI::HTTPError (401 Unauthorised): but I have since created a file named bypass_ssl_verification_for_open_uri in app/initializers containing the following:
# Make open-uri work with https
OpenSSL::SSL::VERIFY_PEER = OpenSSL::SSL::VERIFY_NONE
which I found whilst Googling on how to fix it.
I then started to get this error message: NoMethodError (undefined method 'tr' for #<StringIO:0xb5b728c4>):, I tried creating another file (cgi_escape_fix.rb in app/initializers) containing this:
require 'cgi'
class << CGI
alias_method :orig_escape, :escape
def escape(str)
orig_escape(str.to_str)
end
end
Which I also found on my Google travels but that doesn't seem to have solved anything, so I commented out the file.write CGI::unescape(data) to try a different way but still no joy.
Now in the log I am just getting a plain 500 Internal Server Error with no useful information.
The file I'm attempting to save will always be a pdf.
Ruby 1.8.7
Rails 2.3.14
Got it to work with the following (two new initializer scripts removed):
file = Tempfile.new(filename + ext)
file.binmode
file << open(url, :http_basic_authentication => [username, password]).read
file.close
file = File.open(file.path)
return file
Should also mention that this is being passed to the attachment_fu plugin incase anyone else has problems with it.

Migrating paperclip S3 images to new url/path format

Is there a recommended technique for migrating a large set of paperclip S3 images to a new :url and :path format?
The reason for this is because after upgrading to rails 3.1, new versions of thumbs are not being shown after cropping (previously cached version is shown). This is because the filename no longer changes (since asset_timestamp was removed in rails 3.1). I'm using :fingerprint in the url/path format, but this is generated from the original, which doesn't change when cropping.
I was intending to insert :updated_at in the url/path format, and update attachment.updated_at during cropping, but after implementing that change all existing images would need to be moved to their new location. That's around half a million images to rename over S3.
At this point I'm considering copying them to their new location first, then deploying the code change, then moving any images which were missed (ie uploaded after the copy), but I'm hoping there's an easier way... any suggestions?
I had to change my paperclip path in order to support image cropping, I ended up creating a rake task to help out.
namespace :paperclip_migration do
desc 'Migrate data'
task :migrate_s3 => :environment do
# Make sure that all of the models have been loaded so any attachments are registered
puts 'Loading models...'
Dir[Rails.root.join('app', 'models', '**/*')].each { |file| File.basename(file, '.rb').camelize.constantize }
# Iterate through all of the registered attachments
puts 'Migrating attachments...'
attachment_registry.each_definition do |klass, name, options|
puts "Migrating #{klass}: #{name}"
klass.find_each(batch_size: 100) do |instance|
attachment = instance.send(name)
unless attachment.blank?
attachment.styles.each do |style_name, style|
old_path = interpolator.interpolate(old_path_option, attachment, style_name)
new_path = interpolator.interpolate(new_path_option, attachment, style_name)
# puts "#{style_name}:\n\told: #{old_path}\n\tnew: #{new_path}"
s3_copy(s3_bucket, old_path, new_path)
end
end
end
end
puts 'Completed migration.'
end
#############################################################################
private
# Paperclip Configuration
def attachment_registry
Paperclip::AttachmentRegistry
end
def s3_bucket
ENV['S3_BUCKET']
end
def old_path_option
':class/:id_partition/:attachment/:hash.:extension'
end
def new_path_option
':class/:attachment/:id_partition/:style/:filename'
end
def interpolator
Paperclip::Interpolations
end
# S3
def s3
AWS::S3.new(access_key_id: ENV['S3_KEY'], secret_access_key: ENV['S3_SECRET'])
end
def s3_copy(bucket, source, destination)
source_object = s3.buckets[bucket].objects[source]
destination_object = source_object.copy_to(destination, {metadata: source_object.metadata.to_h})
destination_object.acl = source_object.acl
puts "Copied #{source}"
rescue Exception => e
puts "*Unable to copy #{source} - #{e.message}"
end
end
Didn't find a feasible method for migrating to a new url format. I ended up overriding Paperclip::Attachment#generate_fingerprint so it appends :updated_at.

Uploading files to s3 from local machine to s3 using carrierwave in rails

I am trying to upload the files from my local machine to amazon s3 using carrierwave. Actually I want to write the migration for the above operations. I need to move the images that are stored locally to amazon. Can anybody tell me how should I perform the above operations using the methods of carrierwave.
Btw I am also using Carrierwave_direct on top of carrierwave but I don't think that would affect my storage methods.
I executed uploader.store!(/local/path/to/file) but it fails with the following error:
You tried to assign a String or a Pathname to an uploader, for
security reasons, this is not allowed.
Is there any other way I can send in the path info in the method?
I also tried executing:
new_file.asset = File.open('full/path') #asset is where my uploader is mounted
In this case, when I try new_file.save!, it successfully saves but when I try to get the url by doin new_file.asset.url it shows empty. I don't know why
Heres my uploader:
module DirectUploader
extend ActiveSupport::Concern
included do
include CarrierWave::MimeTypes
include CarrierWave::MiniMagick
include CarrierWaveDirect::Uploader
include ActiveModel::Conversion
extend ActiveModel::Naming
process :set_content_type
end
module InstanceMethods
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
# override the url to return absolute url if available and
# revert back to standard functionality if it is not available
def url
if model.absolute_url.nil?
super
else
model.absolute_url
end
end
def filename
#random = Digest::MD5.hexdigest(model.latest_time.to_s)
"#{#random}.#{File.extname(original_filename)}" if original_filename
end
def policy_doc(options={})
options[:expiration] ||= self.class.upload_expiration
options[:max_file_size] ||= self.class.max_file_size
doc = {
'expiration' => Time.now.utc + options[:expiration],
'conditions' => [
["starts-with", "$utf8", ""],
["starts-with", "$authenticity_token", ""],
["starts-with", "$key", store_dir],
{"bucket" => fog_directory},
{"acl" => acl},
["content-length-range", 1, options[:max_file_size]]
]
}
doc['conditions'] << {"success_action_redirect" => success_action_redirect} if success_action_redirect
doc
end
def policy(options={})
Base64.encode64(policy_doc(options).to_json).gsub("\n","")
end
end
end
And there is no problem in carrierwave configuration because I can upload the files using form/html. Its just that I am finding problems during migration.
Have you tried:
uploader.store!(File.new('/local/path/to/file'))
When I'm running tests, I use:
uploader.store! File.open(Rails.root.join("spec/support/file.png"))

Resources