Defining where Paperclip stores the attachment locally when running Spec tests - ruby-on-rails

I'm using AmazonS3 to store Paperclip attachments on all non-test environments.
For test specifically I use a local path/url setup to avoid interacting with S3 remotely
Paperclip::Attachment.default_options[:path] =
":rails_root/public/system/:rails_env/:class/:attachment/:id_partition/:filename"
Paperclip::Attachment.default_options[:url] =
"/system/:rails_env/:class/:attachment/:id_partition/:filename"
I define my attachment as follows in the model
has_attached_file :source_file, use_timestamp: false
In my Production code I need to access the file using Model.source_file.url because .url returns the remote fully qualified Amazon S3 path to the file. This generally works fine for non-test environments.
However on my test environment I can't use .url because Paperclip creates and stores the file under the path defined by :path above. So I need to use .path. If I use .url I get the error -
Errno::ENOENT:
No such file or directory # rb_sysopen - /system/test/imports/source_files/000/000/030/sample.txt
which makes sense because paperclip didn't store the file there...
How do I get paperclip on my test environment to store/create my file under the :url path so I can use .url correctly?
Edit: If it helps, in test I create the attachment from a locally stored fixture file
factory :import do
source_file { File.new(Rails.root + "spec/fixtures/files/sample.tsv") }
end
Edit2: Setting :path and :url to be the same path in the initializer might seem like a quick fix, but I'm working on a larger app with several contributors, so I don't the have the luxury to do that or break any one else's specs. Plus it looks like Thoughtbot themselves recommend this setup, so there should be a "proper" way to get it working as is.
Thanks!

Have you tried using s3proxy in your test environment to simulate S3 instead of directly have paperclip write to local files?

Related

How to add host to a Carrierwave url?

I'm using Carrierwave on a Document model.
class Document
mount_uploader :file, DocumentUploader
end
and am trying to send an email with document as attachment
class DocumentMailer
def distribute(recipient, document)
filename = document.file.file.original_filename
attachments[ filename ] = File.read(document.file.url)
mail(
to: receipient.email,
subject: "Document attached"
)
end
end
In tests, the Mailer is raising an error
Errno::ENOENT:
No such file or directory # rb_sysopen - /uploads/document/file/2/my_attachment.jpg
I can resolve this error in the test suite by calling path instead of url in DocumentMailer, which returns the full filesystem path
attachments[ filename ] = File.read(document.file.path)
# /Users/AHH/code/myapp/tmp/uploads/document/file/2/my_attachment.jpg
However, this causes the method to fail in production. Carrierwave is using fog to store files on S3, and so I need the full url to assign an attachment to DocumentMailer.
Why do the tests fail when using file.url? I assume it is because the url has no host. So how to I ensure that Carrierwave applies a host to file.url in the test environment?
This is a side effect of the way that you store files in development/test versus production due to the fact that url is actually a URL in production, probably to an S3 host, but locally it's a path to a file, relative to wherever you've specified uploads to be stored. Unfortunately, it doesn't seem like CarrierWave has a graceful way of handling this, at least so far as I've seen, so I ended up doing something like this in our spec helper:
config.before do
allow_any_instance_of(DocumentUploader).to receive(:url) do |uploader|
uploader.path
end
end

Stubbing Paperclip downloads from S3 in RSpec

I am using Paperclip/RSpec and StackOverflow has helped me successfully stub file uploads to S3 using this code:
spec/rails_helper.rb
config.before(:each) do
allow_any_instance_of(Paperclip::Attachment).to receive(:save).and_return(true)
end
This is working great.
On my model I have two Paperclip fields:
class MyModel < ActiveRecord::Base
has_attached_file :pdf
has_attached_file :resource
end
My code uses the #copy_to_local_file method (Docs) to retrieve a file from S3.
#copy_to_local_file takes two params: the style (:original, :thumbnail, etc) and the local file path to copy to.
Example:
MyModel.resource.copy_to_local_file(:original, local_file.path)
When the system under test tries to access MyModel#pdf#copy_to_local_file or MyModel#resource#copy_to_local_file, I originally got errors like the following:
No Such Key - cannot copy /email_receipts/pdfs/000/000/001/original/email_receipt.eml.pdf to local file /var/folders/4p/1mm86g0n58x7d9rvpy88_s9h0000gn/T/receipt20150917-4906-13evk95.pdf
No Such Key - cannot copy /email_receipts/resources/000/000/001/original/email_receipt.eml to local file /var/folders/4p/1mm86g0n58x7d9rvpy88_s9h0000gn/T/resource20150917-4906-1ysbwr3.eml
I realize these errors were happening because uploads to S3 are stubbed, so when it encounters MyModel#pdf#copy_to_local_file or MyModel#resource#copy_to_local_file it tries to grab a file in S3 that isn't there.
Current Solution:
I've managed to quash the errors above, but I feel it's not a complete solution and gives my tests a false sense of security. My half-solution is to stub this method in the following way:
spec/rails_helper.rb
before(:each) do
allow_any_instance_of(Paperclip::Storage::S3).to receive(:copy_to_local_file)
end
While this does stub out the #copy_to_local_file method and removes the errors, it doesn't actually write any content to the local file that is provided as the second argument to #copy_to_local_file, so it doesn't quite simulate the file being downloaded from S3.
Question:
Is there a way to stub #copy_to_local_file AND have it write the contents of a canned file in my spec/factories/files directory to the local file (its second argument)?
Or am I overthinking this? Is this something I shouldn't be worrying about?
You don't need to worry about whether the 'downloaded' files actually exist in your tests. You've decided to stub out Paperclip, so do it completely, by stubbing out both #save and #copy_to_file. You may also need to stub out reads of downloaded files from the filesystem.
All this stubbing raises the possibility of integration errors, so you should probably write a feature spec (using a captive browser like poltergeist) that actually uploads and downloads something and reads it from the filesystem.
That said, you can do anything you want in an RSpec stub by passing it a block:
allow_any_instance_of(Paperclip::Storage::S3).to receive(:copy_to_local_file) do |style, local_dest_path|
# write a file here, or do anything you like
end

Carrierwave can't find file when testing with Capybara

I'm trying to write an integration test that involves a file uploaded with Carrierwave. I have the following configuration:
CarrierWave.configure do |config|
if Rails.env.test?
config.storage = :file
config.enable_processing = false
else
# other configs
end
end
And my uploader has the store path set to:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
The outcome of this is that the file gets stored in public/uploads. But when I try to access this file via the doc.url method on my model, it returns a path like /uploads/..., and when running my integration specs I get the following error:
unable to open file `/uploads/doc/paper/1/img_1.png'
I can confirm that the file does exist in the /public/uploads directory. How do I get the Carrierwave to return the correct path to the uploaded image? I could patch it together with Rails.root.join('public', doc.url) but that would break in production where we're uploading to S3.
After investigating it some more I realized that the problem here is that we are trying to access the file server side. Client side everything works fine, because it's using relative paths and Rails resolves it as an asset path. Server side it doesn't know where to look. In dev and production we use S3 so the url is the same either way. It feels like a bit of a hack, but where we need to access the image on the server, we did this:
path = Rails.env.test? ? doc.img.path : doc.url
file = File.open(path)
I wasn't able to find an environment agnostic way to handle this.

Rails Functional Test Case and Uploading Files to ActionDispatch::Http::UploadFile

I'm am adding tests to a Rails app that remotely stores files. I'm using the default Rails functional tests. How can I add file uploads to them? I have:
test "create valid person" do
post(:create, :person => { :avatar => fixture_file_upload('avatar.jpeg') })
end
This for some reason uploads a Tempfile and causes the AWS/S3 gem to fail with:
NoMethodError: undefined method `bytesize' for Tempfile
Is their any way that I can get the test to use an ActionDispatch::Http::UploadedFile and perform more like it does when testing with the web browser? Is fixture_file_upload the way to test uploading files to a controller? If so why doesn't it work like the browser?
As a note, I really don't want to switch testing frameworks. Thanks!
I use the s3 gem instead of the aws/s3 gem. The main reasons for this are no support for european buckets and development of aws/s3 seems to be stopped.
If you want to test file upload than using the fixtures_file_upload method is correct, it maps directly to Rack::Test::UploadedFile.new (you can use this if the test file isn't in the fixtures folder).
But I've also noticed that the behavior of the Rack::Test::Uploaded file objects isn't exactly the same as the ActionDispatch::Http::UploadedFile object (that's the class of uploaded files). The basic methods (original_filename, read, size, ...) all work but there are some differences when working with the file method. So limit your controller to these methods and all will be fine.
An other possible solution is by creating an ActionDispatch::Http::Uploaded file object and using that so:
upload = ActionDispatch::Http::UploadedFile.new({
:filename => 'avatar.jpeg',
:type => 'image/jpeg',
:tempfile => File.new("#{Rails.root}/test/fixtures/avatar.jpeg")
})
post :create, :person => { :avatar => upload }
I'd recommend using mocks.
A quick google search reveals:
http://www.ibm.com/developerworks/web/library/wa-mockrails/index.html
You should be able to create an object that will respond to the behaviors you want it to. Mostly used in a Unit test environment, so you can test your stuff in isolation, as integration tests are supposed to fully exercise the entire stack. However, I can see in this case it'd be useful to mock out the S3 service because it costs money.
I'm not familiar with the AWS/S3 gem, but it seems that you probably aren't using the :avatar param properly. bytesize is defined on String in ruby1.9. What happens if you call read on the uploaded file where you pass it into AWS/S3?

amazon s3 virtual hosting of bucket

I have uploaded a file on s3 using paperclip.. the file upload process works fine..
Now i wanted to download it. In my model i have set my :s3_host_alias.. now as the file is private.. so if i am trying to fetch the file using paperclip url method... it's giving me access denied error...
and if i am using S3Object.url_for method then the url return is s3.amazonaws.com/mybucket/path_of_file.
I don't want tht s3.amazonaws.com to be shown in the url so used :s3_host_alias in my model
and created a CNAME inmy DNS server... now if i am directly using #object.url then its giving the correct url but throws access denied error. because i guess the access_key and signature is not passed..
Is there a way to fetch private file from s3 using paperclip by using canonical url..
I don't use paperclip, but yes, you can sign a S3 request using a virtual hostname.
I had this problem using Paperclip and the AWS::S3 gem. Paperclip set up everything fine for non-authenticated requests. But falling back to AWS::S3 to generate an authenticated URL didn't use the S3 host alias.
You can pass AWS::S3 a server option on connect, but I didn't need or want a connection just to get the URL. I also couldn't see a way to set it via configuration (so it would apply outside of a connection). Even glancing at the source, it looks like it's non-configurable.
So, I created a monkey patch. My Ruby-fu (and maybe my OO-fu) aren't super high, so there may be a better way to do this, but it works for what I need. Basically, I pass url_for an :s3_host_alias param on the option hash, and then the monkey patch uses that if it's passed. If it's passed, it also has to remove the bucket from the path that's generated.
So....
You can create this 1-line file, RAILS_ROOT/initializers/load_patches.rb, to load all patches in RAILS_ROOT/lib:
Dir[File.join(Rails.root, 'lib', 'patches', '**', '*.rb')].sort.each { |patch| require(patch) }
Then create the file RAILS_ROOT/lib/patches/aws.rb with this code:
http://pastie.org/1622881
And you can call for an authenticated url with something along these lines (Configuration is a custom class for storing, natch, configuration values) :
AWS::S3::S3Object.url_for(media.path(style || media.default_style), media.bucket_name, :expires_in => expires_in, :use_ssl => false, :s3_host_alias => Configuration.s3_host_alias)

Resources