How do you access files stored in the tmp directory? - ruby-on-rails

I'm using CarrierWave to upload images to my web page. Currently, I have it working with Amazon S3 and Heroku. However, I would like to be able to test it on my machine using localhost. Again, I have this working. However, I'm storing the uploaded photos in my apps tmp directory located at:
Users/.../app/tmp/uploads.
When trying to display an image I get a broken link. I've been using:
<img src='<%= bucket.path %>'/>
to display images, and it has been working on Heroku. On localhost I get this error:
ActionController::RoutingError
(No route matches [GET] "/Users/.../app/tmp/uploads/pic.jpeg")
I'm not sure what to do really, I thought providing the path would be enough. Thanks for the help!

Are you using fog? This reply assumes you are.
in /config/environments/development.rb you should be able to set the following:
config.uploadsURL = "http//localhost:3000"
config.serve_static_assets = true
in /config/initializers/carrierwave.rb:
if Rails.env.production?
# your aws config stuff
else
config.storage = :file
end
in a view (unless I am forgetting something) you should then be able to just:
<%= image_tag(image.imgUpload.mini) if image.imgUpload? %>
where 'mini' is a carrierwave version and imgUpload is the mount you defined in your model:
mount_uploader :imgUpload, ImageUploader
of course, you should be able to test it on localhost while also using AWS storage. It might be simpler to just change your /config/environments/development.rb from:
config.action_controller.asset_host = "//#{ENV['FOG_DIRECTORY']}.s3.amazonaws.com"
to
config.action_controller.asset_host = "//your-test-bucket.s3.amazonaws.com"
and keep using AWS while running on localhost.

Related

How to add host to a Carrierwave url?

I'm using Carrierwave on a Document model.
class Document
mount_uploader :file, DocumentUploader
end
and am trying to send an email with document as attachment
class DocumentMailer
def distribute(recipient, document)
filename = document.file.file.original_filename
attachments[ filename ] = File.read(document.file.url)
mail(
to: receipient.email,
subject: "Document attached"
)
end
end
In tests, the Mailer is raising an error
Errno::ENOENT:
No such file or directory # rb_sysopen - /uploads/document/file/2/my_attachment.jpg
I can resolve this error in the test suite by calling path instead of url in DocumentMailer, which returns the full filesystem path
attachments[ filename ] = File.read(document.file.path)
# /Users/AHH/code/myapp/tmp/uploads/document/file/2/my_attachment.jpg
However, this causes the method to fail in production. Carrierwave is using fog to store files on S3, and so I need the full url to assign an attachment to DocumentMailer.
Why do the tests fail when using file.url? I assume it is because the url has no host. So how to I ensure that Carrierwave applies a host to file.url in the test environment?
This is a side effect of the way that you store files in development/test versus production due to the fact that url is actually a URL in production, probably to an S3 host, but locally it's a path to a file, relative to wherever you've specified uploads to be stored. Unfortunately, it doesn't seem like CarrierWave has a graceful way of handling this, at least so far as I've seen, so I ended up doing something like this in our spec helper:
config.before do
allow_any_instance_of(DocumentUploader).to receive(:url) do |uploader|
uploader.path
end
end

Turning off storage in Paperclip

How do I turn remote storage off in Paperclip for use on Heroku? I realize that storing an uploaded file is the whole point of this gem, but I want to turn it off and still use the gem's other features (inspecting the file, etc- just don't need to store). I want to keep all the functionality in place in the model, but just not store the file anywhere.
This is close but it doesn't work on Heroku:
Paperclip::Attachment.default_options[:storage] = 'filesystem'
This doesn't work unfortunately:
Paperclip::Attachment.default_options[:storage] = :none
From this file
https://github.com/thoughtbot/paperclip/blob/master/lib/paperclip/railtie.rb
I did not tested this but inside config/initializer/ or production.rb environment you should be able to set
Paperclip::Attachment.default_options = {}
or with Paperclip::Attachment.default_options[:storage] = ""
Hope it helps

Why do some assets give 403 forbidden?

I am using ROR, nginx, passenger...
If there is no picture in the DB then my app will serve the 'default_avatar.png'. I noticed I was unable to save new pictures. So, I updated my dragonfly initializer to point at my db server:
...
# I have obscured the host IP for this example.
config.url_host = Rails.env.production? ? 'http://IP_OF_HOST' : 'http://localhost:3000'
...
Now I can save pictures and view them through my app, but the 'default_avatar.png' does not resolve. Oddly enough, other image assets do seem to come through. Why do I get 403? At first guess I thought it was a permissions error. But then why would it serve the other images?
UPDATE:
I have just noticed a very important clue. When the assets do not work, they have the url:
/media/jakbfAGasgAgADSGANJGFDgbnadglnalgbakljbgkjabg/default_avatar.png
And when they do work:
/assets/avatar.png
I should mention that I have 2 app servers and 1 db server. I do not believe it to be a permissions error.
I've encountered the same problem.
You need to specify the extension of the file when using the html helper provided by AssetTagHelper lib.
This will work:
<%= image_tag('avatar.png') %>
This won't work:
<%= image_tag('avatar') %>
Not easy to debug.

Carrierwave can't find file when testing with Capybara

I'm trying to write an integration test that involves a file uploaded with Carrierwave. I have the following configuration:
CarrierWave.configure do |config|
if Rails.env.test?
config.storage = :file
config.enable_processing = false
else
# other configs
end
end
And my uploader has the store path set to:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
The outcome of this is that the file gets stored in public/uploads. But when I try to access this file via the doc.url method on my model, it returns a path like /uploads/..., and when running my integration specs I get the following error:
unable to open file `/uploads/doc/paper/1/img_1.png'
I can confirm that the file does exist in the /public/uploads directory. How do I get the Carrierwave to return the correct path to the uploaded image? I could patch it together with Rails.root.join('public', doc.url) but that would break in production where we're uploading to S3.
After investigating it some more I realized that the problem here is that we are trying to access the file server side. Client side everything works fine, because it's using relative paths and Rails resolves it as an asset path. Server side it doesn't know where to look. In dev and production we use S3 so the url is the same either way. It feels like a bit of a hack, but where we need to access the image on the server, we did this:
path = Rails.env.test? ? doc.img.path : doc.url
file = File.open(path)
I wasn't able to find an environment agnostic way to handle this.

Rails: Images on one server, CSS and Javascript on another

I am working on a rails app that has a bunch (hundreds) of images that are hosted on an S3 server. To have helpers like image_tag point here I had to add this to by config/environments/development.rb test.rb and production.rb:
config.action_controller.asset_host = "http://mybucket.s3.amazonaws.com"
However, this also means that it looks there for CSS and Javascript. This is a huge pain because each time I change the CSS I have to re-upload it to Amazon.
So.. Is there an easy way I can make my app look to Amazon for images, but locally for CSS/Javascript?
(I'm using Rails 3.0)
You can pass a Proc object to config.action_controller.asset_host and have it determine the result programmatically at runtime.
config.action_controller.asset_host = Proc.new do |source|
case source
when /^\/(images|videos|audios)/
"http://mybucket.s3.amazonaws.com"
else
"http://mydomain.com"
end
end
but left as it is, this would give you http://mybucket.s3.amazonaws.com/images/whatever.png when you use image_tag :whatever.
If you want to modify the path as well, you can do something very similar with config.action_controller.asset_path
config.action_controller.asset_path = Proc.new do |path|
path.sub /^\/(images|videos|audios)/, ""
end
which would give you http://mybucket.s3.amazonaws.com/whatever.png combined with the former.
There's nothing stopping you from passing full url to image_tag: image_tag("#{IMAGE_ROOT}/icon.png").
But to me moving static images (icons, backgrounds, etc) to S3 and leaving stylesheets/js files on rails sounds kinda inconsistent. You could either move them all to S3 or setup Apache for caching (if you're afraid users pulling big images will create too much overhead for Rails).
BTW, you don't have to put config.action_controller... into config files for all three environments: placing that line just in config/environment.rb will have the same effect.

Resources