I am using a Rails app to upload images (or files) to cloudinary. I am using Carrierwave as uploader. Everything works fine locally, but I see that images are uploaded with http://something.
This does not work in production, since my domain is HTTPS, therefore, when I ask for a HTTP, it fails, since I need HTTPS.
Any help? I checked the documentation and they say that they work with HTTPS, that's why I am confused. I also wrote them, but it would be nice if someone had the same issue and knows how to solve it.
The upload response includes "secure_url", or you can just use "https://res.cloudinary.com..."
--Yakir
Related
I'm using this gem to integrate CKEditor into my Rails app, with paperclip to handle image uploading.
I no longer want Heroku to serve my CKEditor-uploaded images anymore, so I'm switching to Cloudfront and found this tutorial.
As I understood (from the tutorial), I can use Cloudfront without using an S3 bucket, as Cloudfront will automatically fetch my precompiled static assets stored on Heroku. But I'm so confused:
Will image uploaded via CKEditor be included in the asset pipeline? So that when precompiled, will be served by Cloudfront?
The tutorial said I need to change all image links to <%= image_tag('...') %> so that it'll point to Cloudfront and work. But this is only possible when I hard-coded the image, not the case when a user uploaded one in his text and stored in the database. Am I wrong and how to solve it?
Will this method (not using S3 bucket) work for other "dynamic" images such as User's avatars, post's cover images, etc...?
I don't want to use S3 bucket as it will involve the use of asset_sync. Any help is appreciated, thanks!
I'm using cloudfront as my CDN in my Rails app. I created my distribution and changed the enviroment file to enable the asset host.
Everything was working fine until I made a new deploy that included 3 new images. After restarting, everything looks fine but the 3 images. If I get the cloudfront URL and change the domain for my rails app domain the images load just fine, but if I use the CloudFront domain the images look like they weren't found.
Any ideas why this is happening? If I undestand correctly CloudFront doesn't have a delay, it loads the image as soon as the first request comes in.
run rake assets:precompile then upload it.
or check the image path
it should begin with "/" like
<img src="/images/img.jpg"/> if you are images in public assets
I found the problem. Sometime between the precompile and the restart of the server someone made a request and Cloudfront couldn't find the image so that's why it wasn't showing. I changed the image name, re-depoyed it and everything is fine now.
I know this is a broad question, and I'm biting off a little more than I can chew for a first stab at a rails app, but here I am.
I tried to add an image upload/crop to a basic status app. It was working just fine uploading the images and cropping them with carrierwave, but as soon as I started using Fog to upload to S3, I ran into issues.
The image, and it's different sizes, appear to be ending up on S3 just fine, but the app is still trying to access the image as "/assets/uploads/entry/image/65/large_IMG_0035.jpg"
Locally, it just shows a broken image, but on Heroku it breaks the whole thing because
ActionView::Template::Error (uploads/entry/image/1/large_IMG_0035.jpg isn't precompiled
The heroku error makes sense to me because it shouldn't be there. I've combed through the app but don't know what's forcing this. I'll post any code anybody thinks will work? Thanks in advance!
Clarification:
Just to clarify, the images are uploading to S3 fine, the problem is how the app is trying to display the image_url
The app is using a local path in the asset pipeline, not the S3 path that it's actually uploading to.
I was having the same issue. In my Carrierwave Initializer I was setting host to s3.amazonaws.com but when I removed that line altogether urls started working.
I hope this helps you resolve your issue, I fought this for several hours!
I believe this issue is related to how you are accessing your image in your view.
If you have mounted an uploader on the field avatar in the following manner:
class User < ActiveRecord::Base
mount_uploader :avatar, AvatarUploader
end
You would access it in your ERB as follows:
<%= image_tag(#user.avatar_url) %>
I would also suggest watching the following Railscast on the topic.
http://railscasts.com/episodes/253-carrierwave-file-uploads
Re-reading issue, I bet it has to do with Carrierwave using Herkou.
Give this a glance and see if it helps.
https://github.com/jnicklas/carrierwave/wiki/How-to%3A-Make-Carrierwave-work-on-Heroku
I am not clear what exactly do you want to achieve.
But for now I have 2 ideas:
For assets host in CDN, you can take a look at this:
https://devcenter.heroku.com/articles/cdn-asset-host-rails31
If you want the images to be part of a model-relation, here's my rough idea:
Put the images path in a table column.
For further information about this you can browse carrierwave github site.(It has many docs and tutorial)
Like it says on the tin, I'm trying to upload an image with my Ember.js app to a Rails backend that's using Paperclip to manage file uploads. I had a look around and couldn't see any simple way to do this, does anyone know of a good solution here?
I faced similar recently, and it turns out that there are lots of complications with file uploading - does the device support it, do you want to be able to style the input that triggers the upload, etc.
We opted for Jquery File Upload: https://github.com/blueimp/jQuery-File-Upload
The approach I took was to upload directly to S3 from the browser, and then set the token that S3 returns as a property on a model, then save that to the server. Then on the server, you set off a background job to pull in that file from S3 and put it where it should be.
I wrote a fairly simple ember.js file upload example a few months back that shows how you can write a custom view + a custom adapter that allows you to post a multipart form back to the server. The example I did is built for python / django but the concepts should apply
https://github.com/toranb/ember-file-upload
I recently upgraded this to RC1 (like 5 minutes ago) and it appears to still work :D
There is now an Ember Uploader plugin for Ember. I'm just in the process of integrating it right now.
I have a couple kinks I'm ironing out, but it seems pretty legit. Probably less configuration than using the jquery file upload.
In short
In short I want to know if I can send additional headers through a carrierwave and fog connection to Amazon s3?
In depth
I recently found that amazon supports Client and Server side encryption of files. more info ยป http://docs.amazonwebservices.com/AmazonS3/latest/dev/SSEUsingRESTAPI.html
I'm currently using carrierwave in a rails app to upload files to amazon s3.
For server side encryption amazon asks for a header of x-amz-server-side-encryption=AES256 added to the request.
So I'm looking to figure out how to send additional headers through with my carrierwave and fog.
My thought was that maybe I could use the fog_attribute config line something like the following and maybe that might work but I'm not sure the fog_attribute is for partiular attribute or just a blanket header section.
config.fog_attributes = {'x-amz-server-side-encryption' => 'AES256','Cache-Control'=>'max-age=315576000'} # optional, defaults to {}
So I finally got my app in shape to test this but unfortunately it didn't work.
I also found this: https://github.com/geemus/fog/commit/070e2565d3eb08d0daaa258ad340b6254a9c6ef2 commit in the fog repository that make me feel the fog_attributes method is for a defined list of attributes.
There has got to be a way to make this work. Anyone?
I believe that should actually be correct, note however that I don't believe the server side encryption stuff has been released, so you would need to use edge fog to get this behavior. I hope to do a release soon though and then it should be good to go. If you find that you still can't get it working on edge let me know though and we'll try and see what can be done.
I cannot speak about CarrierWave, but this works for saving files with AWS256 encryption with the (currently) standard Fog distribution:
file.attributes[:encryption ] = "AES256"
result = file.save()
However, that does not work for copying files. What works for copying is:
fogfile.copy(#bucket_archived, newfilename, {'x-amz-server-side-encryption' => 'AES256'})