I have a rails app in which employers can upload files for a freelancer to work on. i am using amazon s3 to store the files. The problem is that amazon s3 assigns the file a url that if some has has, they can access the file. Employers will often upload private files that only the freelancer should be able to see. How do I make it so when an employer uploads a file, only the freelancer can see it?
Here is the file uploader code:
CarrierWave.configure do |config|
config.storage = :fog
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key => ENV['AWS_ACCESS'],
:aws_secret_access => ENV['AWS_SECRET']
}
config.fog_directory = ENV['S_BUCKET']
end
Use the config.fog_public = false option to make the files private. And fog_authenticated_url_expiration (time in seconds) to add a TTL to each file URL. See the fog Storage module for more info: https://github.com/carrierwaveuploader/carrierwave/blob/master/lib/carrierwave/storage/fog.rb
Related
Carrierwave is returning a JSON response like this:
"url": "/mys3bucket/uploads/entrees/photo/32/4c312e9aed37a59319096a03_1.jpg",
I need the absolute url. Images are hosted on Amazon S3. How can I get the absolute url?
My temporary hack is to add following to Carrierwave initializer:
config.asset_host = "s3.#{ENV.fetch('AWS_REGION')}.amazonaws.com/mybucket"
CarrierWave uses the combination of the filename and the settings
specified in your uploader class to generate the proper URL. This
allows you to easily swap out the storage backend without making any
changes to your core application.
That said, you cannot store the full URL. You can set CarrierWave's asset_host config setting that is based on envrionment.
What storage are you using on Production? Here is my configuration and It works very well. Hope it helps.
CarrierWave.configure do |config|
config.root = Rails.root
if Rails.env.production?
config.storage = :fog
config.fog_credentials = {
provider: "AWS",
aws_access_key_id: ENV["AWS_ACCESS_KEY_ID"],
aws_secret_access_key: ENV["AWS_SECRET_ACCESS_KEY"],
region: ENV["S3_RESION"]
}
config.fog_directory = ENV["S3_BUCKET_NAME"]
# config.asset_host = ENV["S3_ASSET_HOST"]
else
config.storage = :file
# config.asset_host = ActionController::Base.asset_host
end
end
I tried with solutions and answers that already exist in this environment but do not solve my problem, so I open a new thread.
Mount a test server with these technologies in a homely own server and works perfectly in keeping gcloud store. When I ride gcloud a server with the same configuration it does not work.
I discard probleamas Fog credentials proque in pre-production environment works.
user permissions and try 644 755
When I upload a photo, rather than persist in gcloud storage, it is in the root project folder with filenames such RackMultipart20160414-4440-ry1g0r.jpg
and stored in public / uploads / tmp folder one
No more can be. settings are
config/initializers/carrier_wave.rb
config.fog_credentials = {
:provider => 'Google',
:google_storage_access_key_id => Figaro.env.google_storage_access_key_id,
:google_storage_secret_access_key => Figaro.env.google_storage_secret_access_key
}
config.fog_directory = 'uploads-prod'
config.fog_public = true
config.storage = :fog
config.root = Rails.root.join('public')
content_uploader.rb
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
Any idea or solution ?!
Thank you
I followed this tutorial:
http://lifesforlearning.com/uploading-images-with-carrierwave-to-s3-on-rails/
I had working carrierwave uploader which was storing files to disk space
What I did step by step:
1)added fog gem and run bundle install and bundle update
2)in config/initializers I created r3.rb file with this:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'mykey',
:aws_secret_access_key => 'mysecretkey',
:region => 'us-west-2' # Change this for different AWS region.
}
config.fog_directory = "bucket-main"
end
I ran rails s and tried to save some photo. But as you can see on the picture my bucket is empty.So they must be stored to my disk.
What do I do now?
Update I changed storage to fog.
Here is my photouploader class code:
# encoding: utf-8
class PhotoUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
And now I get this error:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not
match the server certificate (OpenSSL::SSL::SSLError)
i eventually solved my problem by updating
bundle update fog
and
bundle update carrierwave
Try adding path_style to your config and the fog_directory
config.fog_credentials = {
...
:path_style => true
}
config.fog_directory = 'bucket-main'
I just spent a few hours tracking down the cause of this error, which I was also getting:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not match the server certificate (OpenSSL::SSL::SSLError)
The odd thing is how the bucket name is repeated twice in the hostname. It turned out I had configured the wrong region name. Notice in your config.fog_credentials you have
:region => 'us-west-2'
...but the hostname in the exception has s3-us-west-1? If your bucket is in one AWS region, but you configure a different region in your Fog credentials, Fog will try to follow a redirect from AWS, and somehow the bucket name gets doubled up in this situation. Fog produces a warning about the redirect, but Carrierwave ends up hiding this from you.
Set :region in your Fog credentials to where the bucket actually is in AWS, and the does not match the server certificate exception will stop happening.
I have a working heroku app. But since heroku doesn't provide persistent file storage, I'd like to use amazon s3.
I found heroku tutorial https://devcenter.heroku.com/articles/s3
But it seems confusing to me and may be a little bit complicated.
Right now I use carrierwave gem for storing files.
So may be you can give me a small and simple example of code u use to store files to amazon file storage?
UPDATE:
I found this code(is it that simple just a few lines in carrierwave, add gem called fog and that's all? or will I need something else?):
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => "YOUR AMAZON ACCESS KEY",
:aws_secret_access_key => "YOUR AMAZON SECRET KEY",
:region => 'us-west-1' # Change this for different AWS region. Default is 'us-east-1'
}
config.fog_directory = "bucket-name"
end
I would like to have distinct folders in my S3 bucket to keep the production database clear from the development environment.
I am not sure how to do this, here is the skeleton I've come up with in the carrierwave initializer:
if Rails.env.test? or Rails.env.development?
CarrierWave.configure do |config|
//configure dev storage path
end
end
if Rails.production?
CarrierWave.configure do |config|
//configure prod storage path
end
end
Two options:
Option1: You don't care about organizing the files by model ID
In your carrierwave.rb initializer:
Rails.env.production? ? (primary_folder = "production") : (primary_folder = "test")
CarrierWave.configure do |config|
# stores in either "production/..." or "test/..." folders
config.store_dir = "#{primary_folder}/uploads/images"
end
Option 2: You DO care about organizing the files by model ID (i.e. user ID)
In your uploader file (i.e. image_uploader.rb within the uploaders directory):
class ImageUploader < CarrierWave::Uploader::Base
...
# Override the directory where uploaded files will be stored.
def store_dir
Rails.env.production? ? (primary_folder = "production") : (primary_folder = "test")
# stores in either "production/..." or "test/..." folders
"#{primary_folder}/uploads/images/#{model.id}"
end
...
end
Consider the following initializer:
#config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.enable_processing = true
# For testing, upload files to local `tmp` folder.
if Rails.env.test?
config.storage = :file
config.root = "#{Rails.root}/tmp/"
elsif Rails.env.development?
config.storage = :file
config.root = "#{Rails.root}/public/"
else #staging, production
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => ENV['S3_KEY'],
:aws_secret_access_key => ENV['S3_SECRET']
}
config.cache_dir = "#{Rails.root}/tmp/uploads" # To let CarrierWave work on heroku
config.fog_directory = ENV['S3_BUCKET']
config.fog_public = false
config.storage = :fog
end
end
In development, the uploads are sent to the local public directory.
In test mode, to the Rails tmp directory.
And finally, in "else" environment (which is usually a production or staging environment) we direct the files to S3 using Environmental Variables to determine which bucket and AWS credentials to use.
Use different Amazon s3 buckets for your different environments. In your various environment .rb files, set the environment specific asset_host. Then you can avoid detecting the Rails environment in your uploader.
For example, in production.rb:
config.action_controller.asset_host = "production_bucket_name.s3.amazonaws.com"
The asset_host in development.rb becomes:
config.action_controller.asset_host = "development_bucket_name.s3.amazonaws.com"
etc.
(Also consider using a CDN instead of hosting directly from S3).
Then your uploader becomes:
class ImageUploader < CarrierWave::Uploader::Base
...
# Override the directory where uploaded files will be stored.
def store_dir
"uploads/images/#{model.id}"
end
...
end
This is a better technique from the standpoint of replicating production in your various other environments.