I am using CarrierWave and Fog to upload images and process them to Amazon S3.
Below is my Fog settings.
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'AKIAJ23D1I25B2P2HX6A',
:aws_secret_access_key => 'WV64nQAd111+ZelqKgffrzvViG0lEeTTnEOonXHkg'#,
#:region => 'us-west-2'
}
config.fog_directory = "<TESTING>"
end
Is there a way to have two settings that gets used when the app is in production and development environment so that we dont mess with the files in the product and can delete it in dev one.
Yes you can use different settings by specifying in the particular environment. I am using this in my project but what I have done is I am using different directories for both. Like in the production.rb I am using project_directory and in development.rb I am using project_dev_directory. I have specified the same settings in both the environments. If you need to specify different settings you can do that too. Hope this helps.
Related
I started playing with a test app trying to upload sitemaps using Amazon S3. I've been following https://github.com/kjvarga/sitemap_generator trying to figure out the gem and have only been half successful. A sitemap will generate in the public folder, but not upload to the S3 bucket.
I've added the config/sitemap.rb found in the tutorial above.
require 'rubygems'
require 'sitemap_generator'
require 'aws-sdk'
SitemapGenerator::Sitemap.create_index = true
SitemapGenerator::Sitemap.default_host = 'https://www.myapp.herokuapp.com'
SitemapGenerator::Sitemap.create do
add '/home', :changefreq => 'daily', :priority => 0.9
end
SitemapGenerator::Sitemap.adapter = SitemapGenerator::S3Adapter.new(fog_provider: 'AWS',
aws_access_key_id: 'KEY',
aws_secret_access_key: 'SECRET',
fog_directory: 'DIR',
fog_region: 'REGION')
I type in
rails sitemap:refresh
in my terminal and it generates the maps. It just doesn't upload them. No errors, no clues as to what didn't happen, nothing. It even tells me that google and bing are successfully pinged.
Of course I can visit my AWS bucket and manually upload these files but that feels...wrong. I've used shrine for images in the past and am used to uploading to a cache. There must be something I missed.
Check your secrets, maybe you don't have the aws account env vars so S3 adapter would never works or policy in your bucket
So i am trying to set up uploading of files in my production environment. I am currently using CarrierWave along with Fog-Google. I have no issues storing files locally as i do not use Fog for development. However i am currently trying to test out the file uploading functionality in production, but i cannot even push my app up to Heroku.
Here's a snippet of the error i'm receiving when attempting to push to Heroku.
[fog][WARNING] Unrecognized arguments: google_storage_secret_access_key_id, google_storage_secret_access_key
rake aborted!
ArgumentError: Invalid keyfile or passphrase
Now i am relatively new to setting up ENV secret ids and all of that. So instead i'll just say what i know and what i have done just to be sure that i did everything correctly.
So as i am currently using Cloud9 IDE, in my .bashrc file i have
export GOOGLE_STORAGE_ACCESS_KEY_ID=XXXXX
export GOOGLE_STORAGE_SECRET_ACCESS_KEY=XXXXX
export GOOGLE_STORAGE_BUCKET_NAME=XXXXX
In my /config/initializers/carrierwave.rb
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_provider = 'fog/google' # required
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
google_storage_secret_access_key: ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['GOOGLE_STORAGE_BUCKET_NAME']
end
and in my /config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'XXXX',
google_client_email: 'XXXXXX',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID"],
google_storage_secret_access_key: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY"]
)
Like mentioned i am actually quite new to all of these so i've tried my best in following the documentation on both Fog and CarrierWave's github page.
As far as i know i should use .bashrc to store my secret keys etc and then call on them using ENV['SECRET_KEY_NAME'] method. I've set up both the CarrierWave.rb and Fog.rb files in the initializer folder so i'm not quite sure what i'm missing as well.
Additionally i have also tried doing heroku config:set GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID=XXXXXX but that didn't seem to work as well.
I'm not quite sure what to do now and what may be causing the error when attempting to push to Heroku, much less whether any of this is even working in production.
EDIT:
I think the error is largely from the fog.rb file. So i amended it to the following:
GoogleStorage = Fog::Storage::Google.new(
google_project: 'XXX',
google_client_email: 'XXX',
google_json_key_location: '~/.fog/XXXX.json'
)
And now when i try pushing to Heroku the error i get instead is
Errno::ENOENT: No such file or directory # rb_sysopen - /app/.fog/XXX.json
Just to share, i created a .fog folder under the ~ directory. Inside the .fog folder i added in the Private JSON key.
All help and advice would be greatly appreciated. Thank you!
So i've solved the problem and managed to successfully push my code to Heroku.
[Kindly note that this does not mean it functions perfectly in production, it just means i am now able to push to Heroku without any errors]
So there were 2 main errors.
1) Not including my Private Key JSON file in my app.
2) Error in the config/initializers/fog.rb file
To solve the first issue, i created a .fog folder in my App and added in my Private Key JSON file which was from Google's Cloud Platform.
Next i amended the code in config/initializers/fog.rb to:
GoogleStorage = Fog::Storage::Google.new(
:google_project => 'XXXX',
:google_client_email => 'XXXXX',
:google_json_key_location => Rails.root.join('.fog/XXXX'),
:google_storage_access_key_id => ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
:google_storage_secret_access_key => ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
)
I was then able to successfully push my code to Heroku.
I deployed an application on Heroku which I wrote on Ruby on Rails. It is a movie review app.
I am able to upload images from my computer, to the web app online. Everything else works as per my expectations.
The images disappear after a day. My requirement is to have the image continue to render.
I am using Paperclip gem from rails. This only happens on the deployed version and not on localhost.
The default upload location on Heroku is into temporary storage. This is because you will get a different web worker each time you deploy.
You need to use S3 or another location to store your files. Luckily this is well documented for Paperclip on Heroku.
The main configuration difference is this. Add gem 'aws-sdk' to your Gemfile and then adjust your config file in config/environments/production.rb:
config.paperclip_defaults = {
:storage => :s3,
:s3_credentials => {
:bucket => ENV['S3_BUCKET_NAME'],
:access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:secret_access_key => ENV['AWS_SECRET_ACCESS_KEY']
}
}
When running my rails app in development mode from Nitrous.io, I cannot access my development bucket which I set up on AWS S3. The upload button opens my personal computer, from where I don't want to load files. (even when I try to load files from my computer, I get a long error message stating "The request signature we calculated does not match the signature you provided. Check your key and signing method"
I think I don't have AWS S3 configured properly.
Currently, I have one IAM user, which I've assigned to AdministratorAccess Also, I am using the proper AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in my application.yml file. In fog.rb I have it read from the enviroment.
I should add too that I currently enrolled in a web development apprenticeship program.
Sorry for not showing my files
Here's my application.yml with the sensitive data taken out:
SENDGRID_PASSWORD: alphanumberic
SENDGRID_USERNAME: -------#heroku.com
AWS_ACCESS_KEY_ID: alphanumeric
AWS_SECRET_ACCESS_KEY: alphanumeric
development:
AWS_BUCKET: vmanamino-bloccit-development
production:
AWS_BUCKET: vmanamino-bloccit-production
development:
secret_key_base: alphanumeric
test:
secret_key_base: alphanumeric
Here's my fog.rb file which reads the values from the environment
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
aws_secret_access_key: ENV['AWS_ACCESS_KEY_ID'],
}
config.fog_directory = ENV['AWS_BUCKET']
config.fog_public = true
end
You're using the AWS_ACCESS_KEY_ID environment variable for both the access key and the secret access key whereas the latter should of course be using ENV['AWS_SECRET_ACCESS_KEY']
I didn't realize I needed to enclose the KEY ID and the SECRET KEY in quotes. Once I did that, I got it to work. I can upload images from my conputer to S3. Also, I didn't understand the assignment perfectly. I thought my app would upload images from S3. Now, the error raised earlier makes sense. I upload from my computer sending the image to S3.
I deploy my rails app onto Heroku, and I've set aws access keys on the server as environment variables. However, to test my application in development environment, I need to initialize them somewhere on my local machine. So I decided to the following.
/config/initailizers/init_aws_locally.rb
ENV['AWS_ACCESS_KEY_ID'] = 'my key'
ENV['AWS_SECRET_ACCESS_KEY'] = 'my secret key'
This file is added in .gitignore
However, when I upload in development environment, I get this error message:
Missing required arguments: aws_access_key_id, aws_secret_access_key
I think somehow I overlooked a simple step to include my aws keys in my development environment. But I'm not sure why the reason for the error when I already initialized the keys.
For your reference, I'm using carrierwave, S3, and Fog.
config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'], # required
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'], # required
:region => 'us-east-1', # optional, defaults to 'us-east-1'
}
config.fog_directory = 'd' # required
config.fog_public = true # optional, defaults to true
end
Thank you. I appreciate your help!
Your initializers will be run in alphabetical order. See the docs:
If you have any ordering dependency in your initializers, you can
control the load order by naming. For example, 01_critical.rb will be
loaded before 02_normal.rb.
The problem you're experiencing is that your fog.rb initializer is running before your init_aws_locally.rb one (because f comes before i). So ENV['AWS_ACCESS_KEY_ID'] has not been defined (yet) when you set fog_credentials.
I'd avoid putting any credentials in code. It's such a terrible idea and Heroku has the right idea. So what I do is use RVM and put a file .rvmrc in my project folder. I put .rvmrc in .gitignore as well.
Then you edit .rvmrc to have
export AWS_ACCESS_KEY_ID="BLAH"
so on and so forth. Anytime I "cd" into this directory my env is setup for me for that project by RVM. If you're not using RVM there is other alternatives out there.
rails s
and it will have all the environment variables setup that you put in your .rvmrc script. No need for initializer or development only yaml config files that you keep out of source control. From my experience this is the simplest solution.
I went to my shell and typed:
$ echo $AWS_SECRET_ACCESS_KEY
and it came back blank. It turns out I recently moved to a new virtual machine and forgot to add this to the .bashrc file. It's worth checking the shell environment just in case. Once I added those two lines to my .bashrc, everything was happy again.