Fog-Google, Unable to Push to Heroku - ruby-on-rails

So i am trying to set up uploading of files in my production environment. I am currently using CarrierWave along with Fog-Google. I have no issues storing files locally as i do not use Fog for development. However i am currently trying to test out the file uploading functionality in production, but i cannot even push my app up to Heroku.
Here's a snippet of the error i'm receiving when attempting to push to Heroku.
[fog][WARNING] Unrecognized arguments: google_storage_secret_access_key_id, google_storage_secret_access_key
rake aborted!
ArgumentError: Invalid keyfile or passphrase
Now i am relatively new to setting up ENV secret ids and all of that. So instead i'll just say what i know and what i have done just to be sure that i did everything correctly.
So as i am currently using Cloud9 IDE, in my .bashrc file i have
export GOOGLE_STORAGE_ACCESS_KEY_ID=XXXXX
export GOOGLE_STORAGE_SECRET_ACCESS_KEY=XXXXX
export GOOGLE_STORAGE_BUCKET_NAME=XXXXX
In my /config/initializers/carrierwave.rb
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_provider = 'fog/google' # required
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
google_storage_secret_access_key: ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['GOOGLE_STORAGE_BUCKET_NAME']
end
and in my /config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'XXXX',
google_client_email: 'XXXXXX',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID"],
google_storage_secret_access_key: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY"]
)
Like mentioned i am actually quite new to all of these so i've tried my best in following the documentation on both Fog and CarrierWave's github page.
As far as i know i should use .bashrc to store my secret keys etc and then call on them using ENV['SECRET_KEY_NAME'] method. I've set up both the CarrierWave.rb and Fog.rb files in the initializer folder so i'm not quite sure what i'm missing as well.
Additionally i have also tried doing heroku config:set GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID=XXXXXX but that didn't seem to work as well.
I'm not quite sure what to do now and what may be causing the error when attempting to push to Heroku, much less whether any of this is even working in production.
EDIT:
I think the error is largely from the fog.rb file. So i amended it to the following:
GoogleStorage = Fog::Storage::Google.new(
google_project: 'XXX',
google_client_email: 'XXX',
google_json_key_location: '~/.fog/XXXX.json'
)
And now when i try pushing to Heroku the error i get instead is
Errno::ENOENT: No such file or directory # rb_sysopen - /app/.fog/XXX.json
Just to share, i created a .fog folder under the ~ directory. Inside the .fog folder i added in the Private JSON key.
All help and advice would be greatly appreciated. Thank you!

So i've solved the problem and managed to successfully push my code to Heroku.
[Kindly note that this does not mean it functions perfectly in production, it just means i am now able to push to Heroku without any errors]
So there were 2 main errors.
1) Not including my Private Key JSON file in my app.
2) Error in the config/initializers/fog.rb file
To solve the first issue, i created a .fog folder in my App and added in my Private Key JSON file which was from Google's Cloud Platform.
Next i amended the code in config/initializers/fog.rb to:
GoogleStorage = Fog::Storage::Google.new(
:google_project => 'XXXX',
:google_client_email => 'XXXXX',
:google_json_key_location => Rails.root.join('.fog/XXXX'),
:google_storage_access_key_id => ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
:google_storage_secret_access_key => ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
)
I was then able to successfully push my code to Heroku.

Related

Rails 5.2 credentials.yaml.enc and master.key not working on Heroku

I'm setting up active storage for a new app, and haven't been able to get the app running on production after setting up my amazon credentials.
I've included my s3 bucket credentials in my credentials.yaml.enc file
I've added my RAILS_MASTER_KEY env variable to Heroku.
I've set up my s3 bucket in the storage.yml file according to this.
I've added the config.active_storage.service = :amazon line to my production.rb.
I've added config.require_master_key = true to my production.rb
When I try running my app on Heroku, it doesn't load. Doing $ Heroku run rails console gives me the error:
"/app/vendor/bundle/ruby/2.3.0/gems/aws-sigv4-1.0.2/lib/aws-sigv4/signer.rb:517:in `extract_credentials_provider': Cannot load `Rails.config.active_storage.service`: (Aws::Sigv4::Errors::MissingCredentialsError)
missing credentials, provide credentials with one of the following options:
- :access_key_id and :secret_access_key
- :credentials
- :credentials_provider"
As far as I can tell I've set up my credentials the way Rails 5.2 intended. I've tried all sorts of asset precompiling stuff to no avail. When I try adding my amazon credentials as env. variables in Heroku, the app runs fine in production. Any idea what could be going wrong here?
Could it be that you forgot to add config.require_master_key = true to your production.rb?
I’ve had this problem before, seems to be a bug on Heroku.
You should just set up your environment variables through the dashboard on Heroku on the settings tab.
Then you can access that using ENV[‘NAME_OF_YOUR_VARIABLE’]
This fixed my problem.
Also check your Heroku logs very well, by scrolling up to ensure all gems were installed.
Double check that you have the correct key in your config/credentials.yml.enc file. I had one key inverted- secret_key_access instead of secret_access_key, and was getting the same error. Fixing the key name in config/credentials.yml.enc fixed it for me.
In your rails console (locally), run:
Rails.application.credentials.dig(:aws, :access_key_id)
and
Rails.application.credentials.dig(:aws, :secret_access_key)
to make sure they have values.
Welp, this was dumb. Mystery solved. My credentials were commented out in the credentials.yaml.enc file - I was adding them to the top of the file with the default aws example, which is commented out.

Rails Generate controller aws error missing bucket name

I am attempting to create a Users controller in my ruby on rails project, which I have also configured with heroku and an aws-s3 bucket. I set my .env and my heroku local with the S3_BUCKET, AWS_ACCESS_KEY_ID, and AWS_SECRET_ACCESS_KEY. I also set my initializer/aws.rb file to look like this:
Aws.config.update({
region: 'us-east-1',
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
})
S3_BUCKET = Aws::S3::Resource.new.bucket(ENV['S3_BUCKET'])
I have bundle installed the aws gem like this:
gem 'aws-sdk', '~> 3'
However when I run the command
rails g controller Users new
I get the following error in my terminal:
aws-sdk-s3/bucket.rb:658:in `extract_name': missing required option :name (ArgumentError)
I looked at that file and it is trying to find the S3 bucket name, but I have set that already in .env and heroku local. Is there some other place where this needs to be set? None of the guides I have read mention this error.
Hi please check whether you have specified the right credentials and bucket name. Also, make sure you have provided the right region .Try the below code
s3 = Aws::S3::Resource.new(
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
region: 'us-west-1'
)
obj = s3.bucket(ENV['S3_BUCKET']).object('key')
If you want to upload file or something
obj.upload_file(file, acl:'public-read')
This would help you, I have used like this in my project.
1. Create the file aws.rb in your /config/initializers folder.
2. Then copy the below code,
S3Client = Aws::S3::Client.new(
access_key_id: 'ACCESS_KEY_ID',
secret_access_key: 'SECRET_ACCESS_KEY',
region: 'REGION'
)
Thats all, this works.
Happy coding :)

AWS::S3::Errors::AccessDenied: Access Denied when trying to do copy_to

I have written a rake task that does an copy_to from one directory in a bucket to another directory within the same bucket. When I test it locally it works fine, but when I deploy it to an environment it returns AWS::S3::Errors::AccessDenied: Access Denied. I assume that it has something to do with the AWS credentials on the environment I am deploying too, I am also confident that the problem is to do with the copy_to as I accessed the bucket from the rails console and had no issues
my copy from statement is as follows
creds = YAML::load_file(Rails.root.join("config", "s3.yml"))
AWS.config(aws_access_key_id: creds[:access_key_id],
aws_secret_access_key: creds[:secret_access_key])
s3.buckets['test-bucket'].objects['path to file'].copy_to('new_path')
The parameters to AWS.config are access_key_id and secret_access_key, without the aws_ prefix.
http://docs.aws.amazon.com/AWSRubySDK/latest/AWS.html#config-class_method
Found this because I also received Access Denied when calling copy_to(). While older SDK versions were happy to accept a pure key path as parameter to copy_to, newer versions require you to specify the bucket, too.
In my case
s3_bucket.object(old_key).copy_to(new_key)
did not work and produced a rather unhelpful "Access Denied" error with the v3 SDK. Instead, this works:
s3_bucket.object(old_key).copy_to( s3_bucket.object(new_key) )
or
s3_bucket.object(old_key).copy_to(bucket_name+'/'+new_key)
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key', :bucket_name => 'target-bucket')
A simplified example using the aws-sdk gem:
AWS.config(:access_key_id => '...', :secret_access_key => '...')
s3 = AWS::S3.new
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key')

file duplication using asset_sync gem on rails 4

I'm having some problems with assets on Heroku (rails) and am hoping someone can point me in the right direction. I've got the asset_sync gem installed, and after many hours of debugging I've finally got it working. However, when I first run (with an empty S3 bucket) "git push heroku master", I get about 4 copies of every file uploaded to s3 (each with a different hash appended). Also, somehow a lot of files I previously deleted (and are no longer in my app/assets/images directory) are still somehow getting uploaded. I've deleted the public/assets folder on my local copy & pushed to git, but perhaps that folder is still there on heroku? How do I debug this? I want my assets to be properly sync'd, so if I delete an image while developing locally, it will also be removed from s3 when I next deploy.
Another possibly related problem, my static error pages (public/404.html) are not getting served on heroku, yet work fine on development- are these static html files treated as assets and meant to be uploaded to S3 too?
Running heroku run rake assets:precompile does nothing. My asset_sync.rb initializer is:
if defined?(AssetSync)
AssetSync.configure do |config|
config.fog_provider = 'AWS'
config.aws_access_key_id = 'key'
config.aws_secret_access_key = 'key'
config.fog_directory = 'bucketname'
config.fog_region = 'us-east-1'
config.existing_remote_files = "delete"
end
end
I know I should be using environment variables but it shoudln't make any difference hardcoding my access details at least while I'm testing
Thanks for any help.

Rails]Initializing for Fog in development environment

I deploy my rails app onto Heroku, and I've set aws access keys on the server as environment variables. However, to test my application in development environment, I need to initialize them somewhere on my local machine. So I decided to the following.
/config/initailizers/init_aws_locally.rb
ENV['AWS_ACCESS_KEY_ID'] = 'my key'
ENV['AWS_SECRET_ACCESS_KEY'] = 'my secret key'
This file is added in .gitignore
However, when I upload in development environment, I get this error message:
Missing required arguments: aws_access_key_id, aws_secret_access_key
I think somehow I overlooked a simple step to include my aws keys in my development environment. But I'm not sure why the reason for the error when I already initialized the keys.
For your reference, I'm using carrierwave, S3, and Fog.
config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'], # required
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'], # required
:region => 'us-east-1', # optional, defaults to 'us-east-1'
}
config.fog_directory = 'd' # required
config.fog_public = true # optional, defaults to true
end
Thank you. I appreciate your help!
Your initializers will be run in alphabetical order. See the docs:
If you have any ordering dependency in your initializers, you can
control the load order by naming. For example, 01_critical.rb will be
loaded before 02_normal.rb.
The problem you're experiencing is that your fog.rb initializer is running before your init_aws_locally.rb one (because f comes before i). So ENV['AWS_ACCESS_KEY_ID'] has not been defined (yet) when you set fog_credentials.
I'd avoid putting any credentials in code. It's such a terrible idea and Heroku has the right idea. So what I do is use RVM and put a file .rvmrc in my project folder. I put .rvmrc in .gitignore as well.
Then you edit .rvmrc to have
export AWS_ACCESS_KEY_ID="BLAH"
so on and so forth. Anytime I "cd" into this directory my env is setup for me for that project by RVM. If you're not using RVM there is other alternatives out there.
rails s
and it will have all the environment variables setup that you put in your .rvmrc script. No need for initializer or development only yaml config files that you keep out of source control. From my experience this is the simplest solution.
I went to my shell and typed:
$ echo $AWS_SECRET_ACCESS_KEY
and it came back blank. It turns out I recently moved to a new virtual machine and forgot to add this to the .bashrc file. It's worth checking the shell environment just in case. Once I added those two lines to my .bashrc, everything was happy again.

Resources