Rails Generate controller aws error missing bucket name - ruby-on-rails

I am attempting to create a Users controller in my ruby on rails project, which I have also configured with heroku and an aws-s3 bucket. I set my .env and my heroku local with the S3_BUCKET, AWS_ACCESS_KEY_ID, and AWS_SECRET_ACCESS_KEY. I also set my initializer/aws.rb file to look like this:
Aws.config.update({
region: 'us-east-1',
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
})
S3_BUCKET = Aws::S3::Resource.new.bucket(ENV['S3_BUCKET'])
I have bundle installed the aws gem like this:
gem 'aws-sdk', '~> 3'
However when I run the command
rails g controller Users new
I get the following error in my terminal:
aws-sdk-s3/bucket.rb:658:in `extract_name': missing required option :name (ArgumentError)
I looked at that file and it is trying to find the S3 bucket name, but I have set that already in .env and heroku local. Is there some other place where this needs to be set? None of the guides I have read mention this error.

Hi please check whether you have specified the right credentials and bucket name. Also, make sure you have provided the right region .Try the below code
s3 = Aws::S3::Resource.new(
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
region: 'us-west-1'
)
obj = s3.bucket(ENV['S3_BUCKET']).object('key')
If you want to upload file or something
obj.upload_file(file, acl:'public-read')

This would help you, I have used like this in my project.
1. Create the file aws.rb in your /config/initializers folder.
2. Then copy the below code,
S3Client = Aws::S3::Client.new(
access_key_id: 'ACCESS_KEY_ID',
secret_access_key: 'SECRET_ACCESS_KEY',
region: 'REGION'
)
Thats all, this works.
Happy coding :)

Related

Sitemap generator not uploading sitemap

I started playing with a test app trying to upload sitemaps using Amazon S3. I've been following https://github.com/kjvarga/sitemap_generator trying to figure out the gem and have only been half successful. A sitemap will generate in the public folder, but not upload to the S3 bucket.
I've added the config/sitemap.rb found in the tutorial above.
require 'rubygems'
require 'sitemap_generator'
require 'aws-sdk'
SitemapGenerator::Sitemap.create_index = true
SitemapGenerator::Sitemap.default_host = 'https://www.myapp.herokuapp.com'
SitemapGenerator::Sitemap.create do
add '/home', :changefreq => 'daily', :priority => 0.9
end
SitemapGenerator::Sitemap.adapter = SitemapGenerator::S3Adapter.new(fog_provider: 'AWS',
aws_access_key_id: 'KEY',
aws_secret_access_key: 'SECRET',
fog_directory: 'DIR',
fog_region: 'REGION')
I type in
rails sitemap:refresh
in my terminal and it generates the maps. It just doesn't upload them. No errors, no clues as to what didn't happen, nothing. It even tells me that google and bing are successfully pinged.
Of course I can visit my AWS bucket and manually upload these files but that feels...wrong. I've used shrine for images in the past and am used to uploading to a cache. There must be something I missed.
Check your secrets, maybe you don't have the aws account env vars so S3 adapter would never works or policy in your bucket

Rails Active Storage push to DigitalOcean Spaces

Hi I'm trying to get active storage to push to a DigitalOcean space. However, I'm finding that the push url is being changed to amazonaws.com even though I've defined the endpoint to digital ocean.
here is what I have in storage.yml
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: sfo2
bucket: redacted_bucket_name
endpoint: https://sfo2.digitaloceanspaces.com
When I try to upload a file, I get the following error:
Aws::Errors::NoSuchEndpointError (Encountered a `SocketError` while attempting to connect to:
https://redacted_bucket_name.s3.sfo2.amazonaws.com/a8278561714955c23ee99
in my gemfile I have: gem 'aws-sdk-s3
I've followed the directions found here, and I'm still getting the error. Is it possible that there's a new way to do this?
I just set something like this up myself a few days ago. When you check the URL https://redacted_bucket_name.s3.sfo2.amazonaws.com/a8278561714955c23ee99 it's different from the actual endpoint you set up https://redacted_bucket_name.sfo2.amazonaws.com/a8278561714955c23ee99
the error is being caused by an invalid endpoint your hitting, the s3 right before the .sfo2 is offsetting the endpoint. Did you happen to add s3 to your spaces config? check your spaces dashboard and try to get the endpoint setup properly.
I had this same challenge when working on a Rails 6 application in Ubuntu 20.04.
Here's how I fixed mine:
Firstly, create a Spaces access keys in your digital ocean console. This link should help - DigitalOcean Spaces API
Secondly, add a new configuration for DigitalOcean Spaces in your config/storage.yml file. Just after the local storage definition:
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
digital_ocean:
service: S3
access_key_id: <%= SPACES_ACCESS_KEY_ID %>
secret_access_key: <%= SPACES_SECRET_ACCESS_KEY %>
region: <%= SPACES_REGION %>
bucket: <%= SPACES_BUCKET_NAME %>
endpoint: <%= SPACES_ENDPOINT %>
Note: You can give your entry any name, say digital_ocean_spaces or something else. For me I named it digital_ocean.
Thirdly, modify the config.active_storage.service configuration in the config/environments/production.rb file from:
config.active_storage.service = :local
to
config.active_storage.service = :digital_ocean
Finally, specify these environment variables file in your config/application.yml file (if you're using the Figaro gem) or your .env file. (if you're using the dotenv gem). In my case I was using the dotenv gem, so my .env file looked looked like this:
SPACES_ACCESS_KEY_ID=E4TFWVPDBLRTLUNZEIFMR
SPACES_SECRET_ACCESS_KEY=BBefjTJTFHYVNThun7GUPCeT2rNDJ4UxGLiSTM70Ac3NR
SPACES_REGION=nyc3
SPACES_BUCKET_NAME=my-spaces
SPACES_ENDPOINT=https://nyc3.digitaloceanspaces.com
That's all.
I hope this helps

Rails 5.2 credentials.yaml.enc and master.key not working on Heroku

I'm setting up active storage for a new app, and haven't been able to get the app running on production after setting up my amazon credentials.
I've included my s3 bucket credentials in my credentials.yaml.enc file
I've added my RAILS_MASTER_KEY env variable to Heroku.
I've set up my s3 bucket in the storage.yml file according to this.
I've added the config.active_storage.service = :amazon line to my production.rb.
I've added config.require_master_key = true to my production.rb
When I try running my app on Heroku, it doesn't load. Doing $ Heroku run rails console gives me the error:
"/app/vendor/bundle/ruby/2.3.0/gems/aws-sigv4-1.0.2/lib/aws-sigv4/signer.rb:517:in `extract_credentials_provider': Cannot load `Rails.config.active_storage.service`: (Aws::Sigv4::Errors::MissingCredentialsError)
missing credentials, provide credentials with one of the following options:
- :access_key_id and :secret_access_key
- :credentials
- :credentials_provider"
As far as I can tell I've set up my credentials the way Rails 5.2 intended. I've tried all sorts of asset precompiling stuff to no avail. When I try adding my amazon credentials as env. variables in Heroku, the app runs fine in production. Any idea what could be going wrong here?
Could it be that you forgot to add config.require_master_key = true to your production.rb?
I’ve had this problem before, seems to be a bug on Heroku.
You should just set up your environment variables through the dashboard on Heroku on the settings tab.
Then you can access that using ENV[‘NAME_OF_YOUR_VARIABLE’]
This fixed my problem.
Also check your Heroku logs very well, by scrolling up to ensure all gems were installed.
Double check that you have the correct key in your config/credentials.yml.enc file. I had one key inverted- secret_key_access instead of secret_access_key, and was getting the same error. Fixing the key name in config/credentials.yml.enc fixed it for me.
In your rails console (locally), run:
Rails.application.credentials.dig(:aws, :access_key_id)
and
Rails.application.credentials.dig(:aws, :secret_access_key)
to make sure they have values.
Welp, this was dumb. Mystery solved. My credentials were commented out in the credentials.yaml.enc file - I was adding them to the top of the file with the default aws example, which is commented out.

Fog-Google, Unable to Push to Heroku

So i am trying to set up uploading of files in my production environment. I am currently using CarrierWave along with Fog-Google. I have no issues storing files locally as i do not use Fog for development. However i am currently trying to test out the file uploading functionality in production, but i cannot even push my app up to Heroku.
Here's a snippet of the error i'm receiving when attempting to push to Heroku.
[fog][WARNING] Unrecognized arguments: google_storage_secret_access_key_id, google_storage_secret_access_key
rake aborted!
ArgumentError: Invalid keyfile or passphrase
Now i am relatively new to setting up ENV secret ids and all of that. So instead i'll just say what i know and what i have done just to be sure that i did everything correctly.
So as i am currently using Cloud9 IDE, in my .bashrc file i have
export GOOGLE_STORAGE_ACCESS_KEY_ID=XXXXX
export GOOGLE_STORAGE_SECRET_ACCESS_KEY=XXXXX
export GOOGLE_STORAGE_BUCKET_NAME=XXXXX
In my /config/initializers/carrierwave.rb
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_provider = 'fog/google' # required
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
google_storage_secret_access_key: ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['GOOGLE_STORAGE_BUCKET_NAME']
end
and in my /config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'XXXX',
google_client_email: 'XXXXXX',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID"],
google_storage_secret_access_key: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY"]
)
Like mentioned i am actually quite new to all of these so i've tried my best in following the documentation on both Fog and CarrierWave's github page.
As far as i know i should use .bashrc to store my secret keys etc and then call on them using ENV['SECRET_KEY_NAME'] method. I've set up both the CarrierWave.rb and Fog.rb files in the initializer folder so i'm not quite sure what i'm missing as well.
Additionally i have also tried doing heroku config:set GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID=XXXXXX but that didn't seem to work as well.
I'm not quite sure what to do now and what may be causing the error when attempting to push to Heroku, much less whether any of this is even working in production.
EDIT:
I think the error is largely from the fog.rb file. So i amended it to the following:
GoogleStorage = Fog::Storage::Google.new(
google_project: 'XXX',
google_client_email: 'XXX',
google_json_key_location: '~/.fog/XXXX.json'
)
And now when i try pushing to Heroku the error i get instead is
Errno::ENOENT: No such file or directory # rb_sysopen - /app/.fog/XXX.json
Just to share, i created a .fog folder under the ~ directory. Inside the .fog folder i added in the Private JSON key.
All help and advice would be greatly appreciated. Thank you!
So i've solved the problem and managed to successfully push my code to Heroku.
[Kindly note that this does not mean it functions perfectly in production, it just means i am now able to push to Heroku without any errors]
So there were 2 main errors.
1) Not including my Private Key JSON file in my app.
2) Error in the config/initializers/fog.rb file
To solve the first issue, i created a .fog folder in my App and added in my Private Key JSON file which was from Google's Cloud Platform.
Next i amended the code in config/initializers/fog.rb to:
GoogleStorage = Fog::Storage::Google.new(
:google_project => 'XXXX',
:google_client_email => 'XXXXX',
:google_json_key_location => Rails.root.join('.fog/XXXX'),
:google_storage_access_key_id => ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
:google_storage_secret_access_key => ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
)
I was then able to successfully push my code to Heroku.

Rails]Initializing for Fog in development environment

I deploy my rails app onto Heroku, and I've set aws access keys on the server as environment variables. However, to test my application in development environment, I need to initialize them somewhere on my local machine. So I decided to the following.
/config/initailizers/init_aws_locally.rb
ENV['AWS_ACCESS_KEY_ID'] = 'my key'
ENV['AWS_SECRET_ACCESS_KEY'] = 'my secret key'
This file is added in .gitignore
However, when I upload in development environment, I get this error message:
Missing required arguments: aws_access_key_id, aws_secret_access_key
I think somehow I overlooked a simple step to include my aws keys in my development environment. But I'm not sure why the reason for the error when I already initialized the keys.
For your reference, I'm using carrierwave, S3, and Fog.
config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'], # required
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'], # required
:region => 'us-east-1', # optional, defaults to 'us-east-1'
}
config.fog_directory = 'd' # required
config.fog_public = true # optional, defaults to true
end
Thank you. I appreciate your help!
Your initializers will be run in alphabetical order. See the docs:
If you have any ordering dependency in your initializers, you can
control the load order by naming. For example, 01_critical.rb will be
loaded before 02_normal.rb.
The problem you're experiencing is that your fog.rb initializer is running before your init_aws_locally.rb one (because f comes before i). So ENV['AWS_ACCESS_KEY_ID'] has not been defined (yet) when you set fog_credentials.
I'd avoid putting any credentials in code. It's such a terrible idea and Heroku has the right idea. So what I do is use RVM and put a file .rvmrc in my project folder. I put .rvmrc in .gitignore as well.
Then you edit .rvmrc to have
export AWS_ACCESS_KEY_ID="BLAH"
so on and so forth. Anytime I "cd" into this directory my env is setup for me for that project by RVM. If you're not using RVM there is other alternatives out there.
rails s
and it will have all the environment variables setup that you put in your .rvmrc script. No need for initializer or development only yaml config files that you keep out of source control. From my experience this is the simplest solution.
I went to my shell and typed:
$ echo $AWS_SECRET_ACCESS_KEY
and it came back blank. It turns out I recently moved to a new virtual machine and forgot to add this to the .bashrc file. It's worth checking the shell environment just in case. Once I added those two lines to my .bashrc, everything was happy again.

Resources