Switch AWS Accounts in the Ruby Shell - ruby-on-rails

I'm trying to access an s3 bucket from within the interactive ruby shell, using different AWS credentials than the ones the application is configured with.
I've tried manually setting a new s3 client using the other key/secret, but I get Access Denied as the call defaults to using the application's preconfigured AWS account. Modifying the application's configured credentials is not an option, as it's needed to simultaneously access different AWS resources.
Here's what I'm trying in the ruby shell:
s3_2 = AWS::S3.new(:access_key_id => "<key>", :secret_access_key => "<secret>")
bucket = s3_2.buckets['<bucket_name>']
bucket.objects.each do |obj|
puts obj.key
end
(The test just does a get to confirm access, it works if I use public access on the bucket because it allows any AWS user, but not when I restrict it and try to use the new temporary user that has s3 full access on the account.)

The Rails console should be a separately running instance of the app from the server instance using the pre-configured credentials.
The following should update the credentials for the rails console session only.
Aws.config.update({credentials:Aws::Credentials.new('your_access_key_id','your_secret_access_key')})

A new AWS S3 client should be initialized:
s3_2 = Aws::S3::Client.new(:access_key_id => "<key>", :secret_access_key => "<secret>")

Related

Create correct access_key_id and secret_access_key in Amazon SES for aws-ses gem

How to correctly create and setup access_key_id and secret_access_key in Amazon Simple Email Service (SES), for aws-ses gem? In the description of the gem it is written to provide exactly them in the credentials file, but I can't figure out how to create them.
My configuration for aws-ses gem:
# config/initializers/amazon_ses.rb
ActionMailer::Base.add_delivery_method :ses, AWS::SES::Base,
:server => 'email.us-west-2.amazonaws.com',
:access_key_id => Rails.application.credentials.aws[:access_key_id],
:secret_access_key => Rails.application.credentials.aws[:secret_access_key]
I configured the SES service itself by adding my personal domain to it and testing sending emails from the Amazon site. To use the service, it has SMTP settings - but they create a completely different type of key, which is not suitable for the aws-ses gem.
I also tried to use create keys when creating a new user through Identity and Access Management (IAM), specifying full access to Amazon SES.
But all this did not help, the Amazon SES service does not work, and when sending messages to SideKiq, I get errors in the form:
AWS :: SES :: ResponseError: InvalidClientTokenId - The security token included in the request is invalid.
There are several ways to specify credentials for the AWS SDK for Ruby. Hopefully the following topic helps you: https://docs.aws.amazon.com/sdk-for-ruby/v3/developer-guide/setup-config.html

Configuring ActiveStorage to use S3 with IAM role

I'm trying to configure ActiveStorage to use S3 bucket as a storage backend however I don't want to pass any of access_key_id, secret_access_key, region. Instead, I'd like to use previously defined IAM role. Such configuration is mentioned here. It reads (I've added bold):
If you want to use environment variables, standard SDK configuration files, profiles, IAM instance profiles or task roles, you can omit the access_key_id, secret_access_key, and region keys in the example above. The Amazon S3 Service supports all of the authentication options described in the AWS SDK documentation.
However I cannot get it working. My storage.yml looks similar to this:
amazon:
service: S3
bucket: bucket_name
credentials:
role_arn: "linked::account::arn"
role_session_name: "session-name"
I've run rails active_storage:install, applied generated migrations and set config.active_storage.service = :amazon in my app's config.
The issue is that when I'm trying to save a file, I'm getting an unexpected error:
u = User.first
s = StringIO.new
s << 'hello,world'
s.seek 0
u.csv.attach(io: s, filename: 'filename.csv')
Traceback (most recent call last):
2: from (irb):3
1: from (irb):3:in `rescue in irb_binding'
LoadError (Unable to autoload constant ActiveStorage::Blob::Analyzable, expected /usr/local/bundle/gems/activestorage-5.2.2/app/models/active_storage/blob/analyzable.rb to define it)
I'm using Rails 5.2.2.
Are you trying this code inside an AWS EC2 instance or locally in your machine?
If you check the authentication methods in AWS: https://docs.aws.amazon.com/sdk-for-ruby/v3/developer-guide/setup-config.html#aws-ruby-sdk-credentials-iam
You'll see the following section:
Setting Credentials Using IAM
For an Amazon Elastic Compute Cloud instance, create an AWS Identity and Access Management role, and then
give your Amazon EC2 instance access to that role. For more
information, see IAM Roles for Amazon EC2 in the Amazon EC2 User Guide
for Linux Instances or IAM Roles for Amazon EC2 in the Amazon EC2 User
Guide for Windows Instances.
This means that for this authentication method to work, you must:
Create an EC2 instance on AWS
Create an EC2 IAM Role with permissions to write to an S3 Bucket
Configure your EC2 instance attaching the new IAM Role to it
With the role attached to the instance, your config/storage.yml file will look like this:
amazon:
service: S3
bucket: test-stackoverflow-bucket-app
region: "us-west-1"
Note that region is a required parameter, you'll get an error if you skip it: https://github.com/aws/aws-sdk-ruby/issues/1240#issuecomment-231866239
I'm afraid this won't work locally, to use active_storage locally you must set the access_key_id, secret_access_key values.

AWS::S3::Errors::AccessDenied: Access Denied when trying to do copy_to

I have written a rake task that does an copy_to from one directory in a bucket to another directory within the same bucket. When I test it locally it works fine, but when I deploy it to an environment it returns AWS::S3::Errors::AccessDenied: Access Denied. I assume that it has something to do with the AWS credentials on the environment I am deploying too, I am also confident that the problem is to do with the copy_to as I accessed the bucket from the rails console and had no issues
my copy from statement is as follows
creds = YAML::load_file(Rails.root.join("config", "s3.yml"))
AWS.config(aws_access_key_id: creds[:access_key_id],
aws_secret_access_key: creds[:secret_access_key])
s3.buckets['test-bucket'].objects['path to file'].copy_to('new_path')
The parameters to AWS.config are access_key_id and secret_access_key, without the aws_ prefix.
http://docs.aws.amazon.com/AWSRubySDK/latest/AWS.html#config-class_method
Found this because I also received Access Denied when calling copy_to(). While older SDK versions were happy to accept a pure key path as parameter to copy_to, newer versions require you to specify the bucket, too.
In my case
s3_bucket.object(old_key).copy_to(new_key)
did not work and produced a rather unhelpful "Access Denied" error with the v3 SDK. Instead, this works:
s3_bucket.object(old_key).copy_to( s3_bucket.object(new_key) )
or
s3_bucket.object(old_key).copy_to(bucket_name+'/'+new_key)
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key', :bucket_name => 'target-bucket')
A simplified example using the aws-sdk gem:
AWS.config(:access_key_id => '...', :secret_access_key => '...')
s3 = AWS::S3.new
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key')

How to pull down assets locally from S3 Bucket - via Heroku

The only tool I could find, I forked and tried to update to include the S3_REGION because I was getting
$ The bucket you are attempting to access must be addressed using the specified endpoint
These are all the variables I am passing to access the bucket.
opts[:s3_key] =======> AKIAJHXXG*********YA
opts[:s3_secret] =======> uYXxuA*******************pCcXuT61DI7po2
opts[:s3_bucket] =======> *****
opts[:output_path] =======> /Users/myname/Desktop/projects/my_project/public/system
opts[:s3_region] =======> s3-us-west-2.amazonaws.com
https://github.com/rounders/heroku-s3assets has not been update in a while so Im assuming I just can't find where the actual error is breaking either in Heroku tools, or the older aws-s3 gem.
Anyone have any method to pull down production assets to Heroku server from AmazonS3?
I think I mis-understood you, so editing now...maybe experiment with something simpler:
http://priyankapathak.wordpress.com/2012/12/28/download-assets-from-amazon-s3-via-ruby/
My search returned this info:
Bucket is in a different region
The Amazon S3 bucket specified in the COPY command must be in the same
region as the cluster. If your Amazon S3 bucket and your cluster are
in different regions, you will receive an error similar to the
following:
ERROR: S3ServiceException:The bucket you are attempting to access must be addressed using the specified endpoint.
You can create an Amazon S3 bucket in a specific region either by
selecting the region when you create the bucket by using the Amazon S3
Management Console, or by specifying an endpoint when you create the
bucket using the Amazon S3 API or CLI. For more information, see
Uploading files to Amazon S3.
For more information about Amazon S3 regions, see Buckets and Regions
in the Amazon Simple Storage Service Developer Guide.
Alternatively, you can specify the region using the REGION option with
the COPY command.
http://docs.aws.amazon.com/redshift/latest/dg/s3serviceexception-error.html
So it turns out that gem was all but useless. I've gotten further to my goal of downloading all my s3 assets to public/system - but still can not figure out how to download them to my correct local rails directory using the aws s3 docs - http://docs.aws.amazon.com/AWSRubySDK/latest/AWS/S3/S3Object.html
s3 = AWS::S3.new(access_key_id: 'AKIAJH*********PFYA', secret_access_key: 'uYXxuAMcnKODn***************uT61DI7po2', s3_endpoint: 's3-us-west-2.amazonaws.com')
s3.buckets['advlo'].objects.each do |obj|
puts obj.inspect
end
I probably just need to read more unix commands and scp them over individually or something. Any ideas?

Rails/ Heroku: Setting up code to read variables at run time

I made a website that uses the Twitter Ruby gem. On local host, I can get the Twitter gem to work fine, but when I deployed it to Heroku, I'm having trouble signing in via Twitter.
Heroku provides instructions (using Amazon S3 variables) about adding the CONSUMER_KEY and the CONSUMER SECRET
$ cd myapp
$ heroku config:add S3_KEY=some_key_here S3_SECRET=some_secret_here
I did that.
Then when I go to test sign in, I get this in the url. This url is the same as when (on a local host) I forget to add CONSUMER_KEY etc, so I'm thinking that I didn't set up the CONSUMER_KEY and CONSUMER_SECRET properly on Heroku...
Heroku provides further details about setting up a file in config/initalizers to read the variables at runtime, but I think the Github project I forked and then adapted already has this set up https://github.com/sferik/sign-in-with-twitter/blob/master/config/initializers/omniauth.rb so I'm not sure what's going on.
Set up your code to read the vars at runtime in config/initializers/s3.rb:
AWS::S3::Base.establish_connection!(
:access_key_id => ENV['S3_KEY'],
:secret_access_key => ENV['S3_SECRET']
)
UPDATE
error message in Heroku logs when I try to signin via Twitter. Note, i can sign in via local host.
2012-01-02T20:01:43+00:00 app[web.1]:
2012-01-02T20:01:43+00:00 app[web.1]: Started GET "/auth/twitter?utf8=%E2%9C%93" for 64.46.7.250 at 2012-01-02 20:01:43 +0000
2012-01-02T20:01:44+00:00 app[web.1]:
2012-01-02T20:01:44+00:00 app[web.1]: OAuth::Unauthorized (401 Unauthorized):
2012-01-02T20:01:44+00:00 app[web.1]:
This could be one of two things.
Either your keys are not setup correctly, or the route that the Twitter OAuth callback comes back to is not correct (in that it is not processing correctly).
The easiest to check is your Heroku config:
heroku config --app your_app_name
This should show CONSUMER_KEY and CONSUMER_SECRET (I'm assuming your use of S3_KEY and S3_SECRET above are placeholders)
Should this config be correct, and it still doesn't work you'll need to dig into the processing of the request that comes back to your application. It seems to be processing /auth/twitter as it should, but something it not correct within this step.

Resources