Travis CI deploy to S3 bucket not working with secure keys - travis-ci

I have a static website and I'm trying to use Travis CI to migrate content to the S3 bucket where I'm hosting the website each time I commit changes to GitHub. To support this, I have the following .travis.yml file:
language: python
python: '2.7'
install: true
script: true
deploy:
provider: s3
access_key_id: XXXXX
secret_access_key: YYYYY
bucket: thug-r.life
skip_cleanup: true
region: us-east-1
local_dir: public
which works fine. Except I have my secret in plain text on GitHub in a public repo. So...that's bad. Travis CI has a section on encrypting keys (https://docs.travis-ci.com/user/encryption-keys/) which I followed. Using the CLI tool
travis encrypt secret_access_key="YYYYY" --add
which inserts at the bottom of my file
env:
global:
secure: ZZZZZ
So I tried to modify my original file to look like
deploy:
secret_access_key:
secure: ZZZZZ
But then Travis CI complains that the 'The request signature we calculated does not match the signature you provided.'
So I tried encrypting without quotes
travis encrypt secret_access_key=YYYYY --add
and using the output in the same way.
How am I supposed to include the encrypted key?

All of the examples in the Travic CI help on encrypting keys (https://docs.travis-ci.com/user/encryption-keys/) was of the form:
travis encrypt SOMEVAR="secretvalue"
which it states encrypts the key as well as the value. So, taking the output of the above encryption and using it like above
deploy:
secret_access_key:
secure: ZZZZZ
decrypts to be
deploy:
secret_access_key: secret_access_key: YYYYY
which is what was causing the errors. Instead, what I ended up doing that worked was:
travis encrypt "YYYYY" --add
and used it in the .travis.yml file as
deploy:
secret_access_key:
secure: ZZZZZ
which ended up being accepted.
tl;dr Don't include the key when encrypting the secure_access_key

Related

Production Server Uploading to Staging S3

I have an app running on a ubuntu server. I have a production mode and a staging mode.
Problem is that actions being done on the production site relative to uploading and retrieving images from an S3 bucket are being done to the same bucket as my staging. When I have my configurations set up differently.
production.rb
config.s3_bucket = 'bucket-production'
config.s3_path = 'https://bucket-production.s3.us-east-2.amazonaws.com/'
staging.rb && development.rb
config.s3_bucket = 'bucket-staging'
config.s3_path = 'https://bucket-staging.s3.us-east-2.amazonaws.com/'
storage.yml
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: bucket-staging
endpoint: http://bucket-staging.us-east-2.amazonaws.com
I'm thinking it could be something with storage.yml but I deleted this entire file and restarted the localhost server and it didn't change anything. Is storage.yml production only?
Also, my logs are logging to staging from production.
I would like to ask is prod server/staging server(Ubuntu) running in AWS.if yes, you must have IAM ROLE attached to the server which must be fine-grained to which env application should access which bucket of S3. Also storing access_key_id and secret access key id should not be used as best practice. IAM role can take care of that. Also, I would like to add if the server is in the private subnet you need to explore either NAT gateway or use VPC S3 Endpoint to access the bucket.
Also try printing logs of S3 connection in prod mode how it is acquiring cred to access bucket. You might be granting access using some ENV variables or IAM role. Best way to see is use
printenv
before and after of S3 connection to see the variables and which bucket access is provided.
Thanks
Ashish

travis-ci 401 - Bad credentials // See: https://developer.github.com/v3 (Octokit::Unauthorized)

I created a GitHub personal access token with repo, read:repo_hook, and user:email scopes, then encrypted the token with travis encrypt, then pasted that string into my .travis.yml file in the deploy block, but when I tag a release, my job fails with:
Installing deploy dependencies
/home/travis/.rvm/gems/ruby-2.4.5/gems/octokit-4.6.2/lib/octokit/response/raise_error.rb:16:in `on_complete': GET https://api.github.com/user: 401 - Bad credentials // See: https://developer.github.com/v3 (Octokit::Unauthorized)
I know the encrypted string works, because it does successfully upload the files to my release tag and the github logs show that the key was accessed this week, but I can't figure out why this error is happening.
Here is a link to the build: https://travis-ci.com/github/benkonz/gameboy_emulator/jobs/331211965
Here is a link to the repo: https://github.com/benkonz/gameboy_emulator
i have the same issues,
the reason is my $GITHUB_TOKENis set in travis settings.
but my .travis.yml is set in secure mode.
deploy:
...
api_key:
secure: $GITHUB_TOKEN
...
so , i change my .travis.yml
deploy:
...
api_key: $GITHUB_TOKEN
....
sloved my problem.

Failing Travis CI tests when using secret keys in rails

I am using environment variables in secrets.yml for production environment in my rails app. I am sending http post request with api key and password. I can pass my local tests in test environment by using the password. But my password can't be exposed, so how do I pass travis ci tests on github?
You can encrypt your secrets.yml and push encrypted file to the repository.
travis encrypt-file secrets.yml
which will give you secrets.yml.enc add it to repository. Remember not to push secrets.yml.
You need to decrypt that file in before_script
before_script: openssl aes-256-cbc -K $encrypted_0a6446eb3ae3_key -iv $encrypted_0a6446eb3ae3_key -in secrets.yml.enc -out secrets.yml -d
You can directly add above command to travis.yml using --add option:
travis encrypt-file secrets.yml --add
Refer this documentation for more details - Encrypting Files in Travis

Cannot upload files from S3 Development Bucket in Rails App

When running my rails app in development mode from Nitrous.io, I cannot access my development bucket which I set up on AWS S3. The upload button opens my personal computer, from where I don't want to load files. (even when I try to load files from my computer, I get a long error message stating "The request signature we calculated does not match the signature you provided. Check your key and signing method"
I think I don't have AWS S3 configured properly.
Currently, I have one IAM user, which I've assigned to AdministratorAccess Also, I am using the proper AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in my application.yml file. In fog.rb I have it read from the enviroment.
I should add too that I currently enrolled in a web development apprenticeship program.
Sorry for not showing my files
Here's my application.yml with the sensitive data taken out:
SENDGRID_PASSWORD: alphanumberic
SENDGRID_USERNAME: -------#heroku.com
AWS_ACCESS_KEY_ID: alphanumeric
AWS_SECRET_ACCESS_KEY: alphanumeric
development:
AWS_BUCKET: vmanamino-bloccit-development
production:
AWS_BUCKET: vmanamino-bloccit-production
development:
secret_key_base: alphanumeric
test:
secret_key_base: alphanumeric
Here's my fog.rb file which reads the values from the environment
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
aws_secret_access_key: ENV['AWS_ACCESS_KEY_ID'],
}
config.fog_directory = ENV['AWS_BUCKET']
config.fog_public = true
end
You're using the AWS_ACCESS_KEY_ID environment variable for both the access key and the secret access key whereas the latter should of course be using ENV['AWS_SECRET_ACCESS_KEY']
I didn't realize I needed to enclose the KEY ID and the SECRET KEY in quotes. Once I did that, I got it to work. I can upload images from my conputer to S3. Also, I didn't understand the assignment perfectly. I thought my app would upload images from S3. Now, the error raised earlier makes sense. I upload from my computer sending the image to S3.

Where should server environment variables be stored in Ruby on Rails?

Where should I store keys specific to my development, test, production servers in my ruby project? For example where should store my development-specific amazon s3 secret and key? my config/development.rb file? One issue I see with that is if the file was a part of a public github project it would show for everyone.
Thanks!
You store separate environment variables in config/development.rb, config/testing.rb and config/production.rb respectively.
However, if your files will be stored in a public git repo, you do not want to hardcode any sensitive information into them. The best method is to use either yaml files that are part of your .gitignore or to use Environment variables in your shell. I prefer the latter, like this:
PAPERCLIP_OPTIONS = { storage: :s3,
bucket: ENV['S3_BUCKET'],
s3_credentials: { access_key_id: ENV['S3_KEY'],
secret_access_key: ENV['S3_SECRET'] }}
You then just set the environment variables on the system that is running the app.
If you use the yaml config file method, you must add the sensitive config files to your .gitignore file. Otherwise they will still be uploaded to your public repo.
If you look at config directory, you will see a YAML file, containing database credential information. You could do the same for your cloud environments.
development:
server: xiy-234
username: hello
password: 1325abc
production:
...
You can put those information inside a .yml file in the config directory.
For instance:
production:
access_key_id: xxx
secret_access_key: xx
bucket: xxx
development:
access_key_id: xxx
secret_access_key: xxx
bucket: xxx
staging:
access_key_id: xxx
secret_access_key: xxx
bucket: xxx
Once this is done, you have to store those keys in a Hash by doing the following:
APIS_CONFIG = {'amazons3' => YAML.load_file("#{RAILS_ROOT}/config/amazons3.yml")[Rails.env]}
(You can put the previous line of code inside a .rb file situated in the config/initializers directory)
Note that you might find this Railscast interesting.

Resources