Sitemap_generator fails to upload - ruby-on-rails

I've followed the instructions on a couple of pages for getting a sitemap to generate and be uploaded to my S3 Bucket. The sitemap is generating, but not uploading.
I'm using carrierwave for the upload, which is working fine for image uploads.
The key file seems to be config/sitemap.rb. Here's mine:
require 'rubygems'
require 'sitemap_generator'
# Set the host name for URL creation
SitemapGenerator::Sitemap.default_host = "https://www.driverhunt.com"
# pick a place safe to write the files
SitemapGenerator::Sitemap.public_path = 'tmp/'
# store on S3 using #Fog# Carrierwave
SitemapGenerator::Sitemap.adapter = SitemapGenerator::WaveAdapter.new
# SitemapGenerator::Sitemap.adapter = SitemapGenerator::S3Adapter.new
# This is a different problem to the one in the question, but using this second adaptor gives the error: "...lib/fog/storage.rb:27:in `new': is not a recognized storage provider (ArgumentError)"
# inform the map cross-linking where to find the other maps
SitemapGenerator::Sitemap.sitemaps_host = "http://#{ENV['S3_BUCKET']}.s3.amazonaws.com/"
# pick a namespace within your bucket to organize your maps
SitemapGenerator::Sitemap.sitemaps_path = 'sitemaps/'
SitemapGenerator::Sitemap.create do
add '/home', :changefreq => 'daily', :priority => 0.9
# add '/contact_us', :changefreq => 'weekly'
end
# SitemapGenerator::Sitemap.ping_search_engines # Not needed if you use the rake tasks
What's going on? How do I debug a carrierwave upload?

I will answer the question as your comment for the S3Adapter brought me to this topic while I was googling the not recognized provider. If you turn back on the comment using the S3Adapter and do the following you will get it working.
If you do not specify any fog ENV VARS for the fog-aws gem you will get the error:
ArgumentError: is not a recognized provider
by using as an adapter the SitemapGenerator::S3Adapter.new
The setup you have got above is perfectly fine, just use the S3Adapter.new instead of the WaveAdapter!
The error you are getting (and I was getting as well) is due to the fact that SitemapGenerator::S3Adapter uses fog-aws and in order to make it run by default you should have the following ENV VARS:
ENV['AWS_ACCESS_KEY_ID'] = XXX
ENV['AWS_SECRET_ACCESS_KEY'] = XXX
ENV['FOG_PROVIDER'] = AWS
ENV['FOG_DIRECTORY'] = your-bucket-name
ENV['FOG_REGION'] = your-bucket-region (ex: us-west-2)
If you are missing even one of the following you will get the error:
ArgumentError: is not a recognized provider
Alternativelly, if you want to avoid using ENV VARS for some reason you should specify the values when you initialize your adapter as follows:
SitemapGenerator::Sitemap.adapter = SitemapGenerator::S3Adapter.new(fog_provider: 'AWS',
aws_access_key_id: 'your-access-key-id',
aws_secret_access_key: 'your-access-key',
fog_directory: 'your-bucket',
fog_region: 'your-aws-region')
However using just the above ENV VARS you will be fine and get your sitemap up and running. This setup was tested with sitemap_generator version: 5.1.0
For your question:
The Image uploading works as it does not require the exact same configuration as the WaveAdapter. I am guessing that your carrierwave.rb file is missing the following:
config.cache_dir = "#{Rails.root}/tmp/"
config.permissions = 0666
The complete configuration for the carrierwave initializer can be found here:
Generate Sitemaps on read only filesystems like Heroku (check if you are missing something or use the other adapter)
However, I believe that your problem has to do with missing ENV VARS from the production environment.

Related

ENV variables when doing Rails commands (migration, others)

Using Carrierwave and fog and everything working fine with AWS but when I try and do a migration and some other rails commands I get:
lib/fog/core/service.rb:244:in validate_options: Missing required arguments: aws_access_key_id, aws_secret_access_key (ArgumentError)
This also happens with the Rails console. I think for some reason rails is not able to access my ENV variables for some reason? But it works when running as part of a Rails server...
Any thoughts on why? aws key defined as below:
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws'
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY'],
aws_secret_access_key: ENV['AWS_SECRET'],
region: 'eu-west-2'
}
config.fog_directory = 'images' # bucket name
config.cache_dir = "#{Rails.root}/tmp/uploads" # To let CarrierWave work on heroku
end
Not an answer to the above question but as OP has asked again for any advice..
Stop using ENV variables in development. Create a secrets.yml file, and you'll be able to access these values in your project. Make sure you add this to your .gitignore file as committing this is obviously not a good idea.
A very brief, succinct runthrough of how to use secrets:
https://richonrails.com/articles/the-rails-4-1-secrets-yml-file

Fog-google doesn't find credentials

I have a Rails 5 application with Carrierwave. I would like to use fog-google gem but I cannot set it up because fog cannot retrieve the credentials.
I created a .fog file in my application root populated this way:
default:
google_project: XXXX-website-cdn
google_client_email: XXXX#XXXX-website-cdn.iam.gserviceaccount.com
google_json_key_location: google-storage-cdn.json
I then tried to run pry as mentioned in the guidelines, but it doesn't get the credentials.
[3] pry(main)> connection = Fog::Compute::Google.new
ArgumentError: Missing required arguments: google_project
from /Users/ab/.rvm/gems/ruby-2.3.1/gems/fog-core-1.43.0/lib/fog/core/service.rb:244:in `validate_options'
Infact:
[4] pry(main)> Fog.credentials
=> {}
Where do I tell fog to get credentials from the .fog file?
I don't know if it might be useful to know that I'm using Figaro gem to manage my secrets.
QUICK SOLUTION
Put the .fog file in the root of the server (or your computer), not the one of the app.
This is pretty bad, but it's the first I found while quickly looking to solve the problem.
RIGHT SOLUTION
If you use google_json_key_location: google-storage-cdn.json Rails will look into / folder of the current server (your computer if you are working locally). In order to look into the application folder you need to use a Rails helper.
Rails.root.join( 'google-storage-cdn.json' )
# return /path/to/your/app/google-storage-cdn.json
I was looking for a while for a solution on how to avoid putting this .fog file into my home directory as it makes completely no sense. Up to the point of writing this comment, the official github documentation isn't updated. However, there is an open issue on fog-google github repo that demonstrates how to acheive it.
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_provider = 'fog/google'
config.fog_credentials = {
provider: 'Google',
google_project: Rails.application.secrets.google_cloud_storage_project_name,
google_json_key_string: Rails.application.secrets.google_cloud_storage_credential_content
# can optionally use google_json_key_location if using an actual file;
# however, I am using **Heroku** where you can't store physical files unless you
# check them into the repo (and you don't want to do that with service account credentials!)
}
config.fog_directory = Rails.application.secrets.google_cloud_storage_bucket_name
end
config/secrets.yml
development:
google_cloud_storage_project_name: your-project-name
google_cloud_storage_credential_content: '{
"type": "service_account",
"project_id": "your-project-name",
"private_key_id": "REDACTED",
"private_key": "-----BEGIN PRIVATE KEY-----REDACTED-----END PRIVATE KEY-----\n",
"client_email": "REDACTED#your-project-name.iam.gserviceaccount.com",
"client_id": "REDACTED",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/REDACTED%40your-project-name.iam.gserviceaccount.com"
}'
google_cloud_storage_bucket_name: your-bucket-name
All credit goes to the poster of the solution cireficc

Authenticating with service account (email & key) in fog-google

I keep getting Missing required arguments: google_storage_access_key_id, google_storage_secret_access_key. I understand that I am supposed to put my credential "in a /.fog" file, but I don't quite understand how that's supposed to work in the context of a Rails app. Can someone elaborate on how to configure this? I have tried passing the settings in an initializer (as suggested here), but they don't seem to get recognized in the validate_options method.
config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'xxxxxxxxxxxxx',
google_client_email: 'xxxxxxxxxxxxx-yyyyyyyyyyyyy#developer.gserviceaccount.com',
google_key_location: 'private/google-cloud-service-key.p12'
)
Turns out this is not currently possible with the fog-google gem. See this Github issue. I will update this answer when the gem is updated to handle this authentication strategy.
Use the Figaro Gem instead to handle any ENV vars you want to store and user throughout the app securely.
Add Figaro to your Gemfile and bundle install:
gem "figaro"
Figaro installation is easy:
$ bundle exec figaro install
This creates a commented config/application.yml file and adds it to
your .gitignore. Add your own configuration to this file and you're
done!
Example application.yml File
# config/application.yml
GOOGLE_ID: "ID"
GOOGLE_KEY: "KEY"
THEN config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'xxxxxxxxxxxxx',
google_client_email: 'xxxxxxxxxxxxx-yyyyyyyyyyyyy#developer.gserviceaccount.com',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_ID"],
google_storage_secret_access_key: ENV["GOOGLE_KEY"]
)

Can't get Carrierwave to work with Amazon S3

I am trying to use Amazon S3 with Carrierwave. This is the first time I use S3 so I am not sure what I am doing most of the time. I am using Carrierwave with Fog, and uploading the files (just images) through ActiveAdmin, but I get a 'broken pipe' error when I try to upload anything
This is the full trace of the error.
I set up Carrierwave with this configuration in the initializer:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'myid',
:aws_secret_access_key => 'mysecretkey',
}
config.fog_directory = 'bucketname'
config.s3_region = 'EU'
end
And I changed this in my uploader class:
#storage :file
storage :fog
I am using Rails 3.1
Can anyone give me a clue about what's wrong? I've been searching through open issues of Carrierwave and Fog and cant find anything.
IMPORTANT EDIT: I just tried to upload a very small image and it worked, but for some reason >100 KB are giving me the "broken pipe" error.
The s3_region should be 'eu-west-1'.
In my case the 'Broken pipe' was being caused by a RequestTimeTooSkewed error. Here it is explained by the AWS site: http://www.bucketexplorer.com/documentation/amazon-s3--difference-between-requesttime-currenttime-too-large.html.
So because the default S3 bucket location is the us-east-1 and I'm located in the West I had to change my bucket's "Region" to Oregon or us-west and it worked!

Trouble Getting s3 set up in Rails 3 Refinery CMS App

I'm trying to get my refinery cms image storage to Amazon s3 and I'm following this guide:
http://refinerycms.com/guides/how-to-use-amazon-s3-for-storage
But I'm blocked here:
There are a number of ways to set
these with your credentials, including
unix variables or settings them
manually through Ruby using ENV.
How do I define these credentials. Do I put something like :S3_KEY =>
"my_key" in my environments.rb file? I tried this and it didn't work.
I also tried this:
AWS::S3::Base.establish_connection!(
:access_key_id => ENV['S3_KEY'] || 'key_goes_here',
:secret_access_key => ENV['S3_SECRET'] || 's3_secret_key_here',
)
Can't figure out how to do this. Any ideas are greatly appreciated.
The safest way is to specify them as environment variables, so they aren't included in your source code. If you're the only one with access to the source, then specifying them as you describe should work.
You can specify them in your ~/.bashrc
export S3_KEY=mykey
export S3_SECRET=mysecret
Or if you're just testing locally you can prepend them to your rails command.
$ S3_KEY=mykey S3_SECRET=mysecret rails server
If you don't want to/can't use environment variables, another method is to use an initializer to load credentials from a yml file: config/initializers/s3_credentials.rb
# Load AWS::S3 configuration values
#
S3_CREDENTIALS = \
YAML.load_file(File.join(Rails.root, 'config/s3_credentials.yml'))[Rails.env]
# Set the AWS::S3 configuration
#
AWS::S3::Base.establish_connection! S3_CREDENTIALS['connection']
config/s3_credentials.yml
development: &defaults
connection:
:access_key_id: AAAAAA_your-key-here
:secret_access_key: 4rpsi235js_your-secret-here
:use_ssl: true
bucket: project-development
acl: public-read
production:
<<: *defaults
bucket: project

Resources