Sitemap generator not uploading sitemap - ruby-on-rails

I started playing with a test app trying to upload sitemaps using Amazon S3. I've been following https://github.com/kjvarga/sitemap_generator trying to figure out the gem and have only been half successful. A sitemap will generate in the public folder, but not upload to the S3 bucket.
I've added the config/sitemap.rb found in the tutorial above.
require 'rubygems'
require 'sitemap_generator'
require 'aws-sdk'
SitemapGenerator::Sitemap.create_index = true
SitemapGenerator::Sitemap.default_host = 'https://www.myapp.herokuapp.com'
SitemapGenerator::Sitemap.create do
add '/home', :changefreq => 'daily', :priority => 0.9
end
SitemapGenerator::Sitemap.adapter = SitemapGenerator::S3Adapter.new(fog_provider: 'AWS',
aws_access_key_id: 'KEY',
aws_secret_access_key: 'SECRET',
fog_directory: 'DIR',
fog_region: 'REGION')
I type in
rails sitemap:refresh
in my terminal and it generates the maps. It just doesn't upload them. No errors, no clues as to what didn't happen, nothing. It even tells me that google and bing are successfully pinged.
Of course I can visit my AWS bucket and manually upload these files but that feels...wrong. I've used shrine for images in the past and am used to uploading to a cache. There must be something I missed.

Check your secrets, maybe you don't have the aws account env vars so S3 adapter would never works or policy in your bucket

Related

Fog-Google, Unable to Push to Heroku

So i am trying to set up uploading of files in my production environment. I am currently using CarrierWave along with Fog-Google. I have no issues storing files locally as i do not use Fog for development. However i am currently trying to test out the file uploading functionality in production, but i cannot even push my app up to Heroku.
Here's a snippet of the error i'm receiving when attempting to push to Heroku.
[fog][WARNING] Unrecognized arguments: google_storage_secret_access_key_id, google_storage_secret_access_key
rake aborted!
ArgumentError: Invalid keyfile or passphrase
Now i am relatively new to setting up ENV secret ids and all of that. So instead i'll just say what i know and what i have done just to be sure that i did everything correctly.
So as i am currently using Cloud9 IDE, in my .bashrc file i have
export GOOGLE_STORAGE_ACCESS_KEY_ID=XXXXX
export GOOGLE_STORAGE_SECRET_ACCESS_KEY=XXXXX
export GOOGLE_STORAGE_BUCKET_NAME=XXXXX
In my /config/initializers/carrierwave.rb
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_provider = 'fog/google' # required
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
google_storage_secret_access_key: ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['GOOGLE_STORAGE_BUCKET_NAME']
end
and in my /config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'XXXX',
google_client_email: 'XXXXXX',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID"],
google_storage_secret_access_key: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY"]
)
Like mentioned i am actually quite new to all of these so i've tried my best in following the documentation on both Fog and CarrierWave's github page.
As far as i know i should use .bashrc to store my secret keys etc and then call on them using ENV['SECRET_KEY_NAME'] method. I've set up both the CarrierWave.rb and Fog.rb files in the initializer folder so i'm not quite sure what i'm missing as well.
Additionally i have also tried doing heroku config:set GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID=XXXXXX but that didn't seem to work as well.
I'm not quite sure what to do now and what may be causing the error when attempting to push to Heroku, much less whether any of this is even working in production.
EDIT:
I think the error is largely from the fog.rb file. So i amended it to the following:
GoogleStorage = Fog::Storage::Google.new(
google_project: 'XXX',
google_client_email: 'XXX',
google_json_key_location: '~/.fog/XXXX.json'
)
And now when i try pushing to Heroku the error i get instead is
Errno::ENOENT: No such file or directory # rb_sysopen - /app/.fog/XXX.json
Just to share, i created a .fog folder under the ~ directory. Inside the .fog folder i added in the Private JSON key.
All help and advice would be greatly appreciated. Thank you!
So i've solved the problem and managed to successfully push my code to Heroku.
[Kindly note that this does not mean it functions perfectly in production, it just means i am now able to push to Heroku without any errors]
So there were 2 main errors.
1) Not including my Private Key JSON file in my app.
2) Error in the config/initializers/fog.rb file
To solve the first issue, i created a .fog folder in my App and added in my Private Key JSON file which was from Google's Cloud Platform.
Next i amended the code in config/initializers/fog.rb to:
GoogleStorage = Fog::Storage::Google.new(
:google_project => 'XXXX',
:google_client_email => 'XXXXX',
:google_json_key_location => Rails.root.join('.fog/XXXX'),
:google_storage_access_key_id => ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
:google_storage_secret_access_key => ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
)
I was then able to successfully push my code to Heroku.

AWS::S3::Errors::AccessDenied: Access Denied when trying to do copy_to

I have written a rake task that does an copy_to from one directory in a bucket to another directory within the same bucket. When I test it locally it works fine, but when I deploy it to an environment it returns AWS::S3::Errors::AccessDenied: Access Denied. I assume that it has something to do with the AWS credentials on the environment I am deploying too, I am also confident that the problem is to do with the copy_to as I accessed the bucket from the rails console and had no issues
my copy from statement is as follows
creds = YAML::load_file(Rails.root.join("config", "s3.yml"))
AWS.config(aws_access_key_id: creds[:access_key_id],
aws_secret_access_key: creds[:secret_access_key])
s3.buckets['test-bucket'].objects['path to file'].copy_to('new_path')
The parameters to AWS.config are access_key_id and secret_access_key, without the aws_ prefix.
http://docs.aws.amazon.com/AWSRubySDK/latest/AWS.html#config-class_method
Found this because I also received Access Denied when calling copy_to(). While older SDK versions were happy to accept a pure key path as parameter to copy_to, newer versions require you to specify the bucket, too.
In my case
s3_bucket.object(old_key).copy_to(new_key)
did not work and produced a rather unhelpful "Access Denied" error with the v3 SDK. Instead, this works:
s3_bucket.object(old_key).copy_to( s3_bucket.object(new_key) )
or
s3_bucket.object(old_key).copy_to(bucket_name+'/'+new_key)
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key', :bucket_name => 'target-bucket')
A simplified example using the aws-sdk gem:
AWS.config(:access_key_id => '...', :secret_access_key => '...')
s3 = AWS::S3.new
s3.buckets['bucket-name'].objects['source-key'].copy_to('target-key')

Routing error for images Ror

I run my app locally on 3000 and all is fine, but when i upload to heroku it is sub standard.
I am using paperclip gem,
my Heroku logs issues this error
ActionController::RoutingError (No route matches [GET] "/images/medium/missing.png"):,
There seems to be no path to this instance in my pipeline, If it runs locally and not in heroku i figure it's a Heroku config problem?.
For heroku you have to additionally use the aws-sdk gem like shorlty described in the excellent paperclip-intro.
Sample configuration:
# config/environments/production.rb
config.paperclip_defaults = {
:storage => :s3,
:s3_credentials => {
:bucket => ENV['S3_BUCKET_NAME'],
:access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:secret_access_key => ENV['AWS_SECRET_ACCESS_KEY']
}
}
Best info point is usually heroku itself, see this example configuration guide
paperclip gem means you are uploading images, and want to access the image via that GET request right?
that path looks like it is for the heroku server itself. Heroku doesn't support file uploads, get amazon s3 setup or some other storage facility.
it works locally because your local setup allows for file uploads and storage.

Accessing data stored in Amazon S3 through Rails

My goal is to make graphs from data within excel files that users have uploaded to Amazon S3.
I've already implemented the functionality for users to upload the excel files with Carrierwave, now I need to be able to access the data and make it presentable for use with a charting library (Highcharts).
The task that I am stuck on is directly accessing the data in S3 through Rails. Once the data is pulled it should be fairly straightforward to manipulate it with Highcharts.
Any suggestions would be much appreciated!
You can use the AWS SDK:
require 'aws-sdk'
# retrieve the access key and secret key
access_key_id = ENV["ACCESS_KEY_ID"]
secret_access_key = ENV["SECRET_ACCESS_KEY"]
# create an instance of the s3 client
s3 = AWS::S3.new(access_key_id: access_key_id, secret_access_key: secret_access_key)
# get the bucket
bucket = s3.buckets['your-bucket-name']
# retrieve the objects
bucket.objects.each do |object|
puts object.key
puts object.read
end
s3 = Aws::S3::Client.new
bucket = Aws::S3::Bucket.new('AWS_BUCKET NAME HERE')
bucket.objects.each do |obj|
File.open("#{Rails.root}/#{obj.key}", 'wb') do |file|
s3.get_object( bucket:ENV[:AWS_BUCKET], key: obj.key , response_target: file)
end
end
OR
s3 = Aws::S3::Client.new
s3.list_objects(bucket: 'AWS_BUCKET NAME HERE').each do |response|
response.contents.each do |obj|
File.open("#{Rails.root}/#{obj.key}", 'wb') do |file|
s3.get_object( bucket: 'AWS_BUCKET NAME HERE', key: obj.key , response_target: file)
end
end
end
There is official AWS-SDK RUBY gem
AWS SDK ruby official documentation for version 2
For enviornment variable configuration you can figaro or dotenv (for development environment) or set in ~/.bashrc file.
Note:
You need to create S3 bucket and get AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
Before development you can test and access your s3 bucket data using google chrome s3 browser extenstion
Run command source ~/.bashrc or . ~/.bashrc file to reflect changes, if you store ENV variables there.
Code Reference

Ruby On Rails uploading songs to Rackspace Cloud Files container

Ok so here's the deal, I'm creating a web app and I'm adding the function to upload music to user profiles. I'm using Rackspace cloud files for storage, and I'm having a little bit of trouble completing two task. Firstly I'm having trouble writing the code to upload the file to my container. Secondly I need to generate the url of the file to store in the database. I'm very new to integrating API's so i don't have a lot of knowledge.
object = container.create_object 'filename', false
object.write file
Is this code correct for uploading the files?
Using fog directly
First of all, if you're not already using it, the officially supported Ruby library for interacting directly with Cloud Files from Ruby is fog. To start, add it to your Gemfile:
gem 'fog'
Then run bundle install to install it.
To upload a file and get its public url, using fog directly:
# Use your Rackspace API key, not your password; you can find your API key by logging
# in to the control panel, clicking on your name in the top-right, and choosing
# "account settings".
service = Fog::Storage.new(
provider: 'rackspace',
rackspace_username: ENV['RACKSPACE_USERNAME'],
rackspace_api_key: ENV['RACKSPACE_API_KEY']
)
dir = service.directories.create key: 'directory-name', public: true
files_to_upload.each do |path|
file = dir.files.create key: File.basename(path), body: File.open(path, 'r')
puts "public URL for #{file} is #{file.public_url}"
end
Using CarrierWave
However! What you're doing is a pretty common usecase in Rails, so there's a gem for it: CarrierWave. Add the following line to your Gemfile:
gem 'fog'
gem 'carrierwave'
And run bundle install to install it. Now configure CarrierWave to use Cloud Files:
# config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'rackspace',
:rackspace_username => 'xxxxxx',
:rackspace_api_key => 'yyyyyy'
}
config.fog_directory = 'name_of_directory'
end
Next generate an uploader:
rails g uploader Song
Now you can use the SongUploader to store and retrieve Song data. See the generated code, or the CarrierWave docs, for more details.

Resources