I need to copy images from static image url which are stored in database tables
like : https://www.gravatar.com/avatar/b8c19609aaa9eb291f2a5974e369e2a4?s=328&d=identicon&r=PG&f=1
to s3 using ruby on rails
Try out following code:
AWS::S3::S3Object.store(path,content,bucket)
Here, path is the path in the bucket where you want to store, content is the contents which you want to store in that file and bucket is the name of the bucket.
Before this you have to establish connection. So your final code might look like this:
AWS::S3::Base.establish_connection!(
:access_key_id => <key>,
:secret_access_key => <access_key>,
:use_ssl => true,
)
AWS::S3::S3Object.store(path,open('https://www.gravatar.com/avatar/b8c19609aaa9eb291f2a5974e369e2a4?s=328&d=identicon&r=PG&f=1'),bucket)
Related
I'm getting the following error when trying to upload a file to an S3 bucket:
AWS::S3::Errors::InvalidAccessKeyId: The AWS Access Key Id you provided does not exist in our records.
The file exists, the bucket exists, the bucket allows uploads, the credentials are correct, and using CyberDuck with the same credentials i can connect and upload files to that bucket just fine. Most answers around here point to the credentials being overridden by environment variables, that is not the case here, i've tried passing them directly as strings, and outputting them just to make sure, it's the right credentials.
v1
AWS.config(
:access_key_id => 'key',
:secret_access_key => 'secret'
)
s3 = AWS::S3.new
bucket = AWS::S3.new.buckets['bucket-name']
obj = bucket.objects['filename']
obj.write(file: 'path-to-file', acl:'private')
this is using the v1 version of the gem (aws-sdk-v1) but I've tried also using v3 and I get the same error.
v3
Aws.config.update({
region: 'eu-west-1',
credentials: Aws::Credentials.new('key_id', 'secret')
})
s3 = Aws::S3::Resource.new(region: 'eu-west-1')
bucket = s3.bucket('bucket-name')
obj = bucket.object('filename')
ok = obj.upload_file('path-to-file')
Note: the error is thrown on the obj.write line.
Note 2: This is a rake task from a Ruby on Rails 4 app.
Finally figured it out, the problem was that because we are using a custom endpoint the credentials were not found, I guess that works differently with custom endpoints.
Now to specify the custom endpoint you'll need to use a config option that for some reason is not documented (or at least I didn't find it anywhere), I actually had to go through paperclip's code to see how those guys were handling this.
Anyway here's how the config for v1 looks like with the added config for the endpoint:
AWS.config(
:access_key_id => 'key',
:secret_access_key => 'secret',
:s3_endpoint => 'custom.endpoint.com'
)
Hopefully that will save somebody some time.
I am using Ruby on Rails AWS SDK to copy files from one bucket to another and store the url in carrierwave. The app uses carrierwave and I need to store the new copied url to the database field that carrierwave uses. The problem is the url is private so I tried to just generate the url and store it in remote_file_url but it can't access the file. The files are very large so I can't use carrierwave to upload the file to sdk. I tried and had no success.
obj.public_url did not work for me. When I copied the file it worked but not when moved it from one bucket to another.
Here is what I have:
picture_name = "my_picture.jpg"
#picture = Picture.find(id)
upload_dir = "uploads/picture/file/#{#picture.id}/my_picture.jpg"
s3 = Fog::Storage.new(provider: 'AWS', :aws_access_key_id => access_key, :aws_secret_access_key => secret_access_key ,:region => region)
obj = s3.copy_object('my-temp-bucket',
picture_name,
"my-target-bucket",
upload_dir, acl: 'public-read')
#picture.remote_image_url = obj.url_for(:read, :expires => 10*60)
#picture.save
I also tried with no luck.
#picture.remote_image_url = obj.public_url
#picture.save
I get an error undefined method `bucket' for #
Thank you for your help!!!
I want to rename an item in s3 using the Ruby sdk. How do I do this?
I have tried:
require 'aws-sdk'
s3 = AWS.config(
:region => 'region',
:access_key_id => 'key',
:secret_access_key => 'key'
)
b = AWS::S3::Bucket.new(client: s3, name: 'taxalli')
b.objects.each do |obj|
obj.rename_to('imports/files/' + line.split(' ').last.split('/').last)
end
But I dont see anything in the new sdk for moves or renames.
In AWS-SDK version 2 is now method called "move_to" (http://docs.aws.amazon.com/sdkforruby/api/Aws/S3/Object.html#move_to-instance_method) which you can use in this case. Technically it will still copy & delete that file on S3 but you don't need to copy & delete it manually and most importantly it will not delete that file if that copy action fails from some reason.
There is no such thing as renaming in the Amazon S3 SDK. Basically what you have to do is copy the object and then delete the old one.
require 'aws-sdk'
require 'open-uri'
creds = Aws::SharedCredentials.new(profile_name: 'my_profile')
s3 = Aws::S3::Client.new( region: 'us-east-1',
credentials: creds)
s3.copy_object(bucket: "my_bucket",
copy_source: URI::encode("my_bucket/MyBookName.pdf"),
key: "my_new_book_name.pdf")
s3.delete_object(bucket: "my_bucket",
key: "MyBookName.pdf")
have you Rails Paperclip S3 rename thousands of files? or https://gist.github.com/ericskiff/769191 ?
In my Rails app I save customer RMA shipping labels to an S3 bucket on creation. I just updated to V2 of the aws-sdk gem, and now my code for setting the ACL doesn't work.
Code that worked in V1.X:
# Saves label to S3 bucket
s3 = AWS::S3.new
obj = s3.buckets[ENV['S3_BUCKET_NAME']].objects["#{shippinglabel_filename}"]
obj.write(open(label.label('pdf').postage_label.label_pdf_url, 'rb'), :acl => :public_read)
.write seems to have been deprecated, so I'm using .put now. Everything is working, except when I try to set the ACL.
New code for V2.0:
# Saves label to S3 bucket
s3 = Aws::S3::Resource.new
obj = s3.bucket(ENV['S3_BUCKET_NAME']).object("#{shippinglabel_filename}")
obj.put(Base64.decode64(label_base64), { :acl => :public_read })
I get an Aws::S3::Errors::InvalidArgument error, pointed at the ACL.
This code works for me:
photo_obj = bucket.object object_name
photo_obj.upload_file path, {acl: 'public-read'}
so you need to use the string 'public-read' for the acl. I found this by seeing an example in object.rb
I am able to upload images using Paperclip, and can see them in my bucket on Amazon's S3 management console website, but the url provided by Paperclip (e.g., image.url(:thumb)) cannot be used to access the image. I get a url that looks something like this:
http://s3.amazonaws.com/xxx/xxx/images/000/000/012/thumb/image.jpg?1366900621
When i put that URL in my browser, I'm sent to an XML page that states: "The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint."
where the "endpoint" is a subdomain of Paperclip path. But when I go to that "endpoint", I just get another error that says "Access Denied". According to the file information provided by the Amazon site, however, the image is publicly viewable. Can someone tell me what I'm doing wrong?
My development.rb file simply contains the following:
config.paperclip_defaults = {
:storage => :s3,
:s3_credentials => {
:bucket => AWS_BUCKET,
:access_key_id => AWS_ACCESS_KEY_ID,
:secret_access_key => AWS_SECRET_ACCESS_KEY
}
}
I got it to work by changing the default for :url
# config/initializers/paperclip.rb
Paperclip::Attachment.default_options[:url] = ':s3_domain_url'
Paperclip::Attachment.default_options[:path] = '/:class/:attachment/:id_partition/:style/:filename'
I'm in the domestic U.S., but it appears that this was still necessary for my code to work (cf. https://devcenter.heroku.com/articles/paperclip-s3)