I have an application and OBS bucket in Telekom. It is an S3-compatible bucket.
Can I use paperclip gem to upload attachments to the Telekom OBS.
I tried using aws-sdk which seems to be a stupid way to achieve this. At least worth a try.
I'm getting key argument error
ArgumentError (:key must not be blank):
Is there any other gem or workaround which supports an s3-compatible bucket which can be used with paperclip gem to upload a file.?
Related
I'm new to S3 and Shrine, and I'm working with a Shrine uploader in Ruby on Rails to upload files to Amazon S3, which has been in place for a couple of years on this Rails app.
The thing I'm working on has a goal to have S3 generate a checksum when uploading files, and according to these docs for adding a "trailing checksum" the ChecksumAlgorithm needs to be used: https://docs.aws.amazon.com/AmazonS3/latest/userguide/checking-object-integrity.html
In the ruby SDK docs, it lists checksum_algorithm as a param.
https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/S3/Object.html#put-instance_method
When I add the param in the Shrine uploader (plugin :upload_options, { checksum_algorithm: 'SHA256' }) and upload the file, I get the error ArgumentError: unexpected value at params[:checksum_algorithm] from aws-sdk-core/param_validator.rb:33:in 'validate!' https://github.com/aws/aws-sdk-ruby/blob/version-3/gems/aws-sdk-core/lib/aws-sdk-core/param_validator.rb#L14.
I've tried different cases, with and without the dash, and anything else I can think of syntax-wise, but no luck.
It turns out I was using an older version of aws-sdk-s3 and updating the gem solved the problem. Thanks #Janko
I want to allow my user to download a bundle of files that are stored on s3 using the zipline gem. The files are already hosted on an s3 server but they aren't there as part of a paperclip or carrierwave attachment in my app. Will I need to create some records in my database to sort of trick zipline into thinking they are paperclip attachments, or is there a way I can send the zip file without bothering with an attachment gem? At the moment, trying to download the files with zipline doesn't throw an error message at all. It just seems to skip right over and nothing downloads.
See the part of the zipline README where an enumerator gets used to include remote files into the ZIP. It uses absolute URLs, to generate those from your S3 objects you will need to use presigned URLs (which Zipline is going to pass on to Curb):
Aws::S3::Bucket.new(your_bucket_name).object(your_key).presigned_url(:get)
I have used carrierwave for my users to upload files in my rails app. When a user uploads multiple files of size more than 400mb (approx.) they get a timeout error.
Note: I've hosted my app rails app on heroku.
Uploading large files through Heroku is generally not recommended. They limit the request to 30 seconds, which would not be enough time for 400MB.
If you are open to using S3, Heroku provides a potential Rails solution for that.
First solution is you can use refile gem.
Reason is " Effortless direct uploads, even to S3 "
This gem is from Jonas Nicklas, who is behind Carrierwave gem
refile
Blog on reason for refile gem
Second solution
move upload file into background jobs
I recently implemented file uploading using carrier-wave with Amazon S3 storage.
I want to generate, or make available, the S3 URL only for one hour. After that the link should expire.
How can I do this using carrier-wave?
The way to handle this is to use a presigned URL for the S3 file. Once you upload the file using carrier wave, you access the actual S3 URL and use AWS::S3 to presign it with an expiration time. For example, if the key (file name) in your S3 bucket is "my_file", you could do this:
# Your Model
def presigned_url
s3 = AWS::S3.new
bucket = s3.buckets["MyBucket"]
object = bucket.objects["my_file"]
object.url_for(:read, secure:true, expires:1.hour)
end
The URL returned will be valid for 1 hour and then will never work again.
To use this, you will need to include the aws-s3 gem in your Gemfile:
# Gemfile
gem "aws-s3"
I am using paperclip to upload photos to an AWS-S3 bucket. The photos are being accessed through a separate feed using Mechanize. I recently updated my gem file, and now paperclip apparently requires the aws-sdk gem instead of the aws-s3 gem. Now I am debugging code to accommodate the aws-sdk gem and I am stuck at uploading a file. My working code below using the aws-s3 gem:
image = agent.get(url).body
filename = "#{listing_id}:#{i}.jpg"
AWS::S3::S3Object.store(filename, image, S3_BUCKET_NAME)
"https://s3.amazonaws.com/#{S3_BUCKET_NAME}/#{filename}"
I get this error: undefined methodstore' for AWS::S3::S3Object:Class`
and from what I can gather, the new code is going to look something like this:
key = File.basename(filename)
s3.buckets[S3_BUCKET_NAME].objects[key].write(:file => filename)
Using aws-s3, the image was referenced in the store method. How do I reference the image using aws-sdk?
for now I am going back to an old paperclip version and reinstalling aws-s3, but I would like to keep my code up to date if possible. Any thoughts would be much appreciated.