I a writing a Rails API, with help of aws-sdk-ruby, which retrieves a file from AWS and returns in the response of API. Can I get somehow file stream in response of object.get, which I can directly return from the Rails API.
s3 = Aws::S3::Resource.new
bucket_name = "my_bucket"
bucket = s3.bucket(bucket_name)
object = bucket.object("a/b/my.pdf")
Rails.logger.info 'Downloading file to AWS'
downloaded_data = object.get({})
send_data(downloaded_data,
:filename => "my.pdf",
:type => "mime/type"
)
But it does not return file.
One option I know is to first save the file in local using this line:
object.get(response_target: '/tmp/my.pdf')
Than I can return this file but is there a way to skip this step and directly return the response of object.get without saving in local.
I can not use this solution as my URL are not public and I am just creating a REST API.
I got screen like following when I tried this solution.
As of now what I am doing is getting a URL from the object like this:
url = object.presigned_url(:get, expires_in: 3600)
and using following code to send the response:
data = open(url)
send_data data.read, filename: file_name, type: "mime/type"
Related
I'm integrating with a third party ID verification provider. Once the ID verification checks have run a report is generated, which I can access with a get request.
The response is binary text (screenshot attached), which I want to save to a PDF file.
Function:
def generate_pdf
resources = "applicants/61f84499b7b92f00014d5c6d/summary/report?report=applicantReport"
response = RestClient.get(request_env_url(resources), signed_header(resources, nil, 'GET', 'application/pdf'))
puts response
end
How do I take the binary response, create a new file and add the binary into that file in a friendly and readable format?
If you can access a filesystem, you might just want to write that data to a file:
File.open("thing.pdf", "wb") { |f| f.write response }
First I created a pdf with WickedPDF.
pdf_string = WickedPdf.new.pdf_from_string(
ActionController::Base.new.render_to_string(template: 'v1/invoices/invoice_template', filename: 'test.pdf')
)
invoice.attachment.attach(
io: StringIO.new(pdf_string),
filename: 'test.pdf',
content_type: 'application/pdf'
)
My app is setup to store the files on s3 on prod and locally in dev. For testing I also used s3 in dev to verify that my pdf is getting generated and saved correctly. So after it has been generated I am able to log into aws and download my invoice. Everything displays just fine.
Now the problem I have is trying to download my invoice. When I download it, my pdf is just blank.
I have a download method that looks like this:
response.headers['Content-Type'] = #invoice.attachment.content_type
response.headers['Content-Disposition'] = "inline; #{#invoice.attachment.filename}"
response.headers['filename'] = #invoice.filename
#invoice.attachment.download do |chunk|
response.stream.write(chunk)
end
I also tried
send_data #invoice.attachment.download, filename: #invoice.filename
and my frontend (react) uses axios to download it:
const downloadInvoice = (id) => {
axios.get(`/v1/invoices/${id}/download`)
.then((response) => {
const url = window.URL.createObjectURL(new Blob([response.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', response.headers.filename);
document.body.appendChild(link);
link.click();
})
.catch(() => {});
};
I am a little confused to why my downloaded pdf is blank. If I open it in my storage folders it displays just fine. There seems to be an issue with how I download it.
What works is if I create a presigned URL for S3 with:
s3 = Aws::S3::Resource.new(client: aws_client)
bucket = s3.bucket('bucket-name')
obj = bucket.object("#invoice.attachment.attachment.blob.key)
url = obj.presigned_url(:get)
I can send that url back to the frontend and open it in a new tab to view the pdf. But this is not what I want...
Thanks for any help!
In case anyone is interested in this or runs into the same issue. I hope this will save you some time!
The problem is with the axios request.
Instead of:
axios.get(`/v1/invoices/${id}/download`)
use
axios.get(`/v1/invoices/${id}/download`, { responseType: 'arraybuffer' })
I am using Spray-can to host a REST service to which a user will be able to upload a file. The block of code that listens for incoming requests is given below:
def receive: Receive = {
case _: Http.Connected => sender ! Http.Register(self)
case req#HttpRequest(HttpMethods.POST, Uri.Path("/upload"), headers, entity, _) =>
logger.info("Received file upload request.")
// Process the uploaded data using the 'entity' object
I upload the file using this curl command:
curl --data-binary #inputFile.csv 'devserver:8998/upload?tenant=DressShop&facility=CityCenter&customer=Jimmy'
The challenge I am facing is that I'm not able to pick up the filename "inputFile.csv" from the request, though I'm getting the data from the "entity" object. I tried poring through the API but couldn't find out any way to get the filename.
My objective is to ensure that I allow upload of only csv files.
You need to process the entity as form data using as
as[MultipartFormData]
Then you can get the file name from the header fields:
def processFormData(data: MultipartFormData) = {
var attForm = ""
val bodyPart = data.fields(0)
data.fields foreach {
bodyPart => println(bodyPart.headers.find(h=> h.is("content-disposition")).get.value)
}
}
This might help:
The filename can be found in parameters.
I am currently using the Ruby aws-sdk, version 2 gem with server-side customer-provided encryption key(SSE-C). I am able to upload the object from a rails form to Amazon S3 with no issues.
def s3
Aws::S3::Object.new(
bucket_name: ENV['S3_BUCKET'],
key: 'hello',
)
end
def upload_object
customer_key = OpenSSL::Cipher::AES.new(256, :CBC).random_key
customer_key_md5 = Digest::MD5.new.digest(customer_key)
object_key = 'hello'
options = {}
options[:key] = object_key
options[:sse_customer_algorithm] = 'AES256'
options[:sse_customer_key] = customer_key
options[:sse_customer_key_md5] = customer_key_md5
options[:body] = 'hello world'
options[:bucket] = ENV['S3_BUCKET']
s3.put(options)
test_params = {
object_key: object_key,
customer_key: Base64.encode64(customer_key),
md5_key: Base64.encode64(customer_key_md5),
}
Test.create(test_params)
end
But I'm having some issues with retrieving the object and generating a signed url link for the user to download it.
def retrieve_object(customer_key, md5)
options = {}
options[:key] = 'hello
options[:sse_customer_algorithm] = 'AES256'
options[:sse_customer_key] = Base64.decode64(customer_key)
options[:sse_customer_key_md5] = Base64.decode64(md5)
options[:bucket] = ENV['S3_BUCKET']
s3.get(options)
url = s3.presigned_url(:get)
end
The link is generated, but when I clicked on it, it directs me to an amazon page saying.
<Error>
<Code>InvalidRequest</Code>
<Message>
The object was stored using a form of Server Side Encryption. The correct parameters must be provided to retrieve the object.
</Message>
<RequestId>93684EEBA062B1C2</RequestId>
<HostId>
OCnn5EG7ydfoKzsmEDMbqK5kOhLFpNXxVRdekfhOfnBc6s+jtPYFsKi8IZsEPcd9ConbYUHgwC8=
</HostId>
</Error>
The error message is not helping as I'm unsure what parameters I need to add. I think I might be missing some permissions parameters.
Get Method
http://docs.aws.amazon.com/sdkforruby/api/Aws/S3/Object.html#get-instance_method
Presigned_Url Method
http://docs.aws.amazon.com/sdkforruby/api/Aws/S3/Object.html#presigned_url-instance_method
When you generate a pre-signed GET object URL, you need to provide all of the same params that you would pass to Aws::S3::Object#get.
s3.get(sse_customer_algorithm: 'AES256', sse_customer_key: customer_key).body.read
This means you need to pass the same sse_customer_* options to #presigned_url:
url = obj.presigned_url(:get,
sse_customer_algorithm: 'AES256',
sse_customer_key: customer_key)
This will ensure that the SDK correctly signs the headers that Amazon S3 expects when you make the final GET request. The next problem is that you are now responsible for sending those values along with the GET request as headers. Amazon S3 will not accept the algorithm and key in the query string.
uri = URI.parse(url)
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Get.new(uri.request_uri, {
"x-amz-server-side-encryption-customer-algorithm" => 'AES256',
"x-amz-server-side-encryption-customer-key" => Base64.encode64(cpk),
"x-amz-server-side-encryption-customer-key-MD5" => Base64.encode64(OpenSSL::Digest::MD5.digest(cpk))
})
Please note - while testing this, I found a bug in the presigned URL implementation of the current v2.0.33 version of the aws-sdk gem. This has been fixed now and should be part of v2.0.34 once it releases.
See the following gitst for a full example that patches the bug and demonstrates:
Uploads an object using a cpk
Gets the object using the SDK
Generates a presigned GET url
Downloads the object using just Net::HTTP and the presigned URL
You can view the sample script here:
https://gist.github.com/trevorrowe/49bfb9d59f83ad450a9e
Just replace the bucket_name and object_key variables at the top of the script.
In my app, I have a requirement that is stumping me.
I have a file stored in S3, and when a user clicks on a link in my app, I log in the DB they've clicked the link, decrease their 'download credit' allowance by one and then I want to prompt the file for download.
I don't simply want to redirect the user to the file because it's stored in S3 and I don't want them to have the link of the source file (so that I can maintain integrity and access)
It looks like send_file() wont work with a remote source file, anyone recommend a gem or suitable code which will do this?
You would need to stream the file content to the user while reading it from the S3 bucket/object.
If you use the AWS::S3 library something like this may work:
send_file_headers!( :length=>S3Object.about(<s3 object>, <s3 bucket>)["content-length"], :filename=><the filename> )
render :status => 200, :text => Proc.new { |response, output|
S3Object.stream(<s3 object>, <s3 bucket>) do |chunk|
output.write chunk
end
}
This code is mostly copied form the send_file code which by itself works only for local files or file-like objects
N.B. I would anyhow advise against serving the file from the rails process itself. If possible/acceptable for your use case I'd use an authenticated GET to serve the private data from the bucket.
Using an authenticated GET you can keep the bucket and its objects private, while allowing temporary permission to read a specific object content by crafting a URL that includes an authentication signature token. The user is simply redirected to the authenticated URL, and the token can be made valid for just a few minutes.
Using the above mentioned AWS::S3 you can obtain an authenticated GET url in this way:
time_of_exipry = Time.now + 2.minutes
S3Object.url_for(<s3 object>, <s3 bucket>,
:expires => time_of_exipry)
Full image download method using temp file (tested rails 3.2):
def download
#image = Image.find(params[:image_id])
open(#image.url) {|img|
tmpfile = Tempfile.new("download.jpg")
File.open(tmpfile.path, 'wb') do |f|
f.write img.read
end
send_file tmpfile.path, :filename => "great-image.jpg"
}
end
You can read the file from S3 and write it locally to a non-public directory, then use X-Sendfile (apache) or X-Accel-Redirect (nginx) to serve the content.
For nginx you would include something like the following in your config:
location /private {
internal;
alias /path/to/private/directory/;
}
Then in your rails controller, you do the following:
response.headers['Content-Type'] = your_content_type
response.headers['Content-Disposition'] = "attachment; filename=#{your_file_name}"
response.headers['Cache-Control'] = "private"
response.headers['X-Accel-Redirect'] = path_to_your_file
render :nothing=>true
A good writeup of the process is here