Rails direct upload to Amazon S3 - ruby-on-rails

I'm looking to add functionality to my Rails app to upload files directly to Amazon S3. From my research the general consensus seems to be to use the s3-swf-upload-plugin. I've setup a sample app using that gem but I can't get it to play nice with only allowing the selection of a single file. I'd also like to create a record post upload and use paperclip to create a thumbnail for which I can find little guidance.
So my questions are:
(1) am I on the right track using that gem or should I be taking another appraoch?
(2) are there any samples out there that I could use for reference?
Any assistance would be much appreciated.
Chris

Try a new Gem called CarrierWaveDirect it allows you to upload files direct to S3 using a html form and easily move the image processing into a background process

Not sure about whether you can modify it easily to only upload one file at a time, but this gem works very well for me. It is based on one of Ryan Bates' Railscast:
https://github.com/waynehoover/s3_direct_upload

Try looking into carrierwave https://github.com/jnicklas/carrierwave (supports s3)
Multi file uploads with carrierwave and uploadify http://blog.assimov.net/post/4306595758/multi-file-upload-with-uploadify-and-carrierwave-on

If you are using Rails 3, please check out my sample projects:
Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader
Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload
By the way, you can do post-processing with Paperclip using something like this blog post describes:
http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip

I have adapted Heroku's direct to S3 upload solution in Rails (which uses jQuery-File-Upload and the aws-sdk gem) so uploads to S3 can be made remotely using ajax. I hope this is useful:
posts_controller.rb
before_action :set_s3_direct_post, only: [:index, :create]
before_action :delete_picture_from_s3, only: [:destroy]
class PostsController < ApplicationController
def index
.
.
end
def create
#post = #user.posts.build(post_params)
if #post.save
format.html
format.js
end
end
def destroy
Post.find(params[:id]).destroy
end
private
def set_s3_direct_post
return S3_BUCKET.presigned_post(key: "uploads/#{SecureRandom.uuid}/${filename}", success_action_status: '201', acl: 'public-read')
end
def delete_picture_from_s3
key = params[:picture_url].split('amazonaws.com/')[1]
S3_BUCKET.object(key).delete
return true
rescue => e
# If anyone knows a good way to deal with a defunct file sitting in the bucket, please speak up.
return true
end
def post_params
params.require(:post).permit(:content, :picture_url)
end
end
posts.html.erb
<div class="info" data-url="<%= #s3_direct_post.url %>"
data-formdata="<%= (#s3_direct_post.fields.to_json) %>"
data-host="<%= URI.parse(#s3_direct_post.url).host %>">
</div>
The form
<%= form_for(:post, url: :posts, method: :post,
html: { class: "post_form", id: "post_form-#{post.id}" }
) do |f| %>
<%= f.text_area :content, id: "postfield-#{post.id}", class: "postText" %>
<%= f.button( :submit, name: "Post", title: "Post" ) do %>
<span class="glyphicon glyphicon-ok" aria-hidden="true"></span>
<% end %>
<span class="postuploadbutton" id="postUp-<%= post.id %>" title="Add file" >
<span class="glyphicon glyphicon-upload" aria-hidden="true"></span>
</span>
<span title="Cancel file" class="noticecancelupload" id="postCancel-<%= post.id %>" >
<span class="glyphicon glyphicon-remove-circle" aria-hidden="true"></span>
</span>
<%= f.file_field :picture_url, accept: 'image/jpeg,image/gif,image/png',
class: "notice_file_field", id: "postFile-#{post.id}" %>
<% end %>
_post.html.erb
<%= button_to post_path(
params: {
id: post.id,
picture_url: post.picture_url
}
),
class: 'btn btn-default btn-xs blurme',
data: { confirm: "Delete post: are you sure?" },
method: :delete do %>
<span class="glyphicon glyphicon-remove" aria-hidden="true"></span>
<% end %>
Javascript in each _post.html.erb
$(document).off('click',"#postUp-<%= post.id %>");
$(document).on('click', '#postUp-<%= post.id %>', function(e) {
prepareUpload("#post_form-<%= post.id %>");
$('#postFile-<%= post.id %>').trigger("click");
});
$(document).off('click',"#postCancel-<%= post.id %>");
$(document).on('click', '#postCancel-<%= post.id %>', function(e) {
$(".appendedInput").remove(); // $('#postFile-<% post.id %>').val(""); doesn't work for me
$('.progBar').css('background','white').text("");
});
$(document).off('submit',"#post_form-<%= post.id %>"); // without this the form submitted multiple times in production
$(document).on('submit', '#post_form-<%= post.id %>', function(e) { // don't use $('#post_form-<%= post.id %>').submit(function() { so it doesn't bind to the #post_form (so it still works after ajax loading)
e.preventDefault(); // prevent normal form submission
if ( validatePostForm('<%= post.id %>') ) {
$.ajax({
type: 'POST',
url: $(this).attr('action'),
data: $(this).serialize(),
dataType: 'script'
});
$('#postCancel-<%= post.id %>').trigger("click");
}
});
function validatePostForm(postid) {
if ( jQuery.isBlank($('#postfield-' + postid).val()) && jQuery.isBlank($('#postFile-' + postid).val()) ) {
alert("Write something fascinating or add a picture.");
return false;
} else {
return true;
}
}
Javascript in application.js
function prepareUpload(feckid) {
$(feckid).find("input:file").each(function(i, elem) {
var fileInput = $(elem);
var progressBar = $("<div class='progBar'></div>");
var barContainer = $("<div class='progress'></div>").append(progressBar);
fileInput.after(barContainer);
var maxFS = 10 * 1024 * 1024;
var info = $(".info");
var urlnumbnuts = info.attr("data-url");
var formdatanumbnuts = jQuery.parseJSON(info.attr("data-formdata"));
var hostnumbnuts = info.attr("data-host");
var form = $(fileInput.parents('form:first'));
fileInput.fileupload({
fileInput: fileInput,
maxFileSize: maxFS,
url: urlnumbnuts,
type: 'POST',
autoUpload: true,
formData: formdatanumbnuts,
paramName: 'file',
dataType: 'XML',
replaceFileInput: false,
add: function (e, data) {
$.each(data.files, function (index, file) {
if (file.size > maxFS) {
alert('Alas, the file exceeds the maximum file size of 10MB.');
form[0].reset();
return false;
} else {
data.submit();
return true;
}
});
},
progressall: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
progressBar.css('width', progress + '%')
},
start: function (e) {
progressBar.
css('background', 'orange').
css('display', 'block').
css('width', '0%').
text("Preparing...");
},
done: function(e, data) {
var key = $(data.jqXHR.responseXML).find("Key").text();
var url = '//' + hostnumbnuts + '/' + key;
var input = $('<input />', { type:'hidden', class:'appendedInput',
name: fileInput.attr('name'), value: url });
form.append(input);
progressBar.
css('background', 'green').
text("Ready");
},
fail: function(e, data) {
progressBar.
css("background", "red").
css("color", "black").
text("Failed");
}
});
});
} // function prepareUpload()
create.js.erb
$(".info").attr("data-formdata", '<%=raw #s3_direct_post.fields.to_json %>'); // don't use .data() to set attributes
$(".info").attr("data-url", "<%= #s3_direct_post.url %>");
$(".info").attr("data-host", "<%= URI.parse(#s3_direct_post.url).host %>");
$('.post_form')[0].reset();
$('.postText').val('');
application.js
//= require jquery-fileupload/basic
config/initializers/aws.rb
Aws.config.update({
region: 'us-east-1',
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
})
S3_BUCKET = Aws::S3::Resource.new.bucket(ENV['S3_BUCKET'])
Notes:
This solution is designed for multiple post forms on the index.html.erb page. This is why the #s3_direct_post information is placed inside a div of class info inside index.html.erb, rather than in each post form. This means there is only one #s3_direct_post presented on the page at any one time, irrespective of the number of forms on the page. The data inside the #s3_direct_post is only grabbed (with a call to prepareUpload()) upon clicking the file upload button. Upon submission a fresh #s3_direct_post is generated in the posts controller, and the information inside .info is updated by create.js.erb. Storing the #s3_direct_post data inside the form means many different instances of #s3_direct_post can exist at once, leading to errors with the file name generation.
You need to :set_s3_direct_post in both the posts controller index action (ready for the first upload) and the create action (ready for the second and subsequent uploads).
Normal form submission is prevented by e.preventDefault(); so it can be done 'manually' with $.ajax({. Why not just use remote: true in the form? Because in Rails, file upload is done with an HTML request and page refresh even when you try to do it remotely.
Use info.attr() rather than info.data() to set and retrieve the #s3_direct_post attributes because info.data doesn't get updated
(for example see this question). This means you also have to manually parse the attribute into an object using jQuery.parseJSON() (which .data() actually does automatically).
Don't use //= require jquery-fileupload in application.js. This bug was a real ballache to identify (see here). The original Heroku solution didn't work until I changed this.

You can use Paperclip to upload to S3 (see documentation) and to create thumbnails, although it uploads to temporary folder first, after that image processing can be applied before uploading file to S3.
As for the examples of such configuration, there are plenty of them throughout the blogosphere and on StackOverflow, e.g. this.

Related

How to trigger the file upload on the client side and not on form submission?

I have a working version of the active-storage example using s3 found here:
https://edgeguides.rubyonrails.org/active_storage_overview.html
Now I want to be able to perform the file upload not when I finishing filling the form but immediately after the user selects a file to upload.
Actually in my case I have a wysiwyg editor that has a on drop event that fires
var myCodeMirror = CodeMirror.fromTextArea(post_body, {
lineNumbers: true,
dragDrop: true
});
myCodeMirror.on('drop', function(data, e) {
var file;
var files;
// Check if files were dropped
files = e.dataTransfer.files;
if (files.length > 0) {
e.preventDefault();
e.stopPropagation();
file = files[0];
console.log('File: ' + file.name);
console.log('File: ' + file.type);
return false;
}
});
So is there, since the file drop triggers this event, for me to then send this to active-storage somehow so it will start uploading the file to S3 right away?
Triggering uploads from the client-side
Active Storage exposes the DirectUpload JavaScript class which you can use to trigger a file upload directly from the client-side.
You can leverage this for integrations with third-party plugins (e.g. Uppy, Dropzone) or with your own custom JS code.
Using DirectUpload
The first thing you need to do is make sure that AWS S3 is set up to handle direct uploads. This requires ensuring your CORS configuration is set up properly.
Next, you simply instantiate an instance of the DirectUpload class, passing it the file to upload and the upload URL.
import { DirectUpload } from "activestorage"
// your form needs the file_field direct_upload: true, which
// provides data-direct-upload-url
const input = document.querySelector('input[type=file]')
const url = input.dataset.directUploadUrl
const upload = new DirectUpload(file, url)
upload.create((error, blob) => {
// handle errors OR persist to the model using 'blob.signed_id'
})
See full documentation here:
https://edgeguides.rubyonrails.org/active_storage_overview.html#integrating-with-libraries-or-frameworks
The DirectUpload#create method initiates the upload to S3 and returns with an error or the uploaded file blob.
Assuming there are no errors, the last step is to persist the uploaded file to the model. You can do this using blob.signed_id and putting it into a hidden field somewhere on the page OR with an AJAX request to update your model.
Uploading a file on drop
In the case above, to start the direct upload on the drop simply put the code above into the drop handler.
Something like this:
myCodeMirror.on('drop', function(data, e) {
// Get the file
var file = e.dataTransfer.files[0];
// You need a file input somewhere on the page...
const input = document.querySelector('input[type=file]')
const url = input.dataset.directUploadUrl
// Instantiate the DirectUploader object
const upload = new DirectUpload(file, url)
// Upload the file
upload.create((error, blob) => { ... })
});
Using the asset pipeline
If you are just using the asset pipeline and not using a JavaScript bundler tool, then you create instances of the DirectUpload class like this
const upload = new ActiveStorage.DirectUpload(file, url)
The main problem of the topic is - you cannot Import DataUpload in java script section of the form. But we can create object ImmediateUploader as follow:
Global Java script part
upload/uploader.js
import { DirectUpload } from "#rails/activestorage"
export default class Uploader {
constructor(file, url) {
this.file = file
this.url = url
this.directUpload = new DirectUpload(this.file, this.url, this)
}
upload() {
return new Promise((resolve, reject) => {
this.directUpload.create((error, blob) => {
if (error) {
// Handle the error
reject(error)
} else {
// Add an appropriately-named hidden input to the form
// with a value of blob.signed_id
resolve(blob)
}
})
})
}
}
upload/index.js
import Uploader from './uploader.js'
export default {
upload (file, url) {
const uploader = new Uploader(file, url)
return uploader.upload()
}
}
application.js
window.ImmediateUploader = require('./upload');
Form part
Now we can use ImmediateUploader to upload selected files directly to active storage and update images after load without commit:
<%= simple_form_for(resource, as: resource_name, url: registration_path(resource_name), html: { method: :put }) do |f| %>
<%= f.error_notification %>
<div class="form-inputs">
<div class="row">
<img id="avatar" class="centered-and-cropped" width="100" height="100" style="border-radius:50%" src="<%= url_for(user.photo) %>">
<button type="button" class="btn" onclick="event.preventDefault(); document.getElementById('user_photo').click()">Change avatar</button>
</div>
<%= f.file_field :photo, direct_upload: true, class: "hiddenfile" %>
</div>
<div class="form-actions">
<%= f.button :submit, t(".update"), class: 'btn btn-primary' %>
</div>
<% end %>
<% content_for :js do %>
<script>
const input = document.querySelector('input[type=file]')
input.addEventListener('change', (event) => {
Array.from(input.files).forEach(file => uploadFile(file))
// clear uploaded files from the input
input.value = null
})
const uploadFile = (file) => {
// your form needs the file_field direct_upload: true, which
// provides data-direct-upload-url
const url = input.dataset.directUploadUrl;
ImmediateUploader.default.upload (file, url)
.then(blob => {
// get blob.signed_id and add it to form values to submit form
const hiddenField = document.createElement('input')
hiddenField.setAttribute("type", "hidden");
hiddenField.setAttribute("value", blob.signed_id);
hiddenField.name = input.name
document.querySelector('form').appendChild(hiddenField)
// Update new avatar Immediately
document.getElementById('avatar').src = '/rails/active_storage/blobs/' + blob.signed_id + '/' + blob.filename;
// Update photo in Database
axios.post('/users/photo', { 'photo': blob.signed_id }).then(response => {});
});
}</script>
<% end %>
Controller:
class RegistrationController < Devise::RegistrationsController
def update
super
#user = current_user
#user.avatar = url_for(#user.photo.variant(resize_to_limit: [300, 300]).processed) if #user.photo.attached?
#user.save
end
def updatephoto
#photo = params[:photo]
#user = current_user
#user.photo = #photo
#user.save
#user = current_user
#user.avatar = url_for(#user.photo.variant(resize_to_limit: [300, 300]).processed) if #user.photo.attached?
#user.save
end
end

Dynamic update with method calling to Amazon API

I am a beginner in Rails, but I have done a lot of searching on this and can't seem to find something to help me since I am having difficulty breaking down the problem. I have built a working method that requests information about a book given the ISBN from Amazon and would now like to use it to autofill information about the book after a user enters in the ISBN into a form. Here is my method (which is in my listing.rb model file):
def self.isbn_lookup(val)
request = Vacuum.new('US')
request.configure(
aws_access_key_id: 'access_key_here',
aws_secret_access_key: 'secret_access_key_here',
associate_tag: 'associate_tag_here'
)
response = request.item_lookup(
query: {
'ItemId' => val,
'SearchIndex' => 'Books',
'IdType' => 'ISBN'
},
persistent: true
)
fr = response.to_h #returns complete hash
author = fr.dig("ItemLookupResponse","Items","Item","ItemAttributes","Author")
title = fr.dig("ItemLookupResponse","Items","Item","ItemAttributes","Title")
manufacturer = fr.dig("ItemLookupResponse","Items","Item","ItemAttributes","Manufacturer")
url = fr.dig("ItemLookupResponse","Items","Item","ItemLinks","ItemLink",6,"URL")
return {title: title, author: author, manufacturer: manufacturer, url: url}
end
Here is my controller for now. I am not sure how to make this generic so that the ISBN number relies on what the user enters (it should take in a value given by the user instead of assuming the #isbn instance variable is always set):
def edit
#isbn = Listing.isbn_lookup(1285741552)
end
Here is my _form.html.erb partial where I want to call this ISBN autofill:
<%= form_for(#listing, :html => {class: "form-horizontal" , role: "form"}, method: :get) do |f| %>
<div class="form-group">
<div class="control-label col-sm-2">
<%= f.label :isbn, "ISBN" %>
</div>
<div class="col-sm-8">
<%= f.text_field :isbn, id: "auto-isbn", class: "form-control" , placeholder: "ISBN (10 or 13 digits)", autofocus: true %>
</div>
</div>
...
<% end %>
Finally, here is my JS for what I think should maybe be the start to the AJAX call:
$(document).ready(function() {
$(document).on('keyup','input#auto-isbn',function() {
$.get(this.action, $(this).serialize(), null, "script");
return false;
});
});
How do I make it so that when users put in an ISBN, my app will call the isbn_lookup method and then return the information gathered?
To begin, I would create a lookup path in your routes.rb file. That would look something like:
resources :listings do
collection do
get :lookup
end
end
Which will give you:
lookup_listings GET /listings/lookup(.:format) listings#lookup
Then create the lookup action in your listings_controller.rb, something like:
class ListingsController < ApplicationController
...
def lookup
#isbn_lookup_result = Listing.isbn_lookup(params[:isbn])
render partial: 'isbn_lookup_result'
end
...
end
Naturally, this requires that you have a _isbn_lookup_result.html.erb file that accesses/uses the values from #isbn_lookup_result.
Then, to call this action from your JS, do something like (full disclosure, I use coffeescript, so my plain JS skills are a little rusty):
$(document).ready(function() {
#TIMEOUT = null
$(document).on('keyup','input#auto-isbn',function() {
clearTimeout(#TIMEOUT)
#TIMEOUT = setTimeout(function(){
var ajaxResponse = $.ajax({
url: "listings/lookup",
type: 'GET',
data: {isbn: $('input#auto-isbn').val()}
});
ajaxResponse.success(function(data){
# do stuff with your data response
# perhaps something like:
$('#isbn-lookup-results-container').html(data)
});
}, 500);
});
});
This bit:
clearTimeout(#TIMEOUT)
#TIMEOUT = setTimeout(function(){
...
}, 500);
creates a 1/2 second delay between when your user stops typing and when the ajax function is called. That way, you're not literally doing a lookup on every keyup, only when the user pauses in their typing.
This bit:
var ajaxResponse = $.ajax({
url: "listings/lookup",
type: 'GET',
data: {isbn: $('input#auto-isbn').val()}
});
is the AJAX call. You can see the new listings/lookup path in use. The data: {isbn: $('input#auto-isbn').val()} bit gives you params[:isbn], which is used in the lookup action.
Then, upon success, you use this bit to do something with your response:
ajaxResponse.success(function(data){
# do stuff with your data response
# perhaps something like:
$('#isbn-lookup-results-container').html(data)
});
In this case, data is the HTML that resulted from the render partial: call, so could load it into a div.

How to upload videos asynchronously using carrierwave

I'm using carrierwave to upload movies to amazon s3. I want to do it asynchronously so that I can upload multiple videos at the same time and I want to use a progress bar.
How can I do this?
You can use dropzone for this. This is a sample code I have used in my app:
View:
<%= form_tag user_new_drag_drop_photo_path, method: :post, class: "dropzone form-horizontal", id: "media-dropzone", :authenticity_token => true do %>
<div class="fallback">
<%= file_field_tag "photo", multiple: true %>
</div>
<% end %>
Js Code:
function create_dropzone(thumbnailUrls){
Dropzone.autoDiscover = false;
var mediaDropzone;
mediaDropzone = new Dropzone("#media-dropzone", {
addRemoveLinks: true,
success: function(file, response) {
$(file.previewTemplate).find('.dz-remove').attr('id', response.id);
$(file.previewElement).addClass("dz-success");
},
removedfile: function(file){
var id = $(file.previewTemplate).find('.dz-remove').attr('id');
var parent = $(file.previewTemplate).find('.dz-remove').parent();
parent.remove();
// The above will remove the file preview but you can send a AJAX request to delete it from server
},
init: function () {
// This callback be used to do some task on dropzone intialize
}
});
This is purely my code(I have deleted some part), you need to change it according to yourself. I have used user_new_drag_drop_photo_path this is my custom path you can take it to photos_controllers create action. It will send the request using AJAX and will save it through it.
More tutorial you can find here:
http://www.dropzonejs.com/
Hope this helps.

Rails Amazon S3 direct upload with JQuery File Uploader

I have spent days now trying to make this work. I am getting this error
OPTIONS https://bucketname.s3.oregon.amazonaws.com/ net::ERR_NAME_RESOLUTION_FAILED
I am using Version 43.0.2357.130 Ubuntu 14.04 (64-bit)
Gemfile:
gem "jquery-fileupload-rails"
gem 'aws-sdk'
application.js (after jquery):
//= require jquery-fileupload/basic
application.css:
*= require jquery.fileupload
*= require jquery.fileupload-ui
I have a model called uploads that I have generated scaffolds for like this:
rails generate scaffold Upload upload_url:string
uploads_controller.rb:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"Oregon", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end
_form.html.erb (for uploads):
<%= form_for(#upload, html: { class: "directUpload" }) do |f| %>
......
<div class="field">
<%= f.label :upload_url %><br>
<%= f.file_field :upload_url %>
</div>
......
<%= content_tag "div", id: "upload_data", data: {url: #s3_direct_post.url, form_data: #s3_direct_post.fields } do %>
<% end %>
application.js (in the end):
$( document ).ready(function() {
$(function() {
$('.directUpload').find("input:file").each(function(i, elem) {
var fileInput = $(elem);
var form = $(fileInput.parents('form:first'));
var submitButton = form.find('input[type="submit"]');
var progressBar = $("<div class='bar'></div>");
var barContainer = $("<div class='progress'></div>").append(progressBar);
fileInput.after(barContainer);
fileInput.fileupload({
fileInput: fileInput,
url: $('#upload_data').data('url'),
type: 'POST',
autoUpload: true,
formData: $('#upload_data').data('form-data'),
paramName: 'file', // S3 does not like nested name fields i.e. name="user[avatar_url]"
dataType: 'XML', // S3 returns XML if success_action_status is set to 201
replaceFileInput: false,
progressall: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
progressBar.css('width', progress + '%')
},
start: function (e) {
submitButton.prop('disabled', true);
progressBar.
css('background', 'green').
css('display', 'block').
css('width', '0%').
text("Loading...");
},
done: function(e, data) {
submitButton.prop('disabled', false);
progressBar.text("Uploading done");
// extract key and generate URL from response
var key = $(data.jqXHR.responseXML).find("Key").text();
// create hidden field
var input = $("<input />", { type:'hidden', name: fileInput.attr('name'), value: url })
form.append(input);
},
fail: function(e, data) {
submitButton.prop('disabled', false);
progressBar.
css("background", "red").
text("Failed");
}
});
});
});
});
Seriously What can I do to fix this?
My guess is that you have misconfigured your bucket name / route. The error comes from Amazon, warning you there is no DNS route to https://bucketname.s3.oregon.amazonaws.com/.
It seems to me you need to set the actual bucketname to your bucket name, and also drop oregon from the url. Given that your bucket is named aymansalah, the url will be: https://aymansalah.s3.amazonaws.com/
Review Aws::Credentials documentation and check your environment variables to achieve that URL.
I found the problem. Thanks a lot to felixbuenemann a collaborator in jquery-fileupload-rails
Although that is what I see in the properties (it says Region: Oregon), I have to use "us-west-2" according to this Amazon region documentation
uploads_controller.rb is now:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"us-west-2", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end

Setting file content-type in S3 with jQuery file upload

I need help providing a content-type to amazon via a client side jQuery upload form. I need to add the content type because I'm uploading audio files that will not play in jPlayer for ie10 unless the content type is properly set. I used the blog post by pjambet - http://pjambet.github.io/blog/direct-upload-to-s3/ to get up and running (excellent post btw). It seems though that the order of the fields is extremely important. I've been trying to insert a hidden input tag either contaning the relevant content type (audio/mpeg3 I think) or blank to be populated by my upload script. No luck. The upload hangs when the extra fields are added.
direct-upload-form.html.erb
<form accept-charset="UTF-8" action="http://my_bucket.s3.amazonaws.com" class="direct-upload" enctype="multipart/form-data" method="post"><div style="margin:0;padding:0;display:inline"></div>
<%= hidden_field_tag :key, "${filename}" %>
<%= hidden_field_tag "AWSAccessKeyId", ENV['AWS_ACCESS_KEY_ID'] %>
<%= hidden_field_tag :acl, 'public-read' %>
<%= hidden_field_tag :policy %>
<%= hidden_field_tag :signature %>
<%= hidden_field_tag :success_action_status, "201" %>
<%= file_field_tag :file %>
<div class="row-fluid">
<div class="progress hide span8">
<div class="bar"></div>
</div>
</div>
</form>
audio-upload.js
$(function() {
$('input[type="submit"]').attr("disabled","true");
$('input[type="submit"]').val("Please upload audio first");
if($('#demo_audio').val() != ''){
var filename = $('#demo_audio').val().split('/').pop().split('%2F').pop();
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
}
$('.direct-upload').each(function() {
var form = $(this)
$(this).fileupload({
url: form.attr('action'),
type: 'POST',
autoUpload: true,
dataType: 'xml', // This is really important as s3 gives us back the url of the file in a XML document
add: function (event, data) {
$.ajax({
url: "/signed_urls",
type: 'GET',
dataType: 'json',
data: {doc: {title: data.files[0].name}}, // send the file name to the server so it can generate the key param
async: false,
success: function(data) {
// Now that we have our data, we update the form so it contains all
// the needed data to sign the request
form.find('input[name=key]').val(data.key)
form.find('input[name=policy]').val(data.policy)
form.find('input[name=signature]').val(data.signature)
}
})
data.form.find('#content-type').val(file.type)
data.submit();
},
send: function(e, data) {
var filename = data.files[0].name;
$('input[type="submit"]').val("Please wait until audio uploaded is complete...");
$('#file_status').addClass('label-info').html('Uploading ' + filename);
$('.progress').fadeIn();
},
progress: function(e, data){
// This is what makes everything really cool, thanks to that callback
// you can now update the progress bar based on the upload progress
var percent = Math.round((e.loaded / e.total) * 100)
$('.bar').css('width', percent + '%')
},
fail: function(e, data) {
console.log('fail')
},
success: function(data) {
// Here we get the file url on s3 in an xml doc
var url = $(data).find('Location').text()
$('#demo_audio').val(url) // Update the real input in the other form
},
done: function (event, data) {
$('input[type="submit"]').val("Create Demo");
$('input[type="submit"]').removeAttr("disabled");
$('.progress').fadeOut(300, function() {
$('.bar').css('width', 0);
var filename = data.files[0].name;
$('span.filename').html(filename);
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
$('#file').hide();
})
},
})
})
})
signed_urls_controller.rb
class SignedUrlsController < ApplicationController
def index
render json: {
policy: s3_upload_policy_document,
signature: s3_upload_signature,
key: "uploads/#{SecureRandom.uuid}/#{params[:doc][:title]}",
success_action_redirect: "/"
}
end
private
# generate the policy document that amazon is expecting.
def s3_upload_policy_document
Base64.encode64(
{
expiration: 30.minutes.from_now.utc.strftime('%Y-%m-%dT%H:%M:%S.000Z'),
conditions: [
{ bucket: ENV['AWS_S3_BUCKET'] },
{ acl: 'public-read' },
["starts-with", "$key", "uploads/"],
{ success_action_status: '201' }
]
}.to_json
).gsub(/\n|\r/, '')
end
# sign our request by Base64 encoding the policy document.
def s3_upload_signature
Base64.encode64(
OpenSSL::HMAC.digest(
OpenSSL::Digest::Digest.new('sha1'),
ENV['AWS_SECRET_ACCESS_KEY'],
s3_upload_policy_document
)
).gsub(/\n/, '')
end
end
As mentioned in the comments section for the above question, two changes are required to set the Content-Type for the uploaded content to audio/mpeg3.
The policy for the S3 POST API call must be changed to accept an additional "Content-Type" value. In the sample code, this can be achieved by adding the following condition to the conditions array in the s3_upload_policy_document method: ["eq", "$Content-Type", "audio/mpeg3"]
The "Content-Type" variable must be included with the POST request to S3. In the jQuery file uploader plugin this can be achieved by adding a hidden field to the form that is sent to S3, with the name "Content-Type" and the value "audio/mpeg3".

Resources