I have spent days now trying to make this work. I am getting this error
OPTIONS https://bucketname.s3.oregon.amazonaws.com/ net::ERR_NAME_RESOLUTION_FAILED
I am using Version 43.0.2357.130 Ubuntu 14.04 (64-bit)
Gemfile:
gem "jquery-fileupload-rails"
gem 'aws-sdk'
application.js (after jquery):
//= require jquery-fileupload/basic
application.css:
*= require jquery.fileupload
*= require jquery.fileupload-ui
I have a model called uploads that I have generated scaffolds for like this:
rails generate scaffold Upload upload_url:string
uploads_controller.rb:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"Oregon", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end
_form.html.erb (for uploads):
<%= form_for(#upload, html: { class: "directUpload" }) do |f| %>
......
<div class="field">
<%= f.label :upload_url %><br>
<%= f.file_field :upload_url %>
</div>
......
<%= content_tag "div", id: "upload_data", data: {url: #s3_direct_post.url, form_data: #s3_direct_post.fields } do %>
<% end %>
application.js (in the end):
$( document ).ready(function() {
$(function() {
$('.directUpload').find("input:file").each(function(i, elem) {
var fileInput = $(elem);
var form = $(fileInput.parents('form:first'));
var submitButton = form.find('input[type="submit"]');
var progressBar = $("<div class='bar'></div>");
var barContainer = $("<div class='progress'></div>").append(progressBar);
fileInput.after(barContainer);
fileInput.fileupload({
fileInput: fileInput,
url: $('#upload_data').data('url'),
type: 'POST',
autoUpload: true,
formData: $('#upload_data').data('form-data'),
paramName: 'file', // S3 does not like nested name fields i.e. name="user[avatar_url]"
dataType: 'XML', // S3 returns XML if success_action_status is set to 201
replaceFileInput: false,
progressall: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
progressBar.css('width', progress + '%')
},
start: function (e) {
submitButton.prop('disabled', true);
progressBar.
css('background', 'green').
css('display', 'block').
css('width', '0%').
text("Loading...");
},
done: function(e, data) {
submitButton.prop('disabled', false);
progressBar.text("Uploading done");
// extract key and generate URL from response
var key = $(data.jqXHR.responseXML).find("Key").text();
// create hidden field
var input = $("<input />", { type:'hidden', name: fileInput.attr('name'), value: url })
form.append(input);
},
fail: function(e, data) {
submitButton.prop('disabled', false);
progressBar.
css("background", "red").
text("Failed");
}
});
});
});
});
Seriously What can I do to fix this?
My guess is that you have misconfigured your bucket name / route. The error comes from Amazon, warning you there is no DNS route to https://bucketname.s3.oregon.amazonaws.com/.
It seems to me you need to set the actual bucketname to your bucket name, and also drop oregon from the url. Given that your bucket is named aymansalah, the url will be: https://aymansalah.s3.amazonaws.com/
Review Aws::Credentials documentation and check your environment variables to achieve that URL.
I found the problem. Thanks a lot to felixbuenemann a collaborator in jquery-fileupload-rails
Although that is what I see in the properties (it says Region: Oregon), I have to use "us-west-2" according to this Amazon region documentation
uploads_controller.rb is now:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"us-west-2", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end
Related
Im using Carrierwave on Rails to handle file upload.
It's working well and I want to make new feature on my app to display uploaded file with jquery (kind of preview uploaded file)
Therefore, I have to pass uploaded file url to the script
Unfortunately, I have no idea how to do it.
File upload model
class OrderFile < ActiveRecord::Base
belongs_to :order
validates :file, presence: true
mount_uploader :file, StLuploaderUploader
end
Form to upload new file
<%= form_for OrderFile.new, html: { multipart: true }, :url => url_for(:controller => 'order_files', :action => 'create') do |f| %>
<%= f.file_field :file, class: "form-control" %>
<% end %>
jquery File Upload call
jQuery(function() {
return $('#new_order_file').fileupload({
autoUpload: true,
add: function(e, data) {
var file, types;
types = /(\.|\/)(png|jpg|)$/i;
file = data.files[0];
if (types.test(file.type) || types.test(file.name)) {
data.context = $(tmpl("template-upload", file));
$('.CO-file_upload_progress').append(data.context);
data.submit();
} else {
return alert("Not supporting");
}
},
progress: function(e, data) {
var progress;
if (data.context) {
progress = parseInt(data.loaded / data.total * 100, 10);
return data.context.find('.bar').css('width', progress + '%');
}
},
done: function(e, data) {
$.each(data.files, function(index, file) {
prepare_file(file); // passing file to the function
});
}
});
});
Function to preview file
function prepare_file(file) {
loaded(file.url, "model-preview"); // I need an absolute path to the uploaded file to be passed here - file.url is not working
};
I'm using a modified version of the Jquery Multi-file uploader from Railscast #383 (http://railscasts.com/episodes/383-uploading-to-amazon-s3) in a Rails 3 app, and I need to tweak it so that it checks if a file already exists on S3, and skips re-uploading it if so.
Some background: my users need to update large chunks of data. For instance, one might select 500 4MB files to upload. Inevitably, their internet connection breaks, and rather than expecting the user to figure out which files uploaded and which didn't, I want them to be able to just select those same 500 files and the app be smart enough to not start again at the very beginning.
The most preferable solution would be to include an option in the S3 POST that says not to overwrite an existing file. Next most preferable would be to fire off a GET to S3 to see if the file exists and skip it if so.
Least preferably, I've implemented a solution that non-asynchronously fires off a GET to my Rails app (because I create a database entry upon completion of each upload), but I seem to be having problems with throttling those requests, and my user says her browser keeps crashing (it does all 500 at once I guess).
Relevant application.js
//= require jquery
//= require jquery_ujs
//= require jquery.ui.all
//= require jquery-fileupload/basic
//= require jquery-fileupload/vendor/tmpl
My form:
<%= s3_uploader_form post: uploaded_photos_path, as: "uploaded_photo[image_url]", photo_shoot_id: #photo_shoot.id do %>
<%= file_field_tag :file, multiple: true %>
<%= button_tag 'Upload Photos', id: 'upload_photo_button', type: 'button' %>
<% end %>
My javascript:
$(function() {
$('#s3_uploader').fileupload({
limitConcurrentUploads: 5,
add: function(e, data) {
var file, record_exists, photo_check_url;
file = data.files[0];
photo_check_url = "/my_route/has_photo_been_uploaded/" + encodeURIComponent(file.name)
// THIS IS MY NON-THROTTLING HACK THAT NEEDS REPLACEMENT/IMPROVEMENT
// THE CONTROLLER THAT HANDLES THE REQUEST JUST RENDERS AN INLINE STRING OF 'true' OR 'false'
$.ajax( {
url: photo_check_url,
async: false,
success: function (result) {
record_exists = result;
}
});
if (record_exists == 'false') {
data.context = $(tmpl("template-upload", file));
$('#s3_uploader').append(data.context);
data.submit();
}
},
progress: function(e, data) { // irrelevant },
done: function(e, data) { // irrelevant. It posts the object to my database }
},
fail: function(e, data) { // irrelevant }
});
});
My Helper:
module S3UploaderHelper
def s3_uploader_form(options = {}, &block)
uploader = S3Uploader.new(options)
form_tag(uploader.url, uploader.form_options) do
uploader.fields.map do |name, value|
hidden_field_tag(name, value)
end.join.html_safe + capture(&block)
end
end
class S3Uploader
def initialize(options)
#options = options.reverse_merge(
id: "s3_uploader",
aws_access_key_id: ENV["S3_ACCESS_KEY"],
aws_secret_access_key: ENV["S3_SECRET_ACCESS_KEY"],
bucket: S3_BUCKET_NAME,
acl: "private",
expiration: 10.hours.from_now.utc,
max_file_size: 20.megabytes,
as: "file"
)
end
def form_options
{
id: #options[:id],
method: "post",
authenticity_token: false,
multipart: true,
data: {
post: #options[:post],
as: #options[:as]
}
}
end
def fields
{
:key => key,
:acl => #options[:acl],
:policy => policy,
:signature => signature,
"AWSAccessKeyId" => #options[:aws_access_key_id],
}
end
def key
#key ||= "uploaded_photos/${filename}"
end
def url
"https://#{#options[:bucket]}.s3.amazonaws.com/"
end
def policy
Base64.encode64(policy_data.to_json).gsub("\n", "")
end
def policy_data
{
expiration: #options[:expiration],
conditions: [
["starts-with", "$utf8", ""],
["starts-with", "$key", ""],
["content-length-range", 0, #options[:max_file_size]],
{bucket: #options[:bucket]},
{acl: #options[:acl]}
]
}
end
def signature
Base64.encode64(
OpenSSL::HMAC.digest(
OpenSSL::Digest::Digest.new('sha1'),
#options[:aws_secret_access_key], policy
)
).gsub("\n", "")
end
end
end
After learning more about AJAX (after it occurred to me in my second comment), it looks like an acceptable solution was indeed to make the AJAX call asynchronous and place the S3 POST code inside its success callback. That solved my browser non-responsiveness issues.
$.ajax( {
url: my_route_to_ask_if_photo_was_already_uploaded,
success: function (result) {
if (result == 'false') {
// ...other code
data.submit();
}
});
I need help providing a content-type to amazon via a client side jQuery upload form. I need to add the content type because I'm uploading audio files that will not play in jPlayer for ie10 unless the content type is properly set. I used the blog post by pjambet - http://pjambet.github.io/blog/direct-upload-to-s3/ to get up and running (excellent post btw). It seems though that the order of the fields is extremely important. I've been trying to insert a hidden input tag either contaning the relevant content type (audio/mpeg3 I think) or blank to be populated by my upload script. No luck. The upload hangs when the extra fields are added.
direct-upload-form.html.erb
<form accept-charset="UTF-8" action="http://my_bucket.s3.amazonaws.com" class="direct-upload" enctype="multipart/form-data" method="post"><div style="margin:0;padding:0;display:inline"></div>
<%= hidden_field_tag :key, "${filename}" %>
<%= hidden_field_tag "AWSAccessKeyId", ENV['AWS_ACCESS_KEY_ID'] %>
<%= hidden_field_tag :acl, 'public-read' %>
<%= hidden_field_tag :policy %>
<%= hidden_field_tag :signature %>
<%= hidden_field_tag :success_action_status, "201" %>
<%= file_field_tag :file %>
<div class="row-fluid">
<div class="progress hide span8">
<div class="bar"></div>
</div>
</div>
</form>
audio-upload.js
$(function() {
$('input[type="submit"]').attr("disabled","true");
$('input[type="submit"]').val("Please upload audio first");
if($('#demo_audio').val() != ''){
var filename = $('#demo_audio').val().split('/').pop().split('%2F').pop();
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
}
$('.direct-upload').each(function() {
var form = $(this)
$(this).fileupload({
url: form.attr('action'),
type: 'POST',
autoUpload: true,
dataType: 'xml', // This is really important as s3 gives us back the url of the file in a XML document
add: function (event, data) {
$.ajax({
url: "/signed_urls",
type: 'GET',
dataType: 'json',
data: {doc: {title: data.files[0].name}}, // send the file name to the server so it can generate the key param
async: false,
success: function(data) {
// Now that we have our data, we update the form so it contains all
// the needed data to sign the request
form.find('input[name=key]').val(data.key)
form.find('input[name=policy]').val(data.policy)
form.find('input[name=signature]').val(data.signature)
}
})
data.form.find('#content-type').val(file.type)
data.submit();
},
send: function(e, data) {
var filename = data.files[0].name;
$('input[type="submit"]').val("Please wait until audio uploaded is complete...");
$('#file_status').addClass('label-info').html('Uploading ' + filename);
$('.progress').fadeIn();
},
progress: function(e, data){
// This is what makes everything really cool, thanks to that callback
// you can now update the progress bar based on the upload progress
var percent = Math.round((e.loaded / e.total) * 100)
$('.bar').css('width', percent + '%')
},
fail: function(e, data) {
console.log('fail')
},
success: function(data) {
// Here we get the file url on s3 in an xml doc
var url = $(data).find('Location').text()
$('#demo_audio').val(url) // Update the real input in the other form
},
done: function (event, data) {
$('input[type="submit"]').val("Create Demo");
$('input[type="submit"]').removeAttr("disabled");
$('.progress').fadeOut(300, function() {
$('.bar').css('width', 0);
var filename = data.files[0].name;
$('span.filename').html(filename);
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
$('#file').hide();
})
},
})
})
})
signed_urls_controller.rb
class SignedUrlsController < ApplicationController
def index
render json: {
policy: s3_upload_policy_document,
signature: s3_upload_signature,
key: "uploads/#{SecureRandom.uuid}/#{params[:doc][:title]}",
success_action_redirect: "/"
}
end
private
# generate the policy document that amazon is expecting.
def s3_upload_policy_document
Base64.encode64(
{
expiration: 30.minutes.from_now.utc.strftime('%Y-%m-%dT%H:%M:%S.000Z'),
conditions: [
{ bucket: ENV['AWS_S3_BUCKET'] },
{ acl: 'public-read' },
["starts-with", "$key", "uploads/"],
{ success_action_status: '201' }
]
}.to_json
).gsub(/\n|\r/, '')
end
# sign our request by Base64 encoding the policy document.
def s3_upload_signature
Base64.encode64(
OpenSSL::HMAC.digest(
OpenSSL::Digest::Digest.new('sha1'),
ENV['AWS_SECRET_ACCESS_KEY'],
s3_upload_policy_document
)
).gsub(/\n/, '')
end
end
As mentioned in the comments section for the above question, two changes are required to set the Content-Type for the uploaded content to audio/mpeg3.
The policy for the S3 POST API call must be changed to accept an additional "Content-Type" value. In the sample code, this can be achieved by adding the following condition to the conditions array in the s3_upload_policy_document method: ["eq", "$Content-Type", "audio/mpeg3"]
The "Content-Type" variable must be included with the POST request to S3. In the jQuery file uploader plugin this can be achieved by adding a hidden field to the form that is sent to S3, with the name "Content-Type" and the value "audio/mpeg3".
I want to make a simple form to upload an audio file. And i want to show a progress bar of the file uploading when the user submits the file. I only want to submit one file at a time.
My _upload.html.erb:
<%= form_for Sound.new do |f| %>
<%= f.file_field :fichier, name: 'sound[fichier]', :required => true %><br />
<%= f.text_field :title, :placeholder => 'Titre', :size => 10, :required => true %><br />
<%= f.submit 'Envoyer' %>
<%end%>
<div class="progress"><div class="bar" style="width: 0%;"></div></div>
My Js file:
$('#new_sound').submit(function() {
$('#new_sound').fileupload({
dataType: 'json',
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});
EDIT: (I realize that i forgot my question) Actually it's not really a question it's just i don't get why it didn't work. With this js: it works but i want to wait until the user hits submit before the file is uploaded.
$(function () {
$('#new_sound').fileupload({
dataType: 'json',
add: function (e, data){
data.submit();
},
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});
In your first snippet you are binding the submit function on your form that I'm assuming has an id of #new-sound.
Update your second snippet to reflect that. Should look something like this.
$('#new_sound').submit(function() {
$('#new_sound').fileupload({
dataType: 'json',
add: function (e, data){
data.submit();
},
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});
I'm looking to add functionality to my Rails app to upload files directly to Amazon S3. From my research the general consensus seems to be to use the s3-swf-upload-plugin. I've setup a sample app using that gem but I can't get it to play nice with only allowing the selection of a single file. I'd also like to create a record post upload and use paperclip to create a thumbnail for which I can find little guidance.
So my questions are:
(1) am I on the right track using that gem or should I be taking another appraoch?
(2) are there any samples out there that I could use for reference?
Any assistance would be much appreciated.
Chris
Try a new Gem called CarrierWaveDirect it allows you to upload files direct to S3 using a html form and easily move the image processing into a background process
Not sure about whether you can modify it easily to only upload one file at a time, but this gem works very well for me. It is based on one of Ryan Bates' Railscast:
https://github.com/waynehoover/s3_direct_upload
Try looking into carrierwave https://github.com/jnicklas/carrierwave (supports s3)
Multi file uploads with carrierwave and uploadify http://blog.assimov.net/post/4306595758/multi-file-upload-with-uploadify-and-carrierwave-on
If you are using Rails 3, please check out my sample projects:
Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader
Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload
By the way, you can do post-processing with Paperclip using something like this blog post describes:
http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip
I have adapted Heroku's direct to S3 upload solution in Rails (which uses jQuery-File-Upload and the aws-sdk gem) so uploads to S3 can be made remotely using ajax. I hope this is useful:
posts_controller.rb
before_action :set_s3_direct_post, only: [:index, :create]
before_action :delete_picture_from_s3, only: [:destroy]
class PostsController < ApplicationController
def index
.
.
end
def create
#post = #user.posts.build(post_params)
if #post.save
format.html
format.js
end
end
def destroy
Post.find(params[:id]).destroy
end
private
def set_s3_direct_post
return S3_BUCKET.presigned_post(key: "uploads/#{SecureRandom.uuid}/${filename}", success_action_status: '201', acl: 'public-read')
end
def delete_picture_from_s3
key = params[:picture_url].split('amazonaws.com/')[1]
S3_BUCKET.object(key).delete
return true
rescue => e
# If anyone knows a good way to deal with a defunct file sitting in the bucket, please speak up.
return true
end
def post_params
params.require(:post).permit(:content, :picture_url)
end
end
posts.html.erb
<div class="info" data-url="<%= #s3_direct_post.url %>"
data-formdata="<%= (#s3_direct_post.fields.to_json) %>"
data-host="<%= URI.parse(#s3_direct_post.url).host %>">
</div>
The form
<%= form_for(:post, url: :posts, method: :post,
html: { class: "post_form", id: "post_form-#{post.id}" }
) do |f| %>
<%= f.text_area :content, id: "postfield-#{post.id}", class: "postText" %>
<%= f.button( :submit, name: "Post", title: "Post" ) do %>
<span class="glyphicon glyphicon-ok" aria-hidden="true"></span>
<% end %>
<span class="postuploadbutton" id="postUp-<%= post.id %>" title="Add file" >
<span class="glyphicon glyphicon-upload" aria-hidden="true"></span>
</span>
<span title="Cancel file" class="noticecancelupload" id="postCancel-<%= post.id %>" >
<span class="glyphicon glyphicon-remove-circle" aria-hidden="true"></span>
</span>
<%= f.file_field :picture_url, accept: 'image/jpeg,image/gif,image/png',
class: "notice_file_field", id: "postFile-#{post.id}" %>
<% end %>
_post.html.erb
<%= button_to post_path(
params: {
id: post.id,
picture_url: post.picture_url
}
),
class: 'btn btn-default btn-xs blurme',
data: { confirm: "Delete post: are you sure?" },
method: :delete do %>
<span class="glyphicon glyphicon-remove" aria-hidden="true"></span>
<% end %>
Javascript in each _post.html.erb
$(document).off('click',"#postUp-<%= post.id %>");
$(document).on('click', '#postUp-<%= post.id %>', function(e) {
prepareUpload("#post_form-<%= post.id %>");
$('#postFile-<%= post.id %>').trigger("click");
});
$(document).off('click',"#postCancel-<%= post.id %>");
$(document).on('click', '#postCancel-<%= post.id %>', function(e) {
$(".appendedInput").remove(); // $('#postFile-<% post.id %>').val(""); doesn't work for me
$('.progBar').css('background','white').text("");
});
$(document).off('submit',"#post_form-<%= post.id %>"); // without this the form submitted multiple times in production
$(document).on('submit', '#post_form-<%= post.id %>', function(e) { // don't use $('#post_form-<%= post.id %>').submit(function() { so it doesn't bind to the #post_form (so it still works after ajax loading)
e.preventDefault(); // prevent normal form submission
if ( validatePostForm('<%= post.id %>') ) {
$.ajax({
type: 'POST',
url: $(this).attr('action'),
data: $(this).serialize(),
dataType: 'script'
});
$('#postCancel-<%= post.id %>').trigger("click");
}
});
function validatePostForm(postid) {
if ( jQuery.isBlank($('#postfield-' + postid).val()) && jQuery.isBlank($('#postFile-' + postid).val()) ) {
alert("Write something fascinating or add a picture.");
return false;
} else {
return true;
}
}
Javascript in application.js
function prepareUpload(feckid) {
$(feckid).find("input:file").each(function(i, elem) {
var fileInput = $(elem);
var progressBar = $("<div class='progBar'></div>");
var barContainer = $("<div class='progress'></div>").append(progressBar);
fileInput.after(barContainer);
var maxFS = 10 * 1024 * 1024;
var info = $(".info");
var urlnumbnuts = info.attr("data-url");
var formdatanumbnuts = jQuery.parseJSON(info.attr("data-formdata"));
var hostnumbnuts = info.attr("data-host");
var form = $(fileInput.parents('form:first'));
fileInput.fileupload({
fileInput: fileInput,
maxFileSize: maxFS,
url: urlnumbnuts,
type: 'POST',
autoUpload: true,
formData: formdatanumbnuts,
paramName: 'file',
dataType: 'XML',
replaceFileInput: false,
add: function (e, data) {
$.each(data.files, function (index, file) {
if (file.size > maxFS) {
alert('Alas, the file exceeds the maximum file size of 10MB.');
form[0].reset();
return false;
} else {
data.submit();
return true;
}
});
},
progressall: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
progressBar.css('width', progress + '%')
},
start: function (e) {
progressBar.
css('background', 'orange').
css('display', 'block').
css('width', '0%').
text("Preparing...");
},
done: function(e, data) {
var key = $(data.jqXHR.responseXML).find("Key").text();
var url = '//' + hostnumbnuts + '/' + key;
var input = $('<input />', { type:'hidden', class:'appendedInput',
name: fileInput.attr('name'), value: url });
form.append(input);
progressBar.
css('background', 'green').
text("Ready");
},
fail: function(e, data) {
progressBar.
css("background", "red").
css("color", "black").
text("Failed");
}
});
});
} // function prepareUpload()
create.js.erb
$(".info").attr("data-formdata", '<%=raw #s3_direct_post.fields.to_json %>'); // don't use .data() to set attributes
$(".info").attr("data-url", "<%= #s3_direct_post.url %>");
$(".info").attr("data-host", "<%= URI.parse(#s3_direct_post.url).host %>");
$('.post_form')[0].reset();
$('.postText').val('');
application.js
//= require jquery-fileupload/basic
config/initializers/aws.rb
Aws.config.update({
region: 'us-east-1',
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
})
S3_BUCKET = Aws::S3::Resource.new.bucket(ENV['S3_BUCKET'])
Notes:
This solution is designed for multiple post forms on the index.html.erb page. This is why the #s3_direct_post information is placed inside a div of class info inside index.html.erb, rather than in each post form. This means there is only one #s3_direct_post presented on the page at any one time, irrespective of the number of forms on the page. The data inside the #s3_direct_post is only grabbed (with a call to prepareUpload()) upon clicking the file upload button. Upon submission a fresh #s3_direct_post is generated in the posts controller, and the information inside .info is updated by create.js.erb. Storing the #s3_direct_post data inside the form means many different instances of #s3_direct_post can exist at once, leading to errors with the file name generation.
You need to :set_s3_direct_post in both the posts controller index action (ready for the first upload) and the create action (ready for the second and subsequent uploads).
Normal form submission is prevented by e.preventDefault(); so it can be done 'manually' with $.ajax({. Why not just use remote: true in the form? Because in Rails, file upload is done with an HTML request and page refresh even when you try to do it remotely.
Use info.attr() rather than info.data() to set and retrieve the #s3_direct_post attributes because info.data doesn't get updated
(for example see this question). This means you also have to manually parse the attribute into an object using jQuery.parseJSON() (which .data() actually does automatically).
Don't use //= require jquery-fileupload in application.js. This bug was a real ballache to identify (see here). The original Heroku solution didn't work until I changed this.
You can use Paperclip to upload to S3 (see documentation) and to create thumbnails, although it uploads to temporary folder first, after that image processing can be applied before uploading file to S3.
As for the examples of such configuration, there are plenty of them throughout the blogosphere and on StackOverflow, e.g. this.