Im using Carrierwave on Rails to handle file upload.
It's working well and I want to make new feature on my app to display uploaded file with jquery (kind of preview uploaded file)
Therefore, I have to pass uploaded file url to the script
Unfortunately, I have no idea how to do it.
File upload model
class OrderFile < ActiveRecord::Base
belongs_to :order
validates :file, presence: true
mount_uploader :file, StLuploaderUploader
end
Form to upload new file
<%= form_for OrderFile.new, html: { multipart: true }, :url => url_for(:controller => 'order_files', :action => 'create') do |f| %>
<%= f.file_field :file, class: "form-control" %>
<% end %>
jquery File Upload call
jQuery(function() {
return $('#new_order_file').fileupload({
autoUpload: true,
add: function(e, data) {
var file, types;
types = /(\.|\/)(png|jpg|)$/i;
file = data.files[0];
if (types.test(file.type) || types.test(file.name)) {
data.context = $(tmpl("template-upload", file));
$('.CO-file_upload_progress').append(data.context);
data.submit();
} else {
return alert("Not supporting");
}
},
progress: function(e, data) {
var progress;
if (data.context) {
progress = parseInt(data.loaded / data.total * 100, 10);
return data.context.find('.bar').css('width', progress + '%');
}
},
done: function(e, data) {
$.each(data.files, function(index, file) {
prepare_file(file); // passing file to the function
});
}
});
});
Function to preview file
function prepare_file(file) {
loaded(file.url, "model-preview"); // I need an absolute path to the uploaded file to be passed here - file.url is not working
};
Related
I'm making an image library type thing in Rails and Vue, and I'm using DirectUpload to manage attachments.
# photo.rb
class Photo < ApplicationRecord
has_one_attached :file
# ...
end
# photos_controller.rb
class PhotosController < ApplicationController
load_and_authorize_resource
before_action :set_photo, only: %i[show update destroy]
protect_from_forgery unless: -> { request.format.json? }
def index
#photo = current_user.photos.new
render action: 'index'
end
def create
#photo = current_user.photos.create(photo_params)
render json: PhotoBlueprint.render(#photo, root: :photo)
end
# ...
def photo_params
params.require(:photo).permit(:id, :file)
end
end
# photos/index.html.erb
<%= simple_form_for(#photo, url: photos_path(#photo), html: { multipart: true }) do |f| %>
<%= f.file_field :file, multiple: true, direct_upload: true, style: 'display: none;' %>
<% end %>
<div id='photos-app'></div>
// UserFileLib.vue
<script>
import { mapState, mapActions } from 'pinia'
import { usePhotoStore } from '#stores/photo'
import { DirectUpload } from '#rails/activestorage'
export default {
name: 'UserFileLib',
computed: {
...mapState(usePhotoStore, [
'photos'
]),
url () {
return document.getElementById('photo_file').dataset.directUploadUrl
},
input () {
return document.getElementById('file-input')
},
},
mounted () {
this.getPhotos()
},
methods: {
...mapActions(usePhotoStore, [
'addPhoto',
'getPhotos',
]),
activestorageURL (blob) {
return `/rails/active_storage/blobs/redirect/${blob.signed_id}/${blob.filename}`
},
uploadToActiveStorage () {
const file = this.input.files[0]
const upload = new DirectUpload(file, this.url)
upload.create((error, blob) => {
if (error) {
console.log(error)
} else {
const url = this.activestorageURL(blob)
console.log(url)
this.getPhotos()
}
})
},
openFileBrowser () {
this.input.click()
},
formatSize (bytes) {
return Math.round(bytes / 1000)
}
}
}
</script>
<template>
<div
#click="openFileBrowser"
class="card p-3">
Click or drop files here
</div>
<input
type="file"
:multiple="true"
#change="uploadToActiveStorage"
id="file-input" />
<div class="grid is-inline-grid mt-2">
<div
class="image-container"
v-for="image in photos"
:key="image.id">
<img :src="image.url" :alt="image.label" />
<div class="filename">
<strong>{{ image.label }}</strong>
<br />
{{ formatSize(image.size) }} kb
</div>
<div class="close">
×
</div>
</div>
</div>
</template>
Now, the uploads work fine, the blob is stored correctly.
My issue is that a new Photo object is not created to wrap the attachment, meaning the uploads are lost in the system and have no parent record.
What am I doing wrong?
I've solved this for anyone else looking for help. The logic is to create or update the parent record after the upload is done. I missed this in the official documentation.
upload.create((error, blob) => {
if (error) {
// Handle the error
} else {
// ** This is the way **
// Add an appropriately-named hidden input to the form with a
// value of blob.signed_id so that the blob ids will be
// transmitted in the normal upload flow
// ** End of **
//
const hiddenField = document.createElement('input')
hiddenField.setAttribute("type", "hidden");
hiddenField.setAttribute("value", blob.signed_id);
hiddenField.name = input.name
document.querySelector('form').appendChild(hiddenField)
}
})
Since I'm using Vue and Pinia I made a solution in keeping with that logic:
// UserImageLib.vue
uploadToActiveStorage (input, file) {
const url = input.dataset.directUploadUrl
const upload = new DirectUpload(file, url)
upload.create((error, blob) => {
if (error) {
console.log(error)
} else {
const params = { [input.name]: blob.signed_id }
this.createPhoto(params)
}
})
},
// stores/photo.js
addPhoto (payload) {
this.photos.unshift(payload)
},
createPhoto (payload) {
http.post(`/photos`, payload).then(res => {
const photo = res.data.photo
this.addPhoto(photo)
})
},
I have a working version of the active-storage example using s3 found here:
https://edgeguides.rubyonrails.org/active_storage_overview.html
Now I want to be able to perform the file upload not when I finishing filling the form but immediately after the user selects a file to upload.
Actually in my case I have a wysiwyg editor that has a on drop event that fires
var myCodeMirror = CodeMirror.fromTextArea(post_body, {
lineNumbers: true,
dragDrop: true
});
myCodeMirror.on('drop', function(data, e) {
var file;
var files;
// Check if files were dropped
files = e.dataTransfer.files;
if (files.length > 0) {
e.preventDefault();
e.stopPropagation();
file = files[0];
console.log('File: ' + file.name);
console.log('File: ' + file.type);
return false;
}
});
So is there, since the file drop triggers this event, for me to then send this to active-storage somehow so it will start uploading the file to S3 right away?
Triggering uploads from the client-side
Active Storage exposes the DirectUpload JavaScript class which you can use to trigger a file upload directly from the client-side.
You can leverage this for integrations with third-party plugins (e.g. Uppy, Dropzone) or with your own custom JS code.
Using DirectUpload
The first thing you need to do is make sure that AWS S3 is set up to handle direct uploads. This requires ensuring your CORS configuration is set up properly.
Next, you simply instantiate an instance of the DirectUpload class, passing it the file to upload and the upload URL.
import { DirectUpload } from "activestorage"
// your form needs the file_field direct_upload: true, which
// provides data-direct-upload-url
const input = document.querySelector('input[type=file]')
const url = input.dataset.directUploadUrl
const upload = new DirectUpload(file, url)
upload.create((error, blob) => {
// handle errors OR persist to the model using 'blob.signed_id'
})
See full documentation here:
https://edgeguides.rubyonrails.org/active_storage_overview.html#integrating-with-libraries-or-frameworks
The DirectUpload#create method initiates the upload to S3 and returns with an error or the uploaded file blob.
Assuming there are no errors, the last step is to persist the uploaded file to the model. You can do this using blob.signed_id and putting it into a hidden field somewhere on the page OR with an AJAX request to update your model.
Uploading a file on drop
In the case above, to start the direct upload on the drop simply put the code above into the drop handler.
Something like this:
myCodeMirror.on('drop', function(data, e) {
// Get the file
var file = e.dataTransfer.files[0];
// You need a file input somewhere on the page...
const input = document.querySelector('input[type=file]')
const url = input.dataset.directUploadUrl
// Instantiate the DirectUploader object
const upload = new DirectUpload(file, url)
// Upload the file
upload.create((error, blob) => { ... })
});
Using the asset pipeline
If you are just using the asset pipeline and not using a JavaScript bundler tool, then you create instances of the DirectUpload class like this
const upload = new ActiveStorage.DirectUpload(file, url)
The main problem of the topic is - you cannot Import DataUpload in java script section of the form. But we can create object ImmediateUploader as follow:
Global Java script part
upload/uploader.js
import { DirectUpload } from "#rails/activestorage"
export default class Uploader {
constructor(file, url) {
this.file = file
this.url = url
this.directUpload = new DirectUpload(this.file, this.url, this)
}
upload() {
return new Promise((resolve, reject) => {
this.directUpload.create((error, blob) => {
if (error) {
// Handle the error
reject(error)
} else {
// Add an appropriately-named hidden input to the form
// with a value of blob.signed_id
resolve(blob)
}
})
})
}
}
upload/index.js
import Uploader from './uploader.js'
export default {
upload (file, url) {
const uploader = new Uploader(file, url)
return uploader.upload()
}
}
application.js
window.ImmediateUploader = require('./upload');
Form part
Now we can use ImmediateUploader to upload selected files directly to active storage and update images after load without commit:
<%= simple_form_for(resource, as: resource_name, url: registration_path(resource_name), html: { method: :put }) do |f| %>
<%= f.error_notification %>
<div class="form-inputs">
<div class="row">
<img id="avatar" class="centered-and-cropped" width="100" height="100" style="border-radius:50%" src="<%= url_for(user.photo) %>">
<button type="button" class="btn" onclick="event.preventDefault(); document.getElementById('user_photo').click()">Change avatar</button>
</div>
<%= f.file_field :photo, direct_upload: true, class: "hiddenfile" %>
</div>
<div class="form-actions">
<%= f.button :submit, t(".update"), class: 'btn btn-primary' %>
</div>
<% end %>
<% content_for :js do %>
<script>
const input = document.querySelector('input[type=file]')
input.addEventListener('change', (event) => {
Array.from(input.files).forEach(file => uploadFile(file))
// clear uploaded files from the input
input.value = null
})
const uploadFile = (file) => {
// your form needs the file_field direct_upload: true, which
// provides data-direct-upload-url
const url = input.dataset.directUploadUrl;
ImmediateUploader.default.upload (file, url)
.then(blob => {
// get blob.signed_id and add it to form values to submit form
const hiddenField = document.createElement('input')
hiddenField.setAttribute("type", "hidden");
hiddenField.setAttribute("value", blob.signed_id);
hiddenField.name = input.name
document.querySelector('form').appendChild(hiddenField)
// Update new avatar Immediately
document.getElementById('avatar').src = '/rails/active_storage/blobs/' + blob.signed_id + '/' + blob.filename;
// Update photo in Database
axios.post('/users/photo', { 'photo': blob.signed_id }).then(response => {});
});
}</script>
<% end %>
Controller:
class RegistrationController < Devise::RegistrationsController
def update
super
#user = current_user
#user.avatar = url_for(#user.photo.variant(resize_to_limit: [300, 300]).processed) if #user.photo.attached?
#user.save
end
def updatephoto
#photo = params[:photo]
#user = current_user
#user.photo = #photo
#user.save
#user = current_user
#user.avatar = url_for(#user.photo.variant(resize_to_limit: [300, 300]).processed) if #user.photo.attached?
#user.save
end
end
I have spent days now trying to make this work. I am getting this error
OPTIONS https://bucketname.s3.oregon.amazonaws.com/ net::ERR_NAME_RESOLUTION_FAILED
I am using Version 43.0.2357.130 Ubuntu 14.04 (64-bit)
Gemfile:
gem "jquery-fileupload-rails"
gem 'aws-sdk'
application.js (after jquery):
//= require jquery-fileupload/basic
application.css:
*= require jquery.fileupload
*= require jquery.fileupload-ui
I have a model called uploads that I have generated scaffolds for like this:
rails generate scaffold Upload upload_url:string
uploads_controller.rb:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"Oregon", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end
_form.html.erb (for uploads):
<%= form_for(#upload, html: { class: "directUpload" }) do |f| %>
......
<div class="field">
<%= f.label :upload_url %><br>
<%= f.file_field :upload_url %>
</div>
......
<%= content_tag "div", id: "upload_data", data: {url: #s3_direct_post.url, form_data: #s3_direct_post.fields } do %>
<% end %>
application.js (in the end):
$( document ).ready(function() {
$(function() {
$('.directUpload').find("input:file").each(function(i, elem) {
var fileInput = $(elem);
var form = $(fileInput.parents('form:first'));
var submitButton = form.find('input[type="submit"]');
var progressBar = $("<div class='bar'></div>");
var barContainer = $("<div class='progress'></div>").append(progressBar);
fileInput.after(barContainer);
fileInput.fileupload({
fileInput: fileInput,
url: $('#upload_data').data('url'),
type: 'POST',
autoUpload: true,
formData: $('#upload_data').data('form-data'),
paramName: 'file', // S3 does not like nested name fields i.e. name="user[avatar_url]"
dataType: 'XML', // S3 returns XML if success_action_status is set to 201
replaceFileInput: false,
progressall: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
progressBar.css('width', progress + '%')
},
start: function (e) {
submitButton.prop('disabled', true);
progressBar.
css('background', 'green').
css('display', 'block').
css('width', '0%').
text("Loading...");
},
done: function(e, data) {
submitButton.prop('disabled', false);
progressBar.text("Uploading done");
// extract key and generate URL from response
var key = $(data.jqXHR.responseXML).find("Key").text();
// create hidden field
var input = $("<input />", { type:'hidden', name: fileInput.attr('name'), value: url })
form.append(input);
},
fail: function(e, data) {
submitButton.prop('disabled', false);
progressBar.
css("background", "red").
text("Failed");
}
});
});
});
});
Seriously What can I do to fix this?
My guess is that you have misconfigured your bucket name / route. The error comes from Amazon, warning you there is no DNS route to https://bucketname.s3.oregon.amazonaws.com/.
It seems to me you need to set the actual bucketname to your bucket name, and also drop oregon from the url. Given that your bucket is named aymansalah, the url will be: https://aymansalah.s3.amazonaws.com/
Review Aws::Credentials documentation and check your environment variables to achieve that URL.
I found the problem. Thanks a lot to felixbuenemann a collaborator in jquery-fileupload-rails
Although that is what I see in the properties (it says Region: Oregon), I have to use "us-west-2" according to this Amazon region documentation
uploads_controller.rb is now:
def new
#s3_direct_post = Aws::S3::PresignedPost.new(Aws::Credentials.new(ENV['AWS_S3_ACCESS_KEY_ID'], ENV['AWS_S3_SECRET_ACCESS_KEY']),
"us-west-2", ENV['AWS_S3_BUCKET'], {
key: '/uploads/object/test.test',
content_length_range: 0..999999999,
acl: 'public-read',
success_action_status: "201",
})
#upload = Upload.new
end
I need help providing a content-type to amazon via a client side jQuery upload form. I need to add the content type because I'm uploading audio files that will not play in jPlayer for ie10 unless the content type is properly set. I used the blog post by pjambet - http://pjambet.github.io/blog/direct-upload-to-s3/ to get up and running (excellent post btw). It seems though that the order of the fields is extremely important. I've been trying to insert a hidden input tag either contaning the relevant content type (audio/mpeg3 I think) or blank to be populated by my upload script. No luck. The upload hangs when the extra fields are added.
direct-upload-form.html.erb
<form accept-charset="UTF-8" action="http://my_bucket.s3.amazonaws.com" class="direct-upload" enctype="multipart/form-data" method="post"><div style="margin:0;padding:0;display:inline"></div>
<%= hidden_field_tag :key, "${filename}" %>
<%= hidden_field_tag "AWSAccessKeyId", ENV['AWS_ACCESS_KEY_ID'] %>
<%= hidden_field_tag :acl, 'public-read' %>
<%= hidden_field_tag :policy %>
<%= hidden_field_tag :signature %>
<%= hidden_field_tag :success_action_status, "201" %>
<%= file_field_tag :file %>
<div class="row-fluid">
<div class="progress hide span8">
<div class="bar"></div>
</div>
</div>
</form>
audio-upload.js
$(function() {
$('input[type="submit"]').attr("disabled","true");
$('input[type="submit"]').val("Please upload audio first");
if($('#demo_audio').val() != ''){
var filename = $('#demo_audio').val().split('/').pop().split('%2F').pop();
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
}
$('.direct-upload').each(function() {
var form = $(this)
$(this).fileupload({
url: form.attr('action'),
type: 'POST',
autoUpload: true,
dataType: 'xml', // This is really important as s3 gives us back the url of the file in a XML document
add: function (event, data) {
$.ajax({
url: "/signed_urls",
type: 'GET',
dataType: 'json',
data: {doc: {title: data.files[0].name}}, // send the file name to the server so it can generate the key param
async: false,
success: function(data) {
// Now that we have our data, we update the form so it contains all
// the needed data to sign the request
form.find('input[name=key]').val(data.key)
form.find('input[name=policy]').val(data.policy)
form.find('input[name=signature]').val(data.signature)
}
})
data.form.find('#content-type').val(file.type)
data.submit();
},
send: function(e, data) {
var filename = data.files[0].name;
$('input[type="submit"]').val("Please wait until audio uploaded is complete...");
$('#file_status').addClass('label-info').html('Uploading ' + filename);
$('.progress').fadeIn();
},
progress: function(e, data){
// This is what makes everything really cool, thanks to that callback
// you can now update the progress bar based on the upload progress
var percent = Math.round((e.loaded / e.total) * 100)
$('.bar').css('width', percent + '%')
},
fail: function(e, data) {
console.log('fail')
},
success: function(data) {
// Here we get the file url on s3 in an xml doc
var url = $(data).find('Location').text()
$('#demo_audio').val(url) // Update the real input in the other form
},
done: function (event, data) {
$('input[type="submit"]').val("Create Demo");
$('input[type="submit"]').removeAttr("disabled");
$('.progress').fadeOut(300, function() {
$('.bar').css('width', 0);
var filename = data.files[0].name;
$('span.filename').html(filename);
$('#file_status').removeClass('label-info').addClass('label-success').html(filename + ' upload complete');
$('#file').hide();
})
},
})
})
})
signed_urls_controller.rb
class SignedUrlsController < ApplicationController
def index
render json: {
policy: s3_upload_policy_document,
signature: s3_upload_signature,
key: "uploads/#{SecureRandom.uuid}/#{params[:doc][:title]}",
success_action_redirect: "/"
}
end
private
# generate the policy document that amazon is expecting.
def s3_upload_policy_document
Base64.encode64(
{
expiration: 30.minutes.from_now.utc.strftime('%Y-%m-%dT%H:%M:%S.000Z'),
conditions: [
{ bucket: ENV['AWS_S3_BUCKET'] },
{ acl: 'public-read' },
["starts-with", "$key", "uploads/"],
{ success_action_status: '201' }
]
}.to_json
).gsub(/\n|\r/, '')
end
# sign our request by Base64 encoding the policy document.
def s3_upload_signature
Base64.encode64(
OpenSSL::HMAC.digest(
OpenSSL::Digest::Digest.new('sha1'),
ENV['AWS_SECRET_ACCESS_KEY'],
s3_upload_policy_document
)
).gsub(/\n/, '')
end
end
As mentioned in the comments section for the above question, two changes are required to set the Content-Type for the uploaded content to audio/mpeg3.
The policy for the S3 POST API call must be changed to accept an additional "Content-Type" value. In the sample code, this can be achieved by adding the following condition to the conditions array in the s3_upload_policy_document method: ["eq", "$Content-Type", "audio/mpeg3"]
The "Content-Type" variable must be included with the POST request to S3. In the jQuery file uploader plugin this can be achieved by adding a hidden field to the form that is sent to S3, with the name "Content-Type" and the value "audio/mpeg3".
I want to make a simple form to upload an audio file. And i want to show a progress bar of the file uploading when the user submits the file. I only want to submit one file at a time.
My _upload.html.erb:
<%= form_for Sound.new do |f| %>
<%= f.file_field :fichier, name: 'sound[fichier]', :required => true %><br />
<%= f.text_field :title, :placeholder => 'Titre', :size => 10, :required => true %><br />
<%= f.submit 'Envoyer' %>
<%end%>
<div class="progress"><div class="bar" style="width: 0%;"></div></div>
My Js file:
$('#new_sound').submit(function() {
$('#new_sound').fileupload({
dataType: 'json',
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});
EDIT: (I realize that i forgot my question) Actually it's not really a question it's just i don't get why it didn't work. With this js: it works but i want to wait until the user hits submit before the file is uploaded.
$(function () {
$('#new_sound').fileupload({
dataType: 'json',
add: function (e, data){
data.submit();
},
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});
In your first snippet you are binding the submit function on your form that I'm assuming has an id of #new-sound.
Update your second snippet to reflect that. Should look something like this.
$('#new_sound').submit(function() {
$('#new_sound').fileupload({
dataType: 'json',
add: function (e, data){
data.submit();
},
progress: function (e, data){
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.bar').css('width', progress + '%');
},
});
});