Rails 5.2: Trix with Active Storage and AWS S3 - ruby-on-rails

I am trying to have images uploaded via my trix editor and also want to upload the images to AWS S3.
The images are getting succesfully uploaded to ActiveStorage but they are not getting uploaded to S3.
I however see something like this in the rails console Generated URL for file at key: Gsgdc7Jp84wYTQ1W4s (https://bucket.s3.amazonaws.com/Gsgdc7Jp84wYT2Ya3gxQ1W4s?X-Amz-Algorithm=AWS4redential=AKIAX6%2F20200414%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20241821Z&X-Amz-Expires=300&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost&X-Amz-Signature=3613d41915e47baaa7c90421eee3f0ffc)
I see that trix documentation provides attachments.js, which uploads to cloud provider https://trix-editor.org/js/attachments.js.
Also below is my relevant part of my code which is used to upload to ActiveStorage
document.addEventListener('trix-attachment-add', function (event) {
var file = event.attachment.file;
if (file) {
var upload = new window.ActiveStorage.DirectUpload(file,'/rails/active_storage/direct_uploads', window);
upload.create((error, attributes) => {
if (error) {
return false;
} else {
return event.attachment.setAttributes({
url: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
href: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
});
}
});
}
});
Below are my questions:
1) If my active storage is configured to upload to S3, do i still need attachments.js
2) My active storage is configured to upload to S3 and i see the above response in rails console but do not see the file in S3.
Any help in fixing this would be really great. Thanks.

Related

Multer fileFilter callback doesn't throw error

I have this fileFilter code for multer. My problem is, when I call back an error, my Express app gets stuck, and eventually the page gives ERR_CONNECTION_RESET. This happens if I try to upload anything other than a jpeg.
const upload = multer({
storage: storage,
fileFilter: function (req, file, cb) {
return cb(new ExpressError("Only images are allowed", 400));
},
});
the storage is a cloudinary storage which looks like so
const { CloudinaryStorage } = require("multer-storage-cloudinary");
const storage = new CloudinaryStorage({
cloudinary,
params: {
folder: "BilbaoBarrios",
allowedFormats: ["jpeg", "png", "jpg"],
},
});
Also, strangely, if I put the storage variable after FileFilter, it will also work with pngs, but still not with any other file format, which means order is in play here.
Thank you for your time.
Just to confirm, is Cloudinary being accessed within an in-office platform and behind a firewall (or VPN)? In case you are accessing it behind a firewall, delivering images & other Cloudinary assets sometimes requires whitelisting the Cloudinary domain (res.cloudinary.com) on your firewall.
Also, you can check out the sample I have on this link and see if it is blocked as well: https://codesandbox.io/s/upload-feature-demo-ckvgi?file=/src/CldUploaderForm.js

Deleting object in amazon s3 through trix

Im using action text with trix for my rich text content and Im able to upload images on my trix editor which is then uploaded to my amazon s3 bucket successfully.
I'd liek to be able to delete that object in s3 when a user decides to delete the image on the editor.
I'm using the AWS sdk for js and I've set my parameters:
window.addEventListener("trix-attachment-remove", function(event) {
var AWS = require('aws-sdk');
AWS.config.credentials = {
accessKeyId: gon.accessKey,
secretAccessKey: gon.secretKey,
region: 'us-west-1'
}
console.log(event.attachment.getAttributes())
var s3 = new AWS.S3();
var params = { Bucket: gon.bucketName, Key: '#object-name#' };
s3.deleteObject(params, function(err, data) {
if (err) console.log(err, err.stack); // error
else console.log(); // deleted
});
})
So my only issue now is getting the key for the object which I understand is the object name. So here's a sample of my objects in s3 bucket with their names:
And Im trying to get the attributes from the file which Im removing. From this code
event.attachment.getAttributes()
And heres what Im getting:
There's no way the sgid or the string at the end of the url matches any of the object name. Its simply too long. How do I get the object name in s3 when Im removign the object?
Also, just an additional note, if I replace the key with the object name directly from s3 bucket, the delete succeeds so I know its working but I just need to get hte correct object name.

admin on rest uploading images through rest

everyone, I'm using admin-on-rest for front-end development, back-end i am using Ruby on rails. i want to upload images to cloud. i can able to upload to cloud but i need to send image url to my backend so i can store image url. here is my code.
<FlatButton style={styles.button} label="Upload Image" primary onClick={this.handleClick} />
this.handleClick = () => {
window.cloudinary.openUploadWidget({ cloud_name: 'dsaf', upload_preset: 'dsafds', cropping: 'das', 'tags': 'asdf'},
function(error, result) {
return result;
});
};
I am not able to send image-url to back-end. can anyone help me?.

How to upload large files directly to S3 via Rails in a AngularJS App

In my project I have a requirement to allow users to upload very large files (up to 4GB) to S3.
My current setup includes Rails 4.2, S3, Paperclip, Mongoid for the API and AngularJS (with this upload library) for the front-end. Currently I can upload regular sized (up to 20MB) files perfectly fine.
So, I need to allow the front-end app users to upload zip files from an Angular JS app, through a Rails API to S3.
There is no need for file processing on the server, so all the serve needs to do with the file is keep track of its url (the location on S3) and file name so these details can be referenced later on.
After some search I've found this tutorial and this gem that suggest uploading the files directly to S3 bypassing the app altogether.
In my Angular controller I have this code to upload small sized files:
$scope.submitItem = function(file, file2, file3) {
Upload.upload({
url: API_PROVIDER.full_path + 'instant_downloads.json',
data: {file: file, data: $scope.instant_download, categories: $scope.instant_download.categories, avatar: file2,
subcategories: $scope.subcategories, zipfile: file3, options: $scope.instant_download.options}
}).then(function (result) {
if (result.data.success) {
$scope.showAlert('Instant Download created successfully', 'success');
$state.go('app.instant_downloads.index');
}
}, function (result) {
angular.forEach(result.message, function(value, key){
$scope.showAlert(_.capitalize(key) + ': ' + value.join(', '), 'danger');
});
}, function (evt) {
$scope.progressPercentage = parseInt(100.0 * evt.loaded / evt.total);
});
};
And the above code works well. If I try the above code on a large file I get a nginx timeout errors.
Then I tried to use aforementioned gem (while still using the same AngularJS code) to bypass the server but I get an error on the S3 side (it seems it also timeout.
Can someone point me out how I can upload directly to S3, using the AngularJS controller above with a callback to the server so I can get the confirmation the file has been uploaded successfully along with its url?

Carrierwave and Amazon S3: retrieve image

I have a problem with S3 and CarrierWave:
I have a pseudo-form that uploads data and files, I wrote "pseudo" because it's an ajax form so data is sent with jquery to rails with a POST request. Files cannot be uploaded in this way...so I have a popup windows that upload files to rails, I save in the session the reference to the uploaded files and when the ajax request uploads the rest of the form, I link the files uploaded to the rest of the data.
With storage :file it works without any problems, when i receive the file I do:
uploader = ImgObjUploader.new
uploader.store!(params[:image_form][:image])
session["image"] = uploader.url
and then when I get the rest of the data:
if (session[:image] != nil) then
obj.image = File.open(session[:image])
end
And my model is:
mount_uploader :image, ImgObjUploader
This code work without any problems, for amazon s3 I switched to:
uploader = ImgObjUploader.new
uploader.retrieve_from_store!(session[:image])
puts uploader
#obj.image = uploader
obj.image = uploader.url
but it doesn't work...I didn't receive an error but I don't have the image saved inside obj object. Puts uploader prints the url of amazon S3.
Anyone can help me?
Thank You.

Resources