admin on rest uploading images through rest - ruby-on-rails

everyone, I'm using admin-on-rest for front-end development, back-end i am using Ruby on rails. i want to upload images to cloud. i can able to upload to cloud but i need to send image url to my backend so i can store image url. here is my code.
<FlatButton style={styles.button} label="Upload Image" primary onClick={this.handleClick} />
this.handleClick = () => {
window.cloudinary.openUploadWidget({ cloud_name: 'dsaf', upload_preset: 'dsafds', cropping: 'das', 'tags': 'asdf'},
function(error, result) {
return result;
});
};
I am not able to send image-url to back-end. can anyone help me?.

Related

Multer fileFilter callback doesn't throw error

I have this fileFilter code for multer. My problem is, when I call back an error, my Express app gets stuck, and eventually the page gives ERR_CONNECTION_RESET. This happens if I try to upload anything other than a jpeg.
const upload = multer({
storage: storage,
fileFilter: function (req, file, cb) {
return cb(new ExpressError("Only images are allowed", 400));
},
});
the storage is a cloudinary storage which looks like so
const { CloudinaryStorage } = require("multer-storage-cloudinary");
const storage = new CloudinaryStorage({
cloudinary,
params: {
folder: "BilbaoBarrios",
allowedFormats: ["jpeg", "png", "jpg"],
},
});
Also, strangely, if I put the storage variable after FileFilter, it will also work with pngs, but still not with any other file format, which means order is in play here.
Thank you for your time.
Just to confirm, is Cloudinary being accessed within an in-office platform and behind a firewall (or VPN)? In case you are accessing it behind a firewall, delivering images & other Cloudinary assets sometimes requires whitelisting the Cloudinary domain (res.cloudinary.com) on your firewall.
Also, you can check out the sample I have on this link and see if it is blocked as well: https://codesandbox.io/s/upload-feature-demo-ckvgi?file=/src/CldUploaderForm.js

POST data to Next.js page from external application

I have a java application which sends json data to an API via POST, what i'm trying to do is collect this data from the Next.js application to display and store in a database later. I can't figure out how to fetch this data from the Next app. currently i have the following code in the pages/api/comment and I'm calling the http://localhost:3000/api/comment from the java application
export default function handler(req, res) {
if(req.method === 'POST'){
const comment = req.body.data
const newCom = {
id: Date.now(),
text: comment,
}
comments.push(newCom)
res.status(201).json(newCom)
}
}
Can someone give me some directions please?, Thank you very much in advance.
Since Next js is working in server less architecture, you need to persist/save data some where like DB on the time of posting data, Then only we an retrieve data by get Api

Rails 5.2: Trix with Active Storage and AWS S3

I am trying to have images uploaded via my trix editor and also want to upload the images to AWS S3.
The images are getting succesfully uploaded to ActiveStorage but they are not getting uploaded to S3.
I however see something like this in the rails console Generated URL for file at key: Gsgdc7Jp84wYTQ1W4s (https://bucket.s3.amazonaws.com/Gsgdc7Jp84wYT2Ya3gxQ1W4s?X-Amz-Algorithm=AWS4redential=AKIAX6%2F20200414%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20241821Z&X-Amz-Expires=300&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost&X-Amz-Signature=3613d41915e47baaa7c90421eee3f0ffc)
I see that trix documentation provides attachments.js, which uploads to cloud provider https://trix-editor.org/js/attachments.js.
Also below is my relevant part of my code which is used to upload to ActiveStorage
document.addEventListener('trix-attachment-add', function (event) {
var file = event.attachment.file;
if (file) {
var upload = new window.ActiveStorage.DirectUpload(file,'/rails/active_storage/direct_uploads', window);
upload.create((error, attributes) => {
if (error) {
return false;
} else {
return event.attachment.setAttributes({
url: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
href: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
});
}
});
}
});
Below are my questions:
1) If my active storage is configured to upload to S3, do i still need attachments.js
2) My active storage is configured to upload to S3 and i see the above response in rails console but do not see the file in S3.
Any help in fixing this would be really great. Thanks.

How to upload large files directly to S3 via Rails in a AngularJS App

In my project I have a requirement to allow users to upload very large files (up to 4GB) to S3.
My current setup includes Rails 4.2, S3, Paperclip, Mongoid for the API and AngularJS (with this upload library) for the front-end. Currently I can upload regular sized (up to 20MB) files perfectly fine.
So, I need to allow the front-end app users to upload zip files from an Angular JS app, through a Rails API to S3.
There is no need for file processing on the server, so all the serve needs to do with the file is keep track of its url (the location on S3) and file name so these details can be referenced later on.
After some search I've found this tutorial and this gem that suggest uploading the files directly to S3 bypassing the app altogether.
In my Angular controller I have this code to upload small sized files:
$scope.submitItem = function(file, file2, file3) {
Upload.upload({
url: API_PROVIDER.full_path + 'instant_downloads.json',
data: {file: file, data: $scope.instant_download, categories: $scope.instant_download.categories, avatar: file2,
subcategories: $scope.subcategories, zipfile: file3, options: $scope.instant_download.options}
}).then(function (result) {
if (result.data.success) {
$scope.showAlert('Instant Download created successfully', 'success');
$state.go('app.instant_downloads.index');
}
}, function (result) {
angular.forEach(result.message, function(value, key){
$scope.showAlert(_.capitalize(key) + ': ' + value.join(', '), 'danger');
});
}, function (evt) {
$scope.progressPercentage = parseInt(100.0 * evt.loaded / evt.total);
});
};
And the above code works well. If I try the above code on a large file I get a nginx timeout errors.
Then I tried to use aforementioned gem (while still using the same AngularJS code) to bypass the server but I get an error on the S3 side (it seems it also timeout.
Can someone point me out how I can upload directly to S3, using the AngularJS controller above with a callback to the server so I can get the confirmation the file has been uploaded successfully along with its url?

How to best serve a GridFS stored file (PDF) via my Express.js driven API to a connected client (iOS)?

I'm developing a REST HTTP API that has iOS clients connecting to it. The way it's currently set up (and tested with POSTman chrome ext) is that I make the request for the resource, and I have to wait for the whole thing to get read in and spit out for it to show up as a response.
Is this a good method for iOS and Mac client consumption or is there a better method for serving from GridFS?
I'm doing the following:
// Download a PDF
app.get('/api/download-pdf/:pdf_id', function(req, res){
var gfs = new mongodb.GridStore(mongoose.connection.db, ObjectID(req.params.pdf_id), "r");
gfs.open(function(err,gs) {
if (err){
res.send(500);
}
else{
gs.read(function(err,data) {
res.header('Content-type','application/pdf');
res.send(data);
gs.close(function(err) {});
if (err) throw(err);
});
}
});
});
the node driver now supports streaming to/from GridFS http://christiankvalheim.com/post/29753345741/new-features-in-the-driver-for-mongodb-2-2?8e43c3e0
gs.pipe(anotherStream)
See Streams

Resources