Multer fileFilter callback doesn't throw error - multer

I have this fileFilter code for multer. My problem is, when I call back an error, my Express app gets stuck, and eventually the page gives ERR_CONNECTION_RESET. This happens if I try to upload anything other than a jpeg.
const upload = multer({
storage: storage,
fileFilter: function (req, file, cb) {
return cb(new ExpressError("Only images are allowed", 400));
},
});
the storage is a cloudinary storage which looks like so
const { CloudinaryStorage } = require("multer-storage-cloudinary");
const storage = new CloudinaryStorage({
cloudinary,
params: {
folder: "BilbaoBarrios",
allowedFormats: ["jpeg", "png", "jpg"],
},
});
Also, strangely, if I put the storage variable after FileFilter, it will also work with pngs, but still not with any other file format, which means order is in play here.
Thank you for your time.

Just to confirm, is Cloudinary being accessed within an in-office platform and behind a firewall (or VPN)? In case you are accessing it behind a firewall, delivering images & other Cloudinary assets sometimes requires whitelisting the Cloudinary domain (res.cloudinary.com) on your firewall.
Also, you can check out the sample I have on this link and see if it is blocked as well: https://codesandbox.io/s/upload-feature-demo-ckvgi?file=/src/CldUploaderForm.js

Related

IOS iPhone browsers does not accept video files via upload dialog

We are using IOS file upload dialog in order to use video files with our service using react.
All video files are working in android platforms and all browsers in linux and MacOS. However, when we use video files with upload dialog in IOS IPhones such as Iphone 14 Pro Max, then the compress process starts and following that the dialog rejects the video file.
We have been debugging with browserstack using a real phone in a simulator, however no luck until this point.
When we select the file, it firstly runs a compression activity then changes the name of the file to an intermediate file name (as below, the original file name is different), and then upload procedure fails.
Below is the react part which triggers upload mechanism which works with every platform and operating system with exception of IOS.
export const UploadVideo = async (file, signedurl, uploading) =>
{
let resultState = { state: '', data: {} };
if (SERVER_STATUS !== 'localhost')
{
await axios({
method: 'put',
url: signedurl,
data: file,
headers: { 'Content-Type': 'application/octet-stream', },
onUploadProgress: uploading
}).then(function (response)
{
resultState.state = 'success';
}).catch(function (error)
{
resultState.state = 'error';
resultState.data.message = error.message;
window.toastr.error(error.message);
})
} else resultState.state = 'success';
return resultState;
}
The error message I notice here, OS Status error -9806 refers to, according to osstatus.com a secure transport result code. More specifically this one, on Apple's documentation
My take here is that the system is not trusting this URL, I would suggest adding your URL to trusted domains under NSAppTransportSecurity in the Info.plist file. More info on how to do that here.
This is not a solution I would go for for a production app tho, you might want to have a valid certificate for your production URL and app.
Hope this helps.

Autodesk Simple Viewer - "Could not list models. "

I'm trying to implement the code example in this repo:
https://github.com/autodesk-platform-services/aps-simple-viewer-dotnet
While launching in debugging mode, I get an error in the AuthController.cs says:
Could not list models. See the console for more details
I didn't make any significant changes to the original code, I only changed the env vars (client id, secret etc..)
The error is on the below function:
async function setupModelSelection(viewer, selectedUrn) {
const dropdown = document.getElementById('models');
dropdown.innerHTML = '';
try {
const resp = await fetch('/api/models');
if (!resp.ok) {
throw new Error(await resp.text());
}
const models = await resp.json();
dropdown.innerHTML = models.map(model => `<option value=${model.urn} ${model.urn === selectedUrn ? 'selected' : ''}>${model.name}</option>`).join('\n');
dropdown.onchange = () => onModelSelected(viewer, dropdown.value);
if (dropdown.value) {
onModelSelected(viewer, dropdown.value);
}
} catch (err) {
alert('Could not list models. See the console for more details.');
console.error(err);
}
}
I get an access token so my client id and secret are probably correct, I also added the app to the cloud hub, what could be the problem, why the app can't find the projects in the hub?
I can only repeat what AlexAR said - the given sample is not for accessing files from user hubs like ACC/BIM 360 Docs - for that follow this: https://tutorials.autodesk.io/tutorials/hubs-browser/
To address the specific error. One way I can reproduce that is if I set the APS_BUCKET variable to something simple that has likely been used by someone else already, e.g. "mybucket", and so I'll get an error when trying to access the files in it, since it's not my bucket. Bucket names need to be globally unique. If you don't want to come up with a unique name yourself, then just do not declare the APS_BUCKET environment variable and the sample will generate a bucket name for you based on the client id of your app.

Amplify Video - How to upload a video to the "Input" bucket with swift?

I have an IOS project using Amplify as a backend. I have also incorporated Amplify Video in the hope of supporting video-on-demand. After adding Amplify Video to the project, an "Input" and "Output" bucket is generated. These appear outside of my project environment when visualised via the Amplify Console. They can only be accessed via navigating to AWS S3 console. My questions is, how to I upload my videos via swift to the "Input" bucket via Amplify (or do I not)? The code I have below uploads the video to the S3 bucket within the project environment. There is next to no support for Amplify Video for IOS (Amplify Video Documentation)
if let vidData = self.convertVideoToData(from: srcURL){
let key = "myKey"
//let options = StorageUploadDataRequest.Options.init(accessLevel: .protected)
Amplify.Storage.uploadData(key: key, data: vidData) { (progress) in
print(progress.fractionCompleted)
} resultListener: { (result) in
switch result{
case .success(_ ):
print("upload success!")
case .failure(let error):
print(error.errorDescription)
}
}
}
I'm facing the same issue.. As far as I can tell the iOS Amplify library's amplifyconfiguration.json is limited to using one storage spec under S3TransferUtility.
I'm in the process of solving this issue myself, but the quick solution is to modify the created AWS video resources to run off the same bucket (input and output). Now, be warned I'm an iOS Engineer, not backend, only getting familiar with AWS.
Solution as follows:
The input bucket the amplify video plugin created has 4 event notifications under the properties tab. These each kick off a VOD-inputWatcher lambda function. Copy these 4 notifications to your original bucket
The output bucket has two event notifications, copy those also to the original bucket
Try the process now, drop a video into your bucket manually. It will fail but we'll see progress - the MediaConvert job is kicked off, but will tell you it failed because it didn't have permissions to read the files in your bucket. Something like Unable to open input file, Access Denied. Let's solved this:
Go to the input lambda function and add this function:
async function enableACL(eventObject) {
console.log(eventObject);
const objectKey = eventObject.object.key;
const bucketName = eventObject.bucket.name;
const params = {
Bucket: bucketName,
Key: objectKey,
ACL: 'public-read',
};
console.log(`params: ${eventObject}`);
s3.putObjectAcl(params, (err, data) => {
if (err) {
console.log("failed to set ACL");
console.log(err);
} else {
console.log("successfully set acl");
console.log(data);
}
});
}
Now call it from the event handler, and don't forget to add const s3 = new AWS.S3({}); on top of the file:
exports.handler = async (event) => {
// Set the region
AWS.config.update({ region: event.awsRegion });
console.log(event);
if (event.Records[0].eventName.includes('ObjectCreated')) {
await enableACL(event.Records[0].s3);
await createJob(event.Records[0].s3);
const response = {
statusCode: 200,
body: JSON.stringify(`Transcoding your file: ${event.Records[0].s3.object.key}`),
};
return response;
}
};
Try the process again. The lambda will fail, you can see it in the lambda's CloutWatch: failed to set ACL. INFO AccessDenied: Access Denied at Request.extractError. To fix this we need to give S3 permissions to the input lambda function.
Do that by navigating to the lambda function's Configuration / Permissions and find the Role. Open it in IAM and add Full S3 access. Not ideal, but again, I'm just trying to make this work. Probably would be better to specify the exact Bucket and correct actions only. Any help regarding proper roles greatly appreciated :)
Repeat the same for the output lambda function's role also, give it the right S3 permissions.
Try uploading a file again. At this point if you run into this error:
failed to set ACL. INFO NoSuchKey: The specified key does not exist. at Request.extractError. It's because in the bucket you have objects in the protected Folder. Try to use the public folder instead (in the iOS lib you'll have to use StorageAccessLevel.guest permissions to access this)
Now drop a file in the public folder. You should see the MediaConvert job kick off again. It will still fail (check in MediaConvert / Jobs), saying it doesn't have permissions to write to the S3 bucket Unable to write to output file .. . You can fix this by going to the input lambda function again, this gives the permissions to the MediaConvert job:
const jobParams = {
JobTemplate: process.env.ARN_TEMPLATE,
Queue: queueARN,
UserMetadata: {},
Role: process.env.MC_ROLE,
Settings: jobSettings,
};
await mcClient.createJob(jobParams).promise();
Go to the input lambda function, Configuration / Environment Variables. The function uses the field MC_ROLE to provide the role name to the Media Convert job. Copy the role name and look it up in IAM. Modify its permissions by adding the right S3 access to the role to your bucket.
If you try it only more time, the output should appear right next to your input file.
In order to be able to read the s3://public/{userIdentityId}/{videoName}/{videoName}{quality}..m3u8 file using the current Amplify.Storage.downloadFile(key: {key}, ...) function in iOS, you'll probably have to attach to the key right path and remove the .mp4 extension. Let me know if you're facing any problems, I'm sorting this out now also.

Rails 5.2: Trix with Active Storage and AWS S3

I am trying to have images uploaded via my trix editor and also want to upload the images to AWS S3.
The images are getting succesfully uploaded to ActiveStorage but they are not getting uploaded to S3.
I however see something like this in the rails console Generated URL for file at key: Gsgdc7Jp84wYTQ1W4s (https://bucket.s3.amazonaws.com/Gsgdc7Jp84wYT2Ya3gxQ1W4s?X-Amz-Algorithm=AWS4redential=AKIAX6%2F20200414%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20241821Z&X-Amz-Expires=300&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost&X-Amz-Signature=3613d41915e47baaa7c90421eee3f0ffc)
I see that trix documentation provides attachments.js, which uploads to cloud provider https://trix-editor.org/js/attachments.js.
Also below is my relevant part of my code which is used to upload to ActiveStorage
document.addEventListener('trix-attachment-add', function (event) {
var file = event.attachment.file;
if (file) {
var upload = new window.ActiveStorage.DirectUpload(file,'/rails/active_storage/direct_uploads', window);
upload.create((error, attributes) => {
if (error) {
return false;
} else {
return event.attachment.setAttributes({
url: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
href: `/rails/active_storage/blobs/${attributes.signed_id}/${attributes.filename}`,
});
}
});
}
});
Below are my questions:
1) If my active storage is configured to upload to S3, do i still need attachments.js
2) My active storage is configured to upload to S3 and i see the above response in rails console but do not see the file in S3.
Any help in fixing this would be really great. Thanks.

How to upload large files directly to S3 via Rails in a AngularJS App

In my project I have a requirement to allow users to upload very large files (up to 4GB) to S3.
My current setup includes Rails 4.2, S3, Paperclip, Mongoid for the API and AngularJS (with this upload library) for the front-end. Currently I can upload regular sized (up to 20MB) files perfectly fine.
So, I need to allow the front-end app users to upload zip files from an Angular JS app, through a Rails API to S3.
There is no need for file processing on the server, so all the serve needs to do with the file is keep track of its url (the location on S3) and file name so these details can be referenced later on.
After some search I've found this tutorial and this gem that suggest uploading the files directly to S3 bypassing the app altogether.
In my Angular controller I have this code to upload small sized files:
$scope.submitItem = function(file, file2, file3) {
Upload.upload({
url: API_PROVIDER.full_path + 'instant_downloads.json',
data: {file: file, data: $scope.instant_download, categories: $scope.instant_download.categories, avatar: file2,
subcategories: $scope.subcategories, zipfile: file3, options: $scope.instant_download.options}
}).then(function (result) {
if (result.data.success) {
$scope.showAlert('Instant Download created successfully', 'success');
$state.go('app.instant_downloads.index');
}
}, function (result) {
angular.forEach(result.message, function(value, key){
$scope.showAlert(_.capitalize(key) + ': ' + value.join(', '), 'danger');
});
}, function (evt) {
$scope.progressPercentage = parseInt(100.0 * evt.loaded / evt.total);
});
};
And the above code works well. If I try the above code on a large file I get a nginx timeout errors.
Then I tried to use aforementioned gem (while still using the same AngularJS code) to bypass the server but I get an error on the S3 side (it seems it also timeout.
Can someone point me out how I can upload directly to S3, using the AngularJS controller above with a callback to the server so I can get the confirmation the file has been uploaded successfully along with its url?

Resources