uploading images in public dir reactjs/nextjs - path

I've been trying to upload images in the /public directory this code is working fine locally (Windows OS)
import getConfig from "next/config";
import fs from "fs";
const address=path.join(getConfig().serverRuntimeConfig.PROJECT_ROOT, `/public/uploads/users/${username}`);
if (!fs.existsSync(address)) {
fs.mkdirSync(address, { recursive: true });
}
I'm using multer for file uploading from client-side.
The above code is working fine on window os locally but in after deployment at vercel it throws the error:
2022-03-21T16:05:16.872Z 693e7f44-12d9-4f4e-90cf-f030a299f918 ERROR Unhandled Promise Rejection
{"errorType":"Runtime.UnhandledPromiseRejection","errorMessage":"Error:
ENOENT: no such file or directory, mkdir
'/vercel/path0/public/uploads/users/saif'","reason":{"errorType":"Error","errorMessage":"ENOENT:
no such file or directory, mkdir
'/vercel/path0/public/uploads/users/saif'","code":"ENOENT","errno":-2,"syscall":"mkdir","path":"/vercel/path0/public/uploads/users/saif","stack":["Error:
ENOENT: no such file or directory, mkdir
'/vercel/path0/public/uploads/users/saif'"," at Object.mkdirSync
(fs.js:1013:3)"," at DiskStorage.destination [as getDestination]
(/var/task/.next/server/pages/api/User/index.js:155:55)"," at
processTicksAndRejections (internal/process/task_queues.js:95:5)","
at runNextTicks (internal/process/task_queues.js:64:3)"," at
processImmediate
(internal/timers.js:437:9)"]},"promise":{},"stack":["Runtime.UnhandledPromiseRejection:
Error: ENOENT: no such file or directory, mkdir
'/vercel/path0/public/uploads/users/saif'"," at process.
(/var/runtime/index.js:35:15)"," at process.emit
(events.js:412:35)"," at processPromiseRejections
(internal/process/promises.js:245:33)"," at
processTicksAndRejections (internal/process/task_queues.js:96:32)","
at runNextTicks (internal/process/task_queues.js:64:3)"," at
processImmediate (internal/timers.js:437:9)"]} Unknown application
error occurred

Vercel as a platform does not allow persistent file storage as these are serverless functions, they encourage uploads to a bucket like s3 -
https://vercel.com/docs/concepts/solutions/file-storage
Create a Serverless Function to return a presigned URL.
From the front-end, call your Serverless Function to get the presigned POST URL.
Allow the user to upload a file on the front-end. Forward the
file to the POST URL.
Note: here the presigned url is a s3 location that you are creating as a location.
They also post multiple examples by using different examples using s3 or Google storage bucket.

Related

How can I use ImageMagick to read SVG files in Elastic Beanstalk?

I'm trying to use ImageMagick to read SVG files. This works just fine locally, since I have librsvg (https://github.com/GNOME/librsvg) installed on my machine. However, the package doesn't seem to be available through yum, which is what Elastic Beanstalk uses.
So in my backend (Rails/Postgresql) I have something like:
def save_svg_as_image
file = params[:uploaded_svg]
#svg_image = MiniMagick::Image.read(file)
self.svg = #svg_image
end
Again, this works fine on local thanks to librsvg, but when testing with my EB (64bit Amazon Linux/2.11.4), I get:
MiniMagick::Invalid (`identify /tmp/mini_magick20210205-24760-bygqeh` failed with error:
identify: no decode delegate for this image format `/tmp/mini_magick20210205-24760-bygqeh' # error/constitute.c/ReadImage/544.
):
I've attempted using the following solution: Taking the code from https://gist.github.com/whyvez/1e0212a35da97aa8f1b1 and using it in a packages.config file, but I'm continually getting the following error:
Command failed on instance. Return code: 2 Output: wget: unrecognized option '--prefix=/usr'

Docker, ENOENT: no such file or directory

I have a Storage constant that is used in a file called listingController.js
const storage = Storage({
keyFilename: "../key/keyname.json"
});
Everything works fine when I'm not using Docker but after I create a Docker image and deploy it on server I get the following error:
ENOENT: no such file or directory, open '/key/keyname.json'
at wrapError (/app/node_modules/gcs-resumable-upload/build/src/index.js:17:12)
at /app/node_modules/gcs-resumable-upload/build/src/index.js:235:19
at getToken (/app/node_modules/google-auto-auth/index.js:27:9)
at getAuthClient (/app/node_modules/google-auto-auth/index.js:233:9)
at <anonymous>
Here I see a problem that the '..' is ignored in front of the path which is why I think that file is not found.
Here is my project structure:
src
--- key
----- keyname.json
----- firebasekeyfilename.json
--- controller
----- listingController.js
----- firebaseController.js
I have tried all different combinations of file names and paths but I cannot get it to find that file.
Does anyone have a clue why this is happening?
In my firebaseController I have the following reference to a similar file in the same folder and it works fine.
var serviceAccount = require("../key/firebasekeyfilename");
The only difference is that the path is inside require() and I guess that requires a different path.
Been stuck with this for a couple of days now, any pointers would be appreciated, thank!

How To Require Lua Socket?

I'm very new to lua development with file manipulation, and now trying to import the lua socket package into my project according to this post, but I can't run even the code below.
I guess the error message indicates I need to import not only the socket.lua but also .\socket\core (probably .dll, since it doesn't have core.lua), while a reply at the post suggested importing only the file.
I'm stuck in just the beginning... What do I have to do for the next step?
local function main()
local socket = require("socket")
end
main()
Exception in thread "main" com.naef.jnlua.LuaRuntimeException: ...n32.win32.x86_64\workspace\TestForCiv\src\socket.lua:13: module 'socket.core' not found:
no field package.preload['socket.core']
no file '.\socket\core.lua'
no file 'C:\Program Files\Java\jre1.8.0_151\bin\lua\socket\core.lua'
no file 'C:\Program Files\Java\jre1.8.0_151\bin\lua\socket\core\init.lua'
...(a bunch of no file errors continues)
Edit: I added the folder structure. Even I add the .dll file it returns the same error.
I don't know the details of you configuration, but try this
require ("src.socket")
you should require a module from the root path of the lib

Yeoman - How to extract zipped files in generator?

I want to build a Yeoman generator that needs to unzip a file.
From their documentation, it seems this process is done using this.registerTransformStream(...). It says it accept any gulp plugin, so I tried gulp-unzip (link)
Here's my code:
// index.js
...
writing: function() {
var source = this.templatePath('zip'); // the folder where the zipped file is
var destination = this.destinationRoot();
this.fs.copy(source, destination);
this.registerTransformStream(unzip() );
}
...
The result seems promising, first it shows all the file list then I get Error: write after end error.
Here's the dump:
create license.txt
create readme.html
create config.php
...
...
events.js:141
throw er; // Unhandled 'error' event
^
Error: write after end
at writeAfterEnd (C:\Users\myname\Documents\project\generator-test\node_modules\gulp-unzip\node_modules\readable-stream\lib\_stream_writable.js:144:12)
at Transform.Writable.write (C:\Users\myname\Documents\project\generator-test\node_modules\gulp-unzip\node_modules\readable-stream\lib\_stream_writable.js:192:5)
at DestroyableTransform.ondata (C:\Users\myname\Documents\project\generator-test\node_modules\through2\node_modules\readable-stream\lib\_stream_readable.js:531:20)
at emitOne (events.js:77:13)
at DestroyableTransform.emit (events.js:169:7)
at readableAddChunk (C:\Users\myname\Documents\project\generator-test\node_modules\through2\node_modules\readable-stream\lib\_stream_readable.js:198:18)
at DestroyableTransform.Readable.push (C:\Users\myname\Documents\project\generator-test\node_modules\through2\node_modules\readable-stream\lib\_stream_readable.js:157:10)
at DestroyableTransform.Transform.push (C:\Users\myname\Documents\project\generator-test\node_modules\through2\node_modules\readable-stream\lib\_stream_transform.js:123:32)
at DestroyableTransform._transform (C:\Users\myname\Documents\project\generator-test\node_modules\mem-fs-editor\lib\actions\commit.js:34:12)
at DestroyableTransform.Transform._read (C:\Users\myname\Documents\project\generator-test\node_modules\through2\node_modules\readable-stream\lib\_stream_transform.js:159:10)
The destination folder is empty after this. It seems the stream is trying to write the unzipped file but failed.
Does anyone solved this problem before? Or is there alternative way by just using the built-in fs?
Thanks a lot

GridFS + CarrierWave + nginx unable to get file

In my project I upload audiofiles to GridFS using CarrierWave gem. After uploading file is saved to GridFS properly but in my application I am unable to get it from GridFS with MongoFiles Tool or with GridFS-nginx module.
mongofiles get audiotracks/4dfb70d6bcd73f3488000002/data
command leads to this error:
assertion: 13325 couldn't open file: audiotracks/4dfb70d6bcd73f3488000002/data
The only way to get file is to use rails console and it works fine:
cc = Mongo::GridFileSystem.new(Mongo::Connection.new.db("test")).open('audiotracks/4dfb70d6bcd73f3488000002/data', 'r')
cc.read
So if you have encountered problem like this or have some ideas - plz let me know.
mongofiles get will try to write the file to disk with the same name and path as in GridFS.
Assertion 13325 happens when GridFS can't write the file like this.
You should check if the file path exists and you have the permission to write the file. Alternatively you could just provide a file name with the --local parameter.
mongofiles --local mytrack.mp3 get audiotracks/4dfb70d6bcd73f3488000002/data

Resources