Testing sharp with memfs error: [Error: Input file is missing] - sharp

I had trouble testing my code that implement sharp and memfs. I has codes that download image and cropping it to certain dimension. I use sharp to achieve the cropping. When it comes testing, I use jest and memfs. My test code has 2 parts:
Download a image and save to mock volume/fileSystem that created with memfs.
Cropping the downloaded image to certain dimension with sharp.
Part 1 worked perfectly, was able to download image to the fake volume and asserted with jest (exist in the fake volume).
But part 2 gave me error:[Error: Input file is missing].
const sizeToCrop = {
width: 378,
height: 538,
left: 422,
top: 0,
};
sharp(`./downloadedImage.jpg`)
.extract(sizeToCrop)
.toFile(`./CroppedImage.jpg`)
.then(() => {
console.log(`resolve!`);
})
.catch((err: Error) => {
console.log(err);
return Promise.reject();
});
// Error message: [Error: Input file is missing]
But when I test it with real volume. It worked fine.
Anyone has idea how to solve this?
Thank you.

After some inspirations from here, here and here.
It is because sharp is not accessing memfs' faked file system, it is accessing the actual fs' file system, which ./donwloadedImage.jpg to not exist. Hence, in order to make sharp use memfs, it has to be mocked. And extract and toFile functions need to be mocked as well(for chaining function call):
// Inside code.test.ts
jest.mock('sharp', () => {
const sharp = jest.requireActual('sharp');
const { vol } = require('memfs');
let inputFilePath: string;
let sizeToCrop: any;
let outputFilePath: string;
const toFile = async (writePath: string): Promise<any> => {
outputFilePath = writePath;
try {
return await starCropping();
} catch (error) {
console.log(`Error in mocked toFile()`, error);
}
};
const extract = (dimensionSize: string): any => {
sizeToCrop = dimensionSize;
return { toFile };
};
const mockSharp = (readPath: string): any => {
inputFilePath = readPath;
return { extract };
};
async function starCropping(): Promise<void> {
try {
const inputFile = vol.readFileSync(inputFilePath);
const imageBuffer = await sharp(inputFile).extract(sizeToCrop).toBuffer();
vol.writeFileSync(outputFilePath, imageBuffer);
} catch (error) {
console.log(`Error in mocked sharp module`, error);
return Promise.reject();
}
}
return mockSharp;
});

Related

How do I check the size of the data file before I upload it (Flutter)

When I take the file I want the data file size information, in the future the upload file limit will be made based on the file size. the following code that I use
void _openFileExplorer() async {
setState(() => _loadingPath = true);
try {
_directoryPath = null;
_paths = (await FilePicker.platform.pickFiles(
type: _pickingType,
allowMultiple: _multiPick,
allowedExtensions: ['jpg', 'pdf', 'doc', 'docx', 'png', 'jpeg'],
))
?.files;
} on PlatformException catch (e) {
print("Unsupported operation" + e.toString());
} catch (ex) {
print(ex);
}
if (!mounted) return;
setState(() {
_loadingPath = false;
_fileName = _paths != null ? _paths.map((e) => e.name).toString() : '...';
});
}
you can get file size using function named lengthSync.
just use this function like
var size = file.lengthSync()
it will give file size in bytes.

Data sharing between Safari and standalone iPhone 12 iOS 14.3

I tried to share data between Safari browser and standalone PWA on iPhone12 with iOS 14.3.
The information, that this should work are here: https://firt.dev/ios-14/
I#ve tried this: https://www.netguru.com/codestories/how-to-share-session-cookie-or-state-between-pwa-in-standalone-mode-and-safari-on-ios
Without success.
Are there any suggestions to running this? Or is it not possible ...
This is the code
const CACHE_NAME = "auth";
const TOKEN_KEY = "token";
const FAKE_TOKEN = "sRKWQu6hCJgR25lslcf5s12FFVau0ugi";
// Cache Storage was designed for caching
// network requests with service workers,
// mainly to make PWAs work offline.
// You can give it any value you want in this case.
const FAKE_ENDPOINT = "/fake-endpoint";
const saveToken = async (token: string) => {
try {
const cache = await caches.open(CACHE_NAME);
const responseBody = JSON.stringify({
[TOKEN_KEY]: token
});
const response = new Response(responseBody);
await cache.put(FAKE_ENDPOINT, response);
console.log("Token saved! 🎉");
} catch (error) {
// It's up to you how you resolve the error
console.log("saveToken error:", { error });
}
};
const getToken = async () => {
try {
const cache = await caches.open(CACHE_NAME);
const response = await cache.match(FAKE_ENDPOINT);
if (!response) {
return null;
}
const responseBody = await response.json();
return responseBody[TOKEN_KEY];
} catch (error) {
// Gotta catch 'em all
console.log("getToken error:", { error });
}
};
const displayCachedToken = async () => {
const cachedToken = await getToken();
console.log({ cachedToken });
};
// Uncomment the line below to save the fake token
// saveToken(FAKE_TOKEN);
displayCachedToken();
Without success means no result, i've tried to set data in safari and get them in standalone pwa

Express - how do I use sharp with multer?

I need help with using sharp, I want my images to resize when uploaded but I can't seem to get this right.
router.post("/", upload.single("image"), async (req, res) => {
const { filename: image } = req.file;
await sharp(req.file.path)
.resize(300, 200)
.jpeg({ quality: 50 })
.toFile(path.resolve(req.file.destination, "resized", image));
fs.unlinkSync(req.file.path);
res.send("sent");
});
As I know you should pass a Buffer to sharp not the path.
Instead of resizing saved image, resize it before save. to implement this, you should use multer.memoryStorage() as storage.
const multer = require('multer');
const sharp = require('sharp');
const storage = multer.memoryStorage();
const filter = (req, file, cb) => {
if (file.mimetype.split("/")[0] === 'image') {
cb(null, true);
} else {
cb(new Error("Only images are allowed!"));
}
};
exports.imageUploader = multer({
storage,
fileFilter: filter
});
app.post('/', imageUploader.single('photo'), async (req, res, next) => {
// req.file includes the buffer
// path: where to store resized photo
const path = `./public/img/${req.file.filename}`;
// toFile() method stores the image on disk
await sharp(req.file.buffer).resize(300, 300).toFile(path);
next();
});

Content.once is not a function

I try to push a file to the IPFS, and I have converted to the Buffer. I got this error " content.once is not a function".
I am using this library in node.
var Buffer = require('buffer/').Buffer;
const doc = new jsPDF();
doc.fromHTML('test',10,10);
var covnertedBuffer = Buffer.from(doc.output('arraybuffer');
Then, I take the convertedBuffer and pass it to the IPFS api.
Any idea?
Updated test:
I have successfully pushed a file to the IPFS via the API with this code below.
const filename = '/home/administrator/Downloads/5HP8LWKHLV.pdf';
this.ipfsApi = ipfsApi('localhost', '5001');
let readablestream = fs.createReadStream(filename);
readablestream.on('readable', () => {
let result = readablestream.read();
console.log(result);
if (result) {
this.ipfsApi.files.add(result, function(err, files) {
if (err) {
res.json('err');
console.log(err);
}
res.json(files);
});
}
});
But, when I get the arrayBuffer from the doc.output and convert to the Buffer object and push to the IPFS and it failed. Please see below.
var _buffer = Buffer.from(req.buffer);
console.log('Converted to buffer:' + _buffer);
this.ipfsApi = ipfsApi('localhost', '5001');
this.ipfsApi.files.add(_buffer, function(err, files) {
if (!err) {
res.status(500);
console.log(err);
} else {
res.json(files);
res.status(200);
}
});
Thank you
Adding Buffer.from(your_buffer) to your buffer before doing ipfs push works.
ipfs.files.add(Buffer.from(put_your_buffer_here), (error, result) => {
if(error) {
console.error(error)
return
}
console.log("upload is successful");
});

MediaRecorder Blob to file in an electron app

I have an electron app that has very simple desktop capturing functionality:
const {desktopCapturer} = require('electron')
const fs = require('fs');
var recorder;
var chunks = [];
var WINDOW_TITLE = "App Title";
function startRecording() {
desktopCapturer.getSources({ types: ['window', 'screen'] }, function(error, sources) {
if (error) throw error;
for (let i = 0; i < sources.length; i++) {
let src = sources[i];
if (src.name === WINDOW_TITLE) {
navigator.webkitGetUserMedia({
audio: false,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: src.id,
minWidth: 800,
maxWidth: 1280,
minHeight: 600,
maxHeight: 720
}
}
}, handleStream, handleUserMediaError);
return;
}
}
});
}
function handleStream(stream) {
recorder = new MediaRecorder(stream);
chunks = [];
recorder.ondataavailable = function(event) {
chunks.push(event.data);
};
recorder.start();
}
function stopRecording() {
recorder.stop();
toArrayBuffer(new Blob(chunks, {type: 'video/webm'}), function(ab) {
var buffer = toBuffer(ab);
var file = `./test.webm`;
fs.writeFile(file, buffer, function(err) {
if (err) {
console.error('Failed to save video ' + err);
} else {
console.log('Saved video: ' + file);
}
});
});
}
function handleUserMediaError(e) {
console.error('handleUserMediaError', e);
}
function toArrayBuffer(blob, cb) {
let fileReader = new FileReader();
fileReader.onload = function() {
let arrayBuffer = this.result;
cb(arrayBuffer);
};
fileReader.readAsArrayBuffer(blob);
}
function toBuffer(ab) {
let buffer = new Buffer(ab.byteLength);
let arr = new Uint8Array(ab);
for (let i = 0; i < arr.byteLength; i++) {
buffer[i] = arr[i];
}
return buffer;
}
// Record for 3.5 seconds and save to disk
startRecording();
setTimeout(function() { stopRecording() }, 3500);
I know that to save the MediaRecorder blob sources, I need to read it into an ArrayBuffer, then copy that into a normal Buffer for the file to be saved.
However, where this seems to be failing for me is combining the chunk of blobs into blobs. When the chunks are added into a single Blob - it's like they just disappear. The new Blob is empty, and every other data structure they are copied into afterwards also is completely empty.
Before creating the Blob, I know I have valid Blob's in the chunks array.
Here's what the debug info of chunks is, before executing the new Blob(chunks, {.. part.
console.log(chunks)
Then here's the debug info of the new Blob(chunks, {type: 'video/webm'}) object.
console.log(ab)
I'm completely stumped. All the reference tutorials or other SO answers I can find basically follow this flow. What am I missing?
Electron version: 1.6.2
That's not possible to be working. You didn't wait for value to come in stopReocoring. You need to change your stopRecording function to following:
function stopRecording() {
var save = function() {
console.log(blobs);
toArrayBuffer(new Blob(blobs, {type: 'video/webm'}), function(ab) {
console.log(ab);
var buffer = toBuffer(ab);
var file = `./videos/example.webm`;
fs.writeFile(file, buffer, function(err) {
if (err) {
console.error('Failed to save video ' + err);
} else {
console.log('Saved video: ' + file);
}
});
});
};
recorder.onstop = save;
recorder.stop();
}
This problem literally just fixed itself today without me changing anything. I'm not sure what about my system changed (other than a reboot) but it's now working exactly as it should.

Resources