Error: read ECONNRESET at TLSWrap.onStreamRead (internal/stream_base_commons.js:205:27) Uploading the files to store i S3 - storage

When i am trying to upload the file and store in to s3 location, I got the error
Error: read ECONNRESET
at TLSWrap.onStreamRead (internal/stream_base_commons.js:205:27)
is the above a version problem or bug?
var express = require('express'),
aws = require('aws-sdk'),
bodyParser = require('body-parser'),
multer = require('multer'),
multerS3 = require('multer-s3');
aws.config.update({
secretAccessKey: 'xxxxxxxxxxxxxxxxxxxxxxxxx',
accessKeyId: 'xxxxxxxxxxxxxxxx',
region: 'us-east-1'
});
var app = express(),
s3 = new aws.S3();
app.use(bodyParser.json());
var upload = multer({
storage: multerS3({
s3: s3,
bucket: 'xxxxx',
key: function (req, file, cb) {
console.log(file);
cb(null, file.originalname); //use Date.now() for unique file keys
}
})
});
//open in browser to see upload form
app.get('/', function (req, res) {
res.sendFile(__dirname + '/index.html');
});
//used by upload form
app.post('/upload', upload.array('upl',1), function (req, res, next) {
res.send("Uploaded!");
});
app.listen(3000, function () {
console.log('Example app listening on port 3000!');
});
and index html file
<form method="post" enctype="multipart/form-data" action="/upload">
<input type="file" name="upl"/>
<input type="submit"/>
</form>
and packages version
"dependencies": {
"aws-sdk": "^2.753.0",
"body-parser": "^1.19.0",
"express": "^4.17.1",
"multer": "^1.4.2",
"multer-s3": "^2.9.0"
}
and my node and npm versionn is
node is :12.16.1
npm is: 6.13.4
Please any one solve this problem...

I was also facing this issue and after doing lots of analysis, I found that it is happening because of Antivirus installed on my machine. Particularly due to Sophos.
You can find this issue at - https://community.sophos.com/intercept-x-endpoint/f/discussions/134136/sophos-network-threat-detection-is-blocking-cypress-automation-tool
To solve this issue, try to execute your test cases on Electron browser, or contact your admin team and get the password of Sophos and then Login into the Sophos, Go to settings and disable 'Network Threat Protection

This issue may be related to this bug in NodeJS. One thing you can try is putting res.end(); after your res.sendFile(...) and res.send(...) calls. If that doesn't work, you may need to implement a function to call on a process uncaughtException, or wait to see how the Node community ends up working around the issue.

I had a similar issue using Node 16.7.0:
Error: read ECONNRESET
at TLSWrap.onStreamRead (node:internal/stream_base_commons:220:20) {
errno: -54,
code: 'ECONNRESET',
syscall: 'read'
}
I was able to solve it by adding req.end() after checking for the 'error' event.

I had the same issue when trying to run 'npm install' in an Angular project using the node version 16.13.2.
When I switched the node version to 12.18.1 I was able to run the command with no problems.

Dont you have a crazy docker instance that is crashing and restarting on a loop ? Sometime it make cause random connection reset.

Related

Amplify Video - How to upload a video to the "Input" bucket with swift?

I have an IOS project using Amplify as a backend. I have also incorporated Amplify Video in the hope of supporting video-on-demand. After adding Amplify Video to the project, an "Input" and "Output" bucket is generated. These appear outside of my project environment when visualised via the Amplify Console. They can only be accessed via navigating to AWS S3 console. My questions is, how to I upload my videos via swift to the "Input" bucket via Amplify (or do I not)? The code I have below uploads the video to the S3 bucket within the project environment. There is next to no support for Amplify Video for IOS (Amplify Video Documentation)
if let vidData = self.convertVideoToData(from: srcURL){
let key = "myKey"
//let options = StorageUploadDataRequest.Options.init(accessLevel: .protected)
Amplify.Storage.uploadData(key: key, data: vidData) { (progress) in
print(progress.fractionCompleted)
} resultListener: { (result) in
switch result{
case .success(_ ):
print("upload success!")
case .failure(let error):
print(error.errorDescription)
}
}
}
I'm facing the same issue.. As far as I can tell the iOS Amplify library's amplifyconfiguration.json is limited to using one storage spec under S3TransferUtility.
I'm in the process of solving this issue myself, but the quick solution is to modify the created AWS video resources to run off the same bucket (input and output). Now, be warned I'm an iOS Engineer, not backend, only getting familiar with AWS.
Solution as follows:
The input bucket the amplify video plugin created has 4 event notifications under the properties tab. These each kick off a VOD-inputWatcher lambda function. Copy these 4 notifications to your original bucket
The output bucket has two event notifications, copy those also to the original bucket
Try the process now, drop a video into your bucket manually. It will fail but we'll see progress - the MediaConvert job is kicked off, but will tell you it failed because it didn't have permissions to read the files in your bucket. Something like Unable to open input file, Access Denied. Let's solved this:
Go to the input lambda function and add this function:
async function enableACL(eventObject) {
console.log(eventObject);
const objectKey = eventObject.object.key;
const bucketName = eventObject.bucket.name;
const params = {
Bucket: bucketName,
Key: objectKey,
ACL: 'public-read',
};
console.log(`params: ${eventObject}`);
s3.putObjectAcl(params, (err, data) => {
if (err) {
console.log("failed to set ACL");
console.log(err);
} else {
console.log("successfully set acl");
console.log(data);
}
});
}
Now call it from the event handler, and don't forget to add const s3 = new AWS.S3({}); on top of the file:
exports.handler = async (event) => {
// Set the region
AWS.config.update({ region: event.awsRegion });
console.log(event);
if (event.Records[0].eventName.includes('ObjectCreated')) {
await enableACL(event.Records[0].s3);
await createJob(event.Records[0].s3);
const response = {
statusCode: 200,
body: JSON.stringify(`Transcoding your file: ${event.Records[0].s3.object.key}`),
};
return response;
}
};
Try the process again. The lambda will fail, you can see it in the lambda's CloutWatch: failed to set ACL. INFO AccessDenied: Access Denied at Request.extractError. To fix this we need to give S3 permissions to the input lambda function.
Do that by navigating to the lambda function's Configuration / Permissions and find the Role. Open it in IAM and add Full S3 access. Not ideal, but again, I'm just trying to make this work. Probably would be better to specify the exact Bucket and correct actions only. Any help regarding proper roles greatly appreciated :)
Repeat the same for the output lambda function's role also, give it the right S3 permissions.
Try uploading a file again. At this point if you run into this error:
failed to set ACL. INFO NoSuchKey: The specified key does not exist. at Request.extractError. It's because in the bucket you have objects in the protected Folder. Try to use the public folder instead (in the iOS lib you'll have to use StorageAccessLevel.guest permissions to access this)
Now drop a file in the public folder. You should see the MediaConvert job kick off again. It will still fail (check in MediaConvert / Jobs), saying it doesn't have permissions to write to the S3 bucket Unable to write to output file .. . You can fix this by going to the input lambda function again, this gives the permissions to the MediaConvert job:
const jobParams = {
JobTemplate: process.env.ARN_TEMPLATE,
Queue: queueARN,
UserMetadata: {},
Role: process.env.MC_ROLE,
Settings: jobSettings,
};
await mcClient.createJob(jobParams).promise();
Go to the input lambda function, Configuration / Environment Variables. The function uses the field MC_ROLE to provide the role name to the Media Convert job. Copy the role name and look it up in IAM. Modify its permissions by adding the right S3 access to the role to your bucket.
If you try it only more time, the output should appear right next to your input file.
In order to be able to read the s3://public/{userIdentityId}/{videoName}/{videoName}{quality}..m3u8 file using the current Amplify.Storage.downloadFile(key: {key}, ...) function in iOS, you'll probably have to attach to the key right path and remove the .mp4 extension. Let me know if you're facing any problems, I'm sorting this out now also.

Azure IoT Device: Type error in client.js

I try to get an ARM device connected to Azure IoT Hub. I chose Node.js and got some sample code to get the device connected. I added the required NPM packages such as azure_iot_device, azure_iot_common, azure_iot_http_base.
Within the code, there is one line of code which causes an error.
The line: client.sendEvent(message, printResultFor('send'));
After this, on the debugging console I get the message:
\NodejsWebApp1\node_modules\azure-iot-device\lib\client.js:596
return new Client(new transportCtor(authenticationProvider), null, new blob_upload_1.BlobUploadClient(authenticationProvider));
^
TypeError: transportCtor is not a function
at Function.Client.fromConnectionString
(C:\Users\InterestedGuy\source\repos\NodejsWebApp1\NodejsWebApp1\node_modules\azure-iot-device\lib\client.js:596:27)
at sendmsg (C:\Users\InterestedGuy\source\repos\NodejsWebApp1\NodejsWebApp1\server.js:123:32)
at Server. (C:\Users\InterestedGuy\source\repos\NodejsWebApp1\NodejsWebApp1\server.js:48:9)
at emitTwo (events.js:87:13)
at Server.emit (events.js:172:7)
at HTTPParser.parserOnIncoming [as onIncoming] (_http_server.js:529:12)
at HTTPParser.parserOnHeadersComplete (_http_common.js:88:23)
Press any key to continue...
First guess was that I miss a library so I simply searched the Web where transportCtor should have been defined - but no success.
So the easy question is: where should this function be defined? I would expect the function is part of the Azure IoT SDK but I could not find it. Since the module client.js from azure_iot_device is reporting the error I expect it somewhere within the SDK - but where?
THX for any advice
You should install azure-iot-device-http package to communicate with Azure IoT Hub from any device over HTTP 1.1. You can use this command to get the latest version.
npm install -g azure-iot-device-http#latest
Following code is a tutorial shows how to use this package.
var clientFromConnectionString = require('azure-iot-device-http').clientFromConnectionString;
var Message = require('azure-iot-device').Message;
var connectionString = '[IoT Hub device connection string]';
var client = clientFromConnectionString(connectionString);
var connectCallback = function (err) {
if (err) {
console.error('Could not connect: ' + err);
} else {
console.log('Client connected');
var message = new Message('some data from my device');
client.sendEvent(message, function (err) {
if (err) console.log(err.toString());
});
client.on('message', function (msg) {
console.log(msg);
client.complete(msg, function () {
console.log('completed');
});
});
}
};
client.open(connectCallback);
BTW,for this tutorial you also need to install azure-iot-device package.

Unable to replicate database in pouchDB

I'm trying a react native application using couchDB 2.1.1. PouchDB entry in package json looks like this:
"pouchdb": "^6.3.4",
"pouchdb-react-native": "^6.3.4",
Replication is as shown below:
const localDB = new PouchDB('employee');
const remoteDB = new PouchDB('http://username:password#localhost:5984/employee');
localDB.replicate.from(
remoteDB,
(err, response) => {
if (err) {
return console.log(err);
}
},
);
I get following error:
{"code":"ETIMEDOUT","status":0,"result":{"ok":false,"start_time":"...","docs_read":0,"docs_written":0,"doc_write_failures":0,"errors":[],"status":"aborting","end_time":"...","last_seq":0}}
Almost all the times this works fine when I run the app in debug mode. Tried ajax timeout as shown here PouchDB ETIMEDOUT error. This didn't work. Is there something that I'm supposed to look in my code? Please help.
Had the same issue, the following fixed it for me:
Use your PC ip address instead of localhost
Configure your firewall
to allow connections on port 5984 OR just disable it (Not
recommended)

Firebase Cloud function listener stop after 1-2 hours

var admin = require("firebase-admin");
var serviceAccount = require(__dirname+"/myserviceaccount.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://myproject.firebaseio.com"
});
db.ref('myref').on("child_changed", function(snapshot) {
...
});
package.json
{
"name": "listener",
"version": "0.0.1",
"dependencies": {
"firebase-admin": "^5.2.1"
}
}
It works fine until 1-2 hour later and no error log. Anyone can solve this problem?
The code you shared doesn't start any Cloud Functions as far as I can see. I'm surprised that it deploys at all, but it definitely won't start a reliable listener in Cloud Functions for Firebase.
To write code that correctly functions in the Cloud Functions environment, be sure to follow the instructions here: https://firebase.google.com/docs/functions/get-started.
Specifically: the correct syntax to set up code in Cloud Functions that is triggered by updates to a database path is:
exports.listenToMyRef = functions.database.ref('/myref/{pushId}')
.onUpdate(event => {
// Log the current value that was written.
console.log(event.data.val();
return true;
});

Download ZIP file in react-native

I'm writing an app using react-native and I will build it for both android and iOS.
Anyway, I have been trying to download a ZIP-file using react-native but I can't get it to work. After I have downloaded the file my plan is to unzip it and then store it using AsyncStorage.
But I keep getting the error below:
[RCTNetworking.m:330] Received data was not a string, or was not a recognised encoding.
I have tried various settings for my request but I guess I am simply missing something, the code currently looks like:
fetch('somewhere.path/file.zip', {
method: 'GET',
headers: {
'Accept-Encoding': 'application/zip'
},
})
.then((response) => {
console.log("Success");
})
.catch((error) => {
console.log("Error");
}).done();
Success gets printed but the response data does not contain the zip files data.
If it helps I am debugging using XCode and the simulator.
If anybody has any ideas please help me out! :)
Thanks in advance,
Yon
I also write an app to download some zip files and unzip it. And for download function, I using a plugin called react-native-fetch-blob. Code example:
import RNFetchBlob from 'react-native-fetch-blob';
...
RNFetchBlob.config({
fileCache : true,
path: path + '/file.zip'})
.fetch('GET','http://domain/file.zip')
.progress((received, total) => {console.log('progress', received / total)})
.then((res) => {// the temp file path
console.log('The file saved to ', res.path());
});
...
Thanks,

Resources