Node.JS - Program Terminates Before Receiving Callback - javascript

I'm utilizing WebdriverIO v4 to create HTML reports of various tests passing/failing. I gather the results of the tests via multiple event listeners (ex. this.on('test:start'), this.on('suite:end'), etc.). The final event is this.on('end'), which is called when all of the tests have completed execution. It is here were the test results are sorted based on which Operating System it was run on, Browser, etc.
It seems that the problem is that my program terminates before allowing the function to complete or receiving a callback from the function.
Here is the gist of my code:
let aws = require('aws-sdk');
aws.config.update({
//Censored keys for security
accessKeyId: '*****',
secretAccessKey: '*****',
region: 'us-west-2'
});
let s3 = new aws.S3({
apiVersion: "2006-03-01",
});
/*
*
* Here is where the program generates HTML files
*
*/
//The key and data fields are not actually asterisks, this is just to show that key and data fields are initialized at this point
let key = "****"
let data = "****"
s3.upload({
Bucket: 'html',
Key: key,
Body: data
}, function (err, data) {
if (err) {
console.log("Error: ", err);
}
if (data) {
console.log("Success: ", data.Location);
}
}).on('httpUploadProgress', event => {
console.log(`Uploaded ${event.loaded} out of ${event.total}`);
});
When the tests are run, the results are displayed, and then the program terminates. The program does not wait for the callback to return, so no error or success message is displayed. The files are not uploaded to S3.
I know that the s3.upload() works because if I run the code earlier in the project with dummy data, the upload succeeds. But at the end of my project, the program terminates before the upload finishes.
How can I fix this issue? How can I guarantee that the files are uploaded to S3 before the program terminates? How can I see the callback before termination? Thank you!

Related

Is Transfer Acceleration working? response location and object URL is different

I'm uploading a video to S3 using aws-sdk in a reaction environment.
And Use an accelerated endpoint for faster data transfers.
the endpoint is bucket-name.s3-accelerate.amazonaws.com
And I changed the option to 'Enabled' for accelerating transmission in bucket properties.
Below is my code for uploading file objects to s3.
import AWS from "aws-sdk";
require("dotenv").config();
const AWS_ACCESS_KEY = process.env.REACT_APP_AWS_ACCESS_KEY;
const AWS_SECRET_KEY = process.env.REACT_APP_AWS_SECRET_KEY;
const BUCKET = process.env.REACT_APP_BUCKET_NAME;
const REGION = process.env.REACT_APP_REGION;
AWS.config.update({
accessKeyId: AWS_ACCESS_KEY,
secretAccessKey: AWS_SECRET_KEY,
region: REGION,
useAccelerateEndpoint: true, //----> the options is here.
});
async function uploadFileToS3(file) {
const params = {
Bucket: BUCKET,
Key: file.name,
ContentType: 'multipart/form-data',
Body: file
};
const accelerateOption = {
Bucket: BUCKET,
AccelerateConfiguration: { Status: 'Enabled'},
ExpectedBucketOwner: process.env.REACT_APP_BUCKET_OWNER_ID,
};
const s3 = new AWS.S3();
try {
s3.putBucketAccelerateConfiguration(accelerateOption, (err, data) => {
if (err) console.log(err)
else console.log(data, 'data put accelerate') //----> this is just object {}
});
s3.upload(params)
.on("httpUploadProgress", progress => {
const { loaded, total } = progress;
const progressPercentage = parseInt((loaded / total) * 100);
console.log(progressPercentage);
})
.send((err, data) => {
console.log(data, 'data data');
});
} catch (err) {
console.log(err);
}
}
There is definitely s3-accelerate in url in the location property of the data object. (in console.log)
{
Bucket: "newvideouploada2f16b1e1d6a4671947657067c79824b121029-dev"
ETag: "\"d0b40e4afca23137bee89b54dd2ebcf3-8\""
Key: "Raw Run __ Race Against the Storm.mp4"
Location: "https://newvideouploada2f16b1e1d6a4671947657067c79824b121029-dev.s3-accelerate.amazonaws.com/Raw+Run+__+Race+Aga
}
However, the object URL of the property of the video I uploaded does not exist.
Is this how it's supposed to be?
Am I using Transfer Acceleration wrong way?
I saw the documentation and AWS said using putBucketAccelerateConfiguration.
But When I console.log there is noting responsed.
Please let me know How to use Transfer Acceleration in Javascript awd-sdk.
If you are running this code on some AWS compute (EC2, ECS, EKS, Lambda), and the bucket is in the same region as your compute, then consider using VPC Gateway endpoints for S3. More information here. If the compute and the bucket are in different regions, consider using VPC Endpoints for S3 with inter region VPC peering. Note: VPC Gateway endpoints are free while VPC Endpoints are not.
After you enable BucketAccelerate it takes at least half an hour to take effect. You don't need to call this every time you upload a file unless you are also suspending bucket acceleration after you are done.
Bucket acceleration helps when you want to use the AWS backbone network to upload data faster(may be user is in region 'A' and bucket is in region 'B' or you want to upload bigger files in which case it goes to the nearest edge location and then uses the AWS backbone network). You can use this tool to check the potential improvement in terms of speed for various regions.
Also there is additional cost when you use this feature. Check the Data Transfer section on the S3 pricing page.

How to add and execute my own lambda function within the serverless-framework?

How do I add my own lambda without going to AWS console manually and, most importantly, how do I call it from my React app?
Currently, if I want to execute my own lambda function, I go to AWS console and create it there manually. Clearly, there is a way to do this locally in VS Code, since serverless-framework has created its functions already using the fullstack-app deployment.
This is my current Lambda function (send an email from contact form using Amazon SES) created in AWS using console.
var aws = require('aws-sdk');
var ses = new aws.SES({region: 'us-east-1'});
exports.handler = (event, context, callback) => {
var params = {
Destination: {
ToAddresses: [`${event.toEmail}`]
},
Message: {
Body: {Html: { Data: `${event.body}`}
},
Subject: { Data: `${event.subject}`}
},
Source: `${event.fromEmail}`
};
ses.sendEmail(params, function (err, data) {
callback(null, {err: err, data: data});
if (err) {
console.log(err);
context.fail(err);
} else {
console.log(data);
context.succeed(event);
}
});
};
I have created a REST API for it in AWS and I call it from my React app with axios:
axios.post(`https://xxxxxxxx.execute-api.us-east-1.amazonaws.com/default/contactFormSend`, email)
.then(res => {console.log(res)})
My goal is to not create the lambda function manually in AWS console, but write it locally using serverless-framework architecture and find a way to call it.
I have looked everywhere, but I feel like I have missed something very important during my learning about serverless-framework architecture.
How do I add my own lambda without going to AWS console manually and,
most importantly ?
I hope you have serverless.yml file with your function config. Here is a template with possible configs https://www.serverless.com/framework/docs/providers/aws/guide/serverless.yml/
If everything is setup deploying is super easy , just by using serverless deploy
https://www.serverless.com/framework/docs/providers/aws/guide/deploying/
Here is a very simple one from serverless examples - https://github.com/serverless/examples/tree/master/aws-node-rest-api
How do I call it from my React app ?
You need an exposed public endpoint either you take directly the one generated by API Gateway or you create custom domain and map it to your existing domain.

Multer-s3 Uploading empty files on mobile

I'm using angular and multer-s3 to upload files from an angular app to a node server. Everything works well on the desktop but for some reason when trying to upload the photo via my iPhone 7 the uploaded file is corrupt. I'm using the same image and running through the same flow on both devices but getting different results so I'm assuming its because of mobile?
Here's the alert I get when trying to open the S3 file on the mobile
The file “1519398514215-test.png” could not be opened because it is empty.
Here's my code
var aws = require('aws-sdk');
var path = require('path');
var path3 = path.join(__dirname, "../config/config-aws.json");
var multer = require('multer');
var multerS3 = require('multer-s3');
var request = require('request');
aws.config.loadFromPath(path3);
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'XXXX',
acl: 'public-read',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname + '.png'});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname + '.png' ;
cb(null, fileName)
}
})
});
router.post('/', uploadM.array('photos', 3), function(req,res) {
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
const url = 'https://s3-us-west-2.amazonaws.com/XXXX/' + fileName;
return res.status(200).json({
fileName: url
});
});
And here's my client-side
sendImage() {
const formData: FormData = new FormData();
this.removeObjectFromCanvas('polygon');
if (!fabric.Canvas.supports('toDataURL')) {
alert('This browser doesn\'t provide means to serialize canvas to an image');
} else {
// window.open(this.canvas.toDataURL('png'));
const image = new Image();
image.src = this.canvas.toDataURL('png');
const blob = this.dataURItoBlob(image.src);
const file = new File([blob], 'test.png');
formData.append('photos', file, 'test');
this.postFile(formData);
}
}
postFile(file) {
this.fileService.post(file)
.subscribe(data => {
}, error => {
console.log(error);
});
}
UPDATE **********
So found out you can debug on mobile. It looks like the buffer I am sending has data in it. My first thought was the buffer was not sending.
**** Update
Still can't figure this out. I've done some research and its possible it has something to do with formData and append? But as you can see by the image above both seem to be fine. Will continue to research ...
***** UPDATE
Definitely uploading empty files. But its only on mobile?
Also, I checked the formData prior to sending to the node server, seems to have the correct data in it.
*** UPDATE
Ok, even weirder experience. It seems multer-s3 is uploading empty files but when I take the file on the server-side and return it to the client-side, then read that file and display it, the image is displayed perfectly. So the formData is not the issue, it's something with multer-s3 I'm assuming?
****UPDATE
I forgot to mention I am using fabricjs and getting the image from the canvas. I read in some places there may be an issue there but like I said above when I send the file to the server and send it back to the client, after reading the file it displays the image perfectly.
****Update
I tried adding contentType to the multer method and now I'm receiving a 503 service unavailable error when running on mobile only. For desktop it is fine.
aws.config.loadFromPath(path3);
var file1;
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'rent-z',
acl: 'public-read',
contentType: function (req, file, cb) {
cb(null, 'image/png');
},
metadata: function (req, file, cb) {
console.log(file);
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname;
file1 = file;
cb(null, fileName)
}
})
}).array('photos', 1);
router.post('/', function(req,res) {
uploadM(req, res, function (err) {
if (err) {
console.log(err);
return res.status(400).json({
message: "Error uploading to multer",
error: err
});
}
console.log('worked');
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
// fs.readFile(req.body, function (err, data) {
const url = 'https://s3-us-west-2.amazonaws.com/rent-z/' + fileName;
return res.status(200).json({
fileName: url
});
// });
})
});
I even tried running multer-s3's automatic find mime-type function and that did give the same result
**** Day 4
It's been 96 hours since I started debugging this problem. No progress has been made. Still trying to figure out why its working on desktop and not mobile. For anyone looking for a quick summary of behavior:
User uploads image on the desktop
User places image on canvas
Use scales image
User presses sendImage
This converts the image to dataUri then blob
This blob is added to a file which is appended to formData
This formData is sent to a nodejs server where multer-s3 middleware
uploads the file to s3 successfully
User tries on mobile
Fails at step 7. The file is uploaded but is empty.
Let me know if anyone has any ideas on how to continue.
I'll make this an "official" answer since this may work for your needs. Anytime I have an intricate issue like this, my first thought is often "I wonder if there is an API/SaaS/service out there that can abstract this for me." As you've found, file uploads are tricky, particularly when you start throwing in the myriad devices we have to deal with these days.
I won't mention any particular services, but googling "file upload saas" will generally get you the top industry players. For $25 - $50/month you can abstract file uploads to a very simple api call. Not only do you get time savings now, but (assuming you choose a solid provider) you get no more headaches regarding file uploads in the future. It's the SaaS's job to make sure file uploads work on a million different devices; it's the SaaS's job to make sure S3 integration works, even when S3's api changes; it's the SaaS's job to make sure the user sees a nice friendly message if their upload fails for some reason, etc. I get to spend my time building features for our app instead of worrying about whether or not file uploads work on the iPhone 47.
"But then I'm tied to a SaaS, and live at the whim of their prices and feature set" Ah, but you can minimize that problem. For many services we use, I like to make a wrapper/interface/whatever you'd like to call it. In the case of file uploads, I made an ES6 module: fileUploads.js
In this module, I have a method upload. What does this method do? It simply implements and abstracts the API of [fileupload SaaS X]. If in the future we want, or need, to change from SaaS X to SaaS Y, I only have to change one thing in our entire app: my fileUpload.js module.

Stream file from S3 in lambda does not work

I want to load a file from S3 with line seperated values and push it into an array.
The following code does work on my local machine, but does not work executed as a lambda function. The lambda function times out (even if I bump the timeout up to 15 seconds).
Are the SDK's different? What do I miss here since I get no error message at all beside the timeout?
Lambda Env: Node 6.10
Permission to access S3 is set like this
"Statement": [{
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::mybucket",
"arn:aws:s3:::mybucket/*"
]
}]
Code looks like this
var AWS = require('aws-sdk');
var s3 = new AWS.S3({region:'eu-central-1'});
exports.index = function(event, context, callback){
var params = {
Bucket: 'mybucket',
Key: 'file.txt'
}
urls=[];
var stream = s3.getObject(params);
stream.on('httpError',function(err){
console.log(err);
throw err;
});
stream.on('httpData', function(chunk) {
urls.push(chunk.toString());
});
stream.on('httpDone', function() {
urls2 = urls.join('\n\r');
callback(urls2);
});
stream.send();
}
I got following error executing the lambda via AWS console
{
"errorMessage": "2017-07-04T18:25:20.271Z 19ab7138-60e6-11e7-9e1e-c318d929bc39 Task timed out after 15.00 seconds"
}
Thanks for any help!
Handler is required to invoke the lambda function. Also, you need to mention handler name in the lambda function configuration.
exports.handler = (event, context, callback) => {
const bucket = event.Records[0].s3.bucket.name;
const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: bucket,
Key: key,
};
var stream = s3.getObject(params);
....
stream.send();
}
eports.handler is invoked when lambda function triggers. Make sure you must define the handler name(filename.handler) in lambda function configuration.
If you trigger this code on s3 file upload it will read the uploaded s3 file. you change the bucket & key name to read any file(which exist).
Follow the documentation http://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html
This code works as expected Thanks at #anand for verifing it.
The issue was related to VPC settings.
Unfortunatly a good proper error message would have helped. But at the end of the day lesson learned.
If you are running on a VPC and your lambda code should run but you get a time out, better check your security and network settings =)

How to upload a file to S3 via NodeJS and track progress in the browser?

I'm having some trouble wrapping my head around a couple on concepts when it comes to file management and what not within NodeJS and wondered if anyone can point me in the right direction.
I have a queue list of uploads that will happen simultaneously as the user needs (allowing them to pause or delete a download at will) and I need the process to hit the Node server to resize the image or manipulate the video as needed. From here I need Node to upload the video or image via S3, using AWS-SDK (I set up a fakeS3 server for general sandbox testing) and as that is being uploaded I need to track the progress of that upload within the browser.
I am using React and server side rendering with Node so I figured there has to be a very straightforward way of handling this situation but I cannot find someone that has done this previously. Here are a couple concepts I was messing with:
Server.js (core input)
server.use(express.static(path.join(__dirname, 'public')));
server.use(multer({
dest: './public/temp',
limits: {
fieldNameSize: 50,
files: 1,
fields: 5,
fileSize: 1024 * 1024
},
rename: (fieldname, filename) => {
return filename;
},
onFileUploadStart: (file) => {
console.log('Starting file upload process.');
},
inMemory: true
}));
server.use(cookieParser());
server.use(bodyParser.urlencoded({ extended: true }));
server.use(bodyParser.json());
Intermediate route (/upload or something)
export function uploadContent(req, res) {
const config = {
s3ForcePathStyle: true,
accessKeyId: 'ACCESS_KEY_ID',
secretAccessKey: 'SECRET_ACCESS_KEY',
endpoint: new AWS.Endpoint('http://localhost:4567'), // TODO make live
};
const client = new AWS.S3(config);
const params = {
Key: 'Key',
Bucket: 'Bucket',
Body: fs.createReadStream(req.body.files[0])
};
client.upload(params, function uploadCallback (err, data) {
console.log(err, data);
});
}
This does not work, due to body parser conflicting with the route, multer is having a time as well (I'm open to other suggestions) any information on how this would be accomplished would be awesome. I'm not looking for full code just another idea to get me on the right path. I appreciate any help!
you can use a socket chanel between your browser and your Nodejs Route and emit event start, end and progress and using httpUploadProgress event :
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log(evt);
//Emit Here your events
}).
send(function(err, data) { console.log(err, data) });

Categories