Get AWS S3 Upload URL - NodeJs AWS-SDK - javascript

I'm pretty sure I'm missing something very obvious here, but:
I'm uploading a file to an s3 bucket using aws-sdk as follows:
const awsURL = await s3.upload(params, (err, data) => {
if (err) {
console.log(err);
return null;
}
console.log(`File uploaded successfully. ${data.Location}`);
return data.Location;
});
return awsURL;
I'm able to log the upload url successfully, however the awsURL returned is an array, not the data.Location value - shouldn't the data.Location be returned from the callback?

Convert s3.upload to return a promise:
const data = await s3.upload(params).promise(); // this line
console.log(`File uploaded successfully. ${data.Location}`);
return data.Location;

Related

JavaScript function doesn't wait for file system operations

I'm trying to create a simple JS function which reads the directory, loop over all the files inside that DIR and push information in the array. So console.logs regarding file data do output the correct data, but console.log({response}) at the end of the newFunc function doesn't output anything neither does the getFileData from awaited async function. However I've also tried to put timeout on the called async function which results in the correct returned response (this means JS doesn't wait for my file read to finish it's work).
Functionality meantioned above:
const newFunc = async () => {
let response: any = [];
fs.readdir("files", function (err, files) {
if (err) {
console.error("Could not list the directory.", err);
process.exit(1);
}
for (const file of files) {
fs.stat(`files/${file}`, (err, stats) => {
if (err) {
throw err;
}
// print file information
console.log(`File Data Last Modified: ${stats.mtime}`);
console.log(`File Status Last Modified: ${stats.ctime}`);
console.log(`File Status Last Modified: ${stats.size}`);
console.log({ file });
response.push({
fileName: file,
updatedAt: stats.mtime,
createdAt: stats.ctime,
size: stats.size,
});
});
}
});
console.log({ response });
return response;
};
NodeJS endpoint functionality:
const getFileData = await newFunc();
console.log({getFileData})
res.status(200).json(getFileData);
You should use readdirSync and statSync functions. This way the code will wait for the response from the function calls. Currently, due to async nature of fs.readdir and fs.stat methods the code doesn't wait and logs empty response array while also returning an empty response.
Your newFunc method should look like this:
const newFunc = async () => {
let response: any = [];
try {
const files = fs.readdirSync('files');
for (const file of files) {
const stats = fs.statSync(`files/${file}`);
console.log(`File Data Last Modified: ${stats.mtime}`);
console.log(`File Status Last Modified: ${stats.ctime}`);
console.log(`File Status Last Modified: ${stats.size}`);
console.log({ file });
response.push({
fileName: file,
updatedAt: stats.mtime,
createdAt: stats.ctime,
size: stats.size,
});
}
}
catch(err) {
console.log(err);
throw err;
}
return response;
}
readdir and stat are async operations so it seems that in newFunc you are returning the value of response not waiting for these async operations to complete. So, you could use readdirSync and statSync instead of readdir and stat or you could probably just add an await in your current implementation in front of the fs.readdir and fs.stat calls.

Promise does not return any value to caller

I have an Angular app with a simple call to a service to upload an image to AWS S3.
The upload service should return the AWS response to the caller.
Despite the upload service successfully uploading the file and throwing the success message to the console, and despite a RETURN of the promise resolved, the caller "then" is not kickin in.
There is no error messsage, only the following console.log statement is ignored:
console.log('aws file returned: ', res);
Here is the caller:
this.aws.uploadDataFile(data).then(res => {
if (res) {
console.log('aws file returned: ', res);
}
}, err => {
console.log('error: ', err);
});
Here is the upload service called:
uploadDataFile(data: any){
const contentType = data.type;
const bucket = new S3({
accessKeyId: environment.awsAccessKey,
secretAccessKey: environment.awsSecret,
region: environment.awsRegion
});
const params = {
Bucket: environment.awsBucket,
Key: data.name, //manipulate filename here before uploading
Body: data.value,
ContentEncoding: 'base64',
ContentType: contentType
};
return new Promise(function(resolve,reject){
bucket.putObject(params, function(err, res) {
if (err) {
console.log(err);
console.log('Error uploading data: ', res);
return Promise.resolve(err);
}
console.log('succesfully uploaded the image! ' + JSON.stringify(res));
return Promise.resolve(res);
});
})
}
I do see the success message from the service in the console:
console.log('succesfully uploaded the image! ' + JSON.stringify(res));
But this message is not showing:
console.log('aws file returned: ', res);
I need the returned value in the caller to further perform tasks. What am I missing?
The Promise you are creating has an executor callback that provides the resolve and reject methods you should use to resolve the promise. Additionally, any return values from your executor are ignored. You can't chain onto the promise by returning a promise in the executor. This is documented on MDN for the Promise constructor.
The main issue is that you are not using the executor parameters to resolve or reject the promise. This leaves your promise in a pending state indefinitely. Invoking the appropriate resolve or reject will fix the issue.
function uploadDataFile(data) {
//...
return new Promise(function(resolve, reject) {
bucket.putObject(params, function(err, res) {
if (err) {
console.log(err);
console.log('Error uploading data: ', res);
reject(err);
} else {
console.log('succesfully uploaded the image! ' + JSON.stringify(res));
resolve(res);
}
});
});
}
In your case you are using the AWS API. So, you could use the built-in promise method from AWS. The promise method does the same conversion as above.
function uploadDataFile(data) {
//...
return bucket.putObject(params).promise().then(function(res) {
console.log('succesfully uploaded the image! ' + JSON.stringify(res));
return res;
}).catch(function(err) {
console.log('Error uploading data: ');
console.log(err);
throw err;
});
}
If you do not need the extra logging you can just return the promise.
function uploadDataFile(data) {
//...
return bucket.putObject(params).promise();
}
You're resolving the wrong promise.
Promise.resolve is a function which creates a new promise and immediately resolves it. It is extremely rare that it is actually useful.
It is not the same as the function which is passed by the Promise constructor function to your callback and placed in the first argument which you have named resolve.
You need to resolve that promise and not a new one.
Note that the AWS API has built-in support for promises so you don't need to promisify the callback function yourself anyway.

Get all Amazon S3 files inside a bucket within Promise

I'm trying to grab thousands of files from Amazon S3 within a Promise but I can't seem to figure out how to include the ContinuationToken within if the list is truncated and gather it all together within the promise. I'm a novice with JS and could use some help. Here's what I have, so far:
getFiles()
.then(filterFiles)
.then(mapUrls)
;
function getFiles(token) {
var params = {
Bucket: bucket,
MaxKeys: 5000,
ContinuationToken: token
};
var allKeys = [];
var p = new Promise(function(resolve, reject){
s3.listObjectsV2(params, function(err, data) {
if (err) {
return reject(err);
}
allKeys.push(data.Contents)
if (data.IsTruncated) {
s3.listObjectsV2({Bucket: bucket, MaxKeys: 5000, ContinuationToken: data.NextContinuationToken})
console.log('Getting more images...');
allKeys.push(data.Contents)
}
resolve(data.Contents);
});
});
return p;
}
I need the function to continue to run until I've created a list of all objects in the bucket to return.
You need ContinuationToken the second time only.
var params = {
Bucket: bucket,
MaxKeys: 5000,
};
if (data.IsTruncated) {
s3.listObjectsV2({...params, ContinuationToken: data.NextContinuationToken})
IMO, this is just a s3 function called twice, more like a nested
call. Recursion is when a function keeps calling itself
until a specified condition is met.
Read more about recursion: https://medium.com/#vickdayaram/recursion-caad288bf621
I was able to list all objects in the bucket using async/await and the code below to populate an array.
async function getFiles(objects = []) {
const response = await s3.listObjectsV2(params).promise();
response.Contents.forEach(obj => filelist.push(obj.Key));
if (response.NextContinuationToken) {
params.ContinuationToken = response.NextContinuationToken;
await getFiles(params, objects);
}
console.log(filelist.length)
return filelist;
}
Thanks to all who helped!

Piping got.stream to a file

I am refactoring some code that was using http module in Node to use got instead. I tried the following:
function get(url, filePath) {
return new Promise((resolve, reject) => {
got.stream(url).on
("response", response => {
const newFile = fs.createWriteStream(filePath);
response.pipe(newFile);
newFile.on("finish", () => {
newFile.close(resolve());
});
newFile.on("error", err => {
reject(err);
});
}).on
("error", err => {
reject(err);
});
});
}
The finish event never fired. The file (filePath) is created with 0 bytes.
The block of code using newFile was something that worked when I was using the Node http module.
What is the proper way to pipe got.stream to a file?
Per the got() documentation, you want to pipe the stream directly to your file and if you use pipeline() to do it, it will collect errors and report completion.
const pipeline = promisify(stream.pipeline);
const fsp = require('fs').promises;
function get(url, filePath) {
return pipeline(
got.stream(url),
fs.createWriteStream(filePath)
);
}
// usage
get(...).then(() => {
console.log("all done");
}).catch(err => {
console.log(err);
});
FYI, the point of got.stream() is to return a stream that you can directly use as a stream and since you want it to go to a file, you can pipe that stream to that file. I use pipeline() instead of .pipe() because pipeline has much more complete error handling that .pipe(), though in non-error conditions, .pipe() would also work.
Here's a version that cleans up the output file if there's an error:
function get(url, filePath) {
return pipeline(
got.stream(url),
fs.createWriteStream(filePath)
).catch(err => {
fsp.unlink(filePath).catch(err => {
if (err.code !== 'ENOENT') {
// trying to delete output file upon error
console.log('error trying to delete output file', err);
}
});
throw err;
});
}

Write to S3 bucket using Async/Await in AWS Lambda

I have been using the code below (which I have now added await to) to send files to S3. It has worked fine with my lambda code but as I move to transfer larger files like MP4 I feel I need async/await.
How can I fully convert this to async/await?
exports.handler = async (event, context, callback) => {
...
// Copy data to a variable to enable write to S3 Bucket
var result = response.audioContent;
console.log('Result contents ', result);
// Set S3 bucket details and put MP3 file into S3 bucket from tmp
var s3 = new AWS.S3();
await var params = {
Bucket: 'bucketname',
Key: filename + ".txt",
ACL: 'public-read',
Body: result
};
await s3.putObject(params, function (err, result) {
if (err) console.log('TXT file not sent to S3 - FAILED'); // an error occurred
else console.log('TXT file sent to S3 - SUCCESS'); // successful response
context.succeed('TXT file has been sent to S3');
});
You only await functions that return a promise. s3.putObject does not return a promise (similar to most functions that take a callback). It returns a Request object. If you want to use async/await, you need to chain the .promise() method onto the end of your s3.putObject call and remove the callback (https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Request.html#promise-property)
try { // You should always catch your errors when using async/await
const s3Response = await s3.putObject(params).promise();
callback(null, s3Response);
} catch (e) {
console.log(e);
callback(e);
}
As #djheru said, Async/Await only works with functions that return promises.
I would recommend creating a simple wrapper function to assist with this problem.
const putObjectWrapper = (params) => {
return new Promise((resolve, reject) => {
s3.putObject(params, function (err, result) {
if(err) reject(err);
if(result) resolve(result);
});
})
}
Then you could use it like this:
const result = await putObjectWrapper(params);
Here is a really great resource on Promises and Async/Await:
https://javascript.info/async

Categories