nextJS: async getInitialProps() with AWS S3? - javascript

I'm trying to get an s3.getObject() running inside an async getInitialProps() function in a nextJS project, but I can't for the love of it figure out how to get the results prepped to they can be returned as an object (which is needed for getInitialProps() and nextJS' SSR to work properly).
Here is the code:
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await s3.getObject(params, (err, data) => {
if (err) throw err;
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
return imgData;
});
return ...
}
The idea is to fetch an image from S3 and return it as base64 code (just to clear things up).

From your code, s3.getObject, works with callback. you need to wait for the callback to be called.
You can achieve it by converting this callback into a promise.
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await new Promise((resolve, reject) => {
s3.getObject(params, (err, data) => {
if (err) reject(err);
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
resolve(imgData);
});
});
return ...
}

Related

How do I pipe a file from S3 to the client in Uint8List format in Node.js?

I am currently working on a Api just to get to know Node.js, as I am currently learning it.
I successfully created a route for uploading an mp3 file into an s3 bucket, but when I try to fetch a file from S3 in Uint8List format, I don't get the results I want. (Flutter requires me to send an Uin8List, if this is not a good solution I can also convert it into an Ui8List on the client side)
I am able to create a Readable stream, and when the stream receives chunks it logs it into the console. But I am not quite sure how I can send the data back to the client in buffers, I am only able to send the data in one big list but ofcourse for efficiency this is not the best option.
Anyone able to help me? This is the code is currently have:
var AWS = require('aws-sdk');
AWS.config.update(
{
accessKeyId: AWS_ACCESS_KEY,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
region: AWS_REGION
}
);
var s3 = new AWS.S3();
router.get('/assets/:fileKey', auth, async function (req, res, next) {
try {
const fileKey = req.params.fileKey;
const options = {
Bucket: AWS_BUCKET_NAME,
Key: fileKey,
};
const chunks = [];
const getAsBytes = new Promise((resolve, reject) => {
const readStream = s3.getObject(options).createReadStream();
readStream.on('data', (chunk) => {
// console.log('-------new data received--------')
// console.log(chunk);
chunks.push(chunk);
// res.write(chunk);
});
readStream.on('error', reject)
readStream.on('end', resolve);
}).catch((err) => next(err));
await getAsBytes;
res.write(Uint8Array.from(chunks));
res.end();
} catch (error) {
next(error);
}
});
When I try to pipe the readstream I get a response full of question marks and weird symbols..
Try this -> the chunk is actually a buffer so you need to convert that buffer to actual data using .toString()
var AWS = require('aws-sdk');
AWS.config.update({
accessKeyId: AWS_ACCESS_KEY,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
region: AWS_REGION
});
var s3 = new AWS.S3();
router.get('/assets/:fileKey', auth, async function (req, res, next) {
try {
const fileKey = req.params.fileKey;
const options = {
Bucket: AWS_BUCKET_NAME,
Key: fileKey,
};
const chunks = [];
const getAsBytes = new Promise((resolve, reject) => {
const readStream = s3.getObject(options).createReadStream();
readStream.on('data', (chunk) => {
// console.log('-------new data received--------')
// console.log(chunk);
chunks.push(chunk.toString());
// res.write(chunk);
});
readStream.on('error', reject)
readStream.on('end', resolve);
}).catch((err) => next(err));
await getAsBytes;
res.write(Uint8Array.from(chunks));
res.end();
} catch (error) {
next(error);
}
});

AWS S3 - Retreive multiple files and merge them after

I am trying to extract multiple files from AWS S3 bucket and willing to merge the response from all files after.
E.g I have following files:
my-bucket/mainfile1.json.gz
my-bucket/mainfile2.json.gz
my-bucket/mainfile3.json.gz
Currently I am accessing a single file like this:
const unzipFromS3 = (key, bucket) => {
return new Promise(async (resolve, reject) => {
AWS.config.loadFromPath(process.env["PWD"]+'/private/awss3/s3_config.json');
var s3 = new AWS.S3();
let options = {
'Bucket': "my-bucket",
'Key': "mainfile1.json.gz",
};
s3.getObject(options, function(err, res) {
if(err) return reject(err);
resolve(zlib.unzipSync(res.Body).toString());
});
});
};
unzipFromS3().then(function(result){
console.dir(result);
});
Now this works perfect for single file, but how can I achieve this with multiple files in case I want to merge data from 3 separate files?
Here's an initial idea of how to read the gzipped JSON files from S3, unzip them, then merge the resulting JavaScript objects, and finally gzip and write the merged results back to S3.
const aws = require('aws-sdk');
const zlib = require('zlib');
const s3 = new aws.S3();
const BUCKET = 'mybucket';
const PREFIX = '';
const FILES = ['test1.json.gz', 'test2.json.gz', 'test3.json.gz'];
(async () => {
const promises = [];
try {
for (let ii = 0; ii < FILES.length; ii++) {
const params = {
Bucket: BUCKET,
Key: `${PREFIX}${FILES[ii]}`,
};
console.log('Get:', params.Key, 'from:', params.Bucket);
promises.push(s3.getObject(params).promise());
}
const results = await Promise.all(promises);
const buffers = results.map(result => result.Body);
const content = buffers.map(buffer => JSON.parse(zlib.unzipSync(buffer).toString()));
console.log('Read OK', JSON.stringify(content));
const merged = Object.assign({}, ...content);
console.log('Merged content', JSON.stringify(merged));
const params = {
Bucket: BUCKET,
Key: `${PREFIX}result/test.json.gz`,
Body: zlib.gzipSync(JSON.stringify(merged), 'utf8'),
};
console.log('Put:', params.Key, 'to:', params.Bucket);
const rc = await s3.putObject(params).promise()
} catch (err) {
console.log(err, err.stack);
throw err;
}
})();

Unexpected Token s3 error on AWS Lambda Node 12.X

I'm using Node 12.x version to write my Lambda function. Here is the Parsing error that I am getting. What could be the reason?
Update
const im = require("imagemagick");
const fs = require("fs");
const os = require("os");
const uuidv4 = require("uuid/v4");
const {promisify} = require("util");
const AWS = require('aws-sdk');
const resizeAsync = promisify(im.resize)
const readFileAsync = promisify(fs.readFile)
const unlinkAsync = promisify(fs.unlink)
AWS.config.update({region: 'ap-south-1'})
const s3 = new AWS.S3();
exports.handler = async (event) => {
let filesProcessed = event.Records.map((record) => {
let bucket = record.s3.bucket.name;
let filename = record.s3.object.key;
//Fetch filename from S3
var params = {
Bucket: bucket,
Key: filename
};
//let inputData = await s3.getObject(params).promise()
let inputData = await s3.getObject(params).promise();
//Resize the file
let tempFile = os.tmpdir() + '/' + uuidv4() + '.jpg';
let resizeArgs = {
srcData: inputData.Body,
dstPath: tempFile,
width: 150
};
await resizeAsync(resizeArgs)
//Read the resized File
let resizedData = await readFileAsync(tempFile)
//Upload the resized file to S3
let targetFilename = filename.substring(0, filename.lastIndexOf('.') + '-small.jpg')
var params = {
Bucket: bucket + '-dest',
Key: targetFilename,
Body: new Buffer(resizedData),
ContentType: 'image/jpeg'
}
await s3.putObject(params).promise();
return await unlinkAsync(tempFile)
})
await Promise.all(filesProcessed)
return "done"
}
Here is the same code. I am getting Unexpected token S3 error when hovering the red mark (shown in the image)
What you can do is, declare inputData as below and initialize it with the response from the getObject.
let inputData;
var params = {
Bucket: "examplebucket",
Key: "HappyFace.jpg"
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else inputData = data; // successful response
});
For more, you can refer here

Upload Image from form-data to S3 using a Lambda

So I am writing a Lambda that will take in some form data via a straight POST through API Gateway (testing using Postman for now) and then send that image to S3 for storage. Every time I run it, the image uploaded to S3 is corrupted and won't open properly. I have seen people having to decode/encode the incoming data but I feel like I have tried everything using Buffer.from. I am only looking to store either .png or .jpg. The below code does not reflect my attempts using Base64 encoding/decoding seeing they all failed. Here is what I have so far -
Sample Request in postman
{
image: (uploaded .jpg/.png),
metadata: {tag: 'iPhone'}
}
Lambda
const AWS = require('aws-sdk')
const multipart = require('aws-lambda-multipart-parser')
const s3 = new AWS.S3();
exports.handler = async (event) => {
const form = multipart.parse(event, false)
const s3_response = await upload_s3(form)
return {
statusCode: '200',
body: JSON.stringify({ data: data })
}
};
const upload_s3 = async (form) => {
const uniqueId = Math.random().toString(36).substr(2, 9);
const key = `${uniqueId}_${form.image.filename}`
const request = {
Bucket: 'bucket-name',
Key: key,
Body: form.image.content,
ContentType: form.image.contentType,
}
try {
const data = await s3.putObject(request).promise()
return data
} catch (e) {
console.log('Error uploading to S3: ', e)
return e
}
}
EDIT:
I am now atempting to save the image into the /tmp directory then use a read stream to upload to s3. Here is some code for that
s3 upload function
const AWS = require('aws-sdk')
const fs = require('fs')
const s3 = new AWS.S3()
module.exports = {
upload: (file) => {
return new Promise((resolve, reject) => {
const key = `${Date.now()}.${file.extension}`
const bodyStream = fs.createReadStream(file.path)
const params = {
Bucket: process.env.S3_BucketName,
Key: key,
Body: bodyStream,
ContentType: file.type
}
s3.upload(params, (err, data) => {
if (err) {
return reject(err)
}
return resolve(data)
}
)
})
}
}
form parser function
const busboy = require('busboy')
module.exports = {
parse: (req, temp) => {
const ctype = req.headers['Content-Type'] || req.headers['content-type']
let parsed_file = {}
return new Promise((resolve) => {
try {
const bb = new busboy({
headers: { 'content-type': ctype },
limits: {
fileSize: 31457280,
files: 1,
}
})
bb.on('file', function (fieldname, file, filename, encoding, mimetype) {
const stream = temp.createWriteStream()
const ext = filename.split('.')[1]
console.log('parser -- ext ', ext)
parsed_file = { name: filename, path: stream.path, f: file, type: mimetype, extension: ext }
file.pipe(stream)
}).on('finish', () => {
resolve(parsed_file)
}).on('error', err => {
console.err(err)
resolve({ err: 'Form data is invalid: parsing error' })
})
if (req.end) {
req.pipe(bb)
} else {
bb.write(req.body, req.isBase64Encoded ? 'base64' : 'binary')
}
return bb.end()
} catch (e) {
console.error(e)
return resolve({ err: 'Form data is invalid: parsing error' })
}
})
}
}
handler
const form_parser = require('./form-parser').parse
const s3_upload = require('./s3-upload').upload
const temp = require('temp')
exports.handler = async (event, context) => {
temp.track()
const parsed_file = await form_parser(event, temp)
console.log('index -- parsed form', parsed_file)
const result = await s3_upload(parsed_file)
console.log('index -- s3 result', result)
temp.cleanup()
return {
statusCode: '200',
body: JSON.stringify(result)
}
}
The above edited code is a combination of other code and a github repo I found that is trying to achieve the same results. Even with this solution the file is still corrupted
Figured out this issue. Code works perfectly fine - it was an issue with API Gateway. Need to go into the API Gateway settings and set thee Binary Media Type to multipart/form-data then re-deploy the API. Hope this helps someone else who is banging their head against the wall on figuring out sending images via form data to a lambda.

AWS S3 get corrupted image

I need to create smaller version of image from S3. When I call datastorage via s3.getObject I receive corrupted image file, but In S3 it is saved corretly.
corrupted image
I tried to use s3.getObject and set await whenever I can. Also I saving receive image localy to check it before modifing it with sharp. Its a problem with sending image between S3 and my app.
export const s3Config: S3.ClientConfiguration = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_DEFAULT_REGION,
};
async downloader(url: string)
{
const pathParts: string[] = url.split('/');
const imageParts: string[] = pathParts[5].split('.');
const path = pathParts[4].concat('/',pathParts[5]);
const name: string = imageParts[0];
const nameSmall = name + '_small.' + imageParts[imageParts.length -1];
const test = name + '_test.' + imageParts[imageParts.length -1];
this.logger.info(`Image path: ${path}`);
AWS.config.update(s3Config);
const s3 = new AWS.S3();
await s3.getObject(
{
Bucket: mediaBucket, Key:path
},
function (error, data) {
if (error != null) {
console.log("Failed to retrieve an object: " + error);
} else {
console.log("Loaded " + data.ContentLength + " bytes");
}
}
)
.promise()
.then(async data =>{
await fs.createWriteStream('./images/'+test).write(data.Body as any);
// tslint:disable-next-line
await sharp(data.Body as any).resize(100).toFile('./images/'+nameSmall, (err,info)=> {
this.logger.info('err: ', err);
this.logger.info('info: ', info);
});
})
}
I expect right to receive image inn correct form, but the actual output is damage.

Categories