I need to create smaller version of image from S3. When I call datastorage via s3.getObject I receive corrupted image file, but In S3 it is saved corretly.
corrupted image
I tried to use s3.getObject and set await whenever I can. Also I saving receive image localy to check it before modifing it with sharp. Its a problem with sending image between S3 and my app.
export const s3Config: S3.ClientConfiguration = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_DEFAULT_REGION,
};
async downloader(url: string)
{
const pathParts: string[] = url.split('/');
const imageParts: string[] = pathParts[5].split('.');
const path = pathParts[4].concat('/',pathParts[5]);
const name: string = imageParts[0];
const nameSmall = name + '_small.' + imageParts[imageParts.length -1];
const test = name + '_test.' + imageParts[imageParts.length -1];
this.logger.info(`Image path: ${path}`);
AWS.config.update(s3Config);
const s3 = new AWS.S3();
await s3.getObject(
{
Bucket: mediaBucket, Key:path
},
function (error, data) {
if (error != null) {
console.log("Failed to retrieve an object: " + error);
} else {
console.log("Loaded " + data.ContentLength + " bytes");
}
}
)
.promise()
.then(async data =>{
await fs.createWriteStream('./images/'+test).write(data.Body as any);
// tslint:disable-next-line
await sharp(data.Body as any).resize(100).toFile('./images/'+nameSmall, (err,info)=> {
this.logger.info('err: ', err);
this.logger.info('info: ', info);
});
})
}
I expect right to receive image inn correct form, but the actual output is damage.
Related
I want a make an API that will take a file or folder path from the user and upload it to AWS s3 I made progress but
when the user gives a file path it's searching the file path in the server, not in the user's pc
I know I made a mistake but I don't know how to connect API from the users pc and get access to system files
here is code for the post route
router.post("/create/:id", auth, async (req, res) => {
try {
let form = new multiparty.Form();
form.parse(req, async (err, fields, files) => {
console.log(fields);
console.log(files);
//check if user has access to project
const user_id = req.userId;
const project_id = req.params.id;
const user_access = await check_user_access_project(user_id, project_id);
const user = await User.findById(user_id);
const project = await Project.findById(project_id);
if (user_access === 1) {
//create version
const version = new Version({
project_id: project_id,
user_id: user_id,
versionName: fields.versionName[0],
version_description: fields.versionDescription[0],
version_file: [],
});
const version_data = await version.save();
console.log(version_data);
let version_id = version_data._id;
//sync folders to s3
const version_folder_path = fields.files_path[0];
let key = `${user.firstName}_${user_id}/${project.projectName}/${fields.versionName[0]}`;
const version_folder_list = await sync_folders(
version_folder_path,
key
);
console.log("version folder list", version_folder_list);
//update version with version folders
await Version.findByIdAndUpdate(
version_id,
{
$set: {
version_file: version_folder_list,
},
},
{ new: true }
);
//wait for version update
await version.save();
//send response
res.json({
success: true,
version: version_data,
});
} else {
res.status(401).json({
success: false,
message: "User does not have access to project",
});
}
});
} catch (error) {
res.status(400).json({ message: error.message });
}
});
here is the folder sync code
const sync_folders = async (folder_path, key) => {
function getFiles(dir, files_) {
files_ = files_ || [];
var files = fs.readdirSync(dir);
for (var i in files) {
var name = dir + "/" + files[i];
if (fs.statSync(name).isDirectory()) {
getFiles(name, files_);
} else {
files_.push(name);
}
}
return files_;
}
const files = getFiles(folder_path);
console.log(files);
const fileData = [];
for (let i = 0; i < files.length; i++) {
const file = files[i];
console.log(file);
const fileName = file.split("/").pop();
const fileType = file.split(".").pop();
const fileSize = fs.statSync(file).size;
const filePath = file;
const fileBuffer = fs.readFileSync(filePath);
//folder is last part of folder path (e.g. /folder1/folder2/folder3)
const folder = folder_path.split("/").pop();
console.log("folder: " + folder);
//split filepath
const filePath_ = filePath.split(folder).pop();
let filekey = key + "/" + folder + filePath_;
console.log("filekey: " + filekey);
const params = {
Bucket: bucket,
Key: filekey,
Body: fileBuffer,
ContentType: fileType,
ContentLength: fileSize,
};
const data = await s3.upload(params).promise();
console.log(data);
fileData.push(data);
}
console.log("file data", fileData);
console.log("files uploaded");
return fileData;
};
if some buddy can help me pls I need your help
You need to post the item in a form rather than just putting the directory path of user in and then upload the result to your s3 bucket.
This might be a good start if you're new to it:
https://www.w3schools.com/nodejs/nodejs_uploadfiles.asp
I'm using Node 12.x version to write my Lambda function. Here is the Parsing error that I am getting. What could be the reason?
Update
const im = require("imagemagick");
const fs = require("fs");
const os = require("os");
const uuidv4 = require("uuid/v4");
const {promisify} = require("util");
const AWS = require('aws-sdk');
const resizeAsync = promisify(im.resize)
const readFileAsync = promisify(fs.readFile)
const unlinkAsync = promisify(fs.unlink)
AWS.config.update({region: 'ap-south-1'})
const s3 = new AWS.S3();
exports.handler = async (event) => {
let filesProcessed = event.Records.map((record) => {
let bucket = record.s3.bucket.name;
let filename = record.s3.object.key;
//Fetch filename from S3
var params = {
Bucket: bucket,
Key: filename
};
//let inputData = await s3.getObject(params).promise()
let inputData = await s3.getObject(params).promise();
//Resize the file
let tempFile = os.tmpdir() + '/' + uuidv4() + '.jpg';
let resizeArgs = {
srcData: inputData.Body,
dstPath: tempFile,
width: 150
};
await resizeAsync(resizeArgs)
//Read the resized File
let resizedData = await readFileAsync(tempFile)
//Upload the resized file to S3
let targetFilename = filename.substring(0, filename.lastIndexOf('.') + '-small.jpg')
var params = {
Bucket: bucket + '-dest',
Key: targetFilename,
Body: new Buffer(resizedData),
ContentType: 'image/jpeg'
}
await s3.putObject(params).promise();
return await unlinkAsync(tempFile)
})
await Promise.all(filesProcessed)
return "done"
}
Here is the same code. I am getting Unexpected token S3 error when hovering the red mark (shown in the image)
What you can do is, declare inputData as below and initialize it with the response from the getObject.
let inputData;
var params = {
Bucket: "examplebucket",
Key: "HappyFace.jpg"
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else inputData = data; // successful response
});
For more, you can refer here
I'm trying to get an s3.getObject() running inside an async getInitialProps() function in a nextJS project, but I can't for the love of it figure out how to get the results prepped to they can be returned as an object (which is needed for getInitialProps() and nextJS' SSR to work properly).
Here is the code:
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await s3.getObject(params, (err, data) => {
if (err) throw err;
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
return imgData;
});
return ...
}
The idea is to fetch an image from S3 and return it as base64 code (just to clear things up).
From your code, s3.getObject, works with callback. you need to wait for the callback to be called.
You can achieve it by converting this callback into a promise.
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await new Promise((resolve, reject) => {
s3.getObject(params, (err, data) => {
if (err) reject(err);
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
resolve(imgData);
});
});
return ...
}
So I am writing a Lambda that will take in some form data via a straight POST through API Gateway (testing using Postman for now) and then send that image to S3 for storage. Every time I run it, the image uploaded to S3 is corrupted and won't open properly. I have seen people having to decode/encode the incoming data but I feel like I have tried everything using Buffer.from. I am only looking to store either .png or .jpg. The below code does not reflect my attempts using Base64 encoding/decoding seeing they all failed. Here is what I have so far -
Sample Request in postman
{
image: (uploaded .jpg/.png),
metadata: {tag: 'iPhone'}
}
Lambda
const AWS = require('aws-sdk')
const multipart = require('aws-lambda-multipart-parser')
const s3 = new AWS.S3();
exports.handler = async (event) => {
const form = multipart.parse(event, false)
const s3_response = await upload_s3(form)
return {
statusCode: '200',
body: JSON.stringify({ data: data })
}
};
const upload_s3 = async (form) => {
const uniqueId = Math.random().toString(36).substr(2, 9);
const key = `${uniqueId}_${form.image.filename}`
const request = {
Bucket: 'bucket-name',
Key: key,
Body: form.image.content,
ContentType: form.image.contentType,
}
try {
const data = await s3.putObject(request).promise()
return data
} catch (e) {
console.log('Error uploading to S3: ', e)
return e
}
}
EDIT:
I am now atempting to save the image into the /tmp directory then use a read stream to upload to s3. Here is some code for that
s3 upload function
const AWS = require('aws-sdk')
const fs = require('fs')
const s3 = new AWS.S3()
module.exports = {
upload: (file) => {
return new Promise((resolve, reject) => {
const key = `${Date.now()}.${file.extension}`
const bodyStream = fs.createReadStream(file.path)
const params = {
Bucket: process.env.S3_BucketName,
Key: key,
Body: bodyStream,
ContentType: file.type
}
s3.upload(params, (err, data) => {
if (err) {
return reject(err)
}
return resolve(data)
}
)
})
}
}
form parser function
const busboy = require('busboy')
module.exports = {
parse: (req, temp) => {
const ctype = req.headers['Content-Type'] || req.headers['content-type']
let parsed_file = {}
return new Promise((resolve) => {
try {
const bb = new busboy({
headers: { 'content-type': ctype },
limits: {
fileSize: 31457280,
files: 1,
}
})
bb.on('file', function (fieldname, file, filename, encoding, mimetype) {
const stream = temp.createWriteStream()
const ext = filename.split('.')[1]
console.log('parser -- ext ', ext)
parsed_file = { name: filename, path: stream.path, f: file, type: mimetype, extension: ext }
file.pipe(stream)
}).on('finish', () => {
resolve(parsed_file)
}).on('error', err => {
console.err(err)
resolve({ err: 'Form data is invalid: parsing error' })
})
if (req.end) {
req.pipe(bb)
} else {
bb.write(req.body, req.isBase64Encoded ? 'base64' : 'binary')
}
return bb.end()
} catch (e) {
console.error(e)
return resolve({ err: 'Form data is invalid: parsing error' })
}
})
}
}
handler
const form_parser = require('./form-parser').parse
const s3_upload = require('./s3-upload').upload
const temp = require('temp')
exports.handler = async (event, context) => {
temp.track()
const parsed_file = await form_parser(event, temp)
console.log('index -- parsed form', parsed_file)
const result = await s3_upload(parsed_file)
console.log('index -- s3 result', result)
temp.cleanup()
return {
statusCode: '200',
body: JSON.stringify(result)
}
}
The above edited code is a combination of other code and a github repo I found that is trying to achieve the same results. Even with this solution the file is still corrupted
Figured out this issue. Code works perfectly fine - it was an issue with API Gateway. Need to go into the API Gateway settings and set thee Binary Media Type to multipart/form-data then re-deploy the API. Hope this helps someone else who is banging their head against the wall on figuring out sending images via form data to a lambda.
I am trying to upload an image to firebase and then produce 2 thumbnails. I am able to do this with no problems. My current road block is when I write the urls to the realtime database, I am always getting the same url as the initial upload.
For example:
1st upload I get my uploaded image with the two proper thumbnails for the image
2nd upload I get my uploaded image with the two previous thumbnails (first image)
3rd upload I get my uploaded image with the first images thumbnails...
...this continues to reproduce the urls for the first upload
In my storage the correct thumbnails are being generated, but the urls are always for the first upload?
I don't know if this is a problem with the getSignedUrl() or not, really not sure whats going on here.
Here is my cloud function:
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket); // The Storage object.
// console.log(object);
console.log(object.name);
const filePath = object.name; // File path in the bucket.
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Sounrce fileName
await bucket.file(filePath).download({
destination: tmpFilePath
});
//3. resize the images and define an array of upload promises
const sizes = [64, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
//Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
//upload to gcs
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName),
metadata: {
contentType: 'image/jpeg'
}
}).then((data) => {
const file = data[0]
// console.log(data)
file.getSignedUrl({
action: 'read',
expires: '03-17-2100'
}).then((response) => {
const url = response[0];
if (size === 64) {
// console.log('generated 64');
return admin.database().ref('profileThumbs').child(fileName).set({ thumb: url });
} else {
// console.log('generated 128');
return admin.database().ref('categories').child(fileName).child('thumb').set(url);
}
})
.catch(function (error) {
console.error(err);
return;
});
})
});
//4. Run the upload operations
await Promise.all(uploadPromises);
//5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
})
Cleaned up my code and solved my problem, here is how I generated the urls and passed them to the proper URLs by accessing the users UID and postId in the file path:
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const fileName = filePath.split('/').pop();
const userUid = filePath.split('/')[2];
const sizes = [64, 256];
const bucketDir = dirname(filePath);
console.log(userUid);
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(tmpdir(), fileName);
return bucket.file(filePath).download({
destination: tempFilePath
}).then(() => {
sizes.map(size => {
const newFileName = `thumb#${size}_${fileName}.png`
const newFileTemp = path.join(tmpdir(), newFileName);
const newFilePath = `thumbs/${newFileName}`
return sharp(tempFilePath)
.resize(size, size)
.toFile(newFileTemp, () => {
return bucket.upload(newFileTemp, {
destination: join(bucketDir, newFilePath),
metadata: {
contentType: 'image/jpeg'
}
}).then((data) => {
const file = data[0]
console.log(data)
file.getSignedUrl({
action: 'read',
expires: '03-17-2100'
}, function(err, url) {
console.log(url);
if (err) {
console.error(err);
return;
}
if (size === 64) {
return admin.database().ref('profileThumbs').child(userUid).child(fileName).set({ thumb: url });
} else {
return admin.database().ref('categories').child(fileName).child('thumb').set(url);
}
})
})
})
})
}).catch(error =>{
console.log(error);
});
})