I am trying to upload/download an audio chunk file to/from S3 using AWS node SDK. I have tried base64 approach and it works fine. But I am not able to get the Metadata back which I have bundled as part of upload params.
Below is the code snippet for upload along with meta info:
var myMetaInfo = "AdditionalInfo", dataToUpload = {Bucket: bucketName, Key:storageFolderFullPath , Body: myAudioFile.toString('base64'), Metadata: {metaInfo: myMetaInfo}};
s3.client.putObject(dataToUpload, function(err, data) {
if (!err) {
console.log("Successfully uploaded the file to ::" + dataToUpload.Bucket);
} else {
console.log(" **** ERROR while uploading ::"+err);
}
});
And this is the snippet for downloading the file. Metadata is not part of the callback data.
I tried printing the callback 'data' to console and noticed that only the following params are available
LastModified, ContentType, ContentLength, ETag, Body, RequestId
var dataToDownload = {Bucket: bucketName, Key: storageFolderFullPath}, originalFile, myMetaInfo;
s3.client.getObject(dataToDownload, function(err, data) {
if (!err) {
originalFile = new Buffer(data.Body, 'base64');
myMetaInfo = data.Metadata.metaInfo;
console.log(" Meta info:: " + myMetaInfo);
fs.writeFile(fileStoragePath, originalFile, function(err) {
if (!err) {
console.log(" File written!! ");
} else {
console.log(" Error while writing the file !!" + err);
}
});
} else {
console.log(" **** ERROR while downloading ::"+err);
}
});
Any pointers on what is wrong with my implementation? I have followed the documentation mentioned here
Any help is appreciated.
Is your metaInfo value a string?
Referencing the sdk api docs, Metadata is a string map (ala ~ Metadata: {metaInfo: "myMetaInfoString"}. I've tested your code using a string as the value for metaInfo and it does return correctly under the data.Metadata.metaInfo reference.
Related
I am using angular to create my application. Am trying to upload a picture to s3 bucket but everytime i try uploading it, this error shows in my console.
Here's my upload.service.ts file code
uploadBarcode(file: any) {
const uniqueId: number = Math.floor(Math.random() * Date.now());
const file_name: string = 'lab_consignments/'+uniqueId + "_" + file.name ;
const contentType = file.type;
console.log(contentType)
const bucket = new S3(
{
accessKeyId: myAccessKey,
secretAccessKey: secretAccessKey,
region: 'Asia Pacific (Mumbai) ap-south-1'
}
);
const params = {
Bucket: 'chc-uploads',
Key: file_name,
Body: file,
ACL: 'public-read',
ContentType: contentType
};
return bucket.upload(params, function (err, data) {
if (err) {
console.log('There was an error uploading your file: ', err);
localStorage.setItem('docDataStatus', 'false');
return false;
}
else {
console.log('Successfully uploaded file from service');
return true;
}
});
}
}
the access key and secret access key are statically typed in my code so don't worry about it. i just changed those to the variable names because of security issues while posting this question to stack overflow.
Any help would be appreciated.
Though this does not answer your initial question, as Marcin said with your AWS credentials, hard-coding them into your code is very bad practice and should be avoided at all costs. For frontend applications, this can be performed by having a simple API endpoint generate you a signed upload URL to the S3 bucket:
https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/
To actually answer your question, you are likely seeing the error because the region you are passing is incorrectly formatted.
const bucket = new S3(
{
region: 'ap-south-1'
}
);
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html#region-property
Hi I have json response probably of size 150-200MB. Because of its size, I want to save it on aws s3 as json file instead of returning it to the client.
Below this the code I m using currently.
async function uploadFileOnS3(fileData, s3Detail) {
const params = {
Bucket: s3Detail.Bucket,
Key: s3Detail.Key_response,
Body: JSON.stringify(fileData), // big fat js object
};
try {
const stored = await S3.upload(params).promise();
console.log("file uploaded Sucessfully ", stored);
} catch (err) {
console.log(err);
}
console.log("upload exit");
}
I m concern about JSON.stringify(fileData) operation. assuming this function will be part of a aws lambda, won't it take huge resources to parse it as string?
is there any other efficient way to save javascript object as json on aws s3 bucket?
You don't really have to stringify the JSON file. You can pass a stream as a body:
async function uploadFileOnS3(fileData, s3Detail) {
const params = {
Bucket: s3Detail.Bucket,
Key: s3Detail.Key_response,
Body: fileData, // remove stringify from here
};
try {
const stored = await S3.upload(params).promise();
console.log("file uploaded Sucessfully ", stored);
} catch (err) {
console.log(err);
}
console.log("upload exit");
}
exports.handler = async (event) => {
// We create a stream
const stream = fs.createReadStream("/tmp/upload.json");
// Pass the stream to the upload function
await uploadFileOnS3(stream, {
Bucket: "bucket_name",
Key_response: "upload.json"
});
}
I would like to create an app that allows me to easily modify the metadata of any given file.
Is there a way to modify the metadata of any given file type?
Are there any NPM packages that facilitate this?
You can use node-ffmetadata npm package for reading & writings file properties.
var ffmetadata = require("ffmetadata");
// Read song.mp3 metadata
ffmetadata.read("song.mp3", function(err, data) {
if (err) console.error("Error reading metadata", err);
else console.log(data);
});
// Set the artist for song.mp3
var data = {
artist: "Me",
};
ffmetadata.write("song.mp3", data, function(err) {
if (err) console.error("Error writing metadata", err);
else console.log("Data written");
});
I am trying to save a file in a local directory from my s3 bucket. When I run the code everything seems to work fine because no errors are prompted in the console but when I open the directory the file size is just 15 bytes and it's the same story with on file I try to download.
I tried to download a text file and inside I found written [object Object], can anyone help me? This is the function code:
var s3 = new AWS.S3();
s3.getObject(
{ Bucket: "chat-mp-files", Key: conf[1] },
function (error, data) {
if (error != null) {
console.log(err)
} else {
fs.closeSync(fs.openSync(pathstr + '/r/' + conf[1], 'w'));
fs.writeFile(pathstr + '/r/' + conf[1], data, function (err) {
if (err) {
console.log(err);
} else {
console.log("ok");
}
});
});
I have just solved my issue using the official docs section provided by Amazon here
Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method