I am using angular to create my application. Am trying to upload a picture to s3 bucket but everytime i try uploading it, this error shows in my console.
Here's my upload.service.ts file code
uploadBarcode(file: any) {
const uniqueId: number = Math.floor(Math.random() * Date.now());
const file_name: string = 'lab_consignments/'+uniqueId + "_" + file.name ;
const contentType = file.type;
console.log(contentType)
const bucket = new S3(
{
accessKeyId: myAccessKey,
secretAccessKey: secretAccessKey,
region: 'Asia Pacific (Mumbai) ap-south-1'
}
);
const params = {
Bucket: 'chc-uploads',
Key: file_name,
Body: file,
ACL: 'public-read',
ContentType: contentType
};
return bucket.upload(params, function (err, data) {
if (err) {
console.log('There was an error uploading your file: ', err);
localStorage.setItem('docDataStatus', 'false');
return false;
}
else {
console.log('Successfully uploaded file from service');
return true;
}
});
}
}
the access key and secret access key are statically typed in my code so don't worry about it. i just changed those to the variable names because of security issues while posting this question to stack overflow.
Any help would be appreciated.
Though this does not answer your initial question, as Marcin said with your AWS credentials, hard-coding them into your code is very bad practice and should be avoided at all costs. For frontend applications, this can be performed by having a simple API endpoint generate you a signed upload URL to the S3 bucket:
https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/
To actually answer your question, you are likely seeing the error because the region you are passing is incorrectly formatted.
const bucket = new S3(
{
region: 'ap-south-1'
}
);
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html#region-property
Related
In my ReactJS amplify website I would like to get my S3 bucket items and get the metadata. I tried the following code, and get a blank metadata as a result:
s3.listObjects(params, function(err,data){
if(err){
console.log("There was an error getting bucket contents:", err);
}
else{
var bucketContents = data.Contents;
console.log(JSON.stringify(bucketContents));
this.setState(state => {
return { s3Data: bucketContents };
});
// get meta data
for (var i = 0; i < bucketContents.length; i++) {
var urlParams = { Bucket: bucketName, Key: bucketContents[i].Key };
imageTitles.push(bucketContents[i].Key);
s3.headObject(
{
Bucket : bucketName,
Key: bucketContents[i].Key
},
function(err,data){
if(err){
console.log(err.message);
}
else{
console.log("HEAD RESULTS:");
console.log(JSON.stringify(data));
}
});
I have a s3 bucket with 2 custom tags. Inside the s3 bucket I have added the following CORS policy for those tags:
Using amplify cli I have ran amplify pull to update. However I don't see it in my CORS policy locally on my code.
So trying to be useful, I tried manually adding it in the file. Still doesn't work.
Why is this happening, and how can I populate the metadata?
I can retrieve the metadata after enabling CORS headers as explained here: https://github.com/aws/aws-sdk-js/issues/232
export const get = (filename, level) => {
/*First enable CORS headers for the metadata you want to retrieve. Manual setup:
S3 console > select bucket > Click on Permissions > CORS Configuration > Add x-amz-meta-* keys:
<ExposeHeader>x-amz-meta-audience</ExposeHeader>
<ExposeHeader>x-amz-meta-location</ExposeHeader>
*/
Storage.get(filename, {
level,
download: true
})
.then(res => {
console.log(res);
})
.catch(err => console.log(err));
};
I'm trying to upload an image to my AWS S3 bucket after downloading the image from another URL using Node (using request-promise-native & aws-sdk):
'use strict';
const config = require('../../../configs');
const AWS = require('aws-sdk');
const request = require('request-promise-native');
AWS.config.update(config.aws);
let s3 = new AWS.S3();
function uploadFile(req, res) {
function getContentTypeByFile(fileName) {
var rc = 'application/octet-stream';
var fn = fileName.toLowerCase();
if (fn.indexOf('.png') >= 0) rc = 'image/png';
else if (fn.indexOf('.jpg') >= 0) rc = 'image/jpg';
return rc;
}
let body = req.body,
params = {
"ACL": "bucket-owner-full-control",
"Bucket": 'testing-bucket',
"Content-Type": null,
"Key": null, // Name of the file
"Body": null // File body
};
// Grabs the filename from a URL
params.Key = body.url.substring(body.url.lastIndexOf('/') + 1);
// Setting the content type
params.ContentType = getContentTypeByFile(params.Key);
request.get(body.url)
.then(response => {
params.Body = response;
s3.putObject(params, (err, data) => {
if (err) { console.log(`Error uploading to S3 - ${err}`); }
if (data) { console.log("Success - Uploaded to S3: " + data.toString()); }
});
})
.catch(err => { console.log(`Error encountered: ${err}`); });
}
The upload succeeds when I test it out, however after trying to redownload it from my bucket the image is unable to display. Additionally, I notice after uploading the file with my function, the file listed in the bucket is much larger in filesize than the originally uploaded image. I'm trying to figure out where I've been going wrong but cannot find where. Any help is appreciated.
Try to open the faulty file with a text editor, you will see some errors written in it.
You can try using s3.upload instead of putObject, it works better with streams.
I am facing a peculiar issue while trying to upload a video file (.mp4 and .mov) to S3, captured using cordova-plugin-media-capture#1.4.3 or picked from gallery using cordova-plugin-camera#2.4.1.
I am using javascript AWS SDK v2.3.11 and calling the .upload function of the SDK.
It only copies 15 Bytes of data onto the S3 regardless of the actual size of the video file and it is non-playable.
Implementation -
Capture video:
navigator.device.capture.captureVideo(
captureSuccess,
captureError,
{
limit: 1,
duration: 30,
destinationType: 2,
sourceType: 1,
mediaType: 1
}
);
var captureSuccess = function (mediaFiles) {
var mediaFile = mediaFiles[0];
var filedata = {
Key: "videos/" + fileName,
ContentType: mediaFile.type,
Body: mediaFile
};
var aws = AWS;
var creds = new aws.Credentials(
AccessKeyId,
SecretAccessKey,
SessionToken
);
aws.config.credentials = creds;
s3 = new aws.S3();
s3.upload(
filedata,
{
Bucket: bucketName
},
function(err, location){
if(!err){
//uploaded successfully
} else {
//upload failed
}
}
);
}
When I try to convert the media file to its Base64 data and upload, it does write the complete base64 file to the S3 bucket. However, I then need to strip the prefixed filetype and base64 identifiers text then decompile the data to a binary format save it again to S3 (from EB nodeJS service).
Another issue with this approach is that converting a video file to a base64 data and saving in RAM memory of the phone is prone to application crashes due to memory management on both IOS and Android. I am unable to use this mechanism to convert a video file of more than 5 secs in Android, and more than 10 secs in 16GB iPhone6. The application crashes beyond both these scenarios.
Changed Implementation with Base64:
var captureSuccess = function (mediaFiles) {
var mediaFile = mediaFiles[0];
var filedata = {
Key: "videos/" + fileName,
ContentType: mediaFile.type
};
var aws = AWS;
var creds = new aws.Credentials(
AccessKeyId,
SecretAccessKey,
SessionToken
);
aws.config.credentials = creds;
s3 = new aws.S3();
getBase64data(
mediaFile.fullPath, //tried with mediaFile.localURL as well
function(data){
filedata.Body = data;
s3.upload(
filedata,
{
Bucket: bucketName
},
function(err, location){
if(!err){
//uploaded successfully
} else {
//upload failed
}
}
); //ending s3.upload
); //ending getBase64data
}
function getBase64Data(filePath, cb){
window.resolveLocalFileSystemURL(
filePath,
function(entry){
entry.file(
function(file) {
var reader = new FileReader();
reader.onloadend = function(event) {
cb(event.target.result);
};
reader.readAsDataURL(file);
},
function(e){
//error retrieving file object
}
); //ending entry.file
},
function(e){
//error getting entry object
}
); //ending resolveLocalFileSystemURL
}
The AWS S3 JavaScript SDK allows a couple of different ways to provide the file data to the upload function. According to the documentation the file data can be any of the following:
Body — (Buffer, Typed Array, Blob, String, ReadableStream) Object data.
You need to extract the data from your captured video to any of those formats before attempting to upload.
Using mediaFile.fullPath, you can read the data from the file into a Buffer or create a ReadableStream and then use that to upload the file. To extract the data from the file you can use cordova-plugin-file.
I am trying to upload/download an audio chunk file to/from S3 using AWS node SDK. I have tried base64 approach and it works fine. But I am not able to get the Metadata back which I have bundled as part of upload params.
Below is the code snippet for upload along with meta info:
var myMetaInfo = "AdditionalInfo", dataToUpload = {Bucket: bucketName, Key:storageFolderFullPath , Body: myAudioFile.toString('base64'), Metadata: {metaInfo: myMetaInfo}};
s3.client.putObject(dataToUpload, function(err, data) {
if (!err) {
console.log("Successfully uploaded the file to ::" + dataToUpload.Bucket);
} else {
console.log(" **** ERROR while uploading ::"+err);
}
});
And this is the snippet for downloading the file. Metadata is not part of the callback data.
I tried printing the callback 'data' to console and noticed that only the following params are available
LastModified, ContentType, ContentLength, ETag, Body, RequestId
var dataToDownload = {Bucket: bucketName, Key: storageFolderFullPath}, originalFile, myMetaInfo;
s3.client.getObject(dataToDownload, function(err, data) {
if (!err) {
originalFile = new Buffer(data.Body, 'base64');
myMetaInfo = data.Metadata.metaInfo;
console.log(" Meta info:: " + myMetaInfo);
fs.writeFile(fileStoragePath, originalFile, function(err) {
if (!err) {
console.log(" File written!! ");
} else {
console.log(" Error while writing the file !!" + err);
}
});
} else {
console.log(" **** ERROR while downloading ::"+err);
}
});
Any pointers on what is wrong with my implementation? I have followed the documentation mentioned here
Any help is appreciated.
Is your metaInfo value a string?
Referencing the sdk api docs, Metadata is a string map (ala ~ Metadata: {metaInfo: "myMetaInfoString"}. I've tested your code using a string as the value for metaInfo and it does return correctly under the data.Metadata.metaInfo reference.
Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method