Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method
Related
So I created my first big project: https://rate-n-write.herokuapp.com/
In brief, this is a blog app where the user can write reviews and publish them along with pictures.
I have used firebase as the database to store the articles. The app is working fine on localhost. Whenever I am trying to upload an image on Heroku, I get this error
The error is showing up in line number 8 of the following code (editor.js):
uploadInput.addEventListener('change', () => {
uploadImage(uploadInput, "image");
})
const uploadImage = (uploadFile, uploadType) => {
const [file] = uploadFile.files;
if(file && file.type.includes("image")){
const formdata = new FormData();
formdata.append('image', file);
//Error shows up here in the fetch line
fetch('/upload', {
method: 'post',
body: formdata
}).then(res => res.json())
.then(data => {
if(uploadType == "image"){
addImage(data, file.name);
} else{
bannerPath = `${location.origin}/${data}`;
banner.style.backgroundImage = `url("${bannerPath}")`;
}
})
const change_text = document.getElementById("uploadban");
change_text.innerHTML = " ";
} else{
alert("upload Image only");
}
}
This is just a snippet of the whole editor.js file.
Is it because I am trying to upload the file to the project directory? (server.js snippet below):
app.post('/upload', (req, res) => {
let file = req.files.image;
let date = new Date();
// image name
let imagename = date.getDate() + date.getTime() + file.name;
// image upload path
let path = 'public/uploads/' + imagename;
// create upload
file.mv(path, (err, result) => {
if(err){
throw err;
} else{
// our image upload path
res.json(`uploads/${imagename}`)
}
})
})
Do I need to use an online storage service like AWS S3?
Heroku is not suitable for persistent storage of data, the uploaded pictures would be deleted after a while (when the dyno is restarted) read this.
I would suggest using 3rd party object Storage services like
cloudinary or AWS S3
I am using angular to create my application. Am trying to upload a picture to s3 bucket but everytime i try uploading it, this error shows in my console.
Here's my upload.service.ts file code
uploadBarcode(file: any) {
const uniqueId: number = Math.floor(Math.random() * Date.now());
const file_name: string = 'lab_consignments/'+uniqueId + "_" + file.name ;
const contentType = file.type;
console.log(contentType)
const bucket = new S3(
{
accessKeyId: myAccessKey,
secretAccessKey: secretAccessKey,
region: 'Asia Pacific (Mumbai) ap-south-1'
}
);
const params = {
Bucket: 'chc-uploads',
Key: file_name,
Body: file,
ACL: 'public-read',
ContentType: contentType
};
return bucket.upload(params, function (err, data) {
if (err) {
console.log('There was an error uploading your file: ', err);
localStorage.setItem('docDataStatus', 'false');
return false;
}
else {
console.log('Successfully uploaded file from service');
return true;
}
});
}
}
the access key and secret access key are statically typed in my code so don't worry about it. i just changed those to the variable names because of security issues while posting this question to stack overflow.
Any help would be appreciated.
Though this does not answer your initial question, as Marcin said with your AWS credentials, hard-coding them into your code is very bad practice and should be avoided at all costs. For frontend applications, this can be performed by having a simple API endpoint generate you a signed upload URL to the S3 bucket:
https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/
To actually answer your question, you are likely seeing the error because the region you are passing is incorrectly formatted.
const bucket = new S3(
{
region: 'ap-south-1'
}
);
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html#region-property
I am trying to get my node.js backend to upload a file to AWS S3, which it got in a post request from my front-end. This is what my function looks like:
async function uploadFile(file){
var uploadParams = {Bucket: '<bucket-name>', Key: file.name, Body: file};
s3.upload (uploadParams, function (err, data) {
if (err) {
console.log("Error", err);
} if (data) {
console.log("Upload Success", data.Location);
}
});
}
When I try uploading the file this way, I get an Unsupported Body Payload Error...
I used fileStream.createReadStream() in the past to upload files saves in a directory on the server, but creating a fileStream did not work for me, since there is no path parameter to pass here.
EDIT:
The file object is created in the angular frontend of my web application. This it the relevant html code where the file is uploaded by a user:
<div class="form-group">
<label for="file">Choose File</label>
<input type="file" id="file"(change)="handleFileInput($event.target.files)">
</div>
If the event occurs, the handleFileInput(files: FileList) method in the corresponding component is called:
handleFileInput(files: FileList) {
// should result in array in case multiple files are uploaded
this.fileToUpload = files.item(0);
// actually upload the file
this.uploadFileToActivity();
// used to check whether we really received the file
console.log(this.fileToUpload);
console.log(typeof this.fileToUpload)
}
uploadFileToActivity() {
this.fileUploadService.postFile(this.fileToUpload).subscribe(data => {
// do something, if upload success
}, error => {
console.log(error);
});
}
the postFile(fileToUpload: File) method of the file-upload service is used to make the post request:
postFile(fileToUpload: File): Observable<Boolean> {
console.log(fileToUpload.name);
const endpoint = '/api/fileupload/single';
const formData: FormData = new FormData();
formData.append('fileKey', fileToUpload, fileToUpload.name);
return this.httpClient
.post(endpoint, formData/*, { headers: yourHeadersConfig }*/)
.pipe(
map(() => { return true; }),
catchError((e) => this.handleError(e)),
);
}
Here is the the server-side code that receives the file and then calls the uploadFile(file) function:
app.post('/api/fileupload/single', async (req, res) => {
try {
if(!req.files) {
res.send({
status: false,
message: 'No file uploaded'
});
} else {
let file = req.files.fileKey;
uploadFile(file);
//send response
res.send({
status: true,
message: 'File is uploaded',
data: {
name: file.name,
mimetype: file.mimetype,
size: file.size
}
});
}
} catch (err) {
res.status(500).send(err);
}
});
Thank you very much for your help in solving this!
Best regards, Samuel
Best way is stream the file. Assuming you are. reading it from disk. You could do this
const fs = require("fs");
const aws = require("aws-sdk");
const s3Client = new aws.S3();
const Bucket = 'somebucket';
const stream = fs.createReadStream("file.pdf");
const Key = stream.path;
const response = await s3Client.upload({Bucket, Key, Body: stream}).promise();
console.log(response);
I have the following setup, by which I send the image, from its url, to be edited and sent back to be uploaded to S3. The problem I currently have is that the image gets on S3 corrupted, and I am wondering if there's trouble in my code that's causing the issue.
Server side:
function convertImage(inputStream) {
return gm(inputStream)
.contrast(-2)
.stream();
}
app.get('/resize/:imgDetails', (req, res, next) => {
let params = req.params.imgDetails.split('&');
let fileName = params[0]; console.log(fileName);
let tileType = params[1]; console.log(tileType);
res.set('Content-Type', 'image/jpeg');
let url = `https://${process.env.Bucket}.s3.amazonaws.com/images/${tileType}/${fileName}`;
convertImage(request.get(url)).pipe(res);
})
Client side:
axios.get('/resize/' + fileName + '&' + tileType)
.then(res => {
/** PUT FILE ON AWS **/
var img = res;
axios.post("/sign_s3_sized", {
fileName : fileName,
tileType : tileType,
ContentType : 'image/jpeg'
})
.then(response => {
var returnData = response.data.data.returnData;
var signedRequest = returnData.signedRequest;
var url = returnData.url;
this.setState({url: url})
// Put the fileType in the headers for the upload
var options = {
headers: {
'Content-Type': 'image/jpeg'
}
};
axios.put(signedRequest,img, options)
.then(result => {
this.setState({success: true});
}).bind(this)
.catch(error => {
console.log("ERROR: " + JSON.stringify(error));
})
})
.catch(error => {
console.log(JSON.stringify(error));
})
})
.catch(error => console.log(error))
Before going any further, I can assure you now that uploading any images via this setup minus the convertImage() works, otherwise the image gets put on S3 corrupted.
Any pointers as to what the issue behind the image being corrupted is?
Is my understanding of streams here lacking perhaps? If so, what should I change?
Thank you!
EDIT 1:
I tried not running the image through the graphicsmagick API at all (request.get(url).pipe(res);) and the image is still corrupted.
EDIT 2:
I gave up at the end and just uploaded the file from Node.js straight to S3; it turned out to be better practice anyway.
So if you are end goal is to upload the image in the S3 bucket using Node Js, there are simple ways by using multer-s3 node module.
I'm trying to upload an image to my AWS S3 bucket after downloading the image from another URL using Node (using request-promise-native & aws-sdk):
'use strict';
const config = require('../../../configs');
const AWS = require('aws-sdk');
const request = require('request-promise-native');
AWS.config.update(config.aws);
let s3 = new AWS.S3();
function uploadFile(req, res) {
function getContentTypeByFile(fileName) {
var rc = 'application/octet-stream';
var fn = fileName.toLowerCase();
if (fn.indexOf('.png') >= 0) rc = 'image/png';
else if (fn.indexOf('.jpg') >= 0) rc = 'image/jpg';
return rc;
}
let body = req.body,
params = {
"ACL": "bucket-owner-full-control",
"Bucket": 'testing-bucket',
"Content-Type": null,
"Key": null, // Name of the file
"Body": null // File body
};
// Grabs the filename from a URL
params.Key = body.url.substring(body.url.lastIndexOf('/') + 1);
// Setting the content type
params.ContentType = getContentTypeByFile(params.Key);
request.get(body.url)
.then(response => {
params.Body = response;
s3.putObject(params, (err, data) => {
if (err) { console.log(`Error uploading to S3 - ${err}`); }
if (data) { console.log("Success - Uploaded to S3: " + data.toString()); }
});
})
.catch(err => { console.log(`Error encountered: ${err}`); });
}
The upload succeeds when I test it out, however after trying to redownload it from my bucket the image is unable to display. Additionally, I notice after uploading the file with my function, the file listed in the bucket is much larger in filesize than the originally uploaded image. I'm trying to figure out where I've been going wrong but cannot find where. Any help is appreciated.
Try to open the faulty file with a text editor, you will see some errors written in it.
You can try using s3.upload instead of putObject, it works better with streams.