By using Express-fileupload library ,Upload image to AWS S3 bucket - javascript

any one please help me, by using express-fileupload library, I want to upload my file and image to AWS S3 bucket using restful API.

Hope you found the answer.
In case you haven't found the answer, here is what I just found out
const uploadSingleImage = async (file, s3, fileName) => {
const bucketName = process.env.S3_BUCKET_NAME;
if (!file.mimetype.startsWith('image')) {
return { status: false, message: 'File uploaded is not an image' };
}
const params = {
Bucket: bucketName,
Key: fileName,
Body: file.data,
ACL: 'public-read',
ContentType: file.mimetype,
};
return s3.upload(params).promise();
};
s3.upload(params).promise()
will return an object which contains the Location of the file you uploaded, in fact you could generate it yourself but that would not cover in case any error occur so I think what I posted here is a better solution.

Related

How to Upload file in a directory to minIO bucket

Hello everyone i have bucket in minio server and bucket name is 'geoxing' and geoxing have directory img/site. i want to upload picture in site directry using nodejs. below is code and i am getting error Invalid bucket name: geoxing/img/site. how can i solve this error. thanks
savefile() {
const filePath = 'D://repositories//uploads//geoxing//site//b57e46b4bcf879839b7074782sitePic.jpg';
const bucketname = 'geoxing/img/site'
var metaData = {
'Content-Type': 'image/jpg',
'Content-Language': 123,
'X-Amz-Meta-Testing': 1234,
example: 5678,
};
this.minioClient.fPutObject(
bucketname,
'b57e46b4bcf879839b7074782sitePic.jpg',
filePath,
metaData,
function (err, objInfo) {
if (err) {
return console.log(err);
}
return console.log('Success', objInfo.etag);
},
);
}
In Amazon S3 and Minio:
Bucket should just be the name of the bucket (eg geoxing)
Key should include the full path as well as the filename (eg img/site/b57e46b4bcf879839b7074782sitePic.jpg)
Amazon S3 and Minio do not have 'folders' or 'directories' but they emulate directories by including the path name in the Key. Folders do not need to be created prior to uploading to a folder -- they just magically appear when files are stored in that 'path'.

AWS Image upload data to S3 from form is corrupt

I am using multer in a Lambda function to upload an image through an API POST request I am building, this is part of a form on my website.
This is the console log from cloud watch:
{
fieldname: 'logo_image',
originalname: '8824.png',
encoding: '7bit',
mimetype: 'image/png',
destination: '/tmp/upload',
filename: 'f7f44f5c39304937d10e90ceb7e9ddbb',
path: '/tmp/upload/f7f44f5c39304937d10e90ceb7e9ddbb',
size: 1376654
}
This is my js express function that accepts the image using multer:
routes.post('/', upload.single('logo_image'), async (req, res) => {
const file = req.file
// here I am just going into another function to check the form data is valid
await checkTicketId(req, res)
});
I then upload the data to my S3 bucket:
const result = await uploadFile(req.file)
function uploadFile(file, policy_id) {
const fileStream = fs.createReadStream(file.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: file.filename,
}
return s3.putObject(uploadParams).promise()
}
The data is uploaded fine, however its just some binary nonsense. And If I return the object I get data like this:
IHDR � �2�p IDATx���i�$I�%��Ǣj�WVVf]���9zw�X�h"\D � ��7�# -�h���S�]gfU��n�����EDE��##"�ȡݠ���ws3UQ��� �?P ��_#� P`3�5l'�.��w�[�! J�0���?��#� ���1h`�8A�H#�$�#D C1�#�d`��� �ְ5 A21��rʫ ���ɀ� F̈ ���
��
k��p8 $��/4Ly3�A���e"IH�</�ߞ�
�C�R&���<��j�����b �eY�N���nY�K����x�0��Y(�Dx L(d�$� p�pE��5��Q������ٟ�u�K��ΰ $#��N������� ��0���?~���?����?�ዏ>^���tw��\��+�����_۴IҊ#�1�����_~��������s�=%�>��W��D�z
'4c�#̺�[��_M�#D^b{)}�'�b�W���p}su3��ӡ�����xw\�u� ����
onts=s9�B ��89֠#�Cw�K�~Z�Ӳ���XݗEp��"P�uٓ�]롅�����飛'7W���<�zz}uSJ)tDz���/�;.��O�_����8�*y�����vsu����j����Cy<MWW��j.�Ja������������w����Ջ�/��}~{w��
�a��t�2�M4PB�W_|�X!x���ږs1��
���l6�P��NZ�8=Dz����,���mXl�*��`���'_���_(�o�����?�����v�
��a#���?�������C$ �� ��^!�.D=�/Σ�~O���+u�rz�%���.���f��K���x�}ɚ�4Ѭ� �����8m�&7�#�jf�_|���Y�[=pq�v9�5����0a���h�<�;��:)#��B�_��.���H`#��� �8�b�bF�P("K���Z�^8F(i0�f�X������bC �0����M�����}��|CQo��կ�A�ר?��/�VZk�s��b
��n�A�jr0�Z�&�n�wL�i��E`�����<�A���p!�� ��E�yp�*�����dD( ��� M����k�֧�EC�T�V�����Ԋ+�l�<�H�Y1�!�ʤIe��r���
ɱ���L3$��O� �-�z`�Kv���O7nK�.����9CE���Ŧ����|����g�<��W�����V���
�rw���1���D��*�J
Ideally I want it stored as .webp, I have also tried using upload instead of putObject and changing file extension. There doesn't seem to be any buffer data I can use either on the req.file property.
Any insight into this would be helpful, I've been stuck on it for a while now and I've enabled binary data through the API on AWS as well. Thanks!

How to use an S3 pre-signed POST url?

Actually, this is the first time that I'm using s3 for uploading files. I have heard about pre-signed urls But apparently, I can't set a limitation for file size so I found "pre-signed post urls" but it's a little bit wierd!! Surprisingly I didn't find any example. maybe it's not what I want.
I'm getting pre-signed post url from the server:
const { S3 } = require("aws-sdk");
const s3 = new S3({
accessKeyId: accessKey,
secretAccessKey: secretKey,
endpoint: api,
s3ForcePathStyle: true,
signatureVersion: "v4",
});
app.post("/get-url", (req, res) => {
const key = `user/${uuri.v4()}.png`;
const params = {
Bucket: "bucketName",
Fields: {
Key: key,
ContentType: "image/png",
},
};
s3.createPresignedPost(params, function (err, data) {
if (err) {
console.error("Presigning post data encountered an error", err);
} else {
res.json({ url: data.url });
}
});
});
The weird thing is that the url that I get is not like a pre-signed url. it's just the endpoint followed by the bucket name. no query parameter. no option.
As you might guess, i can't use this url:
await axios.put(url, file, {
headers: {
"Content-Type": "image/png",
},
});
I do not even know if I should use post or two requests.
I tried both, Nothing happens. Maybe the pre-signed post url is not like pre-signed url!
At least show me an example! I can't found any.
You are on the right track, but you need to change the method you are invoking. The AWS S3 API docs of the createPresignedPost() that you are currently using states that:
Get a pre-signed POST policy to support uploading to S3 directly from an HTML form.
Try change this method to either getSignedUrl():
Get a pre-signed URL for a given operation name.
const params = { Bucket: 'bucket', Key: 'key' };
s3.getSignedUrl('putObject', params, function (err, url) {
if (err) {
console.error("Presigning post data encountered an error", err);
} else {
res.json({ url });
}
});
or synchronously:
const params = { Bucket: 'bucket', Key: 'key' };
const url = s3.getSignedUrl('putObject', params)
res.json({ url });
Alternatively, use a promise by executing getSignedUrlPromise():
Returns a 'thenable' promise that will be resolved with a pre-signed URL for a given operation name.
const params = { Bucket: 'bucket', Key: 'key' };
s3.getSignedUrlPromise('putObject', params)
.then(url => {
res.json({ url });
}, err => {
console.error("Presigning post data encountered an error", err);
});
Please also read the notes parts of the API documentation to make sure that you understand the limitations of each method.

upload file to s3 using nodejs

I have coded a function which upload file to s3 which using aws-sdk, but I've got a problem.
When I replace Body by Body: "test" then the function will execute successfully and I can see the content test in s3.
But I want to upload a document such as pdf file, it can't run and throws an error as:
UnhandledPromiseRejectionWarning: TypeError [ERR_INVALID_ARG_TYPE]:
The "path" argument must be one of type string, Buffer, or URL.
Received type object at Object.open (fs.js:406:3) at ReadStream.open
(internal/fs/streams.js:110:12) at new ReadStream
(internal/fs/streams.js:99:10) at Object.createReadStream
(fs.js:1725:10)
This is my code:
import { createWriteStream, createReadStream } from "fs";
import AWS from 'aws-sdk';
const fs = require('fs');
const uploadS3 = async ({file}) => {
const { stream, filename} = file;
AWS.config.update({
accessKeyId: '*******',
secretAccessKey: '********',
region: 'us-east-1'
});
let params = {
Bucket: "test-files-staging",
Key: 'documents/' + filename,
Body: fs.createReadStream(file),
ACL: 'public-read'
};
await new AWS.S3().putObject(params).promise().then(() => {
console.log('Success!!!')
}).catch((err) => { console.log(`Error: ${err}`) })
}
Thanks in advance.
Well the error is pretty clear: the path argument needs to be string, buffer or URL, but you pass in an object which is the readStream created by fs.createReadStream.
Instead, use the fs.readFileSync function to read your file into a buffer (if you use the async fs.readFile you'll get a promise- which is again neither a string, buffer or URL and will fail again):
let params = {
Bucket: "test-files-staging",
Key: 'documents/' + filename,
Body: fs.readFileSync(filename), // assuming filename is the full path to your object
ACL: 'public-read'
};
more info on fs functions can be found in documentation- here

How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?

Currently, I am using the #google-cloud/storage NPM package to upload a file directly to a Google Cloud Storage bucket. This requires some trickery as I only have the image's base64 encoded string. I have to:
Decode the string
Save it as a file
Send the file path to the below script to upload to Google Cloud Storage
Delete the local file
I'd like to avoid storing the file in the filesystem altogether since I am using Google App Engine and I don't want to overload the filesystem / leave junk files there if the delete operation doesn't work for whatever reason. This is what my upload script looks like right now:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var base64Img = require('base64-img');
var filePath = base64Img.imgSync(req.body.base64Image, 'user-uploads', 'image-name');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('google-cloud-storage-bucket-name');
// Upload the image to the bucket
bucket.upload(__dirname.slice(0, -15) + filePath, {
destination: 'profile-images/576dba00c1346abe12fb502a-original.jpg',
public: true,
validation: 'md5'
}, function(error, file) {
if (error) {
sails.log.error(error);
}
return res.ok('Image uploaded');
});
Is there anyway to directly upload the base64 encoded string of the image instead of having to convert it to a file and then upload using the path?
The solution, I believe, is to use the file.createWriteStream functionality that the bucket.upload function wraps in the Google Cloud Node SDK.
I've got very little experience with streams, so try to bear with me if this doesn't work right off.
First of all, we need take the base64 data and drop it into a stream. For that, we're going to include the stream library, create a buffer from the base64 data, and add the buffer to the end of the stream.
var stream = require('stream');
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(req.body.base64Image, 'base64'));
More on decoding base64 and creating the stream.
We're then going to pipe the stream into a write stream created by the file.createWriteStream function.
var gcs = require('#google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
//Define bucket.
var myBucket = gcs.bucket('my-bucket');
//Define file & file name.
var file = myBucket.file('my-file.jpg');
//Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg',
metadata: {
custom: 'metadata'
}
},
public: true,
validation: "md5"
}))
.on('error', function(err) {})
.on('finish', function() {
// The file upload is complete.
});
Info on file.createWriteStream, File docs, bucket.upload, and the bucket.upload method code in the Node SDK.
So the way the above code works is to define the bucket you want to put the file in, then define the file and the file name. We don't set upload options here. We then pipe the bufferStream variable we just created into the file.createWriteStream method we discussed before. In these options we define the metadata and other options you want to implement. It was very helpful to look directly at the Node code on Github to figure out how they break down the bucket.upload function, and recommend you do so as well. Finally, we attach a couple events for when the upload finishes and when it errors out.
Posting my version of the answer in response to #krlozadan 's request above:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var mimeTypes = require('mimetypes');
var image = req.body.profile.image,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)[1],
fileName = req.profile.id + '-original.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('my-bucket');
// Upload the image to the bucket
var file = bucket.file('profile-images/' + fileName);
file.save(imageBuffer, {
metadata: { contentType: mimeType },
public: true,
validation: 'md5'
}, function(error) {
if (error) {
return res.serverError('Unable to upload the image.');
}
return res.ok('Uploaded');
});
This worked just fine for me. Ignore some of the additional logic in the first few lines as they are only relevant to the application I am building.
If you want to save a string as a file in Google Cloud Storage, you can do it easily using the file.save method:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file.txt');
const contents = 'This is the contents of the file.';
file.save(contents).then(() => console.log('done'));
:) what an issue !! Have tried it and got the issue Image has uploaded on firebase Storage but not download and just loader is moving around and around... After spending time... Got the success to upload the image on firebase storage with downloading... There was an issue in an access token...
check the screenshot
If you check in the file location section on the right side bottom there is an option "create access token" and not showing any "access token" on there if you create manually access token on there then refresh the page image will showing... So now the question is how to create it by code...
just use below code to create the access token
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
Full code is given below for uploading an image to storage image on firebase storage
const functions = require('firebase-functions')
var firebase = require('firebase');
var express = require('express');
var bodyParser = require("body-parser");
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
You have to convert base64 to image buffer then upload as below, you need to provide image_data_from_html variable as the data you extract from HTML event.
const base64Text = image_data_from_html.split(';base64,').pop();
const imageBuffer = Buffer.from(base64Text, 'base64');
const contentType = data.image_data.split(';base64,')[0].split(':')[1];
const fileName = 'myimage.png';
const imageUrl = 'https://storage.googleapis.com/bucket-url/some_path/' + fileName;
await admin.storage().bucket().file('some_path/' + fileName).save(imageBuffer, {
public: true,
gzip: true,
metadata: {
contentType,
cacheControl: 'public, max-age=31536000',
}
});
console.log(imageUrl);
I was able to get the base64 string over to my Cloud Storage bucket with just one line of code.
var decodedImage = new Buffer(poster64, 'base64');
// Store Poster to storage
let posterFile = await client.file(decodedImage, `poster_${path}.jpeg`, { path: 'submissions/dev/', isBuffer: true, raw: true });
let posterUpload = await client.upload(posterFile, { metadata: { cacheControl: 'max-age=604800' }, public: true, overwrite: true });
let permalink = posterUpload.permalink
Something to be aware of is that if you are inside of a Nodejs environment you wont be able to use atob().
The top answer of this post showed me the errors of my ways!
NodeJS base64 image encoding/decoding not quite working

Categories