Uploading file to S3 using a custom API and AWS lambda - javascript

I am trying to send a file through Postman using the form-data type. The request is sent to AWS Lambda using an API. The request content in the Lambda is corrupted with a lot of question marks in the content.
I would like to convert back to the file from the request content and store the file in S3.
Existing code -
const res = multipart.parse(event, false);
var file = res['File'];
var encodedFile = Buffer.from(file["content"], 'binary');
var encodedFilebs64 = Buffer.from(file["content"], 'binary').toString('base64');
const s3 = new AWS.S3();
const params = {
Bucket: config.s3Bucket,
Key: "asset_" + known_asset_id + '.bin',
Body: encodedFile
};
await s3.upload(params).promise().then(function(data) {
console.log(`File uploaded successfully. ${data.Location}`);
}, function(err) {
console.error("Upload failed", err);
});
Response content from Cloudwatch logs -
https://i.stack.imgur.com/SvBfF.png
When converting this to binary and comparing, the file is not same as the original file.
It would be helpful if someone could help me construct the file from response and store it in S3.

Related

Forwarding an uploaded file stream to another server with Node.js

I'm building an Angular / Node.js application connected to a DMS (Alfresco). The server side Node.js / Express layer acts as a proxy to hide the complexity of Alfresco from the client:
Angular client <--> Node backend <--> Alfresco
This question is only about the Node.js backend.
When uploading a file I would like to forward the incoming file directly to Alfresco without temporarily storing it on the disk. With temporary disk storage this works as expected:
const fileUpload = require('express-fileupload');
const FormData = require('form-data');
// app is the express object
app.use(fileUpload({ createParentPath: true }));
app.post('/file', async (req, res) => {
// filedata is the FormData field of the client containing the actual file
let file = req.files.filedata;
let tmpPath = __dirname + '/tmp/' + file.name;
// save the file temporarily on the server
file.mv(tmpPath, async function(err) {
// create a new FormData Object for the upload to the DMS
const formData = new FormData();
formData.append('name', name);
// creates an fs.ReadStream object which is inherited from stream.Readable
formData.append('filedata', fs.createReadStream(tmpPath));
// upload the file to the DMS
let response = await axios.post('/files/...', formData, { headers: formData.getHeaders() });
// remove the temporary file
fs.rm(tmpPath, () => {
// return the answer of the DMS to the client
res.send(response.data);
});
});
});
Now I would like to avoid the disk access and forward the file directly to the DMS. Taking into consideration Converting a Buffer into a ReadableStream in Node.js I tried the following three alternatives.
const { Readable } = require('stream');
app.post('/file', async (req, res) => {
let file = req.files.fileData;
// create a new FormData Object for the upload to the DMS
const formData = new FormData();
formData.append('name', name);
/* alternatives starting here */
// Source: https://stackoverflow.com/questions/13230487/
// #1
const readable = new Readable();
readable._read = () => {};
readable.push(file.data);
readable.push(null);
// #2
const readable = new Readable();
readable._read = () => {};
const buffer = new Buffer(file.data, 'utf-8');
readable.push(buffer);
readable.push(null);
// #3
const readable = Readable.from(file.data);
/* alternatives ending here */
// put the Readable into the FormData object
formData.append('filedata', readable);
// upload the file to the DMS
let response = await axios.post('/files/...', formData, { headers: formData.getHeaders() });
// return the answer of the DMS to the client
res.send(response.data);
});
Whatever alternative I try, Alfresco always complains, required fields would be missing. Nonetheless, all required fields are provided, since the example with storing the file temporarily works fine. I think, Alfresco cannot handle the stream I provide and that I have a problem to completely understand how streams work in this situation. What should I do differently?
Please note, that all error handling as well as Alfresco request configuration / API URL is omitted for the sake of readability.
try providing file related information such as filename, knownLength etc.
let file = req.files.fileData;
const formData = new FormData();
// buffer with file related info
formData.append(name, file.data, {
filename: file.name,
contentType: file.mimetype,
knownLength: file.size
});
// upload the file to the DMS
let response = await axios.post('/files/...', formData, {headers: formData.getHeaders() });
// return the answer of the DMS to the client
res.send(response.data);

Google Cloud Bucket filepath

I am trying to make a connection to DialogFlow through cloud function. For that, I need credentials that I received from google.
When I try to run code locally I can easily pass the credential.json file by specifying the path. But when I run code from the cloud function I received the error.
Error: ENOENT: no such file or directory, open '/workspace/G:/Business/SlipSlop/functions/chatDialogFlow/functions/credential.json'
which is expected as there is no such path in cloud function. So i have uploaded file to google cloud bucket so i can have access to the file. Below is the code to access the bucket
const storage = new Storage();
const bucket = storage.bucket("d-slslop-bucket-01");
const file = bucket.file("credential.json");
after passing file to the DialogFlow to make connection i received another error
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received an instance of File
I don't know how can i pass the filepath i guess so that i can make connection to the DialogFlow
DialogFlow Connection code
async function runSample(input , projectId = "****-****") {
const storage = new Storage();
const bucket = storage.bucket("d-slslop-bucket-01");
const file = bucket.file("credential.json");
console.log("File path: " + file);
// A unique identifier for the given session
const sessionId = uuid.v4();
// Create a new session
const sessionClient = new dialogflow.SessionsClient({keyFilename:
"https://storage.googleapis.com/d-slslop-bucket-01/credential.json"
//file
// "G:/Business/SlipSlop/functions/chatDialogFlow/functions/credential.json"
// "functions/chatDialogFlow/functions/credential.json"
});
const sessionPath = sessionClient.projectAgentSessionPath(projectId, sessionId);
// The text query request.
const request = {
session: sessionPath,
queryInput: {
text: {
// The query to send to the dialogflow agent
text: input,
// The language used by the client (en-US)
languageCode: "en-US",
},
},
};
// Send request and log result
const responses = await sessionClient.detectIntent(request);
console.log("Detected intent");
const result = responses[0].queryResult;
console.log(" Query: "+result.queryText);
console.log(" Response: "+result.fulfillmentText);
if (result.intent) {
console.log(" Intent: "+result.intent.displayName);
} else {
console.log(" No intent matched.");
}
}
This is where I pass the credential.json file
const sessionClient = new dialogflow.SessionsClient({keyFilename:
//file
// "G:/Business/SlipSlop/functions/chatDialogFlow/functions/credential.json"
// "functions/chatDialogFlow/functions/credential.json"
});
What is working locally, if I pass the path as
"G:/Business/SlipSlop/functions/chatDialogFlow/functions/credential.json"
I am able to make the connection.
But when i deploy my code to the firebase cloud i don't know how to pass the path to JSON file.
Passing only the name of the file solves the problem.
instead of this
"https://storage.googleapis.com/d-slslop-bucket-01/credential.json"
I only define file name as the file was present in my cloud function directory
"credential.json"

Convert blob to PDF and save to S3 using AWS SDK

I am trying to convert a document file to PDF and upload to S3 using my browser.
The API i'm using to convert the file is returning a blob. How can i convert the blob to a PDF file and save it to S3?
Currently my code looks like this
function addFile(file) {
console.log("File!", file);
var fileName = Date.now().toString();
var albumPhotosKey = encodeURIComponent("files") + '/';
var photoKey = albumPhotosKey + fileName;
s3.upload({
Key: photoKey,
Body: file + ".pdf",
ACL: 'public-read'
}, function(err, data) {
if (err) {
console.log(err);
return alert('There was an error uploading your photo: ', err.message);
}
alert('Successfully uploaded photo.');
});
}
I tried converting the blob to a file using this
var file = new File([blobData], "filename.pdf", {type: "application/pdf", lastModified: Date.now()});
and then passed the file to the addFile() function but it creates a file which contains [object File].pdf as it's content.
How can i create a PDF file with the blob contents?
To convert base64 string to PDF on fly and write on s3 as a file, I used this approach -
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
const base64String = 'abcxyz';
let buffer = new Buffer.from(base64String, 'base64');
let params = {
Bucket : 'your-bucket-name',
Key : 'user.pdf',
Body : buffer
}
S3.upload(params, (err, data) => {
if (err) {
console.log(err)
} else
console.log(data.Location)
})
Well, the other approach is to first create pdf from this string on your local machine using 'fs' module.
And to submit it to S3, read that pdf first and then upload to s3. But this is a lazy process and I personally dont recommend it.
Hope it helps.

How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?

Currently, I am using the #google-cloud/storage NPM package to upload a file directly to a Google Cloud Storage bucket. This requires some trickery as I only have the image's base64 encoded string. I have to:
Decode the string
Save it as a file
Send the file path to the below script to upload to Google Cloud Storage
Delete the local file
I'd like to avoid storing the file in the filesystem altogether since I am using Google App Engine and I don't want to overload the filesystem / leave junk files there if the delete operation doesn't work for whatever reason. This is what my upload script looks like right now:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var base64Img = require('base64-img');
var filePath = base64Img.imgSync(req.body.base64Image, 'user-uploads', 'image-name');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('google-cloud-storage-bucket-name');
// Upload the image to the bucket
bucket.upload(__dirname.slice(0, -15) + filePath, {
destination: 'profile-images/576dba00c1346abe12fb502a-original.jpg',
public: true,
validation: 'md5'
}, function(error, file) {
if (error) {
sails.log.error(error);
}
return res.ok('Image uploaded');
});
Is there anyway to directly upload the base64 encoded string of the image instead of having to convert it to a file and then upload using the path?
The solution, I believe, is to use the file.createWriteStream functionality that the bucket.upload function wraps in the Google Cloud Node SDK.
I've got very little experience with streams, so try to bear with me if this doesn't work right off.
First of all, we need take the base64 data and drop it into a stream. For that, we're going to include the stream library, create a buffer from the base64 data, and add the buffer to the end of the stream.
var stream = require('stream');
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(req.body.base64Image, 'base64'));
More on decoding base64 and creating the stream.
We're then going to pipe the stream into a write stream created by the file.createWriteStream function.
var gcs = require('#google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
//Define bucket.
var myBucket = gcs.bucket('my-bucket');
//Define file & file name.
var file = myBucket.file('my-file.jpg');
//Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg',
metadata: {
custom: 'metadata'
}
},
public: true,
validation: "md5"
}))
.on('error', function(err) {})
.on('finish', function() {
// The file upload is complete.
});
Info on file.createWriteStream, File docs, bucket.upload, and the bucket.upload method code in the Node SDK.
So the way the above code works is to define the bucket you want to put the file in, then define the file and the file name. We don't set upload options here. We then pipe the bufferStream variable we just created into the file.createWriteStream method we discussed before. In these options we define the metadata and other options you want to implement. It was very helpful to look directly at the Node code on Github to figure out how they break down the bucket.upload function, and recommend you do so as well. Finally, we attach a couple events for when the upload finishes and when it errors out.
Posting my version of the answer in response to #krlozadan 's request above:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var mimeTypes = require('mimetypes');
var image = req.body.profile.image,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)[1],
fileName = req.profile.id + '-original.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('my-bucket');
// Upload the image to the bucket
var file = bucket.file('profile-images/' + fileName);
file.save(imageBuffer, {
metadata: { contentType: mimeType },
public: true,
validation: 'md5'
}, function(error) {
if (error) {
return res.serverError('Unable to upload the image.');
}
return res.ok('Uploaded');
});
This worked just fine for me. Ignore some of the additional logic in the first few lines as they are only relevant to the application I am building.
If you want to save a string as a file in Google Cloud Storage, you can do it easily using the file.save method:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file.txt');
const contents = 'This is the contents of the file.';
file.save(contents).then(() => console.log('done'));
:) what an issue !! Have tried it and got the issue Image has uploaded on firebase Storage but not download and just loader is moving around and around... After spending time... Got the success to upload the image on firebase storage with downloading... There was an issue in an access token...
check the screenshot
If you check in the file location section on the right side bottom there is an option "create access token" and not showing any "access token" on there if you create manually access token on there then refresh the page image will showing... So now the question is how to create it by code...
just use below code to create the access token
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
Full code is given below for uploading an image to storage image on firebase storage
const functions = require('firebase-functions')
var firebase = require('firebase');
var express = require('express');
var bodyParser = require("body-parser");
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
You have to convert base64 to image buffer then upload as below, you need to provide image_data_from_html variable as the data you extract from HTML event.
const base64Text = image_data_from_html.split(';base64,').pop();
const imageBuffer = Buffer.from(base64Text, 'base64');
const contentType = data.image_data.split(';base64,')[0].split(':')[1];
const fileName = 'myimage.png';
const imageUrl = 'https://storage.googleapis.com/bucket-url/some_path/' + fileName;
await admin.storage().bucket().file('some_path/' + fileName).save(imageBuffer, {
public: true,
gzip: true,
metadata: {
contentType,
cacheControl: 'public, max-age=31536000',
}
});
console.log(imageUrl);
I was able to get the base64 string over to my Cloud Storage bucket with just one line of code.
var decodedImage = new Buffer(poster64, 'base64');
// Store Poster to storage
let posterFile = await client.file(decodedImage, `poster_${path}.jpeg`, { path: 'submissions/dev/', isBuffer: true, raw: true });
let posterUpload = await client.upload(posterFile, { metadata: { cacheControl: 'max-age=604800' }, public: true, overwrite: true });
let permalink = posterUpload.permalink
Something to be aware of is that if you are inside of a Nodejs environment you wont be able to use atob().
The top answer of this post showed me the errors of my ways!
NodeJS base64 image encoding/decoding not quite working

S3 file upload file object using node js

I'm using Sailsjs 0.12.1, node.js 4.2.6
I want to upload the file From front-end(angular.js) through an API and from backend I want to upload the file to the AWS S3 bucket.
front-end I'm sending the file to the api. In backend I'm getting the file with the name but while uploading the file to S3 I'm getting the error
Cannot determine length of [object Object]
I google the error and found the many links but my bad luck.
Back-end
uploadPersonAvtar: function(req, res) {
var zlib = require('zlib');
var file = req.file('image');
var mime = require('mime');
data = {
Bucket: 'bucket',
Key : 'my_key',
Key: file.name,
Body: file,
ContentType: mime.lookup(file.name)
};
// Upload the stream
var s3obj = new AWS.S3(S3options);
s3obj.upload(data, function(err, data) {
if (err) console.log("An error occurred", err);
console.log("Uploaded the file at", data);
})
}
Is my approach is correct
If yes what I'm doing wrong.
I want to know how to use the file object to upload the file.
I can create a read stream but I don't have the file path When I'm creating the file object I'm getting the error: path must be a string
You could look at following example about streaming data : Amazon S3: Uploading an arbitrarily sized stream (upload)
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) { console.log(evt); }).
send(function(err, data) { console.log(err, data) });

Categories