How to implement google drive file upload in vuejs client app - javascript

I've read this article about google drive implementation in nodejs. I want to give to the users of the app the ability to upload the processed files from the app to their google drive account. The article show how to implement a nodejs solution, but since the server will run on localhost, how I can authorize the user on the client side using vuejs?
I've found this question but it's very old and I'm not sure if can really help me at all.
At the moment my nodejs script will save the processed files on the users machine using fs.writeFile.
// espress endpoint
this.app.post('/processFiles', async (req, res) => {
for(let file in req.files){
//console.log(req.files[file]);
await this.composeData(req.files[file]);
}
res.setHeader('Content-Type', 'application/json');
res.send({processStatus: '', outputPath: this.tmpDir});
});
// processing file method
async composetData(file){
//some file compression stuff
this.output = path.format({dir: this.tmpDir, base: file.name});
await fs.writeFile(this.output, this.processedData);
}
Since I want to implement a client side solution, I'm thinking to add an endpoint to my express server that will send processed files back so the client code can do the gdrive upload.
//vue app code
processFiles(){
const formData = new FormData();
this.selectedFiles.forEach( (file, i) => {
formData.append(`file${i}`, file);
});
axios({
method: 'POST',
url: 'http://localhost:9000/processFiles',
data: formData
}).then( (res) => {
//here i want to implement gdrive upload
console.log(res);
});
}
Can anyone provide me some help about?

Related

Nodejs - Fetch file from url and send content to client

For some reason, I don't want to share the URL public (sercet url),
My workflow is as below:
Client send API request to my server:
Api: mywebsite.com/api/image_abc.jpg
I have Nodejs express server to fetch the file from the url:
Eg: sercret_url.com/image_abc.jpg
And then from response image content from Nodejs, I send back the image content to the client and display as image_abc.jpg
I looked around on stackoverflow, but just got an answer from reading file from disk and send to client. What I want is just redirect the image content to client, not saving file to the disk.
Thank you.
Assuming you want to return the contents of a file from a certain URL to the client as buffer here's the solution I suggest
Get the file using axios and return the buffer to you client
const axios = require('axios');
let URL='some-valid-url'
const response = await axios.get(
URL,
{ responseType: 'arraybuffer' }
);
const buffer = Buffer.from(response.data, 'utf-8');
res.status(200).send(buffer);
In case you want to save it to your server you can use fs as follows to write the file to the folder of your choice
fs.writeFile(fileName, buffer, (err) => {
if(!err) console.log('Data written');
});

Download file from FTP server to client using Node js

System architecture
System consist of 3 components
1: FTP server: Used to store files. Can only be accessed by Node.js app. No direct access.
2: Node.js: Provides an API to interact with FTP server. Only way to interact with FTP server. Node.js app have no collection of files stored on FTP. Only provides method to work with FTP. Node.js app should not store any file uploaded from it and download from it.
3: Client: A user how will upload or download a file to FTP server by using Node.js app.
What I have done:
I am able to download the file stored on FTP by using basic-ftp package. Here is the code for download file function.
async function downloadFile(folderPath, fileName, writeStream) {
console.log(folderPath);
console.log(fileName);
const client = new ftp.Client()
// client.ftp.verbose = true
try {
await client.access({
'host': process.env.FTP_HOST,
'user': process.env.FTP_USER,
'password': process.env.FTP_PASSWORD,
});
await client.ensureDir(folderPath);
await client.downloadTo(writeStream, fileName);
}
catch(err) {
console.log(err)
}
client.close()
}
The file is download to the directory named /downloads on Node.js server. What I want to do actually is download the file directly to client computer. To download the file directly to client, I have tried streaming the writeStream object from download method. Here is the code for that
app.post("/download/file", urlencodedParser, (req, res, next) => {
var writeStream = fs.createWriteStream('./downloads/'+req.body.fileName);
writeStream.on("data", (data) => {
res.write(data);
})
writeStream.on("close", () => {
res.end()
})
res.setHeader('Transfer-Encoding', 'chunked');
downloadFile(req.body.folderName, req.body.fileName, writeStream);
})
This does not work. It always ends in error with downloading the file completely.
Another approach I tried is to generate a URL for the file and client will click the URL to download the file. Problem with this approach is the file is not complete by the time I start downloading which results in incomplete file download. For example, If the size of file is 10MB and only 2 MB was download by the time client clicked the link, it will download only 2MB file not 10MB.
Goal:
Download the file to client(browser) from a FTP server through Node js.
Requirement:
Download the file stored on FTP server directly to client through Node js.
Constraints:
Client does not have access to FTP server.
The only way to access the server is through Node js app.
You can try to indicate res as an output stream directly. That way you will simply redirect a stream from an ftp to a client:
async function downloadFile(fileName, writeStream) {
console.log(fileName);
const client = new ftp.Client()
// client.ftp.verbose = true
try {
await client.access({
'host': process.env.FTP_HOST,
'user': process.env.FTP_USER,
'password': process.env.FTP_PASSWORD,
});
await client.downloadTo(writeStream, fileName);
}
catch(err) {
console.log(err)
}
client.close()
}
app.post("/download/file", urlencodedParser, (req, res, next) => {
downloadFile(req.body.fileName, res);
})

Node server to send image to cloud to be hosted

I send an image file to my node server via my react app -
I want to host these images on google cloud or similar so they have an accessible URL.
I have tried using cloudinary and google cloud but to no avail thus far!
My react-side code (shortened):
imageFile = this.state.files[0])
const formData = new FormData()
formData.append('file', imageFile);
sendImage(formData)
sendImage(image) {
axios.post("https://137a6167.ngrok.io/image-upload", image, {
})
.then(res => { // then print response status
console.log(res.statusText)
})
}
The file is successfully sent to my server and consoled:
app.post('/image-upload', (req, res) => {
console.log('consoling the req.body!!!!' + JSON.stringify(req.body))
})
THE CONSOLE: consoling the req.body!!!!{"1":"[object File]"}
I did try use this following cloudinary method, yet it threw errors:
cloudinary.config({
cloud_name: process.env.CLOUD_NAME,
api_key: process.env.API_KEY,
api_secret: process.env.API_SECRET
})
app.use(formData.parse())
app.post('/image-upload', (req, res) => {
const values = Object.values(req.files)
const promises = values.map(image => cloudinary.uploader.upload(image.path))
Promise
.all(promises)
.then(results => res.json(results))
})
this gave me the error that an unhandled error in the promise wasnt handled and i got a bit lost with where to go beyond that!
I looked at google cloud storage too but couldn't get it working! Any advice?
What I really want to do is return back to my react app the URL of the hosted image - so it can be stored in DB for the user!
If you can help at all that would be greatly appreciated, thank you.
There are couple of things you need to fix on the front end before you try to upload to any cloud.
First you need to set 'Content-Type': 'multipart/form-data' header in axios to send the file data properly.
Check this thread for more details: How do I set multipart in axios with react?
Then on the express side you need multer or some other similar library to receive the data. You can't access it from req.body. multer adds req.files for example.
https://github.com/expressjs/multer
Try there steps and then post the exact error message you are receiving from google cloud.

Export a file to Node server and then upload to S3

I'm generating HTML webpage as PDF, and then exporting it locally. How can I save this file to my node server and upload to S3
Please find the attached psuedo code
const convertDataToPdf = (exportFlag,cb)=>{ //set to switch between export and save
const doc = new jsPDF();
//... adding metadata and styling the pdf
if(exportFlag) {
doc.save('sample.pdf') //export PDF locally
} else {
cb(doc.output()) //this converts the PDF to raw to send to server
}
}
Based on a this answer, I'm appending the raw PDF data to a new FormData object, and then an ajax call to post the raw data to my nodejs server
convertDataToPdf(false, pdfData => {
let formData = new FormData();
formData.append(`file-1`, pdfData)
$.ajax({
url: '/file-upload',
data: formData,
processData: false,
contentType: false,
type: 'POST',
}).then(data => {
console.log('PDF upload to s3 successful!', data)
}).catch(err => {
console.log('Error! PDF Upload to S3 failed', err)
})
});
});
Now, how can I parse the raw PDF data on the server and upload it?
As an alternative, is it possible to save my file locally and then upload the file to s3?
First question - you can use on Node server multer https://www.npmjs.com/package/multer . This way you don't have to decode pdf. You just handle request and pass file to S3 (via S3 node API). You can use mimetype to be sure someone is sending you pdf.
For sure if you've got application server such as Nginx, you can limit transfer file.
For example in Nginx client_max_body_size 10M;. It's more secure to check limit on server, because naughty users can always cheat your web validations. Multer also has size validation if you would like to return specific exception from your backend.

Sending form-data to Java server through a Node middleman

I have an application that uses axios to make requests to a node server which in turn makes requests to another java server.
Call to node server from client:
// here payload is FormData()
axios.post(url, payload).then((response) => {
return callback(null, response);
}).catch((err) => {
return callback(err, null);
});
In the node server, I listen to the request using busboy:
let rawData = '';
const busboy = new Busboy({headers: req.headers});
busboy.on('file', function (fieldname, file, filename, encoding, mimetype) {
file.on('data', function (chunk) {
rawData += chunk;
});
});
Now the java server too expects FormData (just like the way I sent it to node). How do I get the FormData from node now? I have been googling hard and trying a lot of stuff in vain. Any solution not involving busboy will help too.
I had finally used the middleware busboy-body-parser which adds support for getting files from the request object as req.files. And once the file is there, I send it as form-data to the java web server using the form-data npm package. The req.files support used to be there by default in Express.js. But from 4.x, it has been deprecated.
Multer is another really good middleware for handling multipart/form-data.

Categories