I need to upload an image to google drive via nodejs after cropping it by a image cropping library. Previously I was uploading images with file input field, so i can get buffer(using express-fileupload library) of the image in backend(nodejs). Now the problem is, after cropping I have image in the form of
data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAA....
How to send images in this form to backend such that we could get buffer of that image so as to upload to google drive. Else could we upload this directly to google drive in frontend(javascript)? .I tried using FormData but I can get only string not buffer.
It depends on how your backend and frontend are connected because you will be not able to transfer the image in this form to the backend using HTTP POST
as the maximum size for a HTTP POST is 64KB
However it is very possible to send the image in a blob form to your backend using a WebSocket/WebRTC Library like Socket.io
For example lets take a express backend and a traditional static html homepage with some javascript
example
//my webpage which has the blob this is some browser javascript connected to a html file
//Hosted on http://localhost:3000
var base64data
//I have a cake.png in my static directory
fetch("cake.png")
.then(response => response.blob())
.then(blob => {
const fileReader = new FileReader();
fileReader.readAsDataURL(blob);
fileReader.onloadend = function() {
base64data = fileReader.result;
}
})
.catch(error => {
console.error(error);
});
const socket = io("http://localhost:4000")
socket.on('connect',()=>{
//Sent this blob to the server
socket.emit('blob',base64data)
})
My nodejs file which has a socket.io and drive api
Socket.io Server hosted on http://localhost:4000
const { Server } = require("socket.io");
const google = require("googleapis");
const io = new Server(4000, {
cors: {
"Access-Control-Allow-Origin": "*",
methods: ["GET", "POST", "OPTIONS"],
},
});
/*Initialize the drive
*......
*......
*/
io.on("connection", (socket) => {
socket.on("blob", (blob) => {
drive.files.create({
requestBody: {
name: "image.png",
mimeType: "image/png",
},
media: {
mimeType: "image/png",
body: blob,
},
});
});
});
There fore the blob can be used as a input for body while creating files
Related
For some reason, I don't want to share the URL public (sercet url),
My workflow is as below:
Client send API request to my server:
Api: mywebsite.com/api/image_abc.jpg
I have Nodejs express server to fetch the file from the url:
Eg: sercret_url.com/image_abc.jpg
And then from response image content from Nodejs, I send back the image content to the client and display as image_abc.jpg
I looked around on stackoverflow, but just got an answer from reading file from disk and send to client. What I want is just redirect the image content to client, not saving file to the disk.
Thank you.
Assuming you want to return the contents of a file from a certain URL to the client as buffer here's the solution I suggest
Get the file using axios and return the buffer to you client
const axios = require('axios');
let URL='some-valid-url'
const response = await axios.get(
URL,
{ responseType: 'arraybuffer' }
);
const buffer = Buffer.from(response.data, 'utf-8');
res.status(200).send(buffer);
In case you want to save it to your server you can use fs as follows to write the file to the folder of your choice
fs.writeFile(fileName, buffer, (err) => {
if(!err) console.log('Data written');
});
I've read this article about google drive implementation in nodejs. I want to give to the users of the app the ability to upload the processed files from the app to their google drive account. The article show how to implement a nodejs solution, but since the server will run on localhost, how I can authorize the user on the client side using vuejs?
I've found this question but it's very old and I'm not sure if can really help me at all.
At the moment my nodejs script will save the processed files on the users machine using fs.writeFile.
// espress endpoint
this.app.post('/processFiles', async (req, res) => {
for(let file in req.files){
//console.log(req.files[file]);
await this.composeData(req.files[file]);
}
res.setHeader('Content-Type', 'application/json');
res.send({processStatus: '', outputPath: this.tmpDir});
});
// processing file method
async composetData(file){
//some file compression stuff
this.output = path.format({dir: this.tmpDir, base: file.name});
await fs.writeFile(this.output, this.processedData);
}
Since I want to implement a client side solution, I'm thinking to add an endpoint to my express server that will send processed files back so the client code can do the gdrive upload.
//vue app code
processFiles(){
const formData = new FormData();
this.selectedFiles.forEach( (file, i) => {
formData.append(`file${i}`, file);
});
axios({
method: 'POST',
url: 'http://localhost:9000/processFiles',
data: formData
}).then( (res) => {
//here i want to implement gdrive upload
console.log(res);
});
}
Can anyone provide me some help about?
We have a Spring Boot java Backend and a frontend in Vue.js. Backend API code calls an external API to get MP3 and Vue.js frontend calls backend API to get the mp3 file and play the same.
Now sometimes the browsers are showing erratic behavior where sometimes they don't play the mp3 because of CORS issue.
Because of this, I want to download the file in the backend itself and return may be bytes to the frontend.
I am very new to the frontend and wondering what would be the best way to download the mp3 from backend and in which format the mp3 file should be sent to frontend and how to convert the bytes to mp3 using javascript(vue).
Please note the files size is extremely small. The file is <5sec mp3 file, so I don't want to store the file in the server.
From Spring you can return an array of bytes:
File f = new File("/path/to/the/file");
byte[] file = Files.readAllBytes(f.toPath());
HttpHeaders headers = new HttpHeaders();
headers.set("Content-Disposition", "attachment; filename=\"" + f.getName() +".wav\"");
ResponseEntity<byte[]> response = new ResponseEntity(file, headers, HttpStatus.OK);
return response;
and then get it in javascript like so:
import fileDownload from 'js-file-download';
const { data } = await axios.get(`http://api.url`, {
responseType: 'arraybuffer',
headers: { 'Accept': '*/*', 'Content-Type': 'audio/wav' }
}).then(resp => resp);
const blob = new Blob([data], {
type: 'audio/wav'
})
const url = URL.createObjectURL(blob);
fetch(url)
.then(res => res.blob())
.then(blob => fileDownload(blob, 'your_filename'))
.catch(e => console.log('ERROR DOWNLOADING AUDIO FILE'));
I used React and axios, but I hope you'll manqge ;)
Using Azure function app, I want to be able to download images from different urls to a particular folder, zip them and send the zip file back in the response.
I'm able to acheive this by following the below steps:
request for the file
Save the file locally
Zip the directory using
archiver read the Zipped file, convert it to base64
send the buffer in the response body
Download and save image
const img = await request(url, { encoding: "binary" });
fs.writeFile(filesName, data, "binary", err => {
if (err) {
reject(`Error while writing the file; ${err}`);
} else {
resolve(data);
}
});
Zip the directory, read the Zipped file and send the response
const target = await zipDirectory(dirName, targetFile);
context.log('Target ' + targetFile);
const rawFile = await readFile(targetFile);
const fileBuffer = Buffer.from(rawFile, "base64");
context.res = {
body: fileBuffer,
headers: {
"Content-Disposition": `filename=target.zip`,
"Content-Type": "application/zip"
},
status: 202
};
Is there a better way to do this?
Create a function with an http trigger, where the input would be the uri of the image, and an output binding of a blob container. The logic would be to save the image in blob storage.
Create another function that is blob-triggered, which would grab the file, zip it, and it can have an output blob binding. It would zip the file and place it in your output blob binding.
Your zipped file would be in the output blob container.
Alternatively you can orchestrate that entire process with a durable function.
I'm generating HTML webpage as PDF, and then exporting it locally. How can I save this file to my node server and upload to S3
Please find the attached psuedo code
const convertDataToPdf = (exportFlag,cb)=>{ //set to switch between export and save
const doc = new jsPDF();
//... adding metadata and styling the pdf
if(exportFlag) {
doc.save('sample.pdf') //export PDF locally
} else {
cb(doc.output()) //this converts the PDF to raw to send to server
}
}
Based on a this answer, I'm appending the raw PDF data to a new FormData object, and then an ajax call to post the raw data to my nodejs server
convertDataToPdf(false, pdfData => {
let formData = new FormData();
formData.append(`file-1`, pdfData)
$.ajax({
url: '/file-upload',
data: formData,
processData: false,
contentType: false,
type: 'POST',
}).then(data => {
console.log('PDF upload to s3 successful!', data)
}).catch(err => {
console.log('Error! PDF Upload to S3 failed', err)
})
});
});
Now, how can I parse the raw PDF data on the server and upload it?
As an alternative, is it possible to save my file locally and then upload the file to s3?
First question - you can use on Node server multer https://www.npmjs.com/package/multer . This way you don't have to decode pdf. You just handle request and pass file to S3 (via S3 node API). You can use mimetype to be sure someone is sending you pdf.
For sure if you've got application server such as Nginx, you can limit transfer file.
For example in Nginx client_max_body_size 10M;. It's more secure to check limit on server, because naughty users can always cheat your web validations. Multer also has size validation if you would like to return specific exception from your backend.