For some reason, I don't want to share the URL public (sercet url),
My workflow is as below:
Client send API request to my server:
Api: mywebsite.com/api/image_abc.jpg
I have Nodejs express server to fetch the file from the url:
Eg: sercret_url.com/image_abc.jpg
And then from response image content from Nodejs, I send back the image content to the client and display as image_abc.jpg
I looked around on stackoverflow, but just got an answer from reading file from disk and send to client. What I want is just redirect the image content to client, not saving file to the disk.
Thank you.
Assuming you want to return the contents of a file from a certain URL to the client as buffer here's the solution I suggest
Get the file using axios and return the buffer to you client
const axios = require('axios');
let URL='some-valid-url'
const response = await axios.get(
URL,
{ responseType: 'arraybuffer' }
);
const buffer = Buffer.from(response.data, 'utf-8');
res.status(200).send(buffer);
In case you want to save it to your server you can use fs as follows to write the file to the folder of your choice
fs.writeFile(fileName, buffer, (err) => {
if(!err) console.log('Data written');
});
Related
Someone could help me, I'm trying to send a photo and caption into a group, but isn't working!
I'd like send the photo that I recieve as base64 and send to a facebook group api.
What I'm doing? I got the base64 then convert into buffer and write it on a local disk, then I read it into a formdata.
I load the data into a form this way =>
const form = new FormData();
const fileContent = Buffer.from(url as any, 'base64');
fs.writeFile('../tmp', fileContent, (err) => {
if (err) return console.log(err)
})
form.append('groupId', groupId)
form.append('caption', caption)
form.append('image ', fs.createReadStream('../tmp'))
In below is the axios configurations and request
await client.post(`${config.facebook.BASE_URL}/${groupId}/photos`, form, {
headers: {
...form.getHeaders(),
Authorization: `OAuth ${config.facebook.token}`,
'content-type': 'multipart/form-data',
file_type: "image/jpeg",
}
})
Note: This way I got the Error: Request failed with status code 500
I already resolved this although I changed de way that the file came to me, instead of receiving the image in base64 I'm receiving a signed url from google storage, and on the form.append, it's necessary to pass the name and the extension of the file, like this => form.append('source', image, 'file.jpg');
of course together with the other params and axios' configurations
I've read this article about google drive implementation in nodejs. I want to give to the users of the app the ability to upload the processed files from the app to their google drive account. The article show how to implement a nodejs solution, but since the server will run on localhost, how I can authorize the user on the client side using vuejs?
I've found this question but it's very old and I'm not sure if can really help me at all.
At the moment my nodejs script will save the processed files on the users machine using fs.writeFile.
// espress endpoint
this.app.post('/processFiles', async (req, res) => {
for(let file in req.files){
//console.log(req.files[file]);
await this.composeData(req.files[file]);
}
res.setHeader('Content-Type', 'application/json');
res.send({processStatus: '', outputPath: this.tmpDir});
});
// processing file method
async composetData(file){
//some file compression stuff
this.output = path.format({dir: this.tmpDir, base: file.name});
await fs.writeFile(this.output, this.processedData);
}
Since I want to implement a client side solution, I'm thinking to add an endpoint to my express server that will send processed files back so the client code can do the gdrive upload.
//vue app code
processFiles(){
const formData = new FormData();
this.selectedFiles.forEach( (file, i) => {
formData.append(`file${i}`, file);
});
axios({
method: 'POST',
url: 'http://localhost:9000/processFiles',
data: formData
}).then( (res) => {
//here i want to implement gdrive upload
console.log(res);
});
}
Can anyone provide me some help about?
I am trying to get this working now for days -.-
Using a simple NodeJS express server, I want to upload an image to a Django instance through Post request, but I just can't figure out, how to prepare the request and embed the file.
Later I would like to post the image, created from a canvas on the client side,
but for testing I was trying to just upload an existing image from the nodeJS server.
app.post('/images', function(req, res) {
const filename = "Download.png"; // existing local file on server
// using formData to create a multipart/form-data content-type
let formData = new FormData();
let buffer = fs.readFileSync(filename);
formData.append("data", buffer); // appending the file a buffer, alternatively could read as utf-8 string and append as text
formData.append('name', 'var name here'); // someone told me, I need to specify a name
const config = {
headers: { 'content-type': 'multipart/form-data' }
}
axios.post("http://django:8000/images/", formData, config)
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
});
What am I doing wrong or did I just miss something?
EDIT
I just found a nice snippet with a slighlty other approach on the npm form-data page, on the very bottom (npmjs.com/package/form-data):
const filename = "Download.png"; // existing local file on server
let formData = new FormData();
let stream = fs.createReadStream(filename);
formData.append('data', stream)
let formHeaders = formData.getHeaders()
axios.post('http://django:8000/images/', formData, {
headers: {
...formHeaders,
},
})
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
sadly, this doesn't change anything :( I still receive only Bad Request: No file was submitted
I don't really have much Django code just a basic setup using the rest_framework with an image model:
class Image(models.Model):
data = models.ImageField(upload_to='images/')
def __str__(self):
return "Image Resource"
which are also registered in the admin.py,
a serializer:
from .models import Image
class ImageSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Image
fields = ('id', 'data')
using automatic URL routing.
I wrote a simple test script and put the same image on the django server, to verify that image uploads works, and it does:
import requests
url = "http://127.0.0.1:8000/images/"
file = {'data': open('Download.png', 'rb')}
response = requests.post(url, files=file)
print(response.status_code) # 201
I had a similar problem: I used the same Django endpoint to upload a file using axios 1) from the client side and 2) from the server side. From the client side it worked without any problem, but from the server side, the request body was always empty.
My solution was to use the following code:
const fileBuffer = await readFile(file.filepath)
const formData = new FormData()
formData.append('file', fileBuffer, file.originalFilename)
const response = await fetch(
urlJoin(BACKEND_URL),
{
method: 'POST',
body: formData,
headers: {
...formData.getHeaders(),
},
}
)
A few relevant references that I found useful:
This blog post, even though it seems the author manages to send form data from the server side using axios, I did not manage to reproduce it on my case.
This issue report in the axio repository, where one comment suggests to use fetch.
In your node.js express server instead of adding the image to the form data, try directly sending the stream in the API post.
const filename = "Download.png"; // existing local file on server
//let formData = new FormData();
let stream = fs.createReadStream(filename);
//formData.append('data', stream)
let formHeaders = formData.getHeaders()
axios.post('http://django:8000/images/', stream, {
headers: {
...formHeaders,
},
})
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
I still didn't manage to get this working with axios so I tried another package for sending files as post requests, namely unirest, which worked out of the box for me.
Also it is smaller, requires less code and does everything I needed:
const filename = "Download.png"; // existing local file on server
unirest
.post(url)
.attach('data', filename) // reads directly from local file
//.attach('data', fs.createReadStream(filename)) // creates a read stream
//.attach('data', fs.readFileSync(filename)) // 400 - The submitted data was not a file. Check the encoding type on the form. -> maybe check encoding?
.then(function (response) {
console.log(response.body) // 201
})
.catch((error) => console.log(error.response.data));
If I have some spare time in the future I may look into what was wrong with my axios implementation or someone does know a solution pls let me know :D
Using Azure function app, I want to be able to download images from different urls to a particular folder, zip them and send the zip file back in the response.
I'm able to acheive this by following the below steps:
request for the file
Save the file locally
Zip the directory using
archiver read the Zipped file, convert it to base64
send the buffer in the response body
Download and save image
const img = await request(url, { encoding: "binary" });
fs.writeFile(filesName, data, "binary", err => {
if (err) {
reject(`Error while writing the file; ${err}`);
} else {
resolve(data);
}
});
Zip the directory, read the Zipped file and send the response
const target = await zipDirectory(dirName, targetFile);
context.log('Target ' + targetFile);
const rawFile = await readFile(targetFile);
const fileBuffer = Buffer.from(rawFile, "base64");
context.res = {
body: fileBuffer,
headers: {
"Content-Disposition": `filename=target.zip`,
"Content-Type": "application/zip"
},
status: 202
};
Is there a better way to do this?
Create a function with an http trigger, where the input would be the uri of the image, and an output binding of a blob container. The logic would be to save the image in blob storage.
Create another function that is blob-triggered, which would grab the file, zip it, and it can have an output blob binding. It would zip the file and place it in your output blob binding.
Your zipped file would be in the output blob container.
Alternatively you can orchestrate that entire process with a durable function.
I'm generating HTML webpage as PDF, and then exporting it locally. How can I save this file to my node server and upload to S3
Please find the attached psuedo code
const convertDataToPdf = (exportFlag,cb)=>{ //set to switch between export and save
const doc = new jsPDF();
//... adding metadata and styling the pdf
if(exportFlag) {
doc.save('sample.pdf') //export PDF locally
} else {
cb(doc.output()) //this converts the PDF to raw to send to server
}
}
Based on a this answer, I'm appending the raw PDF data to a new FormData object, and then an ajax call to post the raw data to my nodejs server
convertDataToPdf(false, pdfData => {
let formData = new FormData();
formData.append(`file-1`, pdfData)
$.ajax({
url: '/file-upload',
data: formData,
processData: false,
contentType: false,
type: 'POST',
}).then(data => {
console.log('PDF upload to s3 successful!', data)
}).catch(err => {
console.log('Error! PDF Upload to S3 failed', err)
})
});
});
Now, how can I parse the raw PDF data on the server and upload it?
As an alternative, is it possible to save my file locally and then upload the file to s3?
First question - you can use on Node server multer https://www.npmjs.com/package/multer . This way you don't have to decode pdf. You just handle request and pass file to S3 (via S3 node API). You can use mimetype to be sure someone is sending you pdf.
For sure if you've got application server such as Nginx, you can limit transfer file.
For example in Nginx client_max_body_size 10M;. It's more secure to check limit on server, because naughty users can always cheat your web validations. Multer also has size validation if you would like to return specific exception from your backend.