I am trying to attach an image file. Somehow my client side didnt send anything if attaching a file there.
It's on React(website) and React-Native Web View(rendering).
Bad part of the game is, I cant track the client log, cause of mobile browser issue.
1)
const formData = new FormData();
formData.append('report[description]', text);
formData.append('token', localStorage.getItem('tempToken'));
formData.append('report[image]', new Blob([this.state.file], { type: 'image/png' }));
axios.post('/accept_report', formData)
2)
const config = {
headers: {
'content-type': 'multipart/form-data'
}
}
formData.append('report[image]', this.state.file);
axios.post("/accept_report", formData, config)
Solved:
Since, It happens only when attaching the file. My Production nginx was restricting big sized files. I was only able to upload around 500kb not over.
Have changed nginx client body size.
http{
#...
client_max_body_size 10M; #or 100M it depends on yr preference.
#...
}
Related
I want my users to upload profile pictures. Right now I can choose a file and pass it into a POST request body. I'm working in Node.js + JavaScript.
I am using DigitalOcean's Spaces object storage service to store my images, which is S3-compatible.
Ideally, my storage stores the file as an actual image. Instead, it is storing as a strange file with Content-Type application/octet-stream. I don't know how I'm supposed to work with this -- normally to display the image, I simply reference the URL that hosts the image, but in this case the URL is pointing to this strange file. It is named something like VMEDFS3Q65JV4B7YQKLS (no extension). The size of the file is 14kb which seems right and it appears to hold the file data. It looks like this:
etc...
I know I'm grabbing the right image and I know the database is hooked up properly as it's posting to the exact right place, I'm just unhappy with the file type.
Request on front end:
fetch('/api/image/profileUpload', {
method: 'PUT',
body: { file: event.target.files[0] },
'Content-Type': 'image/jpg',
})
Code in backend:
const AWS = require('aws-sdk')
let file = req.body;
file = JSON.stringify(file);
AWS.config.update({
region: 'nyc3',
accessKeyId: process.env.SPACES_KEY,
secretAccessKey: process.env.SPACES_SECRET,
});
const s3 = new AWS.S3({
endpoint: new AWS.Endpoint('nyc3.digitaloceanspaces.com')
});
const uploadParams = {
Bucket: process.env.SPACES_BUCKET,
Key: process.env.SPACES_KEY,
Body: file,
ACL: "public-read",
ResponseContentType: 'image/jpg'
};
s3.upload(uploadParams, function(err, data) {
if (err) console.log(err, err.stack);
else console.log(data);
});
What I've tried:
Adding content-type header in request and content-type parameters in back
Other methods of fetching the data in the backend -- they all result in the same thing
Not stringifying the file after grabbing it from the req.body
Changing POST request to PUT request
Would appreciate any insight into either a) converting this octet-stream file into an image or b) getting this image to upload as an image. Thank you
I solved this problem somewhat -- I changed my parameters to this:
const uploadParams = {
Bucket: 'oscarexpert',
Key: 'asdff',
Body: image,
ContentType: "image/jpeg",
ACL: "public-read",
};
^^added ContentType.
It still doesn't store the actual image but that's sort of a different issue.
In my application I read huge data of images, and send the whole data to the client:
const imagesPaths = await getFolderImagesRecursive(req.body.rootPath);
const dataToReturn = await Promise.all(imagesPaths.map((imagePath) => new Promise(async (resolve, reject) => {
try {
const imageB64 = await fs.readFile(imagePath, 'base64');
return resolve({
filename: imagePath,
imageData: imageB64,
});
} catch {
return reject();
}
})));
return res.status(200).send({
success: true,
message: 'Successfully retreived folder images data',
data: dataToReturn,
});
Here is the client side:
const getFolderImages = (rootPath) => {
return fetch('api/getFolderImages', {
method: 'POST',
headers: { 'Content-type': 'application/json' },
body: JSON.stringify({ rootPath }),
});
};
const getFolderImagesServerResponse = await getFolderImages(rootPath);
const getFolderImagesServerData = await getFolderImagesServerResponse.json();
When I do send the data I get failure due to the huge data. Sending the data just with res.send(<data>) is impossible. So, then, how can I bypass this limitation - and how should I accept the data in the client side with the new process?
The answer to your problem requires some read :
Link to the solution
One thing you probably haven’t taken full advantage of before is that webserver’s http response is a stream by default.
They just make it easier for you to pass in synchron data, which is parsed to chunks under the hood and sent as HTTP packages.
We are talking about huge files here; naturally, we don’t want them to be stored in any memory, at least not the whole blob. The excellent solution for this dilemma is a stream.
We create a readstream with the help of the built-in node package ‘fs,’ then pass it to the stream compatible response.send parameter.
const readStream = fs.createReadStream('example.png');
return response.headers({
'Content-Type': 'image/png',
'Content-Disposition': 'attachment; filename="example.png"',
}).send(readStream);
I used Fastify webserver here, but it should work similarly with Koa or Express.
There are two more configurations here: naming the header ‘Content-Type’ and ‘Content-Disposition.’
The first one indicates the type of blob we are sending chunk-by-chunk, so the frontend will automatically give the extension to it.
The latter tells the browser that we are sending an attachment, not something renderable, like an HTML page or a script. This will trigger the browser’s download functionality, which is widely supported. The filename parameter is the download name of the content.
Here we are; we accomplished minimal memory stress, minimal coding, and minimal error opportunities.
One thing we haven’t mentioned yet is authentication.
For the fact, that the frontend won’t send an Ajax request, we can’t expect auth JWT header to be present on the request.
Here we will take the good old cookie auth approach. Cookies are set automatically on every request header that matches the criteria, based on the cookie options. More info about this in the frontend implementation part.
By default, cookies arrive as semicolon separated key-value pairs, in a single string. In order to ease out the parsing part, we will use Fastify’s Cookieparser plugin.
await fastifyServer.register(cookieParser);
Later in the handler method, we simply get the cookie that we are interested in and compare it to the expected value. Here I used only strings as auth-tokens; this should be replaced with some sort of hashing and comparing algorithm.
const cookies = request.cookies;
if (cookies['auth'] !== 'authenticated') {
throw new APIError(400, 'Unauthorized');
}
That’s it. We have authentication on top of the file streaming endpoint, and everything is ready to be connected by the frontend.
I am trying to get this working now for days -.-
Using a simple NodeJS express server, I want to upload an image to a Django instance through Post request, but I just can't figure out, how to prepare the request and embed the file.
Later I would like to post the image, created from a canvas on the client side,
but for testing I was trying to just upload an existing image from the nodeJS server.
app.post('/images', function(req, res) {
const filename = "Download.png"; // existing local file on server
// using formData to create a multipart/form-data content-type
let formData = new FormData();
let buffer = fs.readFileSync(filename);
formData.append("data", buffer); // appending the file a buffer, alternatively could read as utf-8 string and append as text
formData.append('name', 'var name here'); // someone told me, I need to specify a name
const config = {
headers: { 'content-type': 'multipart/form-data' }
}
axios.post("http://django:8000/images/", formData, config)
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
});
What am I doing wrong or did I just miss something?
EDIT
I just found a nice snippet with a slighlty other approach on the npm form-data page, on the very bottom (npmjs.com/package/form-data):
const filename = "Download.png"; // existing local file on server
let formData = new FormData();
let stream = fs.createReadStream(filename);
formData.append('data', stream)
let formHeaders = formData.getHeaders()
axios.post('http://django:8000/images/', formData, {
headers: {
...formHeaders,
},
})
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
sadly, this doesn't change anything :( I still receive only Bad Request: No file was submitted
I don't really have much Django code just a basic setup using the rest_framework with an image model:
class Image(models.Model):
data = models.ImageField(upload_to='images/')
def __str__(self):
return "Image Resource"
which are also registered in the admin.py,
a serializer:
from .models import Image
class ImageSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Image
fields = ('id', 'data')
using automatic URL routing.
I wrote a simple test script and put the same image on the django server, to verify that image uploads works, and it does:
import requests
url = "http://127.0.0.1:8000/images/"
file = {'data': open('Download.png', 'rb')}
response = requests.post(url, files=file)
print(response.status_code) # 201
I had a similar problem: I used the same Django endpoint to upload a file using axios 1) from the client side and 2) from the server side. From the client side it worked without any problem, but from the server side, the request body was always empty.
My solution was to use the following code:
const fileBuffer = await readFile(file.filepath)
const formData = new FormData()
formData.append('file', fileBuffer, file.originalFilename)
const response = await fetch(
urlJoin(BACKEND_URL),
{
method: 'POST',
body: formData,
headers: {
...formData.getHeaders(),
},
}
)
A few relevant references that I found useful:
This blog post, even though it seems the author manages to send form data from the server side using axios, I did not manage to reproduce it on my case.
This issue report in the axio repository, where one comment suggests to use fetch.
In your node.js express server instead of adding the image to the form data, try directly sending the stream in the API post.
const filename = "Download.png"; // existing local file on server
//let formData = new FormData();
let stream = fs.createReadStream(filename);
//formData.append('data', stream)
let formHeaders = formData.getHeaders()
axios.post('http://django:8000/images/', stream, {
headers: {
...formHeaders,
},
})
.then(response => {
console.log("success!"); // never happens :(
})
.catch(error => {
console.log(error.response.data); // no file was submitted
});
I still didn't manage to get this working with axios so I tried another package for sending files as post requests, namely unirest, which worked out of the box for me.
Also it is smaller, requires less code and does everything I needed:
const filename = "Download.png"; // existing local file on server
unirest
.post(url)
.attach('data', filename) // reads directly from local file
//.attach('data', fs.createReadStream(filename)) // creates a read stream
//.attach('data', fs.readFileSync(filename)) // 400 - The submitted data was not a file. Check the encoding type on the form. -> maybe check encoding?
.then(function (response) {
console.log(response.body) // 201
})
.catch((error) => console.log(error.response.data));
If I have some spare time in the future I may look into what was wrong with my axios implementation or someone does know a solution pls let me know :D
I am trying to download large files (3 gigs) using a post request to my backend. The post request is done to hide the file system from web scraping. It seems like some of the download initially is loaded into memory because the file doesn't initially start downloading, but my ram spikes, then 30 seconds later it begins downloading and slightly lower ram usage. Here is the code I am using.
fetch("http://localhost:5000/api/send", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ "root": root, "path": path, "name": name })
})
.then(response => response.blob())
.then(blob => {
var url = window.URL.createObjectURL(blob)
var a = document.createElement("a")
a.href = url
a.download = name
document.body.appendChild(a)
a.click()
a.remove()
})
Is there anyway to implement this without a ram/performance hit? Downloading the file as a GET request and using something like window.location.href = url works no problem, but I would prefer not to use that, and have difficulties specifying file names with symbols like "&" in the URL parameter.
Any ideas are appreciated!
Your server respond with a blob, so it’s downloaded into RAM then referred to as an object url on client side.
You see what the client side really need is a url. You don’t have to create it from a blob. Instead you can create a hashed url on server side like "bit.ly/whatever", respond with that url, then client code do the same trick.
I'm generating HTML webpage as PDF, and then exporting it locally. How can I save this file to my node server and upload to S3
Please find the attached psuedo code
const convertDataToPdf = (exportFlag,cb)=>{ //set to switch between export and save
const doc = new jsPDF();
//... adding metadata and styling the pdf
if(exportFlag) {
doc.save('sample.pdf') //export PDF locally
} else {
cb(doc.output()) //this converts the PDF to raw to send to server
}
}
Based on a this answer, I'm appending the raw PDF data to a new FormData object, and then an ajax call to post the raw data to my nodejs server
convertDataToPdf(false, pdfData => {
let formData = new FormData();
formData.append(`file-1`, pdfData)
$.ajax({
url: '/file-upload',
data: formData,
processData: false,
contentType: false,
type: 'POST',
}).then(data => {
console.log('PDF upload to s3 successful!', data)
}).catch(err => {
console.log('Error! PDF Upload to S3 failed', err)
})
});
});
Now, how can I parse the raw PDF data on the server and upload it?
As an alternative, is it possible to save my file locally and then upload the file to s3?
First question - you can use on Node server multer https://www.npmjs.com/package/multer . This way you don't have to decode pdf. You just handle request and pass file to S3 (via S3 node API). You can use mimetype to be sure someone is sending you pdf.
For sure if you've got application server such as Nginx, you can limit transfer file.
For example in Nginx client_max_body_size 10M;. It's more secure to check limit on server, because naughty users can always cheat your web validations. Multer also has size validation if you would like to return specific exception from your backend.