I am trying to send a base64 string of a file in an axios request body. The file size is arond 370KB.
I got request payload too large 413 error. After doing some research in internet learned that server is limiting the request size.
Till now my understanding is clear.
Now I changed it to formData and passing that form data as a request body. And I am not getting any 413 error. Server neatly provessed my request.
So what hapenned between formData and server?
Server is running on Nginx, Node, Express.
By default axios sends data as JSON and on Node.js side JSON parser has a default limit of 100KB. So you can either continue to use formData or increase a limit in JSON parser options.
app.use(json({
limit: '20mb'
}));
But if you intend to send large content often then consider using formData or even send content as binary.
Upd. For FormData if you usually will process it by multer you will have the following limits by default:
Field size: 1048576 bytes
File size: unlimited
Related
I want:
To send a file (I have my eye on .docx and text files, but let's use a .pdf as an example) as binary in the body of a POST request from a browser (Javascript).
Main problem
I can do this just fine in POSTMAN. You can select "binary" as the body type, and viola! your file is configured to be the body. But I don't know how to mimic that behavior in Javascript.
My question is: On the client-side, in Javascript, how can I get a file into my POST request as binary?
Specifically, how can I get the file into the same format that POSTMAN uses when you select Body -> Binary in a POST request?
For context:
I have been using this guide to get everything configured how I want in AWS. It ends with making requests in POSTMAN. But adding a file as binary in POSTMAN is one thing - doing it from a browser in Javascript is another, and the main question that I have.
I am sending this through API Gateway to a Lambda function. I have API Gateway configured to handle application/pdf as binary, and the Lambda function to decode it once it arrives.
So I think I want to hand it in as a binary blob, not base64. But not sure exactly how.
JavaScript
postBinary() {
var settings = {
"url": "https:<my-aws-api>.amazonaws.com/v1/upload",
"method": "POST",
"timeout": 0,
"headers": {
"Content-Type": "application/pdf"
},
"data": <my pdf file here as binary>
};
$.ajax(settings).done(function (response) {
console.log(response);
});
},
API Gateway:
Integration has 'When there are no templates defined (recommended)' set to 'Content-Type':'application/pdf'. The API's Binary Media Types have 'application/pdf' set. I know I have CORS set correctly - I can pass strings through the POST request and get success messages back, but I would like to handle files here, not just a simple string. I also want to avoid requiring the client side to parse out data on their end.
My Lambda function will take the file and then parse information out of it, then send it back.
Lambda function:
import json
import base64
import boto3
BUCKET_NAME = 'my-bucket'
def lambda_handler(event, context):
file_content = base64.b64decode(event['content'])
parsed_data = some_function(file_content) # parse information from file
return {
'statusCode': 200,
'body': {
'file_path': file_path
}
}
In the end, we want a user experience of: choose a file, send to API, get back parsed data. Simple.
Note: I know there are lots of good reasons to put files in s3 instead of going through Lambda first. But our files are small, and we are not concerned about taking considerable compute time/power in Lambda. Further, we want to avoid sending to s3 right away because we would like the user to only have to make one call to the API: send file in POST request, get results. If we send to s3 first, the user has to send multiple requests: request pre-signed URL, send file to s3, request results from parsing.
I am mostly concerned with the fact that this is possible in POSTMAN and it must be possible via browser/Javascript as well.
Thanks, everyone!
The npm-request library allows me to construct HTTP requests using a nice JSON-style syntax, like this.
request.post(
{
url: 'https://my.own.service/api/route',
formData: {
firstName: 'John',
lastName: 'Smith'
}
},
(err, response, body) => {
console.log(body)
}
);
But for troubleshooting, I really need to see the HTTP message body of the request as it would appear on the wire. Ideally I'm looking for a raw bytes representation with a Node.js Buffer object. It seems easy to get this for the response, but not the request. I'm particularly interested in multipart/form-data.
I've looked through the documentation and GitHub issues and can't figure it out.
Simplest way to do this is to start a netcat server on any port:
$ nc -l -p 8080
and change the URL to localhost in your code:
https://localhost:8080/v1beta1/text:synthesize?key=API_KEY
Now, any requests made will print the entire, raw HTTP message sent to the localhost server.
Obviously, you won't be able to see the response, but the entire raw request data will be available for you to inspect in the terminal you have netcat running
I figured out how to dump the HTTP message body with Request. In both cases, I'm just copying the same approach that request uses internally.
Multipart Form Uploads
req._form.pipe(process.stdout);
URL-encoded Forms
console.log(req.body);
You could try #jfriend00 suggestion an use a network sniffer like wireshark but as you're fetching an https URL this might not be the easiest route as it requires some setup to intercept TLS connections.
So maybe it would be enough turning on debug mode for the request module itself, you can do that by simply setting require('request').debug = true. As a third option you could go with the dedicated debug module for request here which allows you to view request and response headers and bodies.
I can think of a number of ways to see the bytes of the request:
Turn on debugging in the request module. There are multiple ways to do that documented here including setting NODE_DEBUG=request or require('request').debug = true or using the request-debug module.
Use a network sniffer to see what's actually being sent over the socket, independent of your node.js code.
Create your own dummy http server that does nothing but log the exact incoming request and send your same request to that dummy server so it can log it for you.
Create or use a proxy (like nginx) that can dump the exact incoming request before forwarding it on to its final destination and send the request to the proxy.
Step through the sending of the request in the debugger to see exactly what it is writing to the socket (this may be time consuming, particularly with async callbacks, but will eventually work).
you could use a nodejs server capable of logging the raw request/response string , then direct your request to that server
i gave an example using both http and https server - no dependencies
nodejs getting raw https request and response
I'm building a simple proxy with php for server side and javascript for the client side (browser).
the php server receives post requests with json data , something like that:
{headersArray: ["Get /someurl...","Cookie: some cookie"],url...}
than after extracting the headers and the url the php code uses curl to fetch the resource , and at this point i would like to construct a response object on the client via the "Response" constructor , so i need to transmit back to the client the headers also so i thought about constructing again a json object which will contain the headers and the response body , but a lot of servers uses gzip encoding, can i insert the gzip encoded response body as a json property and safely transmit it back to the client? would i need to decode it on the client browser? does it add a lot of overhead? any better ideas?
I'm appending a BLOB in the FormData like,
const formData = new FormData();
formData.append('xyz', blob);
While uploading this blob, following is the ugly payload I've seen(this ugly data is too long to show here).
Content-Disposition: form-data; name="xyz"; filename="xyz.png"
Content-Type: image/png
PNG
IHDRà5ÑÜä IDATx^\½i\i¥g¾/A23««» ÂHh©×ZÔ=ÐHúÿ¿#ôQ½Õ$cñݯð<ÇÞ %öp²2Ép¿÷]l9vìØì¿ÿoÿvº\¯õ°¨ût¯óý\÷ºÖz»ªé~©UÝk5j9¿×v³ªÏ_¾ÔùrÙ|UëÕ¶f³y]n׺MתùTóÅTÓt«ÛíR󥶫}}Ø.....
------WebKitFormBoundary1wDstGejHPb3PhBI
Due to which server is blocking this file to be get uploaded with HTTP 403 Forbidden error. As there are some rules applied at the server side using AWS WAF. These rules are re:SQL injection, XSS, etc.
I think, those rules are blocking the request because of this ugly payload and if I append basic file object(not a BLOB), 403 error is not coming! It works properly with the File Object and this error comes only for BLOB upload.
Let me know your thoughts and how to prevent it!
I am building a frontend application in which I'm going to retrieve files via the API provided from backend.
The API consumes json request like most restful APIs whereas responses file in multipart/form-data type. Therefore when I tried to get the body of the response with axios, data appears like this.
--77d4f4ac-bcb2-4457-ad81-810cf8c3ce47
Content-Disposition: attachment; filename=20170822.txt
Content-Type: text/plain
...data...
--77d4f4ac-bcb2-4457-ad81-810cf8c3ce47--
It confused me quite a lot since I'm used to deal with raw data with blob object. However it seems that I have to parse the response by myself here in order to get the raw data. I searched around but found almost all of the articles and questions are discussing about server-side handling. So my question can be separated into 2 pieces.
Is it okay/possible to handle multipart/form-data on client side?
If it is, how can I handle it? (Of course, it will be really appreciated if there's a library for it)