Axios in React Native - Cannot POST a Blob or File - javascript

I'm trying to post the raw data of a picture, using Axios, after taking it with react-native-image-picker.
I successfully generated a blob using this piece of code:
const file = await fetch(response.uri);
const theBlob = await file.blob();
If I inspect the metadata of the blob it's all right: MIME type, size, and so on.
However, when I try to POST it using Axios, using:
axios({
method: "POST",
url: "https://my-api-endpoint-api.example.org",
data: theBlob,
});
what I receive on the API side is this strange JSON payload:
{"_data":{"lastModified":0,"name":"rn_image_picker_lib_temp_77cb727c-5056-4cb9-8de1-dc5e13c673ec.jpg","size":1635688,"offset":0,"type":"image/jpeg","blobId":"83367ee6-fa11-4ae1-a1df-bf1fdf1d1f57","__collector":{}}}
The same code is working fine on React, and I have the same behavior trying with a File object instead of a Blob one.
I see in other answers that I could use something else than Axios, like RNFetchBlob.fetch(), but since I'm using shared functions between the React website and the React Native app, I'd really prefer an approach that allows me to use Axios and Blobs.
Is there some way to work around it?

Updated answer
As pointed out by #T.J.Crowder in the comments, there is a cleaner approach that works around the issue of the React Native host environment, without touching anything else on the code.
It's enough to add this on the index.js file, before everything else:
Blob.prototype[Symbol.toStringTag] = 'Blob'
File.prototype[Symbol.toStringTag] = 'File'
I leave my original answer here under since it's a working alternative if one doesn't want to mess up with the prototype.
Original answer
The described behavior happens because the host environment of React Native does not handle the Blob type nicely: it will actually become just an object.
In fact, if you try to render toString.call(new Blob()) in a component, you'll see [object Blob] in a browser, but [object Object] in React Native.
The matter is that the default transformRequest implementation of Axios will use exactly this method (toString.call) to check if you're passing a Blob or some generic object to it. If it sees you're passing a generic object, it applies a JSON.stringify to it, producing the strange JSON you're seeing POSTed.
This happens exactly here: https://github.com/axios/axios/blob/e9965bfafc82d8b42765705061b9ebe2d5532493/lib/defaults.js#L61
Actually, what happens here is that utils.isBlob(data) at line 48 returns false, since it really just applies toString on the value and checks if it is [object Blob], which as described above is not the case.
The fastest workaround I see here, since you're sure you're passing a Blob, is just to override transformRequest with a function that just returns the data as it is, like this:
axios({
method: "POST",
url: "https://my-api-endpoint-api.example.org",
data: theBlob,
transformRequest: (d) => d,
});
This will just make the request work.

I actually had this problem recently (using Expo SDK 43 on iPhone). I remember using axios over fetch because I had problems with uploading blobs with fetch in the past. But I tried it here and it just worked.
The context in this use case is downloading a giphy from url and then putting it on s3 with a signed request. Worked on both web and phone.
const blob = await fetch(
giphyUrl
).then((res) => res.blob());
fetch(s3SignedPutRequest, { method: "PUT", body: blob });

You can send file into server using formData through axios API like below :
const file = await fetch(response.uri);
const theBlob = await file.blob();
var formData = new FormData();
theBlob.lastModifiedDate = new Date();
theBlob.name = "file_name";
formData.append("file", theBlob);
axios.post('https://my-api-endpoint-api.example.org', formData, {
headers: {
'Content-Type': 'multipart/form-data'
}
})

Related

FastAPI returns "Error 422: Unprocessable entity" when I send multipart form data with JavaScript Fetch API

I have some issue with using Fetch API JavaScript method when sending some simple formData like so:
function register() {
var formData = new FormData();
var textInputName = document.getElementById('textInputName');
var sexButtonActive = document.querySelector('#buttonsMW > .btn.active');
var imagesInput = document.getElementById('imagesInput');
formData.append('name', textInputName.value);
if (sexButtonActive != null){
formData.append('sex', sexButtonActive.html())
} else {
formData.append('sex', "");
}
formData.append('images', imagesInput.files[0]);
fetch('/user/register', {
method: 'POST',
data: formData,
})
.then(response => response.json());
}
document.querySelector("form").addEventListener("submit", register);
And on the server side (FastAPI):
#app.post("/user/register", status_code=201)
def register_user(name: str = Form(...), sex: str = Form(...), images: List[UploadFile] = Form(...)):
try:
print(name)
print(sex)
print(images)
return "OK"
except Exception as err:
print(err)
print(traceback.format_exc())
return "Error"
After clicking on the submit button I get Error 422: Unprocessable entity. So, if I'm trying to add header Content-Type: multipart/form-data, it also doesn't help cause I get another Error 400: Bad Request. I want to understand what I am doing wrong, and how to process formData without such errors?
The 422 response body will contain an error message about which field(s) is missing or doesn’t match the expected format. Since you haven't provided that (please do so), my guess is that the error is triggered due to how you defined the images parameter in your endpoint. Since images is expected to be a List of File(s), you should instead define it using the File type instead of Form. For example:
images: List[UploadFile] = File(...)
^^^^
When using UploadFile, you don't have to use File() in the default value of the parameter. Hence, the below should also work:
images: List[UploadFile]
Additionally, in the frontend, make sure to use the body (not data) parameter in the fetch() function to pass the FormData object (see example in MDN Web Docs). For instance:
fetch('/user/register', {
method: 'POST',
body: formData,
})
.then(res => {...
Please have a look at this answer, as well as this answer, which provide working examples on how to upload multiple files and form data to a FastAPI backend, using Fetch API in the frontend.
As for manually specifying the Content-Type when sending multipart/form-data, you don't have to (and shouldn't) do that, but rather let the browser set the Content-Type—please take a look at this answer for more details.
So, I found that I has error in this part of code:
formData.append('images', imagesInput.files[0]);
Right way to upload multiple files is:
for (const image of imagesInput.files) {
formData.append('images', image);
}
Also, we should use File in FastAPI method arguments images: List[UploadFile] = File(...) (instead of Form) and change data to body in JS method. It's not an error, cause after method called, we get right type of data, for example:
Name: Bob
Sex: Man
Images: [<starlette.datastructures.UploadFile object at 0x7fe07abf04f0>]

NodeJS fetch failed (object2 is not iterable) when uploading file via POST request

I'm trying to upload a file using native fetch in NodeJS (added in node 17.5, see https://nodejs.org/ko/blog/release/v17.5.0/).
However, I keep getting the following error -
TypeError: fetch failed
at Object.processResponse (node:internal/deps/undici/undici:5536:34)
at node:internal/deps/undici/undici:5858:42
at node:internal/process/task_queues:140:7
at AsyncResource.runInAsyncScope (node:async_hooks:202:9)
at AsyncResource.runMicrotask (node:internal/process/task_queues:137:8)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
cause: TypeError: object2 is not iterable
at action (node:internal/deps/undici/undici:1660:39)
at action.next (<anonymous>)
at Object.pull (node:internal/deps/undici/undici:1708:52)
at ensureIsPromise (node:internal/webstreams/util:172:19)
at readableStreamDefaultControllerCallPullIfNeeded
node:internal/webstreams/readablestream:1884:5)
at node:internal/webstreams/readablestream:1974:7
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
I'm using the following code to create and submit the form response -
function upload(hub_entity_id, document_path) {
let formData = new FormData();
formData.append("type", "Document");
formData.append("name", "ap_test_document.pdf");
formData.append("file", fs.createReadStream("ap_test_document.pdf"));
formData.append("entity_object_id", hub_entity_id);
const form_headers = {
Authorization: auth_code,
...formData.getHeaders(),
};
console.log(
`Uploading document ap_test_document.pdf to hub (${hub_entity_id}) `
);
console.log(formData);
let raw_response = await fetch(urls.attachments, {
method: "POST",
headers: form_headers,
body: formData,
});
console.log(raw_response);
}
Issue with form-data package:
The formData structure is not parseable by Node.js, so it throws: object2 is not iterable.
On the other hand, the sad story is formData will not be maintained anymore,
and you may have noticed that two years have passed since the last version was published. So they officially announced that formData will be archived:
The final nail in the coffin of formData.
Will this be the time for deprecation? form-data haven't been updated in a while and it still lacks some method that should be provided according to the spec. node-fetch#3 stopp recommended ppl using form-data due to inconsistency with spec compatible FormData and recommend that ppl use built in FormData or a spec:ed formdata polyfill that supports iterating over all fields and having Blob/File support
Solutions
1. using form-data package:
stream.Transform:
With the stream.Transform class from Node.js stream and passing the form-data instance, we can send the request with the built-in fetch API.
from Node.js doc:
Transform streams are Duplex streams where the output is in some way related to the input. Like all Duplex streams, Transform streams implement both the Readable and Writable interfaces.
So we can achieve it like this:
import { Transform } from 'stream';
// rest of code
const tr = new Transform({
transform(chunk, encoding, callback) {
callback(null, chunk);
},
});
formData.pipe(tr);
const request = new Request(url, {
method: 'POST',
headers: form_headers,
duplex: 'half',
body: tr
})
let raw_response = await fetch(request);
stream.PassThrough:
Instead of returning each chunk of stream, We can simply use stream.PassThrough:
import { PassThrough } from 'stream';
// rest of code
const pt = new PassThrough()
formData.pipe(pt);
const request = new Request(url, {
method: 'POST',
headers: form_headers,
duplex: 'half',
body: pt
})
let raw_response = await fetch(request);
Important note: If you don't pass duplex: 'half', you would get this error:
duplex option is required when sending a body
2. using built-in form-data
Currently, the part of the Node.js core that handles fetch is named undici.
Luckily, you don't need to use any third-party module for handling any form-data, since undici has implemented it and is now a part of Node.js core.
Sadly, working with part of undici streaming is not easy and straightforward. However, you can still achieve it.
import { Readable } from 'stream';
// For now, it is essential for encoding the header part, or you can skip importing this module and instead implement it by yourself.
import { FormDataEncoder } from 'form-data-encoder';
// This is a built-in FormData class, as long as you're using Node.js version 18.x and above,
// no need to import any third-party form-data packages from NPM.
const formData = new FormData();
formData.set('file', {
name: 'ap_test_document.pdf',
[Symbol.toStringTag]: 'File',
stream: () => fs.createReadStream(filePath)
});
const encoder = new FormDataEncoder(formData)
const request = new Request(url, {
method: 'POST',
headers: {
'content-type': encoder.contentType,
Authorization: auth_code
},
duplex: 'half',
body: Readable.from(encoder)
})
let raw_response = await fetch(request);
P.S:
You may need to read this issue, for the part about encoding.
I have also faced this type of formdata issue in react app. After using single quote instead of double quote resolved my issue.
Try this:
formData.append('type', "Document");
formData.append('name', "ap_test_document.pdf");
formData.append('file', fs.createReadStream("ap_test_document.pdf"));
formData.append('entity_object_id', hub_entity_id);
Let me know if this solved your problem or not.
The usage of
new Request(url, {
method: 'POST',
headers: ...,
duplex: 'half',
body: ...
})
in Mostafa Fakhraei's answer is necessary to avoid errors caused by the absence of a size attribute in the formData.set statement. See also this GitHub issue, including the comment which considers this approach a misuse of FormData.
Here is an approach without any help from FormData or similar packages. You can make the request with body set to a pass-through stream into which you
write some form fields
then pipe the fs.createReadStream
then write some more form fields.
let body = new stream.PassThrough();
body.write(`--xxx\r
Content-Disposition: form-data; name="type"\r
\r
Document\r
--xxx\r
Content-Disposition: form-data; name="name"\r
\r
ap_test_document.pdf\r
--xxx\r
Content-Disposition: form-data; name="file"; filename="ap_test_document.pdf"\r
Content-Type: application/pdf\r
\r
`);
fs.createReadStream("ap_test_document.pdf")
.on("end", function() {
body.end(`\r
--xxx\r
Content-Disposition: form-data; name="entity_object_id"\r
\r
${hub_entity_id}\r
--xxx--\r
`);
})
.pipe(body, {end: false});
let raw_response = await fetch(urls.attachments, {
method: "POST",
headers: {"content-type": "multipart/form-data; boundary=xxx"},
duplex: "half",
body
});
Noteworthy points:
There is a body.write before the .pipe(body, {end: false})
The extra {end: false} allows you to write additional fields into the body after the piping has completed.
This writing of additional fields happens through body.end in the end event of the fs.createReadStream.
The boundary xxx should be a longer, random string. It must be mentioned in the content-type header and must not occur in the streamed file (otherwise the client would think the file has ended; this problem would vanish if the Content-Length for the file was set in an extra line after the Content-Type).
A \r is required before each newline, otherwise the multer middleware does not parse the multipart body ("Unexpected end of form at Multipart._final").
I assume that the FormDataEncoder in Mostafa Fakhraei's answer does something similar internally.
Use fs.readFileSync instead of createReadStream
formData.append("file", fs.readFileSync("ap_test_document.pdf"));
Also you can use filename direct in formData.append
formData.append("file", fs.readFileSync("ap_test_document.pdf"), "ap_test_document.pdf");

Why does FormData appends extra empty arrays?

I am trying to upload multiple images via axios to backend server on a new application. It worked perfectly on my previous applications I have developed. When I did some inspections, the problem was there were a couple of arrays with empty values added to the formdata. It didn't happen on the previous apps I developed.
This is the code to upload the files:
const formData = new FormData();
for(var i =0;i<data.files.length;i++){
let name = data.files[i].name
let ext = data.files[i].name.split('.')
ext = ext[ext.length-1].toLowerCase() == 'dcm' ? 'DCM':'IMG'
let content = data.files[i]
console.log({name,ext,content})
formData.append('types',ext)
formData.append('names',name)
formData.append('images',content)
}
const response = await axios.post(`${API_BASE_URL}images/`,
formData,
{
headers: {
"Content-Type": 'application/x-www-form-urlencoded',
"Authorization": TOKEN
}
})
return response;
This is the request payload of the old application
And this is the request payload of the new application
I use the same exact code for both of the new and old app. Notice that in the new app somehow the formdata appends a lot of arrays with empty values, while in the old app there's no such thing.
Can anybody tell me what happened here?
EDIT: It seems that axios's current version (0.27.1) sends data differently than the previous axios version I used (0.26.0). It worked fine when I downgraded the version number. Still curious what happened though...

Node.js Express send huge data to client vanilla JS

In my application I read huge data of images, and send the whole data to the client:
const imagesPaths = await getFolderImagesRecursive(req.body.rootPath);
const dataToReturn = await Promise.all(imagesPaths.map((imagePath) => new Promise(async (resolve, reject) => {
try {
const imageB64 = await fs.readFile(imagePath, 'base64');
return resolve({
filename: imagePath,
imageData: imageB64,
});
} catch {
return reject();
}
})));
return res.status(200).send({
success: true,
message: 'Successfully retreived folder images data',
data: dataToReturn,
});
Here is the client side:
const getFolderImages = (rootPath) => {
return fetch('api/getFolderImages', {
method: 'POST',
headers: { 'Content-type': 'application/json' },
body: JSON.stringify({ rootPath }),
});
};
const getFolderImagesServerResponse = await getFolderImages(rootPath);
const getFolderImagesServerData = await getFolderImagesServerResponse.json();
When I do send the data I get failure due to the huge data. Sending the data just with res.send(<data>) is impossible. So, then, how can I bypass this limitation - and how should I accept the data in the client side with the new process?
The answer to your problem requires some read :
Link to the solution
One thing you probably haven’t taken full advantage of before is that webserver’s http response is a stream by default.
They just make it easier for you to pass in synchron data, which is parsed to chunks under the hood and sent as HTTP packages.
We are talking about huge files here; naturally, we don’t want them to be stored in any memory, at least not the whole blob. The excellent solution for this dilemma is a stream.
We create a readstream with the help of the built-in node package ‘fs,’ then pass it to the stream compatible response.send parameter.
const readStream = fs.createReadStream('example.png');
return response.headers({
'Content-Type': 'image/png',
'Content-Disposition': 'attachment; filename="example.png"',
}).send(readStream);
I used Fastify webserver here, but it should work similarly with Koa or Express.
There are two more configurations here: naming the header ‘Content-Type’ and ‘Content-Disposition.’
The first one indicates the type of blob we are sending chunk-by-chunk, so the frontend will automatically give the extension to it.
The latter tells the browser that we are sending an attachment, not something renderable, like an HTML page or a script. This will trigger the browser’s download functionality, which is widely supported. The filename parameter is the download name of the content.
Here we are; we accomplished minimal memory stress, minimal coding, and minimal error opportunities.
One thing we haven’t mentioned yet is authentication.
For the fact, that the frontend won’t send an Ajax request, we can’t expect auth JWT header to be present on the request.
Here we will take the good old cookie auth approach. Cookies are set automatically on every request header that matches the criteria, based on the cookie options. More info about this in the frontend implementation part.
By default, cookies arrive as semicolon separated key-value pairs, in a single string. In order to ease out the parsing part, we will use Fastify’s Cookieparser plugin.
await fastifyServer.register(cookieParser);
Later in the handler method, we simply get the cookie that we are interested in and compare it to the expected value. Here I used only strings as auth-tokens; this should be replaced with some sort of hashing and comparing algorithm.
const cookies = request.cookies;
if (cookies['auth'] !== 'authenticated') {
throw new APIError(400, 'Unauthorized');
}
That’s it. We have authentication on top of the file streaming endpoint, and everything is ready to be connected by the frontend.

Why does S3 file get a prefix after upload?

StackOverflow 👋
I'm putting an object to S3 directly from the browser via a signedUrl.
The code I'm using looks roughly like this:
const formData = new FormData()
const file = await selectorToInput.files[0]
formData.append('file', file)
await fetch(uploadUrl, {
method: 'PUT',
body: formData,
mode: 'cors',
headers: {
'Content-Type': 'text/yaml',
}
}).then(r => r.ok)
The upload is successful, however, when retrieving the objects after they've been uploaded they're all prefixed with characters like this:
-----------------------------3536405376111676041452100156
'Content-Disposition: form-data; name="file"
<THE REST OF THE FILE CONTENTS>
They look like file headers of some kind, but can't figure out for the life of me where they're coming from. For more context, these files are all yaml and these additional characters are causing the parser I'm using to throw malformed yaml errors, so I don't feel like they're suppose to be there.
I've also tried this without using the .text() call on the File object and get the same result with different looking headers. Is this an issue/feature of Fetch?
I wish I could provide more info, but I've been searching for several hours now and haven't found an explanation. Any help is greatly appreciated.
Answering my own issue here. After more digging I found this issue on the amazon sdk. Turns out that wrapping the file in FormData was the issue. This goes contrary to all other documentation I found, so hopefully this helps someone else.

Categories