I'm trying to get a POST endpoint working with AWS, to post an image to an amazon bucket, but i'm getting the following error from s3fs/aws-sdk.
Unhandled rejection MalformedXML: The XML you provided was not well-formed or did not validate against our published schema
Why is this error happening and how can it be fixed?
Here is my POST function amongst other relevant things:
import fs from 'fs';
import s3fs from 's3fs';
const S3FS = s3fs;
const s3fsImp = new S3FS('testbucket', {
accessKeyId: 'asdf...',
secretAccessKey: '1234...',
});
s3fsImp.create();
...
const file = req.files.file;
const stream = fs.createReadStream(file.path);
return s3fsImp.writeFile(file.originalFileName, stream).then(() => {
fs.unlink(file.path, (err) => {
if (err) {
console.error(err);
}
res.json({ working: true });
});
});
I find the error message quite useless because i'm not providing any XML. Do I need to declare/flag a schema somewhere or something? I don't understand why this is happening yet.
I'm testing this with Postman:
no content type set
sending image via body 'file' set as type 'file' (as apposed to 'text')
plain form-data
I've been reading some documentation and tutorials and i'm now unsure how to continue with this. Any help is appreciated.
Related
I use XMLHttpRequest in Node.js and when handling request in server.js (asynchronous POST request) I use this code:
const filePath = './input.pdf';
fsPromises.readFile(filePath)
.then(res => {
const pdfBytes = new Uint8Array(res).buffer;
})
.catch(err => {
console.log('Could not read file-template');
});
then I get ArrayBuffer through this code. But I want to move this code into module and call export function there with this code. This change disrupted reading file and devtools network returns error in my request with message 'Provisional headers are shown'.
How can I make it work in module, not in server.js?
I attached images with messages in devtools img1, img2
I Googled this but couldn't find an answer but it must be a common problem. This is the same question as Node request (read image stream - pipe back to response), which is unanswered.
How do I send an image file as an Express .send() response? I need to map RESTful urls to images - but how do I send the binary file with the right headers? E.g.,
<img src='/report/378334e22/e33423222' />
Calls...
app.get('/report/:chart_id/:user_id', function (req, res) {
//authenticate user_id, get chart_id obfuscated url
//send image binary with correct headers
});
There is an api in Express.
res.sendFile
app.get('/report/:chart_id/:user_id', function (req, res) {
// res.sendFile(filepath);
});
http://expressjs.com/en/api.html#res.sendFile
a proper solution with streams and error handling is below:
const fs = require('fs')
const stream = require('stream')
app.get('/report/:chart_id/:user_id',(req, res) => {
const r = fs.createReadStream('path to file') // or any other way to get a readable stream
const ps = new stream.PassThrough() // <---- this makes a trick with stream error handling
stream.pipeline(
r,
ps, // <---- this makes a trick with stream error handling
(err) => {
if (err) {
console.log(err) // No such file or any other kind of error
return res.sendStatus(400);
}
})
ps.pipe(res) // <---- this makes a trick with stream error handling
})
with Node older then 10 you will need to use pump instead of pipeline.
I'm using form-data and Axios to upload a file to another API from a Firebase Function. However, when I add createReadStream(filePath) to my formData object and attach it to my POST request, the response gives a 400 status code, claiming that the file is not present. My code:
const { fileName } = data;
const tempFilePath = path.join(os.tmpdir(), 'image.jpg');
const bucket = admin.storage().bucket();
await bucket.file(fileName).download({destination: tempFilePath});
let formData = new FormData();
formData.append('photo', fs.createReadStream(tempFilePath));
const formHeaders = formData.getHeaders();
await axios.post('api/endpoint', formData, {
headers: {
...formHeaders
}
}).catch(err => {
console.error(err.response.data)
});
The error I get after the Post request has a status code of 400 and is as follows:
{ error: 'invalid_query_missing_photo', ok: false }
I have verified that the tempFilePath does lead to an actual file, and I am able to make the request from Postman. However, even the code generated from Postman's code generation feature results in the same error.
My import statements:
// The Cloud Functions for Firebase SDK to create Cloud Functions and setup triggers.
const functions = require('firebase-functions');
const axios = require('axios');
// The Firebase Admin SDK to access Firestore.
const admin = require('firebase-admin');
admin.initializeApp();
const path = require('path');
const os = require('os');
const fs = require('fs');
const { firebaseConfig } = require('firebase-functions');
const FormData = require('form-data');
The error 400 is coming from the second API, maybe something else is required to mark an take the request as valid, or the field required is not named photo.
I tested your code and I created another function simulating the second API, in my second API, I experienced an HTTP error 400 when the file field of the form doesn't match with the required in the API
Second API Code (Function written on python)
import os
def hello_world(request):
print(request.files)
#this must match with the form field or you gonna receive an error 400
file = request.files['photo']
filename="cool.png"
file.save(os.path.join("/tmp", filename))
#this is to verify that the object were saved and is in /tmp
print('Size :', os.path.getsize(os.path.join("/tmp", filename)))
return f'Hello World!'
With this second function I can validate that the issue is not coming from your function code.
I discovered you need to use getRawBody from the createReadStream, but only for cloud functions. This is what finally worked for me for google cloud firebase functions:
const fullFile = await getRawBody(fs.createReadStream(fileLocation));
...
formData.append('file', fullFile, { filename: 'somename.png'} );
...
await axios.post('api/endpoint', formData.getBuffer(), {
headers: {
...formHeaders
}
}).catch(err => {
console.error(err.response.data)
});
This solved my error of:
TypeError [ERR_INVALID_ARG_TYPE]: The "string" argument must be one of type string, Buffer, or ArrayBuffer. Received type object
Locally readstream seems to be converted into a full buffer automatically, but on cloud servers you need to convert it manually. There doesn't seem to be any documentation about this anywhere.
I have written a simple function to handle upload of files in my sails.js app.
let upload = file.upload((err, uploadedFiles) => {
if (err) {
return res.serverError(err);
} else {
return res.send({ data: uploadedFiles });
}
});
When the upload is complete I am redirected to a page displaying raw json, which contains the uploaded file information (including the path).
raw json response
What I am expecting when I console.log(upload) is the same information, however I am getting the writestream instead.
console.log output
This is a problem for me because I would like to be able to extract the file name from the object and use it in another part of my program, but I can't do this because all I am able to access is the writestream.
I have tried using async/await and callbacks and can't seem to fix my issue.
Hopefully someone can help me!
Thanks
A helpful person on the sails Gitter suggested that I use this package, which supports async/await: https://www.npmjs.com/package/sails-hook-uploads
I tested it out with the following code and it works:
let upload = await sails
.uploadOne(file, {
maxBytes: 3000000,
})
.intercept('E_EXCEEDS_UPLOAD_LIMIT', 'tooBig')
.intercept(
(err) => new Error('The photo upload failed: ' + util.inspect(err))
);
I send an image file to my node server via my react app -
I want to host these images on google cloud or similar so they have an accessible URL.
I have tried using cloudinary and google cloud but to no avail thus far!
My react-side code (shortened):
imageFile = this.state.files[0])
const formData = new FormData()
formData.append('file', imageFile);
sendImage(formData)
sendImage(image) {
axios.post("https://137a6167.ngrok.io/image-upload", image, {
})
.then(res => { // then print response status
console.log(res.statusText)
})
}
The file is successfully sent to my server and consoled:
app.post('/image-upload', (req, res) => {
console.log('consoling the req.body!!!!' + JSON.stringify(req.body))
})
THE CONSOLE: consoling the req.body!!!!{"1":"[object File]"}
I did try use this following cloudinary method, yet it threw errors:
cloudinary.config({
cloud_name: process.env.CLOUD_NAME,
api_key: process.env.API_KEY,
api_secret: process.env.API_SECRET
})
app.use(formData.parse())
app.post('/image-upload', (req, res) => {
const values = Object.values(req.files)
const promises = values.map(image => cloudinary.uploader.upload(image.path))
Promise
.all(promises)
.then(results => res.json(results))
})
this gave me the error that an unhandled error in the promise wasnt handled and i got a bit lost with where to go beyond that!
I looked at google cloud storage too but couldn't get it working! Any advice?
What I really want to do is return back to my react app the URL of the hosted image - so it can be stored in DB for the user!
If you can help at all that would be greatly appreciated, thank you.
There are couple of things you need to fix on the front end before you try to upload to any cloud.
First you need to set 'Content-Type': 'multipart/form-data' header in axios to send the file data properly.
Check this thread for more details: How do I set multipart in axios with react?
Then on the express side you need multer or some other similar library to receive the data. You can't access it from req.body. multer adds req.files for example.
https://github.com/expressjs/multer
Try there steps and then post the exact error message you are receiving from google cloud.