Sending large base64 image to socket.io server - javascript

I am trying to send an image encoded in base64 to a socket.io server. Simply emitting it doesn't seem to work. However, sending shorter strings works fine, the problem is with sending a really long base64 encoded image.
// this works
this.socket.emit('imageUpload', {captureType: "documentBack", data: "simple text"})
// this doesn't
this.socket.emit('imageUpload', {captureType: "documentBack", data: "really long base64 encoded image"})
I have a frontend built with React JS and backend built with Node.
Following the comment by #Zac, I've implemented socket.io-stream like the documentation instructs but I still can't see the request on the backend. The data appears to have been emitted from the client but I can't see the data on the event handler.
This is what I have so far:
// Client (React.js)
import ss from 'socket.io-stream';
stream = ss.createStream();
ss(this.socket).emit('imageUpload', this.stream, {
captureType: 'documentBack',
data: imageData,
});
// Server (Node.js)
socket.on("imageUpload", (stream, imageData) => {
console.log("Image uploaded");
// console.log(stream);
// console.log(imageData);
socket.broadcast.to(sessionId).emit("imageDownload", imageData);
});

Related

How to stringy a javascript object with a blob as data in it?

I am recording my webcam with MediaRecoder and sending each blob back to sever using Websocket as such :
recorder = new MediaRecorder(canvasStream);
recorder.ondataavailable = e => {
ws.send(e.data)
}
which works fine, however I want to have more control over the type of message or data that will be send through Websocket and therefore I went with the classic
ws.send(JSON.stringify({ type: 'REC', data: e.data }))
to no avail. I cannot obviously Parse the data back on the server. How can I send a blob to the server while stringifying my message?
Json is a text-based format, it cannot directly include binary data like Blobs. What you can do is to obtain the Blob's arrayBuffer, encode it with base64 or hex and send it as text. This will make the upload 1.5-2 times larger though.
Alternatively, you can try a binary transport, like MessagePack instead of json.

Nodejs - Fetch file from url and send content to client

For some reason, I don't want to share the URL public (sercet url),
My workflow is as below:
Client send API request to my server:
Api: mywebsite.com/api/image_abc.jpg
I have Nodejs express server to fetch the file from the url:
Eg: sercret_url.com/image_abc.jpg
And then from response image content from Nodejs, I send back the image content to the client and display as image_abc.jpg
I looked around on stackoverflow, but just got an answer from reading file from disk and send to client. What I want is just redirect the image content to client, not saving file to the disk.
Thank you.
Assuming you want to return the contents of a file from a certain URL to the client as buffer here's the solution I suggest
Get the file using axios and return the buffer to you client
const axios = require('axios');
let URL='some-valid-url'
const response = await axios.get(
URL,
{ responseType: 'arraybuffer' }
);
const buffer = Buffer.from(response.data, 'utf-8');
res.status(200).send(buffer);
In case you want to save it to your server you can use fs as follows to write the file to the folder of your choice
fs.writeFile(fileName, buffer, (err) => {
if(!err) console.log('Data written');
});

Export a file to Node server and then upload to S3

I'm generating HTML webpage as PDF, and then exporting it locally. How can I save this file to my node server and upload to S3
Please find the attached psuedo code
const convertDataToPdf = (exportFlag,cb)=>{ //set to switch between export and save
const doc = new jsPDF();
//... adding metadata and styling the pdf
if(exportFlag) {
doc.save('sample.pdf') //export PDF locally
} else {
cb(doc.output()) //this converts the PDF to raw to send to server
}
}
Based on a this answer, I'm appending the raw PDF data to a new FormData object, and then an ajax call to post the raw data to my nodejs server
convertDataToPdf(false, pdfData => {
let formData = new FormData();
formData.append(`file-1`, pdfData)
$.ajax({
url: '/file-upload',
data: formData,
processData: false,
contentType: false,
type: 'POST',
}).then(data => {
console.log('PDF upload to s3 successful!', data)
}).catch(err => {
console.log('Error! PDF Upload to S3 failed', err)
})
});
});
Now, how can I parse the raw PDF data on the server and upload it?
As an alternative, is it possible to save my file locally and then upload the file to s3?
First question - you can use on Node server multer https://www.npmjs.com/package/multer . This way you don't have to decode pdf. You just handle request and pass file to S3 (via S3 node API). You can use mimetype to be sure someone is sending you pdf.
For sure if you've got application server such as Nginx, you can limit transfer file.
For example in Nginx client_max_body_size 10M;. It's more secure to check limit on server, because naughty users can always cheat your web validations. Multer also has size validation if you would like to return specific exception from your backend.

Node.js Pipe a PDF API Response

So my scenario is a user clicks a button on a web app, this triggers a server side POST request to an internal (i.e non public) API sitting on another server in the same network, this should return a PDF to my server which will proxy (pipe) it back to the user.
I want to just proxy the PDF body content directly to the client without creating a tmp file.
I have this code which works using the npm request module but it does not feel right:
var pdfRequest = request(requestOptions);
pdfRequest.on('error', function (err) {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: ' + err, res);
});
pdfRequest.on('response', function (resp) {
if (resp.statusCode === 200) {
pdfRequest.pipe(res);
} else {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: RAW RESP: ' + JSON.stringify(resp), res);
}
});
Is the the correct way to pipe the PDF response?
Notes:
I need to check the status code to conditionally handle errors, the payload for the POST is contained in the requestOptions (I know this part is all correct).
I would like to keep using the request module
I defiantly do not want to be creating any temp files
If possible I would also like to modify the content disposition header to set a custom filename, i know how to do this without using pipes

dojo/request a png image array buffer using NodeJS, and return the image to the client

I am attempting to request a png image using NodeJS and dojo/request from an ArcGIS Server REST feed, and return the image to the client.
I need nodeJS to add a token to the request as a query parameter as my services are secured and I want to control their security through node and not on the resource server (ArcGIS Server 10.3). Without nodeJS in the picture, the png loads in the browser. This is an (open) example of a call returning a png image, using their standard REST feed:
https://sampleserver6.arcgisonline.com/arcgis/rest/services/911CallsHotspot/MapServer/1/images/257104fa1b21d7b483c160ee8f3943bb
I am using dojo/request to access this resource. It seems the png image is coming back to node fine, however when I res.send() back to the client, I get a 'not an image' in Fiddler. I can see the PNG header, and the IHDR and IEND in the hexview, however it is apparent that the encoding (? or other) is not the same as when the resource is returned directly from the ArcServer. When the png is received on the client returned from node, the content-length is a little less than when the same png comes directly from the ArcServer. (Node is attempting, by its default, to use Transfer-Encoding: Chunked. I set content-length to the length of the response, so it sends at one time, which I believe is the behaviour the client is expecting.)
ESRI (ArcGIS Server) have a new sample stub out which does exactly what I want to do, however they are using esri/request (built on dojo/request) client side, and leverage Blob library to create the correct response for the client. https://developers.arcgis.com/javascript/jssamples/data_requestArrayBuffer.html
We can see they use handleAs: "arrayBuffer", which I have tried on dojo/request and it seems to make no difference to the content of the returned png. When they receive the response, they make a new Blob and then read it as data URL with FileReader (which I have a node side implementation of) and directly use that result as the png.
I have tried ad-nausuem to replicate this, but it seems that node does not have a reliable Blob library (I have tried "Blob" with no luck) and I read that in node you are supposed to use Buffers instead. No matter what I pass to fileReader, I get the error "Cannot read as File" (I have tried all fileReader methods). The fileReader library: https://www.npmjs.com/package/filereader
ESRI code:
esriRequest(url, {handleAs: "arrayBuffer"}).then(function(response) {
reader.readAsDataURL(new Blob([response], {
type: "image/png"
}));
//reader.result is png ready to use client side
});
My attempts:
//tried handleAs: "arrayBuffer"
//tried npm Blob library
request(url, {handleAs: "arrayBuffer"}).then(function(response) {
var reader = new FileReader();
reader.addEventListener("loadend", function() {
res.setHeader('Content-Type', 'image/png');
res.setHeader('Content-Length', reader.result.length);
res.send(reader.result);
});
reader.readAsDataURL(new Blob([response], {
type: "image/png"
}));
});
I am using node package filereader and Blob, with no luck creating a blob to pass to fileReader. If nodeJS uses Buffers instead of Blob, given that the content returned by the service appears to be an ArrayBuffer png, how do I pass that back to the client, through node?
I have also attempted to use ArrayBuffer to Buffer methods (through Uint8Array) however once I have the png in buffer form I can never read it.
In short, I just want to proxy the png from its source to the client, adding a token query parameter.
Comments welcome to improve the question.
Cross posted:
https://gis.stackexchange.com/questions/174304/dojo-request-a-png-image-array-buffer-using-nodejs-and-return-the-image-to-the
https://geonet.esri.com/message/575847#575847
I was looking to do a similar action but didn't found any answer. After some trial and error I seem to have figured it out.
In your request options add the option encoding: null, this will get the image in a base64 format from the original source.
request({
url: yoururl,
method: 'GET',
encoding: null
}, function (error, response, body) {
if (!error && response.statusCode === 200) {
res.send(response.statusCode, body);
} else {
res.send(response.statusCode, body.toString('utf8'));
}
});
Then in my client I did a GET on my own url which gives you the base64 string. For your image on the client you can set the source to data:image/PNG;base64,{result}
var result = getBase64FromMyUrl();
var src = 'data:image/PNG;base64,' + result;

Categories