Axios serving png image is giving broken image - javascript

I am using express as middle-ware api.A person hits my middle-ware , it hits some third party api and returns back the result. Its working fine for all other end points expect an get end point that creates an png and serves it.
so under method returns response
_execute = R.curry((reqFunction, req, res) => {
reqFunction(req, res).
then(r => {
res.status(r.status).header(r.headers).send(r.data)
}).
catch(err => {
res.status(err.response.status).header(err.headers).send(err.response.data)
})
});
and reqFunction is method like
modelDiagram = (req, res) => {
const headers = req.headers;
const modelName = req.params['modelName'];
const endPoint = this.modelDiagramEndPoint + '/' + modelName;
return this.httpRequests.get(endPoint, headers);
}
and httpRequests.get method is
get = (endPoint, headers) => {
let options = {
method: 'GET',
url: endPoint
}
options = this.addHeaders(headers, options);
return this._send(options);
}
and this._send is like
_send = (options) => {
return axios(options);
};
the response if I directly hit from browser is like
```
�PNG
IHDRn��7Q�
IDATx���oHk����D�ȥ|!H�DD�^)"k"n�̤$H�V뒻�Vж"�+��n-"�[��X񲻝L��լ�[Z�^����lnK&K���=�9�9z�sʋs�~���g�g��g����gf��l��z��QIcc�dEE�<��?�\YY�վ}�R7!{�����'O�T�C�x�B�fff��۷UYYٿ8��^�a� ������?��92O��L2w�����{���s�NXZ��k۝N�r�\�<.�N������/_���尵�����ϫ���E�['���"�x˞|f ���~��N��K�d�����u���3j��u5�����%p6���Sj���ꛎ3>�<)�M�M�y4���v��ܳ{�
N�gSpCw~��p�[Opr��Φ�F��M#pR8��[�z������ä'O�g����-+�I7�v�����oW��<)�
�ɓ�|�p��2;<=�)�fF;Փ��4 4ò�����8Qf����V�,X�n��},4À���H�8�n}}�������8"���À7p�/]��j����h�0g�#Bfપ�Too/������z��mD�
8���Ḹ8�w�^������v�ڵ��v�ܩ��������ݭˬgX�ۺu����
x####:��N(�Ο?��o߮6o޼�'YSS�n߉'|����Tcccx����P���::Z����h1�v�U~~����Q)))����9�V;�[�n�tAA����U����l+P$O�[��̙3�3׹Z[���|)�ϯ��WIII>>^
|���7o�q�~��RzYT��q��#�]7p2���ZZZ�Ç}v^TT��Feee<��0�;v�P�����o�U[[�����e�� ���Q������m]]��,//ϻ~EE��3"��ի���<h���[���Έ����;��.Y^��SYY���l�'X�ն�񤥥��j����4##C/���{��#u�.P{�
�Ѹ;7����9�a58����j[���u����3�Ȝ'��Zu~8��������{���ӧO��\�IWWW�lc̃����6Z�sIo۶Mϛ�p��ޮӒ'�+��3P[�yH�|e����e���аf�����+鱱1UZZj���N����T}�577���r���F�A����1b�b����R��0�O�拓#e���-�C%&&�mUTTԚO��$e�H1��ӿ���A}A#?��������H�����#�&7bo�%z�̐3/77W�>��#/8�a�a�p�ÀÀ�88:�Hyz�x��[�].�L$|w <55����tvv����z=77���۸HhN��;�_�yV��!�#o�l��ߟ���Y!�!�ppp�C�C����888�!�!�!�pppp�C�C������8zppp��)�>w�����k��[V��N�S�\�W�X'�zzz/��X���ˇ�-��W"
h���l�-�|w�v��������hy06���|�N1Կ�jDM/�����'�����/՝��5��w�[:���Hr�zc �����η�37��C��U����8�a�pNn�C�ƍ�M�ɣ�P����&��a�<�����|�ppp��0�8�a�a�.���xb;u���8���k׮��?�\EEE�t��I���*))I��ǫ��]6;;����Ull����Q���3��|��ƍ�zU\�.����y���z���W���+�����W�Ѽ�v���RGJJ�ONN���推���!鼼<]��7ܪ��M[Ғglk�9uꔷ��������kjjt^UU�^>wr�����}&$$����H�GFF��YYY�.77W�t���d�aɼl�^ZZ�6"�XwϞ=:��#qYY�Η��O�����T�,���cLKKS���A��}���8pi���^xN�Gcݒ�������N���ѣ�u233u~SS�>y����d�?11���dx��9���V-,,�5栵:�#�d{{�NK���1J}:-s��˗��HZ�e~mnn^1?ÜX���9$=66�JKK}�d��tuuu䂓�]�P����VK�$J�l'��y����X�����6�?::��dH4G���P�������B��&s�� Nr���O�\���
8�0�0�8�a�a�p��8l'p��V^��t��5�w�ᩩ�/-w���sOWW�빹��t��E�#s:��Y�҇ȳ�n��|�fC,��,$h!�B!����?Ǔ�8���IEND�B�
```
but image is broken. I have tried many things like req.end and explicitly setting headers but I am not able to solve it.Please guide me

Sorry for late answer but maybe someone find this approach more reliable:
axios.get('/path/to/image.png', { responseType: 'arraybuffer' })
.then(response => {
let blob = new Blob(
[response.data],
{ type: response.headers['content-type'] }
)
let image = URL.createObjectURL(blob)
return image
})
How I understand this:
When you specifies in options { responseType: 'arraybuffer' } axios will set response.data to ArrayBuffer
Next you create file-like object - Blob
You get url to blob-file via URL.createObjectURL(blob) that you can use in src attribute of img tag
As described in this article approach above should be faster.
Here is jsfiddle demo
UPD:
Just find out that you can specify in options responseType: 'blob' so you don't need to create Blob object yourself:
axios.get('https://picsum.photos/300/300', {responseType: 'blob'})
.then(response => {
let imageNode = document.getElementById('image');
let imgUrl = URL.createObjectURL(response.data)
imageNode.src = imgUrl
})
Jsfiddle demo
Here axios issue about missing documentation for binary data manipulation. In this comment you can find the same approach that I suggest. Also you can use FileReader API, in this stackoverflow comment you can find pros and cons for both approaches.

You just try this
return axios.get(<URL WHICH RETURNS IMAGE IN STRING ex:'http://host/image.png'>, { responseType: 'arraybuffer' })
.then((response) => {
let image = btoa(
new Uint8Array(response.data)
.reduce((data, byte) => data + String.fromCharCode(byte), '')
);
return `data:${response.headers['content-type'].toLowerCase()};base64,${image}`;
});

Related

How can I send a list of videos from Node.js to React.js?

In my Node backend, I have an endpoint where I get a few videos from where I have them stored in Google Cloud, then return these videos as the response. Here is my code for that:
app.get("/getGalleryVideos", async (req, res) => {
const [files] = await bucket.getFiles(bucketName, { prefix: req.query.user });
const fileNames = files.map((file) => file.name);
const sumFiles = [];
for (let i = 0; i < fileNames.length; i++) {
await bucket
.file(fileNames[i])
.download()
.then((data) => {
sumFiles.push(data);
});
}
res.json(sumFiles);
});
Then when I receive these videos as the response in the React frontend, I want to convert them into a format that they can be used as the source for HTML video tags. For now I am just trying to get one video to work so when I receive the response as a list, I just take the first item and work with that. Here is my code for that:
function getGalleryVideos() {
var user = "undefined";
if (typeof Cookies.get("loggedInUser") != "undefined") {
user = Cookies.get("loggedInUser");
}
axios({
method: "get",
url: `http://localhost:3001/getGalleryVideos`,
params: {
user: user,
},
timeout: 100000,
}).then((res) => {
console.log("res");
console.log(res);
console.log("res.data");
console.log(res.data);
console.log("res.data[1]");
console.log(res.data[1]);
console.log("res.data[1][0]");
console.log(res.data[1][0]);
console.log("res.data[1][0].data");
console.log(res.data[1][0].data);
const vid = new Blob(res.data[1][0].data, { type: "video/mp4" });
console.log("vid");
console.log(vid);
console.log("url");
console.log(URL.createObjectURL(vid));
setVideo(URL.createObjectURL(vid));
});
}
Now this is what I receive in the React frontend console when I make this request:
But if I navigate to the URL that I printed in the console, the video isn't there:
Now I'm not sure why this is happening, but one thing I have noticed is my prints in the console show that the size of the buffer before I convert it to a blob, is 773491 and then after I convert it, it is 1987122 which seems quite far off.
Please help explain how I can fix this, or provide a different way I can send these videos over so that I can display them.

How to upload a file into Firebase Storage from a callable https cloud function

I have been trying to upload a file to Firebase storage using a callable firebase cloud function.
All i am doing is fetching an image from an URL using axios and trying to upload to storage.
The problem i am facing is, I don't know how to save the response from axios and upload it to storage.
First , how to save the received file in the temp directory that os.tmpdir() creates.
Then how to upload it into storage.
Here i am receiving the data as arraybuffer and then converting it to Blob and trying to upload it.
Here is my code. I have been missing a major part i think.
If there is a better way, please recommend me. Ive been looking through a lot of documentation, and landed up with no clear solution. Please guide. Thanks in advance.
const bucket = admin.storage().bucket();
const path = require('path');
const os = require('os');
const fs = require('fs');
module.exports = functions.https.onCall((data, context) => {
try {
return new Promise((resolve, reject) => {
const {
imageFiles,
companyPIN,
projectId
} = data;
const filename = imageFiles[0].replace(/^.*[\\\/]/, '');
const filePath = `ProjectPlans/${companyPIN}/${projectId}/images/${filename}`; // Path i am trying to upload in FIrebase storage
const tempFilePath = path.join(os.tmpdir(), filename);
const metadata = {
contentType: 'application/image'
};
axios
.get(imageFiles[0], { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: 'application/image'
}
})
.then(response => {
console.log(response);
const blobObj = new Blob([response.data], {
type: 'application/image'
});
return blobObj;
})
.then(async blobObj => {
return bucket.upload(blobObj, {
destination: tempFilePath // Here i am wrong.. How to set the path of downloaded blob file
});
}).then(buffer => {
resolve({ result: 'success' });
})
.catch(ex => {
console.error(ex);
});
});
} catch (error) {
// unknown: 500 Internal Server Error
throw new functions.https.HttpsError('unknown', 'Unknown error occurred. Contact the administrator.');
}
});
I'd take a slightly different approach and avoid using the local filesystem at all, since its just tmpfs and will cost you memory that your function is using anyway to hold the buffer/blob, so its simpler to just avoid it and write directly from that buffer to GCS using the save method on the GCS file object.
Here's an example. I've simplified out a lot of your setup, and I am using an http function instead of a callable. Likewise, I'm using a public stackoverflow image and not your original urls. In any case, you should be able to use this template to modify back to what you need (e.g. change the prototype and remove the http response and replace it with the return value you need):
const functions = require('firebase-functions');
const axios = require('axios');
const admin = require('firebase-admin');
admin.initializeApp();
exports.doIt = functions.https.onRequest((request, response) => {
const bucket = admin.storage().bucket();
const IMAGE_URL = 'https://cdn.sstatic.net/Sites/stackoverflow/company/img/logos/so/so-logo.svg';
const MIME_TYPE = 'image/svg+xml';
return axios.get(IMAGE_URL, { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: MIME_TYPE
}
}).then(response => {
console.log(response); // only to show we got the data for debugging
const destinationFile = bucket.file('my-stackoverflow-logo.svg');
return destinationFile.save(response.data).then(() => { // note: defaults to resumable upload
return destinationFile.setMetadata({ contentType: MIME_TYPE });
});
}).then(() => { response.send('ok'); })
.catch((err) => { console.log(err); })
});
As a commenter noted, in the above example the axios request itself makes an external network access, and you will need to be on the Blaze or Flame plan for that. However, that alone doesn't appear to be your current problem.
Likewise, this also defaults to using a resumable upload, which the documentation does not recommend when you are doing large numbers of small (<10MB files) as there is some overhead.
You asked how this might be used to download multiple files. Here is one approach. First, lets assume you have a function that returns a promise that downloads a single file given its filename (I've abridged this from the above but its basically identical except for the change of INPUT_URL to filename -- note that it does not return a final result such as response.send(), and there's sort of an implicit assumption all the files are the same MIME_TYPE):
function downloadOneFile(filename) {
const bucket = admin.storage().bucket();
const MIME_TYPE = 'image/svg+xml';
return axios.get(filename, ...)
.then(response => {
const destinationFile = ...
});
}
Then, you just need to iteratively build a promise chain from the list of files. Lets say they are in imageUrls. Once built, return the entire chain:
let finalPromise = Promise.resolve();
imageUrls.forEach((item) => { finalPromise = finalPromise.then(() => downloadOneFile(item)); });
// if needed, add a final .then() section for the actual function result
return finalPromise.catch((err) => { console.log(err) });
Note that you could also build an array of the promises and pass them to Promise.all() -- that would likely be faster as you would get some parallelism, but I wouldn't recommend that unless you are very sure all of the data will fit inside the memory of your function at once. Even with this approach, you need to make sure the downloads can all complete within your function's timeout.

How can I send an image from Node.js to React.js?

I have the following setup, by which I send the image, from its url, to be edited and sent back to be uploaded to S3. The problem I currently have is that the image gets on S3 corrupted, and I am wondering if there's trouble in my code that's causing the issue.
Server side:
function convertImage(inputStream) {
return gm(inputStream)
.contrast(-2)
.stream();
}
app.get('/resize/:imgDetails', (req, res, next) => {
let params = req.params.imgDetails.split('&');
let fileName = params[0]; console.log(fileName);
let tileType = params[1]; console.log(tileType);
res.set('Content-Type', 'image/jpeg');
let url = `https://${process.env.Bucket}.s3.amazonaws.com/images/${tileType}/${fileName}`;
convertImage(request.get(url)).pipe(res);
})
Client side:
axios.get('/resize/' + fileName + '&' + tileType)
.then(res => {
/** PUT FILE ON AWS **/
var img = res;
axios.post("/sign_s3_sized", {
fileName : fileName,
tileType : tileType,
ContentType : 'image/jpeg'
})
.then(response => {
var returnData = response.data.data.returnData;
var signedRequest = returnData.signedRequest;
var url = returnData.url;
this.setState({url: url})
// Put the fileType in the headers for the upload
var options = {
headers: {
'Content-Type': 'image/jpeg'
}
};
axios.put(signedRequest,img, options)
.then(result => {
this.setState({success: true});
}).bind(this)
.catch(error => {
console.log("ERROR: " + JSON.stringify(error));
})
})
.catch(error => {
console.log(JSON.stringify(error));
})
})
.catch(error => console.log(error))
Before going any further, I can assure you now that uploading any images via this setup minus the convertImage() works, otherwise the image gets put on S3 corrupted.
Any pointers as to what the issue behind the image being corrupted is?
Is my understanding of streams here lacking perhaps? If so, what should I change?
Thank you!
EDIT 1:
I tried not running the image through the graphicsmagick API at all (request.get(url).pipe(res);) and the image is still corrupted.
EDIT 2:
I gave up at the end and just uploaded the file from Node.js straight to S3; it turned out to be better practice anyway.
So if you are end goal is to upload the image in the S3 bucket using Node Js, there are simple ways by using multer-s3 node module.

Streaming JSON data to React results in unexpected end of JSON inpit

I'm trying to stream a lot of data from a NodeJS server that fetches the data from Mongo and sends it to React. Since it's quite a lot of data, I've decided to stream it from the server and display it in React as soon as it comes in. Here's a slightly simplified version of what I've got on the server:
const getQuery = async (req, res) => {
const { body } = req;
const query = mongoQueries.buildFindQuery(body);
res.set({ 'Content-Type': 'application/octet-stream' });
Log.find(query).cursor()
.on('data', (doc) => {
console.log(doc);
const data = JSON.stringify(result);
res.write(`${data}\r\n`);
}
})
.on('end', () => {
console.log('Data retrieved.');
res.end();
});
};
Here's the React part:
fetch(url, { // this fetch fires the getQuery function on the backend
method: "POST",
body: JSON.stringify(object),
headers: {
"Content-Type": "application/json",
}
})
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
const pump = () =>
reader.read().then(({ done, value }) => {
if (done) return this.postEndHandler();
console.log(value.length); // !!!
const decoded = decoder.decode(value);
this.display(decoded);
return pump();
});
return pump();
})
.catch(err => {
console.error(err);
toast.error(err.message);
});
}
display(chunk) {
const { data } = this.state;
try {
const parsedChunk = chunk.split('\r\n').slice(0, -1);
parsedChunk.forEach(e => data.push(JSON.parse(e)));
return this.setState({data});
} catch (err) {
throw err;
}
}
It's a 50/50 whether it completes with no issues or fails at React's side of things. When it fails, it's always because of an incomplete JSON object in parsedChunk.forEach. I did some digging and it turns out that every time it fails, the console.log that I marked with 3 exclamation marks shows 65536. I'm 100% certain it's got something to do with my streams implementation and I'm not queuing the chunks correctly but I'm not sure whether I should be fixing it client or server side. Any help would be greatly appreciated.
Instead of implementing your own NDJSON-like streaming JSON protocol which you are basically doing here (with all of the pitfalls of dividing the stream into chunks and packets which is not always under your control), you can take a look at some of the existing tools that are created to do what you need, e.g.:
http://oboejs.com/
http://ndjson.org/
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/clarinet

how to load an image from url into buffer in nodejs

I am new to nodejs and am trying to set up a server where i get the exif information from an image. My images are on S3 so I want to be able to just pass in the s3 url as a parameter and grab the image from it.
I am u using the ExifImage project below to get the exif info and according to their documentation:
"Instead of providing a filename of an image in your filesystem you can also pass a Buffer to ExifImage."
How can I load an image to the buffer in node from a url so I can pass it to the ExifImage function
ExifImage Project:
https://github.com/gomfunkel/node-exif
Thanks for your help!
Try setting up request like this:
var request = require('request').defaults({ encoding: null });
request.get(s3Url, function (err, res, body) {
//process exif here
});
Setting encoding to null will cause request to output a buffer instead of a string.
Use the axios:
const response = await axios.get(url, { responseType: 'arraybuffer' })
const buffer = Buffer.from(response.data, "utf-8")
import fetch from "node-fetch";
let fimg = await fetch(image.src)
let fimgb = Buffer.from(await fimg.arrayBuffer())
I was able to solve this only after reading that encoding: null is required and providing it as an parameter to request.
This will download the image from url and produce a buffer with the image data.
Using the request library -
const request = require('request');
let url = 'http://website.com/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// Use the buffer
// buffer contains the image data
// typeof buffer === 'object'
});
Note: omitting the encoding: null will result in an unusable string and not in a buffer. Buffer.from won't work correctly too.
This was tested with Node 8
Use the request library.
request('<s3imageurl>', function(err, response, buffer) {
// Do something
});
Also, node-image-headers might be of interest to you. It sounds like it takes a stream, so it might not even have to download the full image from S3 in order to process the headers.
Updated with correct callback signature.
Here's a solution that uses the native https library.
import { get } from "https";
function urlToBuffer(url: string): Promise<Buffer> {
return new Promise((resolve, reject) => {
const data: Uint8Array[] = [];
get(url, (res) => {
res
.on("data", (chunk: Uint8Array) => {
data.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(data));
})
.on("error", (err) => {
reject(err);
});
});
});
}
const imageUrl = "https://i.imgur.com/8k7e1Hm.png";
const imageBuffer = await urlToBuffer(imageUrl);
Feel free to delete the types if you're looking for javascript.
I prefer this approach because it doesn't rely on 3rd party libraries or the deprecated request library.
request is deprecated and should be avoided if possible.
Good alternatives include got (only for node.js) and axios (which also support browsers).
Example of got:
npm install got
Using the async/await syntax:
const got = require('got');
const url = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
(async () => {
try {
const response = await got(url, { responseType: 'buffer' });
const buffer = response.body;
} catch (error) {
console.log(error.body);
}
})();
you can do it that way
import axios from "axios";
function getFileContentById(
download_url: string
): Promise < Buffer > {
const response = await axios.get(download_url, {
responseType: "arraybuffer",
});
return Buffer.from(response.data, "base64");
}

Categories