Fetch API stream doesn't work on server but does locally - javascript

As I explained in a previous question I have a FastAPI endpoint that returns a StreamingResponse response that is then consumed by a React application using fetch().body.getReader() API.
The problem I'm facing appears when I open my React application, select the image (s) using Uppy and send it to my FastAPI endpoint, locally it works just fine and the images are returned as a stream response:
But when I deploy my application on Heroku or Render the rendered response is all broken:
Adding more context to my previous question I'm rendering the stream using an async generator:
async function* submit({data}) {
const formData = new FormData()
data?.current?.files?.successful.map(image =>
formData.append('images', image.data)
)
formData.append('language', data.current.language.code)
try {
const response = await fetch(
'endpoint',
{
method: 'POST',
body: formData
}
)
const reader = response.body.getReader()
while (true) {
const { value, done } = await reader.read()
if (done) break
const base64 = `data:${response.headers.get(
'content-type'
)};base64,${btoa(String.fromCharCode(...new Uint8Array(value)))}`
yield base64
}
} catch (error) {
// ...
}
}
That is called when the "Gallery" component in the screenshot is rendered:
const [images, setImages] = useState([])
useEffect(() => {
;(async () => {
for await (const image of await translateSubmit({
data
})) {
setImages(previous => [...previous, image])
}
})()
// eslint-disable-next-line
}, [])
I was expecting to get the same result in the server when the application is deployed, but it just doesn't work as it should and I'm not sure how to approach the problem. Any tips?

Related

Can't get json with axios.get and headers

I am trying to get the joke from https://icanhazdadjoke.com/. This is the code I used
const getDadJoke = async () => {
const res = await axios.get('https://icanhazdadjoke.com/', {headers: {Accept: 'application/json'}})
console.log(res.data.joke)
}
getDadJoke()
I expected to get the joke but instead I got the full html page, as if I didn't specify the headers at all. What am I doing wrong?
If you look at the API documentation for icanhazdadjoke.com, there is a section titled "Custom user agent." In that section, they explain how they want any requests to have a User Agent header. If you use Axios in a browser context, the User Agent is set for you by your browser. But I'm going to go out on a limb and say that you are running this code via Node, in which case, you may manually need to set the User Agent header, like so:
const getDadJoke = async () => {
const res = await axios.get(
'https://icanhazdadjoke.com/',
{
headers:
{
'Accept': 'application/json',
'User-Agent': 'my URL, email or whatever'
}
}
)
console.log(res.data.joke)
}
getDadJoke()
The docs say what they want you to put for the User Agent, but I think it would honestly work if there were any User Agent field at all.
The HTML page you're getting is a 503 response from Cloudflare.
As per the API documentation
Custom user agent
If you intend on using the icanhazdadjoke.com API we kindly ask that you set a custom User-Agent header for all requests.
My guess is they have a Cloudflare Browser Integrity Check configured that's triggering for the default Node / Axios user-agent.
Setting a custom user-agent appears to get around this...
const getDadJoke = async () => {
try {
const res = await axios.get("https://icanhazdadjoke.com/", {
headers: {
accept: "application/json",
"user-agent": "My Node and Axios app", // use something better than this
},
});
console.log(res.data.joke);
} catch (err) {
console.error(err.response?.data, err.toJSON());
}
};
Given how unreliable Axios releases have been since v1.0.0, I highly recommend you switch to something else. The Fetch API is available natively in Node since v18
const getDadJoke = async () => {
try {
const res = await fetch("https://icanhazdadjoke.com/", {
headers: {
accept: "application/json",
"user-agent": "My Node and Fetch app", // use something better than this
},
});
if (!res.ok) {
const err = new Error(`${res.status} ${res.statusText}`);
err.text = await res.text();
throw err;
}
console.log((await res.json()).joke);
} catch (err) {
console.error(err, err.text);
}
};
Using Axios REST API call which response JSON format.
If you using API from https://icanhazdadjoke.com/api#authentication
, you can use Axios.
Here is example.
Alternative method.
You needs to use web scrapping method for this case. Because HTML response from https://icanhazdadjoke.com/.
This is example how to scrap using puppeteer library in node.js
Demo code
Save as get-joke.js file.
const puppeteer = require("puppeteer");
async function getJoke() {
try {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto('https://icanhazdadjoke.com/');
const joke = await page.evaluate(() => {
const jokes = Array.from(document.querySelectorAll('p[class="subtitle"]'))
return jokes[0].innerText;
});
await browser.close();
return Promise.resolve(joke);
} catch (error) {
return Promise.reject(error);
}
}
getJoke()
.then((joke) => {
console.log(joke);
})
Selector
Main Idea to use DOM tree selector
In the Chrome's DevTool (by pressing F12), shows HTML DOM tree structures.
<p> tag has class name is subtitle
document.querySelectorAll('p[class="subtitle"]')
Install dependency and run it
npm install puppeteer
node get-joke.js
Result
You can get the joke from that web site.

How to correctly display images on React frontend that are returned by Laravel as StreamedResponse objects?

The idea is as follows:
Images/documents are stored privately on the server
A logged-in user on frontend clicks a button which sends an axios request to backend to get an aggregated result of ModelA from TableA and it's associated attachment file list from TableB
For each ModelA, numerous requests are made to endpoint to fetch images which are returned as \Symfony\Component\HttpFoundation\StreamedResponse via Storage::download($request->file_name)
This works in the sense that files are returned.
Note - I tried attaching all files to response in step 2 but this didn't work, so added the extra step to get file list and get individual files after that based on the list. This might kill the webserver if the amount of requests becomes too high, so would appreciate any advise on a different approach.
The problem
How to display the files in React and is this the right approach at all considering potential performance issues noted above?
I've tried the following:
Create an octet-stream url link with FileReader but these wouldn't display and had the same url despite await being used for the reader.readAsDataURL(blob) function:
const { email, name, message, files } = props
const [previews, setPreviews] = useState<string[]>([])
const { attachments } = useAttachment(files)
useEffect(() => {
const p = previews
files && attachments?.forEach(async filename => {
const reader = new FileReader()
reader.onloadend = () => {
p.push(reader.result as string)
setPreviews(p)
}
const blob = new Blob([filename])
await reader.readAsDataURL(blob)
})
}, [files, attachments, previews])
Create src attributes with URL.createObjectURL() but these, although generated and unique, wouldn't display when used in an <img /> tag:
useEffect(() => {
const p = previews
files && attachments?.forEach(filename => {
const blob = new Blob([filename])
const src = URL.createObjectURL(blob)
p.push(src)
setPreviews(p)
})
}, [files, attachments, previews])
Results example:
<img src="blob:http://127.0.0.1:8000/791f5efb-1b4e-4474-a4b6-d7b14b881c28" class="chakra-image css-0">
<img src="blob:http://127.0.0.1:8000/3d93449e-175d-49af-9a7e-61de3669817c" class="chakra-image css-0">
Here's the useAttachment hook:
import { useEffect, useState } from 'react'
import { api } from '#utils/useAxios'
const useAttachment = (files: any[] | undefined) => {
const [attachments, setAttachments] = useState<any[]>([])
const handleRequest = async (data: FormData) => {
await api().post('api/attachment', data).then(resp => {
const attach = attachments
attach.push(resp)
setAttachments(attach)
})
}
useEffect(() => {
if (files) {
files.forEach(async att => {
const formData = new FormData()
formData.append('file_name', att.file_name)
await handleRequest(formData)
})
}
}, [files, attachments])
return { attachments }
}
export default useAttachment
Try Storage::response(). This is the same as Storage::download() just that it sets the Content-Disposition header to inline instead of attachment.
This tells the browser to display it instead of downloading it. See MDN Docs Here
Then you can use it as the src for an <img/>.
Solved it by sending the files in a single response but encoded with base64encode(Storage::get('filename')). Then, on the frontend, it was as simple as:
const base64string = 'stringReturned'
<img> src={`data:image/png;base64,${base64string}`}</img>```

How to make call to external api from nodejs

Hi all I have to develop a utility which makes a call to external API with different parameters, for example, I have an array val which has 100 value val= ['we23','22ww', 'gh22'....n] and URL: www.google.com so one by one I have to append value from val to the URL, first api= www.google.com/we23, second api= www.google.com/22ww and make an External API hit and then store the response in database. so what is the most efficient way to do it? and links to working examples would be helpful.
A very simple example express app using the Fetch API:
const express = require('express')
const fetch = require('node-fetch')
const app = express()
// This sets up a route to localhost:3000/random and goes off and hits
// cat-fact.herokuapp.com/facts/random
app.get('/:apiRoute', async (req, res) => {
try {
const { apiRoute } = req.params
const apiResponse = await fetch(
'https://cat-fact.herokuapp.com/facts/' + apiRoute
)
const apiResponseJson = await apiResponse.json()
// await db.collection('collection').insertOne(apiResponseJson)
console.log(apiResponseJson)
res.send('Done – check console log')
} catch (err) {
console.log(err)
res.status(500).send('Something went wrong')
}
})
app.listen(3000, () => console.log(`Example app listening on port 3000!`))
Visit http://localhost:3000/random
With the following code you can make concurrent API calls within an endpoint using Node.js + Express:
const [
LoMasNuevo, LoMasVisto, TeRecomendamos, Categorias,
] = await Promise.all([
numerosController.getLoMasNuevo(),
numerosController.getLoMasVisto(),
numerosController.getRecomendaciones(),
categoriasController.getCategorias(),
]);
Inside every get function you can make an axios request like this:
const params = {
method: 'GET',
url: 'https://development.api.yodlee.com/ysl/transactions',
headers: {
'Api-Version': '1.1',
Authorization: `Bearer ${tokenuser}`,
},
};
const data = await axios(params);
return data;
In 2022
In Node.js:
const fetch = (...args) => import('node-fetch').then(({ default: fetch }) =>
fetch(...args));
app.get('/checkDobleAPI', async (req, res) => {
try {
const apiResponse = await fetch(
'https://jsonplaceholder.typicode.com/posts'
)
const apiResponseJson = await apiResponse.json()
console.log(apiResponseJson)
res.send('Running 🏃')
} catch (err) {
console.log(err)
res.status(500).send('Something went wrong')
}
})
You can use Express to build a API as your idea
Then you can call api by using axios package.
In addition, you can build link to receive request and send response by using Router of ExpressJS

How to upload a file into Firebase Storage from a callable https cloud function

I have been trying to upload a file to Firebase storage using a callable firebase cloud function.
All i am doing is fetching an image from an URL using axios and trying to upload to storage.
The problem i am facing is, I don't know how to save the response from axios and upload it to storage.
First , how to save the received file in the temp directory that os.tmpdir() creates.
Then how to upload it into storage.
Here i am receiving the data as arraybuffer and then converting it to Blob and trying to upload it.
Here is my code. I have been missing a major part i think.
If there is a better way, please recommend me. Ive been looking through a lot of documentation, and landed up with no clear solution. Please guide. Thanks in advance.
const bucket = admin.storage().bucket();
const path = require('path');
const os = require('os');
const fs = require('fs');
module.exports = functions.https.onCall((data, context) => {
try {
return new Promise((resolve, reject) => {
const {
imageFiles,
companyPIN,
projectId
} = data;
const filename = imageFiles[0].replace(/^.*[\\\/]/, '');
const filePath = `ProjectPlans/${companyPIN}/${projectId}/images/${filename}`; // Path i am trying to upload in FIrebase storage
const tempFilePath = path.join(os.tmpdir(), filename);
const metadata = {
contentType: 'application/image'
};
axios
.get(imageFiles[0], { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: 'application/image'
}
})
.then(response => {
console.log(response);
const blobObj = new Blob([response.data], {
type: 'application/image'
});
return blobObj;
})
.then(async blobObj => {
return bucket.upload(blobObj, {
destination: tempFilePath // Here i am wrong.. How to set the path of downloaded blob file
});
}).then(buffer => {
resolve({ result: 'success' });
})
.catch(ex => {
console.error(ex);
});
});
} catch (error) {
// unknown: 500 Internal Server Error
throw new functions.https.HttpsError('unknown', 'Unknown error occurred. Contact the administrator.');
}
});
I'd take a slightly different approach and avoid using the local filesystem at all, since its just tmpfs and will cost you memory that your function is using anyway to hold the buffer/blob, so its simpler to just avoid it and write directly from that buffer to GCS using the save method on the GCS file object.
Here's an example. I've simplified out a lot of your setup, and I am using an http function instead of a callable. Likewise, I'm using a public stackoverflow image and not your original urls. In any case, you should be able to use this template to modify back to what you need (e.g. change the prototype and remove the http response and replace it with the return value you need):
const functions = require('firebase-functions');
const axios = require('axios');
const admin = require('firebase-admin');
admin.initializeApp();
exports.doIt = functions.https.onRequest((request, response) => {
const bucket = admin.storage().bucket();
const IMAGE_URL = 'https://cdn.sstatic.net/Sites/stackoverflow/company/img/logos/so/so-logo.svg';
const MIME_TYPE = 'image/svg+xml';
return axios.get(IMAGE_URL, { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: MIME_TYPE
}
}).then(response => {
console.log(response); // only to show we got the data for debugging
const destinationFile = bucket.file('my-stackoverflow-logo.svg');
return destinationFile.save(response.data).then(() => { // note: defaults to resumable upload
return destinationFile.setMetadata({ contentType: MIME_TYPE });
});
}).then(() => { response.send('ok'); })
.catch((err) => { console.log(err); })
});
As a commenter noted, in the above example the axios request itself makes an external network access, and you will need to be on the Blaze or Flame plan for that. However, that alone doesn't appear to be your current problem.
Likewise, this also defaults to using a resumable upload, which the documentation does not recommend when you are doing large numbers of small (<10MB files) as there is some overhead.
You asked how this might be used to download multiple files. Here is one approach. First, lets assume you have a function that returns a promise that downloads a single file given its filename (I've abridged this from the above but its basically identical except for the change of INPUT_URL to filename -- note that it does not return a final result such as response.send(), and there's sort of an implicit assumption all the files are the same MIME_TYPE):
function downloadOneFile(filename) {
const bucket = admin.storage().bucket();
const MIME_TYPE = 'image/svg+xml';
return axios.get(filename, ...)
.then(response => {
const destinationFile = ...
});
}
Then, you just need to iteratively build a promise chain from the list of files. Lets say they are in imageUrls. Once built, return the entire chain:
let finalPromise = Promise.resolve();
imageUrls.forEach((item) => { finalPromise = finalPromise.then(() => downloadOneFile(item)); });
// if needed, add a final .then() section for the actual function result
return finalPromise.catch((err) => { console.log(err) });
Note that you could also build an array of the promises and pass them to Promise.all() -- that would likely be faster as you would get some parallelism, but I wouldn't recommend that unless you are very sure all of the data will fit inside the memory of your function at once. Even with this approach, you need to make sure the downloads can all complete within your function's timeout.

Streaming JSON data to React results in unexpected end of JSON inpit

I'm trying to stream a lot of data from a NodeJS server that fetches the data from Mongo and sends it to React. Since it's quite a lot of data, I've decided to stream it from the server and display it in React as soon as it comes in. Here's a slightly simplified version of what I've got on the server:
const getQuery = async (req, res) => {
const { body } = req;
const query = mongoQueries.buildFindQuery(body);
res.set({ 'Content-Type': 'application/octet-stream' });
Log.find(query).cursor()
.on('data', (doc) => {
console.log(doc);
const data = JSON.stringify(result);
res.write(`${data}\r\n`);
}
})
.on('end', () => {
console.log('Data retrieved.');
res.end();
});
};
Here's the React part:
fetch(url, { // this fetch fires the getQuery function on the backend
method: "POST",
body: JSON.stringify(object),
headers: {
"Content-Type": "application/json",
}
})
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
const pump = () =>
reader.read().then(({ done, value }) => {
if (done) return this.postEndHandler();
console.log(value.length); // !!!
const decoded = decoder.decode(value);
this.display(decoded);
return pump();
});
return pump();
})
.catch(err => {
console.error(err);
toast.error(err.message);
});
}
display(chunk) {
const { data } = this.state;
try {
const parsedChunk = chunk.split('\r\n').slice(0, -1);
parsedChunk.forEach(e => data.push(JSON.parse(e)));
return this.setState({data});
} catch (err) {
throw err;
}
}
It's a 50/50 whether it completes with no issues or fails at React's side of things. When it fails, it's always because of an incomplete JSON object in parsedChunk.forEach. I did some digging and it turns out that every time it fails, the console.log that I marked with 3 exclamation marks shows 65536. I'm 100% certain it's got something to do with my streams implementation and I'm not queuing the chunks correctly but I'm not sure whether I should be fixing it client or server side. Any help would be greatly appreciated.
Instead of implementing your own NDJSON-like streaming JSON protocol which you are basically doing here (with all of the pitfalls of dividing the stream into chunks and packets which is not always under your control), you can take a look at some of the existing tools that are created to do what you need, e.g.:
http://oboejs.com/
http://ndjson.org/
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/clarinet

Categories