How to build an object tree with recursive API calls? - javascript

I want to construct a tree where each node is used in a API call to get the children nodes; starting at the root. And this will be done recursively until it reaches the TREE_DEPTH_LIMIT
export const search = async (searchTerm) => {
try {
const tree = {};
await createTree(searchTerm, tree);
return tree;
} catch (err: any) {}
};
const TREE_DEPTH_LIMIT = 3;
const createTree = async (searchTerm, tree) => {
if (counter === TREE_DEPTH_LIMIT) {
counter = 0;
return;
}
counter++;
tree[searchTerm] = {};
try {
const res = await axiosInstance.get(
`/query?term=${searchTerm}`
);
// res.data.terms is an array of strings
res.data.terms.forEach((term) => {
createTree(term, tree[searchTerm]);
});
} catch (err) {}
};
I am trying to do this recursively in the createTree() function above. It will use the searchTerm in the API call. Then it will loop through res.data.terms and call createTree() on each on the terms. But the output is not what I was expecting.
This is the output:
const tree = {
apple: {
apple_tree: {},
tree: {},
},
};
The expected output: (because the TREE_DEPTH_LIMIT is 3, it should have 3 levels in the tree)
const tree = {
apple: {
apple_tree: {
leaf: {},
},
tree: {
trunk: {},
},
},
};
Or is my solution completely incorrect and I should be going for another approach??

Some issues:
counter seems to be a global variable, but that will not work out well as at each return from recursion, counter should have its value restored. It is better to use a local variable for that, so that every execution context has its own version for it. Even better is to make it a parameter and let it count down instead of up.
The recursive call is not awaited, so in search the promise returned by createTree may resolve before all the children have been populated, and so you would work with an incomplete tree.
Not a real problem, but it is not the most elegant that the caller must create a tree object and pass it as argument. I would redesign the functions so that search will create that object itself, and then use a recursive function to create children -- so I'd name that function createChildren, instead of createTree.
Here is a snippet that first mocks the get method so you can actually run it:
// Mock for this demo
const axiosInstance = {async get(term) {const delay = ms => new Promise(resolve => setTimeout(resolve, ms));await delay(50);return {data: {terms: {"apple": ["apple_tree", "tree"],"apple_tree": ["leaf"],"leaf": [],"tree": ["trunk"],"trunk": []}[term.split("=")[1]]}};}}
const createChildren = async (searchTerm, depth) => {
if (depth-- <= 0) return {};
try {
const res = await axiosInstance.get(`/query?term=${searchTerm}`);
const promises = res.data.terms.map(async (term) =>
[term, await createChildren(term, depth)]
);
return Object.fromEntries(await Promise.all(promises));
} catch (err) {
console.log(err);
}
};
const TREE_DEPTH_LIMIT = 3;
const search = async (searchTerm, depth=TREE_DEPTH_LIMIT) =>
({[searchTerm]: await createChildren(searchTerm, depth)});
// Demo
search("apple").then(tree =>
console.log(tree)
);

Related

JS/React - How should I use a fetch call inside of a for loop?

I'm still a little unfamiliar with asynchronous functions beyond their simplest forms, so I'm hoping someone here can help me out.
Some information:
getCollection() returns an array of objects that look like this.
{
"id": 1,
"userId": 3,
"gameId": 3498,
"review": "Testing review for game 1"
},
getGameById() takes in a integer (the id of a game) and returns that game object from an external API.
gameArray should be getting filled with game objects whose IDs in the external API match the IDs from the array of objects provided by getCollection.
const [games, setGames] = useState([])
const getGames = () => {
getCollection().then(userGames => {
let gameArray = []
for (const eachObj of userGames) {
if (eachObj.userId === currentUserId) {
getGameById(eachObj.gameId).then(game => {
gameArray.push(game)
})
}
}
Promise.all(gameArray).then(() => {
console.log(gameArray) //logs empty array, but shouldn't be empty
setGames(gameArray)
})
})
}
I have never used Promise.all before, it's just something I saw as a possible solution to this issue I'm having.
Promise.all takes an array of promises.
First you must build the array of promises. Then you must call Promise.all with this array to retrieve all the games :
function getGames() {
getCollection().then(userGames => {
const gamePromises = userGames
.filter(userGame => userGame.userId == currenUserId)
.map(userGame => getGameById(userGame.gameId));
Promise.all(gamePromises).then(games=> {
console.log(games);
setGames(games)
});
})
}
Here is another solution using async function which is maybe more readable
async function getGames() {
const userGames = await getCollection();
const currentUserGames = userGames.filter(({userId}) => userId == currentUserId);
const games = await Promise.all(userGames.map(({gameId}) => getGameById(gameId));
setGames(games);
}
The array that you pass to Promise.all needs to contain the promises, but you're pushing the game objects - and you're doing it asynchronously, so the array is still empty when you pass it to Promise.all.
const getGames = () => {
getCollection().then(userGames => {
let gamePromises = []
for (const eachObj of userGames) {
if (eachObj.userId === currentUserId) {
gamePromises.push(getGameById(eachObj.gameId))
// ^^^^^^^^^^^^^^^^^^ ^
}
}
return Promise.all(gamePromises)
// ^^^^^^ chaining
}).then(gameArray => {
// ^^^^^^^^^
console.log(gameArray)
setGames(gameArray)
})
}
To simplify:
async function getGames() {
const userGames = await getCollection()
const gamePromises = userGames
.filter(eachObj => eachObj.userId === currentUserId)
.map(eachObj => getGameById(eachObj.gameId))
const gameArray = await Promise.all(gamePromises)
console.log(gameArray)
setGames(gameArray)
}

How to "multicast" an async iterable?

Can an async generator be somehow broadcast or multicast, so that all its iterators ("consumers"? subscribers?) receive all values?
Consider this example:
const fetchMock = () => "Example. Imagine real fetch";
async function* gen() {
for (let i = 1; i <= 6; i++) {
const res = await fetchMock();
yield res.slice(0, 2) + i;
}
}
const ait = gen();
(async() => {
// first "consumer"
for await (const e of ait) console.log('e', e);
})();
(async() => {
// second...
for await (const é of ait) console.log('é', é);
})();
Iterations "consume" a value, so only one or the other gets it.
I would like for both of them (and any later ones) to get every yielded value, if such a generator is possible to create somehow. (Similar to an Observable.)
This is not easily possible. You will need to explicitly tee it. This is similar to the situation for synchronous iterators, just a bit more complicated:
const AsyncIteratorProto = Object.getPrototypeOf(Object.getPrototypeOf(async function*(){}.prototype));
function teeAsync(iterable) {
const iterator = iterable[Symbol.asyncIterator]();
const buffers = [[], []];
function makeIterator(buffer, i) {
return Object.assign(Object.create(AsyncIteratorProto), {
next() {
if (!buffer) return Promise.resolve({done: true, value: undefined});
if (buffer.length) return buffer.shift();
const res = iterator.next();
if (buffers[i^1]) buffers[i^1].push(res);
return res;
},
async return() {
if (buffer) {
buffer = buffers[i] = null;
if (!buffers[i^1]) await iterator.return();
}
return {done: true, value: undefined};
},
});
}
return buffers.map(makeIterator);
}
You should ensure that both iterators are consumed at about the same rate so that the buffer doesn't grow too large.
Here's a solution using Highland as an intermediary. From the docs:
A stream forked to multiple consumers will pull values, one at a time, from its source as only fast as the slowest consumer can handle them.
import _ from 'lodash'
import H from 'highland'
export function fork<T>(generator: AsyncGenerator<T>): [
AsyncGenerator<T>,
AsyncGenerator<T>
] {
const source = asyncGeneratorToHighlandStream(generator).map(x => _.cloneDeep(x));
return [
highlandStreamToAsyncGenerator<T>(source.fork()),
highlandStreamToAsyncGenerator<T>(source.fork()),
];
}
async function* highlandStreamToAsyncGenerator<T>(
stream: Highland.Stream<T>
): AsyncGenerator<T> {
for await (const row of stream.toNodeStream({ objectMode: true })) {
yield row as unknown as T;
}
}
function asyncGeneratorToHighlandStream<T>(
generator: AsyncGenerator<T>
): Highland.Stream<T> {
return H(async (push, next) => {
try {
const result = await generator.next();
if (result.done) return push(null, H.nil);
push(null, result.value);
next();
} catch (error) {
return push(error);
}
});
}
Usage:
const [copy1, copy2] = fork(source);
Works in Node, browser untested.
I built a library to do this here: https://github.com/tjenkinson/forkable-iterator
Means you can do something like:
import { buildForkableIterator, fork } from 'forkable-iterator';
function* Source() {
yield 1;
yield 2;
return 'return';
}
const forkableIterator = buildForkableIterator(Source());
console.log(forkableIterator.next()); // { value: 1, done: false }
const child1 = fork(forkableIterator);
// { value: 2, done: false }
console.log(child1.next());
// { value: 2, done: false }
console.log(forkableIterator.next());
// { value: 'return', done: true }
console.log(child1.next());
// { value: 'return', done: true }
console.log(forkableIterator.next());
If you no longer need to keep consuming from a fork providing you loose references to it there also shouldn’t be a memory leak.

Why can't I use nested yield in for..in loop in Redux-Saga

So I have tasks object with ids and values. With for in loop I want to read 'members' property. If it exist, and first element of array !=='all' I want to make request to firebase for read document by given id. But I cant use yield, there is error:
Failed to compile.
./src/containers/modules/Tasks/store/sagas/tasksSagas.js
Line 27: Parsing error: Unexpected reserved word 'yield'
Why am I getting it? I can't use nested yields?
//Fetching users for tasks
for (const task in tasks) {
if (tasks[task].hasOwnProperty('members')) {
if (tasks[task].members[0]==='all'){
tasks[task].fetchedMembers = ['all']
}
else {
const fetchedMembers = tasks[task].members.map(member => {
const user = yield db.collection('users').doc(member).get()
const userData = user.data()
return {
uid: member,
...userData
}
})
tasks[task].fetchedMembers = fetchedMembers
}
}
}
const fetchedMembers = tasks[task].members.map(member => {
const user = yield db.collection('users').doc(member).get()
const userData = user.data()
return {
uid: member,
...userData
}
})
The function you pass to .map is not a generator function, so you can't use yield in it. And it can't be a generator function, since map knows nothing about generators or sagas.
One option is to create an array of promises, and then yield that (either wrapped in Promise.all, or redux-saga's all effect):
const promises = tasks[task].members.map(member => {
return db.collection('users').doc(member).get();
})
const members = yield all(promises);
const fetchedMembers = members.map(user => {
const userdata = user.data();
return {
uid: member,
...userData;
}
);
Another option is to write a mini-saga you want to do for each member, and then create an array of call effects, then yield all of those:
const callEffects = tasks[task].members.map(member => {
return call(function* () {
const user = yield db.collection('users').doc(member).get()
const userData = user.data()
return {
uid: member,
...userData
}
});
})
const fetchedMembers = yield all(callEffects);

Wrap a resultset callback function with a generator/iterator

I'm working on converting a legacy callback-based API into an async library. But I just can't wrap my head around getting a "resultset" to work as a generator (Node 10.x).
The original API works like this:
api.prepare((err, rs) => {
rs.fetchRows(
(err, row) => {
// this callback is called as many times as rows exist
console.log("here's a row:", row);
},
() => {
console.log("we're done, data exausted");
}
);
});
But here is how I want to use it:
const wrapped = new ApiWrapper(api);
const rs = await wrapped.prepare({});
for (let row of rs.rows()) {
console.log("here's a row:", row);
}
let row;
while(row = await rs.next()) {
console.log("here's a row:", row);
}
I thought I had it under control with generators, but it looks like you cannot use yield inside a callback. It actually seems logical if you think about.
class ApiWrapper {
constructor(api) {
this.api = api;
}
prepare() {
return new Promise((resolve, reject) => {
this.api.prepare((err, rs) => {
if (err) {
reject(err);
} else {
resolve(rs);
}
});
});
}
*rows() {
this.api.fetchRows((err, row) => {
if (err) {
throw err;
} else {
yield row; // nope, not allowed here
}
});
}
next() { ... }
}
So what alternatives do I have?
Important: I don't want to store anything in an array then iterate that, we're talking giga-loads of row data here.
Edit
I'm able to simulate the behavior I want using stream.Readable but it warns me that it's an experimental feature. Here's a simplified array-based version of the issue I'm trying to solve using stream:
const stream = require('stream');
function gen(){
const s = new stream.Readable({
objectMode: true,
read(){
[11, 22, 33].forEach(row => {
this.push({ value: row });
});
this.push(null)
}
});
return s;
}
for await (let row of gen()) {
console.log(row);
}
// { value: 11 }
// { value: 22 }
// { value: 33 }
(node:97157) ExperimentalWarning: Readable[Symbol.asyncIterator] is an experimental feature. This feature could change at any time
I finally realized I needed something similar to Go's channels that were async/await compatible. Basically the answer is to synchronize an async iterator and a callback, making them wait for each other as next() iterations are consumed.
The best (Node) native solution I found was to use a stream as an iterator, which is supported in Node 10.x but tagged experimental. I also tried to implement it with the p-defer NPM module, but that turned out to be more involved than I expected. Finally ran across the https://www.npmjs.com/package/#nodeguy/channel module, which was exactly what I needed:
const Channel = require('#nodeguy/channel');
class ApiWrapper {
// ...
rows() {
const channel = new Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
this.api.fetchRows(async (err, row) => {
await channel.push(row);
}).then(() => channel.close());
return iter;
}
}
// then later
for await (let row of rs.rows()) {
console.log(row)
}
Note how each iterating function core, next() and rows(), have a await that will throttle how much data can be pushed across the channel, otherwise the producing callback could end up pushing data uncontrollably into the channel queue. The idea is that the callback should wait for data to be consumed by the iterator next() before pushing more.
Here's a more self-contained example:
const Channel = require('#nodeguy/channel');
function iterating() {
const channel = Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
console.log('next');
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
[11, 22, 33].forEach(async it => {
await channel.push(it);
console.log('pushed', it);
});
console.log('returned');
return iter;
}
(async function main() {
for await (let it of iterating()) {
console.log('got', it);
}
})();
/*
returned
next
pushed 11
got 11
next
pushed 22
got 22
next
pushed 33
got 33
next
*/
Like I said, Streams and/or Promises can be used to implement this, but the Channel module solves some of the complexity that make it more intuitive.
The original question has two nested callback taking async functions
api.prepare((err,res) => ...)
rs.fetchRows((err,res) => ...)
The first one runs the callback only once so just promisifying it as follows is sufficient.
function promisfied(f){
return new Promise((v,x) => f(x,v));
}
However the second function will invoke it's callback multiple times and we wish to generate an async iterable from this function such that we can consume it in a for await of loop.
This is also possible by employing async generators as follows;
async function* rowEmitterGenerator(rs){
let _v, // previous resolve
_x, // previous reject
_row = new Promise((v,x) => (_v = v, _x = x));
rs.fetchRows((err, row) => ( err ? _x(err) : _v(row)
, _row = new Promise((v,x) => (_v = v, _x = x))
));
while(true){
try {
yield _row;
}
catch(e){
console.log(e);
}
}
}
Then putting all together in a top level await context;
const rows = await promisified(api.prepare),
rowEmitter = rowEmitterGenerator(rows);
for await (let row of rowEmitter){
console.log(`Received row: ${row}`);
// do something with the row
}

ReactJS how to wait for all API calls to be ended in componentDidMount of simple component

I'm using latest react and very basic app which calls 3rd party service API which actually is not well designed in meaning of following.
I have to execute one call which return list and then have to iterate and call other end point to get data for item from list and then again in data have new list for which I have to call 3rd API end point.
After I receive all data I combined it to one items array and place it in state in componentDidMount function but this final step works only if I surround it with setTimeout.
Is there some elegant way to do that?
I'm using fetch and really pure react components, have my own simple service from where I call API, here is some code parts...
items[tag].sensors = [];
API.getObjects(sessionData, userDetails, tag).then(links => {
Object.keys(links.link).forEach(link => {
API.getObjects(sessionData, userDetails, link).then(objLink => {
Object.keys(objLink.link).forEach(function (key) {
let obj = objLink.link[key];
if (obj && obj.type === 'sensor') {
API.getSensorNames(sessionData, key).then(response => {
const sensor = response.sensor;
// some sensor calculations....
items[tag].sensors.push(sensor);
});
}
});
});
});
});
// this part only works if it's surrounded with timeout
setTimeout(function() {
let processedItems = [];
for (var key in items) {
if (items.hasOwnProperty(key)) {
processedItems.push(items[key]);
}
}
self.setState({
items: processedItems
});
}, 1000);
Thanks in advance.
Simply, You can use Promise to wait until you get values from the API call, therefore you will put your code in function like this
function prepareItems() {
items[tag].sensors = [];
return new Promise((resolve, reject) => {
API.getObjects(sessionData, userDetails, tag).then(links => {
Object.keys(links.link).forEach(link => {
API.getObjects(sessionData, userDetails, link).then(objLink => {
Object.keys(objLink.link).forEach(function(key) {
let obj = objLink.link[key];
if (obj && obj.type === "sensor") {
API.getSensorNames(sessionData, key).then(response => {
const sensor = response.sensor;
// some sensor calculations....
items[tag].sensors.push(sensor);
// whenever you set resolve it will end the promise
//and pass the result it to the then function
resolve(items)
});
}
});
});
});
});
});
}
and use then to get the result from the prepareItems function after its resolved
prepareItems().then(items => {
//Do what ever you want with prepared item
})
What about using async/await operators.
These operators allows you to wait until the response is ready.
You can use this kind of helper function.
getItems = async (...) => {
...
items[tag].sensors = []
const links = await API.getObjects(sessionData, userDetails, tag)
Object.keys(links.link).forEach(async (link) => {
const objLink = await API.getObjects(sessionData, userDetails, link)
Object.keys(objLink.link).forEach(async (key) => {
let obj = objLink.link[key]
if (obj && obj.type === 'sensor') {
const response = await API.getSensorNames(sessionData, key)
const sensor = response.sensor
items[tag].sensors.push(sensor)
}
})
})
this.setState({ items })
}
Also you can see this great documentation.

Categories