I have a async recursive function which returns promise if there is more work to do or returns result array otherwise. In case where no recursion is involved it correctly return the array but when recursion is there the array is undefined. The code is
function foo(filepath) {
var resultArr = [];
function doo(file) {
return asyncOperation(file).then(resp => {
resultArr.push(resp.data);
if (resp.pages) {
var pages = resp.pages.split(',');
pages.forEach(page => {
return doo(page);
});
} else {
return resultArr;
}
});
}
return doo(filepath);
}
And the way this is called
foo(abcfile).then(function(result){
console.log(result);
});
If I pass abcfile which has no resp.pages, I get result array, but there are resp.pages, then the result array is undefined.
I think you're just missing a returned promise within the if (resp.pages) block
if (resp.pages) {
return Promise.all(resp.pages.split(',').map(page => doo(page)))
.then(pagesArr => {
return resultArr.concat(...pagesArr)
})
}
I'm thinking there may be an issue with scoping resultArr outside the doo function so maybe try this
function foo(filepath) {
function doo(file) {
return asyncOperation(file).then(resp => {
const resultArr = [ resp.data ]
if (resp.pages) {
return Promise.all(resp.pages.split(',').map(page => doo(page)))
.then(pagesArr => resultArr.concat(...pagesArr))
} else {
return resultArr
}
})
}
return doo(filePath)
}
To explain the use of the spread-operator, look at it this way...
Say you have three pages for a file top; page1, page2 and page3 and each of those resolves with a couple of sub-pages each, the pagesArr would look like
[
['page1', 'page1a', 'page1b'],
['page2', 'page2a', 'page2b'],
['page3', 'page3a', 'page3b']
]
and resultArr so far looks like
['top']
If you use concat without the spread operator, you'd end up with
[
"top",
[
"page1",
"page1a",
"page1b"
],
[
"page2",
"page2a",
"page2b"
],
[
"page3",
"page3a",
"page3b"
]
]
But with the spread, you get
[
"top",
"page1",
"page1a",
"page1b",
"page2",
"page2a",
"page2b",
"page3",
"page3a",
"page3b"
]
To verify this works, I'll make a fake dataset, and a fakeAsyncOperation which reads data from the dataset asynchronously. To model your data closely, each query from the fake dataset returns a response with data and pages fields.
let fake = new Map([
['root', {data: 'root', pages: ['a', 'b', 'c', 'd']}],
['a', {data: 'a', pages: ['a/a', 'a/a']}],
['a/a', {data: 'a/a', pages: []}],
['a/b', {data: 'a/b', pages: ['a/b/a']}],
['a/b/a', {data: 'a/b/a', pages: []}],
['b', {data: 'b', pages: ['b/a']}],
['b/a', {data: 'b/a', pages: ['b/a/a']}],
['b/a/a', {data: 'b/a/a', pages: ['b/a/a/a']}],
['b/a/a/a', {data: 'b/a/a/a', pages: []}],
['c', {data: 'c', pages: ['c/a', 'c/b', 'c/c', 'c/d']}],
['c/a', {data: 'c/a', pages: []}],
['c/b', {data: 'c/b', pages: []}],
['c/c', {data: 'c/c', pages: []}],
['c/d', {data: 'c/d', pages: []}],
['d', {data: 'd', pages: []}]
]);
let fakeAsyncOperation = (page) => {
return new Promise(resolve => {
setTimeout(resolve, 100, fake.get(page))
})
}
Next we have your foo function. I've renamed doo to enqueue because it works like a queue. It has two parameters: acc for keeping track of the accumulated data, and xs (destructured) which is the items in the queue.
I've used the new async/await syntax that makes it particularly nice for dealing with this. We don't have to manually construct any Promises or deal with any manual .then chaining.
I made liberal use of the spread syntax in the recursive call because I its readability, but you could easily replace these for concat calls acc.concat([data]) and xs.concat(pages) if you like that more. – this is functional programming, so just pick an immutable operation you like and use that.
Lastly, unlike other answers that use Promise.all this will process each page in series. If a page were to have 50 subpages, Promise.all would attempt to make 50 requests in parallel and that may be undesired. Converting the program from parallel to serial is not necessarily straightforward, so that is the reason for providing this answer.
function foo (page) {
async function enqueue (acc, [x,...xs]) {
if (x === undefined)
return acc
else {
let {data, pages} = await fakeAsyncOperation(x)
return enqueue([...acc, data], [...xs, ...pages])
}
}
return enqueue([], [page])
}
foo('root').then(pages => console.log(pages))
Output
[ 'root',
'a',
'b',
'c',
'd',
'a/a',
'a/a',
'b/a',
'c/a',
'c/b',
'c/c',
'c/d',
'b/a/a',
'b/a/a/a' ]
Remarks
I'm happy that the foo function in my solution is not too far off from your original – I think you'll appreciate that. They both use an inner auxiliary function for looping and approach the problem in a similar way. async/await keeps the code nice an flat and highly readable (imo). Overall, I think this is an excellent solution for a somewhat complex problem.
Oh, and don't forget about circular references. There are no circular references in my dataset, but if page 'a' were to have pages: ['b'] and page 'b' had pages: ['a'], you can expect infinite recursion. Because this answer processes the pages serially, this would be a very easy to fix (by checking the accumulated value acc for an existing page identifier). This is much trickier (and out-of-scope of this answer) to handle when processing the pages in parallel.
The issue here is mixing async/sync operations in if (resp.pages) branch. Basically you'll have to return promise from then callback if you want the promise chain to work as expected.
In addition to Phil's answer, if you want to execute pages in order/sequence
if (resp.pages) {
var pages = resp.pages.split(',');
return pages.reduce(function(p, page) {
return p.then(function() {
return doo(page);
});
}, Promise.resolve());
}
The issue is you don't return anything when there's pages.
function foo(filepath) {
var resultArr = [];
function doo(file, promises) {
let promise = asyncOperation(file).then(resp => {
resultArr.push(resp.data);
if (resp.pages) {
var pages = resp.pages.split(',');
var pagePromises = [];
pages.forEach(page => {
return doo(page, pagePromises);
});
//Return the pages
return pagePromises;
} else {
return resultArr;
}
});
//They want it added
if ( promises ) { promises.push( promise ); }
//Just return normally
else { return promise; }
}
return doo(filepath);
}
Related
this is bit more theoretical question. I originally intented to call this question Is it possible to iterate over map twice, but just from the sound of it, it sounds like an anti-pattern. So I assume I'm just approaching this wrong.
Also note: Data here servers as an abstraction. I know that what I'm doing here with data provided here is unnecessary, but please don't get fixated too much on data-structure and what-not. It does not represent the real (more complex, which furthermore is provided by client and I can't alter) data I'm working with. Instead approach the problem as how to return structured async calls for each array item please! :-)
My problem boils down to this:
I have array of ids on which I need to execture two separate asynchronous calls
Both of these callls need to pass (and in all id instances)
So as an example, imagine I have these two data-sets:
const dataMessages = [
{ id: "a", value: "hello" },
{ id: "b", value: "world" }
];
const dataNames = [
{ id: "a", name: "Samuel" },
{ id: "b", name: "John" },
{ id: "c", name: "Gustav"},
];
And an API-call mock-up:
const fetchFor = async (collection: Object[], id: string): Promise<Object> => {
const user = collection.find(user => user.id === id);
if (!user) {
throw Error(`User ${id} not found`);
}
return user;
};
Now I need to call the fetchFor() function for both the data-sets, presumably inside the inside the Promise.all, given forEach is not asynchronous from a predetermined array of ids.
I was thinking something akin to maping a list of Promises for the Promise.all to execute. This works fine, when you only need to map a single api-call:
const run = async () => {
const result = await Promise.all(
['a', 'b'].map(id => fetchFor(dataMessages, id)
)
console.log(result) // [{id: 'a', value: 'hello'}, {id: 'b', value: 'world}]
}
However I somehow need to return both promises for the
fetchFor(dataMessages, id)
fetchFor(dataNames, id)
inside the Promise.all array of Promises.
I guess I could always simply do a flatMap of two maps for both instances of API calls, but that sounds kinda dumb, given
I'd be doing array.map on same array twice
My data structure would not be logically connected (two separate array items for the same user, which would not even by followed by eachother)
So ideally I'd like to return dat in form of
const result = await Promise.all([...])
console.log(result)
/* [
* {id: 'a', message: 'hello', name: 'Samuel'},
* {id: 'b', message: 'world', name: 'John'},
* ]
Or do I simply have to do flatmap of promises and then do data-merging to objects based on id identifier inside a separate handler on the resolved Promise.all?
I've provided a working example of the single-api-call mockup here, so you don't have to copy-paste.
What would be the correct / common way of approaching such an issue?
You could nest Promise.all calls:
const [messages, names] = await Promise.all([
Promise.all(
['a', 'b'].map(id => fetchFor(dataMessages, id)
),
Promise.all(
['a', 'b', 'c'].map(id => fetchFor(dataNames, id)
)
]);
If you're wanting to then merge the results after retrieved, it's just a matter of standard data manipulation.
Consider that we have an 'api_fetchData' which fetch data from server, and based on route, it could be table or a tree, also we have two state that we should update them based on the received data. and i should note that if the data was a table we should sort it's records by its priority. i wonder how should i do that in functionl programming way(like use ramda or either ...)
const tree=null;
and
const table = null:
//table
{
type: "table",
records: [
{
id: 1,
priority: 15
},
{
id: 2,
priority: 3
}
]
}
//tree
{
type: "tree",
children: [
{
type: "table",
records: [
{
id: 1,
priority: 15
},
{
id: 2,
priority: 3
}
]
}
]
}
This is what i did:
//define utils
const Right = x => ({
map: f => Right(f(x)),
fold: (f, g) => g(x),
inspect: x => `Right(${x})`
});
const Left = x => ({
map: f => Left(x),
fold: (f, g) => f(x),
inspect: x => `Left(${x})`
});
const fromNullable = x => (x != null ? Right(x) : Left(null));
const state = reactive({
path: context.root.$route.path,
tree: null,
table: null
});
// final code:
const fetchData = () => {
return fromNullable(mocked_firewall[state.path])
.map(data =>
data.type === "tree"
? (state.tree = data)
: (state.table = R.mergeRight(data, {
records: prioritySort(data.records)
}))
)
.fold(
() => console.log("there is no data based on selected route"),
x => console.log(x)
);
};
You have several problems here -
safely accessing deeply nested data on an uncertain object
sorting an array of complex data using functional techniques
safely and immutably updating deeply nested data on an uncertain object
1. Using .map and .chain is low-level and additional abstraction should be added when syntax becomes painful –
const state =
{ tree:
{ type: "sequoia"
, children: [ "a", "b", "c" ]
}
}
recSafeProp(state, [ "tree", "type" ])
.getOrElse("?") // "sequoia"
recSafeProp(state, [ "tree", "foobar" ])
.getOrElse("?") // "?"
recSafeProp(state, [ "tree", "children", 0 ])
.getOrElse("?") // "a"
recSafeProp(state, [ "tree", "children", 1 ])
.getOrElse("?") // "b"
recSafeProp(state, [ "tree", "children", 99 ])
.getOrElse("?") // "?"
We can implement recSafeProp easily, inventing our own convenience function, safeProp, along the way –
const { Nothing, fromNullable } =
require("data.maybe")
const recSafeProp = (o = {}, props = []) =>
props.reduce // for each props as p
( (r, p) => // safely lookup p on child
r.chain(child => safeProp(child, p))
, fromNullable(o) // init with Maybe o
)
const safeProp = (o = {}, p = "") =>
Object(o) === o // if o is an object
? fromNullable(o[p]) // wrap o[p] Maybe
: Nothing() // o is not an object, return Nothing
2. The Comparison module. You seem to have familiarity with functional modules so let's dive right in –
const { empty, map } =
Comparison
const prioritySort =
map(empty, record => record.priority || 0)
// or...
const prioritySort =
map(empty, ({ priority = 0}) => priority)
myarr.sort(prioritySort)
If you want immutable sort –
const isort = ([ ...copy ], compare = empty) =>
copy.sort(compare)
const newArr =
isort(origArr, proritySort)
Here's the Comparison module. It is introduced in this Q&A. Check it out there if you're interested to see how to make complex sorters in a functional way –
const Comparison =
{ empty: (a, b) =>
a < b ? -1
: a > b ? 1
: 0
, map: (m, f) =>
(a, b) => m(f(a), f(b))
, concat: (m, n) =>
(a, b) => Ordered.concat(m(a, b), n(a, b))
, reverse: (m) =>
(a, b) => m(b, a)
, nsort: (...m) =>
m.reduce(Comparison.concat, Comparison.empty)
}
const Ordered =
{ empty: 0
, concat: (a, b) =>
a === 0 ? b : a
}
Combining sorters –
const { empty, map, concat } =
Comparison
const sortByProp = (prop = "") =>
map(empty, (o = {}) => o[prop])
const sortByFullName =
concat
( sortByProp("lastName") // primary: sort by obj.lastName
, sortByProp("firstName") // secondary: sort by obj.firstName
)
data.sort(sortByFullName) // ...
Composable sorters –
const { ..., reverse } =
Comparison
// sort by `name` then reverse sort by `age` –
data.sort(concat(sortByName, reverse(sortByAge)))
Functional prinicples –
// this...
concat(reverse(sortByName), reverse(sortByAge))
// is the same as...
reverse(concat(sortByName, sortByAge))
As well as –
const { ..., nsort } =
Comparison
// this...
concat(sortByYear, concat(sortByMonth, sortByDay))
// is the same as...
concat(concat(sortByYear, sortByMonth), sortByDay)
// is the same as...
nsort(sortByYear, sortByMonth, sortByDay)
3. The lowest hanging fruit to reach for here is Immutable.js. If I find more time later, I'll show a lo-fi approach to the specific problem.
Your solution seems overkill to me.
Functional programming is about many things, and there is no concise definition, but if your main unit of work is the pure function and if you don't mutate data, you're well on the way to being a functional programmer. Using an Either monad has some powerful benefits at times. But this code looks more like an attempt to fit Either into a place where it makes little sense.
Below is one suggested way to write this code. But I've had to make a few assumptions along the way. First, as you discuss fetching data from a server, I'm assuming that this has to run in an asynchronous mode, using Promises, async-await or a nicer alternative like Tasks or Futures. I further assumed that where you mention mocked_firewall is where you are doing the actual asynchronous call. (Your sample code treated it as an object where you could look up results from a path; but I can't really make sense of that for mimicking a real service.) And one more assumption: the fold(() => log(...), x => log(x)) bit was nothing essential, only a demonstration that your code did what it was supposed to do.
With all that in mind, I wrote a version with an object mapping type to functions, each one taking data and state and returning a new state, and with the central function fetchData that takes something like your mocked_firewall (or really like my alteration of that to return a Promise) and returns a function that accepts a state and returns a new state.
It looks like this:
// State-handling functions
const handlers = {
tree: (data, state) => ({... state, tree: data}),
table: (data, state) => ({
... state,
table: {... data, records: prioritySort (data .records)}
})
}
// Main function
const fetchData = (firewall) => (state) =>
firewall (state)
.then ((data) => (handlers [data .type] || ((data, state) => state)) (data, state))
// Demonstrations
fetchData (mocked_firewall) ({path: 'foo', table: null, tree: null})
.then (console .log) // updates the tree
fetchData (mocked_firewall) ({path: 'bar', table: null, tree: null})
.then (console .log) // updates the table
fetchData (mocked_firewall) ({path: 'baz', table: null, tree: null})
.then (console .log) // server returns type we can't handle; no update
fetchData (mocked_firewall) ({path: 'qux', table: null, tree: null})
.then (console .log) // server returns no `type`; no update
.as-console-wrapper {min-height: 100% !important; top: 0}
<script>
// Dummy functions -- only for demonstration purposes
const prioritySort = (records) =>
records .slice (0) .sort (({priority: p1}, {priority: p2}) => p1 - p2)
const mocked_firewall = ({path}) =>
Promise .resolve ({
foo: {
type: "tree",
children: [{
type: "table",
records: [{id: 1, priority: 15}, {id: 2, priority: 3}]
}]
},
bar: {
type: 'table',
records: [{id: 1, priority: 7}, {id: 2, priority: 1}, {id: 3, priority: 4}]
},
baz: {type: 'unknown', z: 45},
} [path] || {})
</script>
You will notice that this does not alter the state; instead it returns a new object for the state. I see that this is tagged vue, and as I understand it, that is not how Vue works. (This is one of the reasons I haven't really used Vue, in fact.) You could easily change the handlers to update the state in place, with something like tree: (data, state) => {state.tree = data; return state}, or even skipping the return. But don't let any FP zealots catch you doing this; remember that the functional programmers are well versed in "key algorithmic techniques such as recursion and condescension." 1.
You also tagged this ramda.js. I'm one of the founders of Ramda, and a big fan, but I see Ramda helping here only around the edges. I included for instance, a naive version of the prioritySort that you mentioned but didn't supply. A Ramda version would probably be nicer, something like
const prioritySort = sortBy (prop ('priority'))
Similarly, if you don't want to mutate the state, we could probably rewrite the handler functions with Ramda versions, possibly simplifying a bit. But that would be minor. For the main function, I don't see anything that would be improved by using Ramda functions.
There is a good testability argument to be made that we should pass not just the firewall to the main function but also the handlers object. That's entirely up to you, but it does make it easier to mock and test the parts independently. If you don't want to do that, it is entirely possible to inline them in the main function like this:
const fetchData = (firewall) => (state) =>
firewall (state)
.then ((data) => (({
tree: (data, state) => ({... state, tree: data}),
table: (data, state) => ({
... state,
table: {...data, records: prioritySort(data .records)}
})
}) [data .type] || ((data, state) => state)) (data, state))
But in the end, I find the original easier to read, either as is, or with the handlers supplied as another parameter of the main function.
1 The original quote is from Verity Stob, but I know it best from James Iry's wonderful A Brief, Incomplete, and Mostly Wrong History of Programming Languages.
🎈 Short and simple way
In my example I use ramda, you need to compose some functions and voila ✨:
const prioritySort = R.compose(
R.when(
R.propEq('type', 'tree'),
R.over(
R.lensProp('children'),
R.map(child => prioritySort(child))
)
),
R.when(
R.propEq('type', 'table'),
R.over(
R.lensProp('records'),
R.sortBy(R.prop('priority')),
)
),
)
const fetchData = R.pipe(
endpoint => fetch(`https://your-api.com/${endpoint}`, opts).then(res => res.json()),
R.andThen(prioritySort),
)
fetchData(`tree`).then(console.log)
fetchData(`table`).then(console.log)
Check the demo
For inspection you can simple use function const log = value => console.log(value) || value
R.pipe(
// some code
log,
// some code
)
it will log the piping value.
I am very new to Node js and asynchronous programming seems difficult for me to grasp. I am using promise-mysql to make the flow synchronous but I have hit a road block with for loop inside of a chain of promise
I have a multiple choice question module. One table stores all the mcq questions and the other stores all the related choices for the questions. I am using the output of the first query as an input to the second query and so I did promise chaining as below
var mcqAll=[]
var sql_test_q_ans='select qId, q_text from questions'
con.query(sql_test_q_ans)
.then((result)=>{
for(var i=0; i<result.length; i++)
{
ques=result[i]
var sql_test_q_ops='SELECT op_text, op_id FROM mc_ops WHERE
q_id='+result[i].q_id
con.query(sql_test_q_ops)
.then((resultOps)=>{
mcqAll.push({i: ques, ops: resultOps})
console.log(mcqAll)
})
}
})
I am trying to create a javascript object array which would look something like this
[{q_text:'How many states in USA', q_ops:{1:25, 2:35, 3:45, 4:50}}
{question2 and its options}
{question3 and its options}....
]
When I run the above code the object populates all the question's options correctly but the same question is repeated in all the q_text for all questions.
[ { q_text: 'No of states in USA',
[ {op_text: '25', mc_op_id: 113 },
{ op_text: '35', mc_op_id: 114 },
{ op_text: '45', mc_op_id: 115 },
{ op_text: '50', mc_op_id: 116}],
{ q_text: 'No of states in USA',
[ {op_text: 'A', mc_op_id: 1 },
{ op_text: 'B', mc_op_id: 2 },
{ op_text: 'C', mc_op_id: 3 },
{ op_text: 'D', mc_op_id: 4}],
{ q_text: 'No of states in USA',
[ {op_text: 'Yes', mc_op_id: 31 },
{ op_text: 'No', mc_op_id: 32 },
{ op_text: 'No sure', mc_op_id: 33 },
{ op_text: 'Might be', mc_op_id: 34}]
]
I feel like it has something to do with asynchronous flow since console.log before the second query gets printed in all before printing anything after the second query. Any insight would be appreciated
Edit: I added a sample output for better understanding. As seen in the output, the options change and get stored in the js object in the for loop but the question is updated for all the objects to the last question in the for loop
node js current working async and await, still now use to async and await,
use this reference url: https://javascript.info/async-await
async and await is work as promise, await is use to wait to execute script
example
let mcqAll=[]
let sql_test_q_ans='select qId, q_text from questions'
async function showAvatar() {
let result = await con.query(sql_test_q_ans);
if(result.length > 0){
array.forEach((async function (item, index, result) {
let q = result[index];
let sql_test_q_ops='SELECT op_text, op_id FROM mc_ops WHERE
q_id='+result[index].q_id
let executeQuery = await con.query(sql_test_q_ops);
if(executeQuery.affectedRows > 0){
mcqAll.push({index: q, ops: executeQuery})
console.log(mcqAll);
}
});
}
}
You have a scope problem here
This is an example to reproduce your problem:
ques is a global variable that is updated in the for-loop so, when the async code ends the execution will read the global variable with the last ques = result[i] value.
'use strict'
const result = ['a', 'b', 'c']
const mcqAll = []
var ques
for (var i = 0; i < result.length; i++) {
ques = result[i]
var sql_test_q_ops = 'SELECT op_text, op_id FROM mc_ops WHERE q_id = ' + result[i].q_id
query(sql_test_q_ops)
.then(() => {
mcqAll.push({ i: ques })
console.log(mcqAll)
})
}
function query() {
return new Promise(resolve => setTimeout(resolve, 100))
}
But, if you simply declare the ques like this:
for (var i = 0; i < result.length; i++) {
const ques = result[i]
const sql_test_q_op...
all will work.
It is a good practice to use const or let instead of var because the last one creates a global scoped variable that is dangerous.
Regarding your comment: the output is empty because this for-loop is sync, so you reply in sync way to the response.
An example on how to manage this case could be like this:
'use strict'
const result = ['a', 'b', 'c']
const mcqAll = []
const promiseArray = result.map(ques => {
const sql_test_q_ops = 'SELECT op_text, op_id FROM mc_ops WHERE q_id = ' + ques.q_id
return query(sql_test_q_ops)
.then(() => { mcqAll.push({ i: ques }) })
})
// Wait for all the query to complete before rendering the results
Promise.all(promiseArray)
.then(() => {
console.log({ mcqAll });
res.render('mcqAllPage', { mcqAll })
})
.catch(err => res.send(500)) // this is an example
function query() {
return new Promise(resolve => setTimeout(resolve, 100))
}
Consider that there are many possibilities to implement this:
use for async iterator to run query sequentially
improve performance by run only one query with a in condition instead of a query for each q_id and manage the result with some code to group the results
using the promise array as in the example
Go deeper and choose the one that fits best for your need.
Important: .catch always the promise chain!
While learning NodeJS, I've been battling to write a more concise logic to this code block (see below) that could either introduce recursion or make use of ES6 methods to provide more elegance and better readability.
I'm bothered by the nesting happening on the for of loops
Thoughts?
export function pleaseRefactorMe(measures, metrics, stats) {
if (!Array.isArray(metrics)) metrics = [metrics] //=> returns array [ 'temperature' ]
if (!Array.isArray(stats)) stats = [stats] //> returns array [ 'min', 'max', 'average' ]
let statistics = []
/**** refactor opportunity for nested for of loops ****/
for (let metric of metrics) {
for (let stat of stats) {
try {
let value = calculateStatsForMetric(stat, metric, measure)
if (value) {
statistics.push({
metric: metric,
stat: stat,
value: value
})
}
} catch (err) {
return err
}
}
}
return statistics
}
First, always pass arrays in, methods usually shouldn't do this sort of input validation in JavaScript. Also don't throw in calculateStatsForMetric, if you have throwing code there wrap it in a try/catch and return a falsey value.
Now, you can use higher order array methods like flatMap and map:
Take each metric
For each metric
Take each stat (this calls for a flatMap on a map)
Calculate a function on it
Keep truthy values (this calls for a filter)
Or in code:
export const refactored = (measure, metrics, stats) =>
metrics.flatMap(metric => stats.map(stat => ({
metric,
stat,
value: calculateStatsForMetric(stat, metric, measure)
}))).filter(o => o.value);
A simple approach would be to use forEach -
let statistics = [];
metrics.forEach(m => {
stats.forEach(s => {
let value = calculateStatsForMetric(s, m, measures);
if (value) {
statistics.push({
metric: m,
stat: s,
value: value
});
}
});
});
I have a data structure that looks like this:
{
sections: [
{
pages: [
{
questions: [
],
},
],
},
],
}
There's data in the questions array that I'm trying to get at, and return a final 1 dimensional array at the end. There can be an x number of sections, each section can have an x number of pages, and each page can have an x number of questions.
I'm trying to keep the code relatively concise but also readable, this is my current implementation:
function generateQuestionData(survey) {
let data = [];
survey.sections.forEach((section) => {
section.pages.forEach((page) => {
const newData = page.questions.map(getQuestionDataItem);
data = [...data, ...newData];
});
});
return data;
}
EDIT
Is there a way to accomplish the same thing without the data variable reassignment? so something along the lines of
function generateQuestionData(survey) {
return survey.sections.forEach((section) => { // something instead of forEach
section.pages.forEach((page) => page.questions.map(getQuestionDataItem));
// data = [...data, ...newData]; no need for this?
});
}
if that makes sense
How about having a helper for iteration:
function* goOver(array, key, ...keys) {
for(const el of array) {
if(keys.length) {
yield* goOver(el[key] || [], ...keys);
} else {
yield el[key];
}
}
That can be used as:
const result = [...goOver(survey.sections, "pages", "questions")]
.map(getQuestionDataItem);
concerning the second question:
Is there a way to accomplish the same thing without the data variable reassignment?
how about:
data.push(...newData);
You could use reduce in order to show the intent to really do something per section and page to the likes of:
private generateQuestionData = (survey) =>
survey.sections.reduce((data, section) =>
[
...data,
...section.pages.reduce((pageData, page) =>
[
...pageData,
...page.questions.map(getQuestionDataItem)
],
[]
)
],
[]
);
Code indentation/formatting could surely be optimized.
const data = {
sections: [
{
pages: [
{
questions: [
"hello",
"world"
],
},
{
questions: [
"foo",
"bar"
]
}
],
},
],
};
const get = (...pathKeys) => o => pathKeys.reduce((o,k ) => o[k], o);
const flatten = (arrays) => [].concat.apply([], arrays);
const result = flatten(get("sections")(data).map(s => get("pages")(s).map(get("questions"))));
console.log(flatten(flatten(result)));
Note that the flatten at the end could be simplified with a flattenDeep function.
In "pseudo code" I would express your problem like that, with
compose being composition left to right, and map being fn => arr => arr.map(fn)
compose(
get("section"),
map(compose(get("pages"), map(get("questions")))),
flattenDeep,
);