I'm using the native driver for mongoDB. In the db I have about 7 collections and I want create a variable that stores the amount of entries in each collection minus the last collection. Afterwards I want to create another variable that stores the entries of the last collection then I want to pass the variables through the res.render() command and show it on the webpage.
The problem I'm having here is that I'm so used to synchronous execution of functions which in this case goes straight out the window.
The code below is the way I'm thinking, if everything is executed in sync.
var count = 0;
db.listCollections().toArray(function(err,collection){
for(i = 1; i < collection.length;i++){
db.collection(collection[i].name).count(function(err,value){
count = count + value;
})
}
var count2 = db.collection(collection[i].name).count(function(err,value){
return value;
})
res.render('index.html',{data1: count, data2: count2})
})
Obviously this doesn't do want I want to do so I tried playing around with promise, but ended up being even more confused.
You could do something like this with Promises:
Get collection names, iterate over them, and return either count, or entries (if it's the last collection). Then sum up individual counts and send everything to the client.
db.listCollections().toArray()
.then(collections => {
let len = collections.length - 1
return Promise.all(collections.map(({name}, i) => {
let curr = db.collection(name)
return i < len ? curr.count() : curr.find().toArray()
}
))
}
)
.then(res => {
let last = res.pop(),
count = res.reduce((p, c) => p + c)
res.render('index.html', {count, last})
})
Related
In the code bellow I work with CSS DOM, which may be computation heavy. This is apparently necessary to access the :after selector. renderItem method will add a new item to the DOM (including it's after element) and this is the reason why I have used the async function and await for it's return in each iteration inside of loadFromStorage.
However, the await seems to not work correctly, or something weird happens inside of renderItem function. The n iterator is updated correctly at the beginning of the function (items are correctly rendered to the screen and the first console.debug prints a correct value in a correct order), but at the bottom, the second printed value, is always the last iteration value (which is 4 in my case, as I am trying to render 4 items from the local storage) and getCSSRule method is getting a wrong number.
let books = []
let n = 0
const renderItem = async (entry, direction = 1) => {
const li = document.createElement('li')
const ul = document.querySelector('ul')
li.classList.add('item')
n += 1
console.debug(`iter: ${n}`)
li.id = (`item${n}`)
await addCSSRule(`#item${n}:after`)
li.innerText = entry.slice(0, entry.length - 13)
if (direction === 1)
ul.appendChild(li)
else
ul.insertBefore(li, ul.firstChild)
console.debug(`iter: ${n}`)
const s = await getCSSRule(`#item${n}::after`).catch(() => {
console.debug(`Failed to find ':after' selector of 'item${n}'`)
return false
})
s.style.content = "\""+ entry.slice(entry.length - 13, entry.length) +"\""
return true
}
const loadFromStorage = () => {
books = localStorage.getItem('books').split('//')
books.forEach(async (entry) => {
await renderItem(entry)
})
}
...
Console result (considering localStorage.getItem('books').split('//') returns 4 items):
iter: 1
iter: 2
iter: 3
iter: 4
iter: 4 // Printed x4
I been also trying to pass this renderItem method to await inside of a Promise object, which give me the same result. Also when I update the n iterator at the end of function the same thing happens, but at the beginning of it.
I am sorry if some terminology I have used is not correct in the context of JavaScript, I been not using this language for many years and currently I am trying to catch on.
The key problem here is that you're passing an async function to forEach, so even though you're awaiting inside the body of it, forEach will not wait for the function itself. To illustrate the order of events here, say you have 4 books A, B, C, D. Your execution will look something like this.
renderItem(A)
n += 1 (n is now 1)
console.log(n) (logs 1)
await addCSSRule(`#item${1}:after`) (This is a truly asynchronous event, and so this frees up the event loop to work on other things, namely the next elements in the forEach)
renderItem(B)
n += 1 (2)
console.log(n) (logs 2)
...
renderItem(C) ... n += 1 (3) ... await addCSSRule
renderItem(D) ... n += 1 (4) ... await addCSSRule
And then whenever the addCSSRule calls resolve n will always be 4 no matter which call you're in.
Solution
Use a for await...of loop instead of Array.prototype.forEach.
for await (const entry of books) {
await renderItem(entry);
}
Or a traditional for loop, and modify renderItem to take n as an argument
for (let i = 0; i < books.length; i++) {
renderItem(books[i], i+1);
// we don't need to await in this case, and we add 1 to i so that the 'n' value is 1-indexed to match your current behaviour.
}
I would prefer the latter option as it's best practice to avoid mutable global state (your n variable) - as it can lead to confusing control flow and issues just like the one you're having.
One other option is to set a local variable to the value of n after incrementing it inside renderItem, so that for the duration of that function the value won't change, but that seems like a very hacky workaround to me.
I have a JavaScript loop
for (var index = 0; index < this.excelData.results.length; index++) {
let pmidList = this.excelData.results[index]["PMIDList"];
if (pmidList.length == 0 ){
continue;
}
let count = 0;
let pmidsList = pmidList.split(',');
if (pmidsList.length > 200){
pmidList = pmidsList.slice(count, count+ 200).join(',');
} else {
pmidList= pmidsList.join(",");
}
// Create some type of mini loop
// pmidList is a comma separated string so I need to first put it into an array
// then slice the array into 200 item segments
let getJSONLink = 'https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?'
getJSONLink += 'db=pubmed&retmode=json&id=' + pmidList
await axios.get(getJSONLink)
.then(res => {
let jsonObj = res.data.result;
//Do Stuff with the data
}
}).catch(function(error) {
console.log(error);
});
//Loop
}
the entire process works fine EXCEPT when the PMIDList has more than 200 comma separated items. The web service only will accept 200 at a time. So I need to add an internal loop that parses out the first 200 hundred and loop back around to do the rest before going to the next index and it would be nice to do this synchronously since the webservice only allows 3 requests a second. And since I'm using Vue wait is another issue.
The easiest option would be to use async await within a while loop, mutating the original array. Something like:
async function doHttpStuff(params) {
let getJSONLink = 'https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?'
getJSONLink += 'db=pubmed&retmode=json&id=' + params.join('')
return axios.get(getJSONLink)
}
and use it inside your inner while loop:
...
let pmidsList = pmidList.split(',');
// I'll clone this, because mutating might have some side effects?
const clone = [...pmidsList];
while(clone.length) {
const tmp = clone.splice(0, 199); // we are mutating the array, so length is reduced
const data = await doHttpStuff(tmp);
// do things with data
}
...
I am trying to get the total 'stargazers_count' from a particular user's github repo. This code, using 'for' loop is working. But I would like to use 'reduce'. I am posting what I tried. Can someone point to me where I am wrong?
The JSON can be viewed with this url,using your github username:
https://api.github.com/users/${your username}/repos
This is the working code using for loop:
axios.get(response.data.repos_url)
.then((res) => {
let starCount = 0;
for(let i=0; i<res.data.length; i++) // find out the total number of github stars
{
starCount += res.data[i].stargazers_count;
}
response.data.NoOfStars = starCount; // add the total stars as a property in response object
This is what I tried with 'reduce':
axios.get(response.data.repos_url)
.then((res) => {
let starCount = 0;
const reducer = (acc, currVal) => acc + currVal.stargazers_count;
arr = res.data;
starCount = arr.reduce(reducer);
This did not work. If I can get a brief explanation of where and why I am wrong, it will be helpful.
You need to provide a starting value for your accumulator, otherwise reduce will assume that the first array element is the starting value and this would lead to a NaN result.
starCount = arr.reduce(reducer,0);
I've got a data set that's several hundred elements long. I need to loop through the arrays and objects and determine if the data in them is less than a certain number (in my case, 0). If it is, I need to remove all those data points which are less than zero from the data set.
I've tried .pop and .slice but I'm not implementing them correctly. I was trying to push the bad data into its own array, leaving me with only the good data left.
Here's my JS
for (var i = 0; i < data.length; i++) {
if (data[i].high < 0) {
console.log(data[i].high)
var badData = [];
badData.push(data.pop(data[i].high));
console.log(data[i].high)
}
}
I'd go with .filter():
const result = data.filter(row => row.high > 0);
In case you need the bad results too.
const { good, bad } = data.reduce((acc, row) => {
const identifier = row.high > 0 ? 'good' : 'bad';
acc[identifier].push(row);
return acc;
}, { bad: [], good: [] });
There seem to be many questions about this problem out here, but none directly relate to my question AFAICT. Here is the problem statement:
This problem is the same as the previous problem (HTTP COLLECT) in that you need to use http.get(). However, this time you will be provided with three URLs as the first three command-line arguments.
You must collect the complete content provided to you by each of the URLs and print it to the console (stdout). You don't need to print out the length, just the data as a String; one line per URL. The catch is that you must print them out in the same order as the URLs are provided to you as command-line arguments.
Here is my original solution that fails:
var http = require('http')
var concat = require('concat-stream')
var args = process.argv.slice(2, 5)
var args_len = args.length
var results = []
args.forEach(function(arg, i) {
http.get(arg, function(res) {
res.setEncoding('utf8')
res.pipe(concat(function(str) {
results[i] = str
if (results.length === args_len)
results.forEach(function(val) {
console.log(val)
})
}))
}).on('error', console.error)
})
This is the solution they recommend:
var http = require('http')
var bl = require('bl')
var results = []
var count = 0
function printResults () {
for (var i = 0; i < 3; i++)
console.log(results[i])
}
function httpGet (index) {
http.get(process.argv[2 + index], function (response) {
response.pipe(bl(function (err, data) {
if (err)
return console.error(err)
results[index] = data.toString()
count++
if (count == 3)
printResults()
}))
})
}
for (var i = 0; i < 3; i++)
httpGet(i)
What I fail to grok is the fundamental difference between my code and the official solution. I am doing the same as their solution when it comes to stuffing the replies into an array to reference later. They use a counter to count the number of callbacks while I am comparing the length of two arrays (one whose length increases every callback); does that matter? When I try my solution outside the bounds of the learnyounode program it seems to work just fine. But I know that probably means little.... So someone who knows node better than I... care to explain where I have gone wrong? TIA.
They use a counter to count the number of callbacks while I am comparing the length of two arrays (one whose length increases every callback); does that matter?
Yes, it does matter. The .length of an array depends on the highest index in the array, not the actual number of assigned elements.
The difference surfaces only when the results from the asynchronous requests come back out of order. If you first assign index 0, then 1, then 2 and so on, the .length matches the number of assigned elements and would be the same as their counter. But now try out this:
var results = []
console.log(results.length) // 0 - as expected
results[1] = "lo ";
console.log(results.length) // 2 - sic!
results[0] = "Hel";
console.log(results.length) // 2 - didn't change!
results[3] = "ld!";
console.log(results.length) // 4
results[2] = "Wor";
console.log(results.length) // 4
If you would test the length after each assignment and output the array whenever you get 4, it would print
"Hello ld!"
"Hello World!"
So it turns out there were two different issues here, one of which was pointed out by #Bergi above. The two issues are as follows:
The .length method does not actually return the number of elements in the array. Rather it returns the highest index that is available. This seems quite silly. Thanks to #Bergi for pointing this out.
The scoping of the i variable is improper, and as such the value of i can change. This causes a race condition when results come back.
My final solution ended up being as follows:
var http = require('http')
var concat = require('concat-stream')
var args = process.argv.slice(2, 5)
var args_len = args.length
var results = []
var count = 0
function get_url_save(url, idx) {
http.get(url, function(res) {
res.setEncoding('utf8')
res.pipe(concat(function(str) {
results[idx] = str
if (++count === args_len)
results.forEach(function(val) {
console.log(val)
})
}))
}).on('error', console.error)
}
args.forEach(function(arg, i) {
get_url_save(arg, i)
})
Breaking the outtermost forEach into a method call solves the changing i issue since i gets passed in as parameter by value, thus never changing. The addition of the counter solves the issue described by #Bergi since the .length method isn't as intuitive as one would imagine.