I am trying to resolve the array of promises together. Not sure how to do it. Let me share the pseudo code for it.
async function sendNotification(user, notificationInfo) {
const options = {
method: 'POST',
url: 'http://xx.xx.xx:3000/notification/send',
headers:
{ 'Content-Type': 'application/json' },
body:
{ notificationInfo, user },
json: true,
};
console.log('sent');
return rp(options);
}
I have wrapped the sendNotification method in another method which returns the promise of rp(request-promise) module.
Next i am pushing this sendNotification method in array of promise , something like this
const notificationWorker = [];
for (const key3 in notificationObject) {
if(notificationObject[key3].users.length > 0) {
notificationWorker.push(sendNotification(notificationObject[key3].users, notificationObject[key3].payload)); // problem is notification are going as soon as i am pushing in notificationWorker array.
}
}
// task 1 - send all notifications
const result = await Promise.all(notificationWorker); // resolving all notification promises together
// task 2 - update values in db , after sending all notifications
const result2 = await Promise.all(updateWorker); // update some values in db
In above code , my problem is notifications are going as soon as i am pushing it in notificationWorker array. I want all notifications to go together, when i run await Promise.all(notificationWorker)
Not sure , how to achieve what i am trying?
I understood the question partially , but then i feel this is difference between nodejs working concurrently and we trying to achieve parallelism , isn't that so.
Nodejs just switching between the tasks by , and not actually parallely doing it.Child Process might help you in that case.
So for eg. if you go through a snippet
function done(i){
try{
return new Promise((resolve,reject)=>{
console.log(i);
resolve("resolved " + i + "th promise");
})
}catch(e){
return null;
}
}
let promises = [];
for(let i=0;i < 100000; i++){
promises.push(done(i));
}
So console starts even when you dont call Promise.all right ? this was your question but infact Promise.all should not suffice your thing , should go by spwaning child processes to achieve parallelism to some extent.
The point i am trying to make it you are potraying the question to do something like first generate array of promises and start all of them once when Promise.all is called but in my opinion Promise.all also will be running concurrently not giving you what you want to achieve.
Something like this - https://htayyar.medium.com/multi-threading-in-javascript-with-paralleljs-10e1f7a1cf32 || How to create threads in nodejs
Though most of these cases show up when we need to do a cpu intensive task etc but here we can achieve something called map reduce to distribute you array of users in parts and start that array to loop and send notifications.
All of the solutions, i am presenting is to achieve some kind of parallelism but i dont think sending array of huge amount of users would ever be done easily (with less resources - instance config etc) at same instant
Related
I have hundreds of element to get from MongoDB database and print them in the front-end.
Fetch all into single one request could decrease performance as it carries big payload in the body response.
So I'm looking for a solution to split my Angular request into several and with the constraint to be simultaneous.
Example :
MONGODB
Collection: Elements (_id, name, children, ...)
Documents: 10000+
But we only need ~100-500 elements
ANGULAR :
const observables = [];
const iteration = 5, idParent = 'eufe4839ffcdsj489483'; // example
for (let i = 0; i < iteration; i++) {
observables.push(
this.myHttpService.customGetMethod<Element>(COLLECTION_NAME, endpoint, 'idParent=' + idParent + '&limit=??')); // url with query
}
forkJoin(observables).subscribe(
data => {
this.elements.push(data[0]);
this.elements.push(data[1]);
},
err => console.error(err)
);
I use forkJoin because I need simultaneous requests for better performance.
The idea is to send multiple requests to the back-end with different limit parameter values and get the whole set of elements into the data object at the end.
The only purpose is to split request to avoid big latency with maybe errors due to the size of the payload body.
How can I proceed with the given stack to perform such this operation ? I would like to avoid the use of websockets.
I think fork.join is used when you want to resolve all the observables in parallel, but you need to wait for all the request what if one fails? forkJoin will complete on first error as soon as it encounters it and you kinda can't know from which observable it came from , but if you handle errors inside the inner observables you can easily achieve that.
const observables = [];
const iteration = 5, idParent = 'eufe4839ffcdsj489483'; // example
for (let i = 0; i < iteration; i++) {
observables.push(
this.myHttpService.customGetMethod<Element>(COLLECTION_NAME, endpoint, 'idParent=' + idParent + '&limit=??')).pipe(catchError(() => {
throw `an Error request #: ${i}`;
}); // url with query
}
forkJoin(observables).subscribe(
data => {
this.elements.push(data[0]);
this.elements.push(data[1]);
},
err => console.error(err)
);
The other way could be to introduce the infinite-scroll or ngx-infinite-scroll if you want to show the data as list.
You can also add the pagination in the frontend if that matches your requirement. There is one lib which might help you: Syncfusion grids. There can be other ways too to improve performance at the backend side too.
I'm working with Cloud Functions for Firebase, and I get a timeout with some of my functions. I'm pretty new with JavaScript. It looks like I need to put a for inside a promise, and I get some problems. The code actually get off from for too early, and I think he make this in a long time. Do you have some way to improve this code and make the code faster?
exports.firebaseFunctions = functions.database.ref("mess/{pushId}").onUpdate(event => {
//first i get event val and a object inside a firebase
const original = event.data.val();
const users = original.uids; // THIS ITS ALL USERS UIDS!!
// so fist i get all users uids and put inside a array
let usersUids = [];
for (let key in users) {
usersUids.push(users[key]);
}
// so now i gonna make a promise for use all this uids and get token's device
//and save them inside a another child in firebase!!
return new Promise((resolve) => {
let userTokens = [];
usersUids.forEach(element => {
admin.database().ref('users/' + element).child('token').once('value', snapShot => {
if (snapShot.val()) { // if token exist put him inside a array
userTokens.push(snapShot.val());
}
})
})
resolve({
userTokens
})
}) // now i make then here, from get userTokens and save in another child inside a firebase database
.then((res) => {
return admin.database().ref("USERS/TOKENS").push({
userTokens: res,
})
})
})
You are making network requests with firebase, so maybe that's why it's slow. You are making one request per user, so if you have 100 ids there, it might as well take a while.
But there's another problem that I notice, that is: you are just resolving to an empty list. To wait for several promises, create an array of promises, and then use Promise.all to create a promise that waits for all of them in parallel.
When you call resolve, you have already done the forEach, and you have started every promise, but they have not been added to the list yet. To make it better, chance it to a map and collect all the returned promises, and then return Promise.all.
I am using npm ws module (or actually the wrapper called isomorphic-ws) for websocket connection.
NPM Module: isomorphic-ws
I use it to receive some array data from a websocket++ server running on the same PC. This data is then processed and displayed as a series of charts.
Now the problem is that the handling itself takes a very long time. I use one message to calculate 16 charts and for each of them I need to calculate a lot of logarithms and other slow operations and all that in JS. Well, the whole refresh operation takes about 20 seconds.
Now I could actually live with that, but the problem is that when I get a new request it is processed after the whole message handler is finished. And if I get several requests in the meantime, all of them shall be processed as they came in. And so the requests are there queued and the current state gets more and more outdated as the time goes on...
I would like to have a way of detecting that there is another message waiting to be processed. If that is the case I could just stop the current handler at any time and start over... So when using npm ws, is there a way of telling that there is another message waiting in to be processed?
Thanks
You need to create some sort of cancelable job wrapper. It's hard to give a concrete suggestion without seeing your code. But it could be something like this.
const processArray = array => {
let canceled = false;
const promise = new Promise((resolve, reject) => {
// do something with the array
for(let i = 0; i < array.length; i++) {
// check on each iteration if the job has been canceled
if(canceled) return reject({ reason: 'canceled' });
doSomething(array[i])
}
resolve(result)
})
return {
cancel: () => {
cancel = true
},
promise
}
}
const job = processArray([1, 2, 3, ...1000000]) // huge array
// handle the success
job.promise.then(result => console.log(result))
// Cancel the job
job.cancel()
I'm sure there are libraries to serve this exact purpose. But I just wanted to give a basic example of how it could be done.
With node.js I want to http.get a number of remote urls in a way that only 10 (or n) runs at a time.
I also want to retry a request if an exception occures locally (m times), but when the status code returns an error (5XX, 4XX, etc) the request counts as valid.
This is really hard for me to wrap my head around.
Problems:
Cannot try-catch http.get as it is async.
Need a way to retry a request on failure.
I need some kind of semaphore that keeps track of the currently active request count.
When all requests finished I want to get the list of all request urls and response status codes in a list which I want to sort/group/manipulate, so I need to wait for all requests to finish.
Seems like for every async problem using promises are recommended, but I end up nesting too many promises and it quickly becomes uncypherable.
There are lots of ways to approach the 10 requests running at a time.
Async Library - Use the async library with the .parallelLimit() method where you can specify the number of requests you want running at one time.
Bluebird Promise Library - Use the Bluebird promise library and the request library to wrap your http.get() into something that can return a promise and then use Promise.map() with a concurrency option set to 10.
Manually coded - Code your requests manually to start up 10 and then each time one completes, start another one.
In all cases, you will have to manually write some retry code and as with all retry code, you will have to very carefully decide which types of errors you retry, how soon you retry them, how much you backoff between retry attempts and when you eventually give up (all things you have not specified).
Other related answers:
How to make millions of parallel http requests from nodejs app?
Million requests, 10 at a time - manually coded example
My preferred method is with Bluebird and promises. Including retry and result collection in order, that could look something like this:
const request = require('request');
const Promise = require('bluebird');
const get = Promise.promisify(request.get);
let remoteUrls = [...]; // large array of URLs
const maxRetryCnt = 3;
const retryDelay = 500;
Promise.map(remoteUrls, function(url) {
let retryCnt = 0;
function run() {
return get(url).then(function(result) {
// do whatever you want with the result here
return result;
}).catch(function(err) {
// decide what your retry strategy is here
// catch all errors here so other URLs continue to execute
if (err is of retry type && retryCnt < maxRetryCnt) {
++retryCnt;
// try again after a short delay
// chain onto previous promise so Promise.map() is still
// respecting our concurrency value
return Promise.delay(retryDelay).then(run);
}
// make value be null if no retries succeeded
return null;
});
}
return run();
}, {concurrency: 10}).then(function(allResults) {
// everything done here and allResults contains results with null for err URLs
});
The simple way is to use async library, it has a .parallelLimit method that does exactly what you need.
Advice from a Parse developer forum said to "limit saveAll to 75 objects unless one wants saveAll to make its own batches" which by default are 20 objects. And to put this in a promise chain.
I need to do a saveAll promise chain where I don't know how many promises I need.
How would this be done?
I have an Array of Arrays. The sub arrays are all length 75. I need all the indexes of the master array to be saveAll in a Promise each.
var savePromises = []; // this will collect save promises
while((partition=partitionedArray.pop()) != null){
savePromises.push(Parse.Object.saveAll(partition, {
success: function(objs) {
// objects have been saved...
},
error: function(error) {
// an error occurred...
status.error("something failed");
}
}));
}
return Parse.Promise.when(savePromises);
}).then(function() {
// Set the job's success status
status.success("successful everything");
A nice way to do this is to build the chain of promises recursively. If you've already batched the objects that need saving into batches, then some of the work is done already.
// assume batches is [ [ unsaved_object0 ... unsaved_object74 ], [ unsaved_object75 ... unsaved_object149 ], ... ]
function saveBatches(batches) {
if (batches.length === 0) { return Parse.Promise.as(); }
var nextBatch = batches[0];
return Parse.Object.saveAll(nextBatch).then(function() {
var remainingBatches = batches.slice(1, batches.length);
return saveBatches(remainingBatches);
});
}
EDIT - To call this, just call it and handle the promise it returns...
function doAllThoseSaves() {
var batches = // your code to build unsaved objects
// don't save them yet, just create (or update) e.g....
var MyClass = Parse.Object.extend("MyClass")
var instance = new MyClass();
// set, etc
batches = [ [ instance ] ]; // see? not saved
saveBatches(batches).then(function() {
// the saves are done
}, function(error) {
// handle the error
});
}
EDIT 2 - At some point, the transactions you want to won't fit under the burst limit of the free tier, and spread out (somehow) won't fit within the timeout limit.
I've struggled with a similar problem. In my case, it's a rare, admin-facing migration. Rare enough and invisible to the end user, to have made me lazy about a solid solution. This is kind of a different question, now, but a few ideas for a solid solution could be:
see underscore.js _.throttle(), running from the client, to spread the transactions out over time
run your own node server that throttles calls into parse similarly (or the equal) to _.throttle().
a parse scheduled job that runs frequently, taking a small bite at a time (my case involves an import file, so I can save it quickly initially, open it in the job, count the number of objects that I've created so far, scan accordingly into the file, and do another batch)
my current (extra dumb, but functional) solution: admin user manually requests N small batches, taking care to space those requests ("one mississippi, two mississippi, ...") between button presses
heaven forbid - hire another back-end, remembering that we usually get what we pay for, and parse -- even at the free-tier -- is pretty nice.