Wait for inner code to finish before looping - javascript

I have a for loop that I would like to run 10 times. It downloads a 3MB file off my server then returns the time (in ms) it took and I will do that 10 times and take the average as the client's download speed.
However, when I run this loop:
$scope.startTest = function () {
var dlTime = 0;
for (var i = 0; i < 10; i++) {
var wait = getLargeData();
wait.then(function(result) {
dlTime += result.dlTime;
$scope.message += "\n finished loop " + i;
});
}
$scope.message += "\n Total download time: " + dlTime;
}
it prints out the following:
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
finished loop 10
Total download time: 0
I know my problem has to do with asychronization but how can I make the loop wait on the .then call before moving on?
Edit: getLargeData() does return a promise
function getLargeData() {
var loadTime = 0;
var dlSpeed = 0;
var promise = $q.defer();
var startTime = new Date();
$networkSvc.getLargeData()
.success(function (data) {
loadTime = new Date() - startTime;
dlSpeed = 3 / (loadTime / 1000);
var ret = { loadTime: loadTime, dlSpeed: dlSpeed };
promise.resolve(ret);
return promise.promise;
})
.error(function() {
$scope.message = "Error - could not contact server.";
});
return promise.promise;
}

Use $q.all
let promises = [promiseAlpha(), promiseBeta(), promiseGamma()];
$q.all(promises).then((values) => {
console.log(values[0]); // value alpha
console.log(values[1]); // value beta
console.log(values[2]); // value gamma
complete();
});
It takes array of promises and values is array of completed promises.
Here is something about.
The code you wrote write 10 for ten times, because that are promises which wait for data, but i variable goes before finish.
q.all is for resolving these ten promises. You can iterate in this scope.

You need to take advantage of the $q service of angular. Use that in your getLargeData() function and have that function return a promise upon completion. You can than benefit from it as such getLargeData().then( ... Read more about it in the docs and the first example is a good point to start.

You cannot. Either
Get rid of asynchronous calls which will lead to the fact that your users really wait (by the way you won't download then in parallel)
Use Promise.all() (learn more here) or similar to the promise ($q?) implementation you use. This allows you to make a callback when all processes are done.

Related

Wait X time, then continue with the loop - JS React

I have a loop that cycles around and creates REST Queries, creating and running a different one each time.
In the future I imagine the number of REST Queries that could be made will increase far higher than the browser will take at once (over 20,000).
I want to do something along the lines of counting how many loops have been done, and after every 500 or so, pausing for a few seconds to allow the browser to catch up with the REST responses, and then continuing.
How is this done in react JS?
example code:
for (var i = 0; i < array.length; i++)
{
query += i;
axious.get(query) . then { ...does stuff here... }
//want something along the lines of if multiple of 500, wait(1000)
}
the simplest approach is to make a waitIf function that returns a Promise and accepts a condition.
If the condition is true, it will wait then execute the callback, otherwise, it will execute the callback directly.
a simple implementation would be.
function waitIf(condition, duration, callback) {
if (condition) {
// return a Promise that wait for a `duration` ms using timeout
return Promise((resolve) => {
setTimeout(() => {
resolve(callback());
}, duration);
})
}
return callback();
}
for (let i = 0; i < array.length; i++) {
query += i;
// unify the call here
waitIf(i % 500 === 0, 1000, () => axious.get(query)).then();
}

Callback is being called only after big for loop ends

I'm receiving data in browser through websockets (paho-mqtt) but problem is that the receiving callback gets fired only when another task ends (big for loop) and it gets fired with all the stacked data, I'm not losing data just getting delayed. Shouldn't the callback get fired even if there is a loop running? What is happening here?. Otherwise, how can I achieve this, keep receiving while inside a loop?
What I'm trying to say is equivalent to the following:
If I do this in chrome
setTimeout(() => {
console.log('hello!');
}, 10);
for (var i = 0; i < 50000; i++) {
console.log('for array');
}
I get
50000 VM15292:5 for array
VM15292:2 hello!
Shouldn't I get something like this?
1000 VM15292:5 for array
VM15292:2 hello!
49000 VM15292:5 for array
When you run JavaScript code in the browser (unless using Web Workers or other special technologies), it is executed on a single thread. That might not sound too important, but it is.
Your code consists of a for-loop (synchronous) and a call to setTimeout (asychronous). Since only one piece of JavaScript can be running at once, your for-loop will never be interrupted by setTimeout.
In fact, if your for-loop contained extremely intensive operations that required more than 10 ms to complete, your setTimeout callback might actually be delayed past that mark, because the browser always wait for the currently executing code to finish before continuing to run the event loop.
setTimeout(() => {
console.log('hello!');
}, 10);
for (var i = 0; i < /* 50000 */ 5; i++) {
console.log('for array');
}
The others have diagnosed the problem well, the single threaded nature of the browser. I will offer a possible solution: generators.
Here's a codepen which demonstrates the problem:
http://codepen.io/anon/pen/zZwXem?editors=1111
window.CP.PenTimer.MAX_TIME_IN_LOOP_WO_EXIT = 60000;
function log(message) {
const output = document.getElementById('output');
output.value = output.value + '\n' + message;
}
function asyncTask() {
log('Simulated websocket message')
}
function doWork() {
const timer = setInterval(1000, asyncTask);
let total = 0;
for (let i = 1; i < 100000000; i++) {
const foo = Math.log(i) * Math.sin(i);
total += foo;
}
log('The total is: '+ total);
clearInterval(timer);
}
When doWork() is called by clicking the 'Do Work' button, the asyncTask never runs, and the UI locks up. Horrible UX.
The following example uses a generator to run the long running task.
http://codepen.io/anon/pen/jBmoPZ?editors=1111
//Basically disable codepen infinite loop detection, which is faulty for generators
window.CP.PenTimer.MAX_TIME_IN_LOOP_WO_EXIT = 120000;
let workTimer;
function log(message) {
const output = document.getElementById('output');
output.value = output.value + '\n' + message;
}
function asyncTask() {
log('Simulated websocket message')
}
let workGenerator = null;
function runWork() {
if (workGenerator === null) {
workGenerator = doWork();
}
const work = workGenerator.next();
if (work.done) {
log('The total is: '+ work.value);
workerGenerator = null;
} else {
workTimer = setTimeout(runWork,0);
}
}
function* doWork() {
const timer = setInterval(asyncTask,1000);
let total = 0;
for (let i = 1; i < 100000000; i++) {
if (i % 100000 === 0) {
yield;
}
if (i % 1000000 == 0) {
log((i / 100000000 * 100).toFixed(1) + '% complete');
}
const foo = Math.log(i) * Math.sin(i);
total += foo;
}
clearInterval(timer);
return total;
}
Here we do work in a generator, and create a generator runner to call from the 'Do Work' button in the UI. This runs on the latest version of Chrome, I can't speak for other browsers. Typically you'd use something like babel to compile the generators down to ES5 syntax for a production build.
The generator yields every 10000 rows of calculation, and emits a status update every 100000 rows. The generator runner 'runWork' creates an instance of the generator and repeatedly calls next(). The generator then runs until it hits the next 'yield' or return statement. After the generator yields, the generator runner then gives up the UI thread by calling setTimeout with 0 milliseconds and using itself as the handler function. This typically means it will get called once every animation frame (ideally). This goes until the generator returns the done flag, at which point the generator runner can get the returned value and clean up.
Here the HTML for the example in case you need to recreate the codepen:
<input type='button' value='Do Work' onclick=doWork() />
<textarea id='output' style='width:200px;height:200px'></textarea>
Javascript engines tends to be single threaded.
So if you are in a long running tight loop that doesn't yield (e.g. to do some io) then the callback will never get a chance to run until the loop finishes

Make several requests to an API that can only handle 20 request a minute

I've got a method that returns a promise and internally that method makes a call to an API which can only have 20 requests every minute. The problem is that I have a large array of objects (around 300) and I would like to make a call to the API for each one of them.
At the moment I have the following code:
const bigArray = [.....];
Promise.all(bigArray.map(apiFetch)).then((data) => {
...
});
But it doesnt handle the timing constraint. I was hoping I could use something like _.chunk and _.debounce from lodash but I can't wrap my mind around it. Could anyone help me out ?
If you can use the Bluebird promise library, it has a concurrency feature built in that lets you manage a group of async operations to at most N in flight at a time.
var Promise = require('bluebird');
const bigArray = [....];
Promise.map(bigArray, apiFetch, {concurrency: 20}).then(function(data) {
// all done here
});
The nice thing about this interface is that it will keep 20 requests in flight. It will start up 20, then each time one finishes, it will start another. So, this is a potentially more efficient than sending 20, waiting for all to finish, sending 20 more, etc...
This also provides the results in the exact same order as bigArray so you can identify which result goes with which request.
You could, of course, code this yourself with generic promises using a counter, but since it is already built in the the Bluebird library, I thought I'd recommend that way.
The Async library also has a similar concurrency control though it is obviously not promise based.
Here's a hand-coded version using only ES6 promises that maintains result order and keeps 20 requests in flight at all time (until there aren't 20 left) for maximum throughput:
function pMap(array, fn, limit) {
return new Promise(function(resolve, reject) {
var index = 0, cnt = 0, stop = false, results = new Array(array.length);
function run() {
while (!stop && index < array.length && cnt < limit) {
(function(i) {
++cnt;
++index;
fn(array[i]).then(function(data) {
results[i] = data;
--cnt;
// see if we are done or should run more requests
if (cnt === 0 && index === array.length) {
resolve(results);
} else {
run();
}
}, function(err) {
// set stop flag so no more requests will be sent
stop = true;
--cnt;
reject(err);
});
})(index);
}
}
run();
});
}
pMap(bigArray, apiFetch, 20).then(function(data) {
// all done here
}, function(err) {
// error here
});
Working demo here: http://jsfiddle.net/jfriend00/v98735uu/
You could send 1 block of 20 requests every minute or space them out 1 request every 3 seconds (latter probably preferred by the API owners).
function rateLimitedRequests(array, chunkSize) {
var delay = 3000 * chunkSize;
var remaining = array.length;
var promises = [];
var addPromises = function(newPromises) {
Array.prototype.push.apply(promises, newPromises);
if (remaining -= newPromises.length == 0) {
Promise.all(promises).then((data) => {
... // do your thing
});
}
};
(function request() {
addPromises(array.splice(0, chunkSize).map(apiFetch));
if (array.length) {
setTimeout(request, delay);
}
})();
}
To call 1 every 3 seconds:
rateLimitedRequests(bigArray, 1);
Or 20 every minute:
rateLimitedRequests(bigArray, 20);
If you prefer to use _.chunk and _.debounce1 _.throttle:
function rateLimitedRequests(array, chunkSize) {
var delay = 3000 * chunkSize;
var remaining = array.length;
var promises = [];
var addPromises = function(newPromises) {
Array.prototype.push.apply(promises, newPromises);
if (remaining -= newPromises.length == 0) {
Promise.all(promises).then((data) => {
... // do your thing
});
}
};
var chunks = _.chunk(array, chunkSize);
var throttledFn = _.throttle(function() {
addPromises(chunks.pop().map(apiFetch));
}, delay, {leading: true});
for (var i = 0; i < chunks.length; i++) {
throttledFn();
}
}
1You probably want _.throttle since it executes each function call after a delay whereas _.debounce groups multiple calls into one call. See this article linked from the docs
Debounce: Think of it as "grouping multiple events in one". Imagine that you go home, enter in the elevator, doors are closing... and suddenly your neighbor appears in the hall and tries to jump on the elevator. Be polite! and open the doors for him: you are debouncing the elevator departure. Consider that the same situation can happen again with a third person, and so on... probably delaying the departure several minutes.
Throttle: Think of it as a valve, it regulates the flow of the executions. We can determine the maximum number of times a function can be called in certain time. So in the elevator analogy.. you are polite enough to let people in for 10 secs, but once that delay passes, you must go!

Sleep() not working as expected

In my code, I'm trying to put a certain delay before continuing to the rest of the code. Pretty basic. I'm using a custom sleep function because javascript's native sleep function is not working at all. I'm actually working in app script in google spreadsheets so maybe that's why. But the following code is in the <script> tag of the html file in spreadsheet app script.
Anyway, when I use sleep(), it is being executed before the setTimeout
function get_ids(db){
window.alert("before window.ids is saved?");
google.script.run.withSuccessHandler(getIdsFromAppscript).getIdsFromModel(db);
//this returns a result to getIdsFromAppscript but the following code doesn't wait
//for the result to be returned so I want to use sleep before the ids
//variable is returned by get_ids
setTimeout(function(){
window.alert("checking if ids is saved after 10s?");
window.alert("In timeout ids="+window.ids);
var ids= window.ids; //is non empty
},10000);
sleep(10000);
var ids= window.ids;
window.alert("after wait");
window.alert("after sleep ids="+ids); //is empty but should be non empty
return ids; //=window.ids , Need to return a non-empty result
}
function getIdsFromAppscript(result){
window.ids=result;
}
and the sleep function:
function sleep(ms) {
var start = new Date().getTime(), expire = start + ms;
while (new Date().getTime() < expire) { }
return;
}
Current Order of output based on window.alert():
1) before window is saved?
2) after sleep
3) after sleep ids= //basically empty which shouldn't be the case
4) checking if ids is saved after 10s?
5) In timeout ids= [.....] //non empty list
However, my desired output order is:
1) before window is saved?
2) checking if ids is saved after 10s?
3) In timeout ids= [.....] //non empty list
4) after sleep
5) after sleep ids= [...] //should be a non empty list
The reason why I'm writing setTimeout is to show that after 10 seconds, the result is being stored in window.ids however even after I give a sleep for 10 seconds, or even 30 seconds, I can't get the result from window.ids.
What exactly am I doing wrong here? Thanks in advance~
Java Script, especially through the V8 engine does not sleep. Sleeping causes the entire thread that JavaScript runs on to stop, which breaks the whole point of asynchronocy. setTimeout() only waits to run the function you push into it for the time you also put into it. It doesn't stop the rest of the executions, and whatever happens first will happen first.
If it's important to you that something happens in order, always, then you need to use callbacks or promises.
An example in code could be:
function doTimeout(ms) {
setTimeout(function(){
window.alert("checking if ids is saved after 10s?");
window.alert("In timeout ids="+window.ids);
var ids= window.ids; //is non empty
},ms);
}
function sleep(ms, callback) {
var start = new Date().getTime(), expire = start + ms;
while (new Date().getTime() < expire) { }
callback(ms);
}
sleep(10000, doTimeout);
Javascript is single threaded. You must return from your code for scripts in other threads to execute. Script in other threads includes functions to handle a timeout event, functions called when promises are kept or fail, and call back functions provided for asynchronous requests made using an XMLHttpRequest object.
Writing a function and calling it sleep() does not change this. You could have called it waitingForGodot() for all the difference it would make. What the code you provided does is to spend a lot of time looping in the thread it was called in. It does not relinquish control and blocks all other scripts from executing. If it goes on for long enough my browser will ask me if I wish to abort the (as in your) script.
I have included two examples below showing that your sleep function blocks the entire Javascript engine. When I use your sleep function, the interval function does not get executed even though I have set an interval of 100 ms and the output is delayed by 10 seconds. However, in the second example the output does get printed immediately at the correct interval. This shows your sleep function is blocking the entire execution engine and that explains why your ids array is empty.
function sleep(ms) {
var start = new Date().getTime(),
expire = start + ms;
while (new Date().getTime() < expire) {}
return;
}
function get_ids() {
document.write("before window.ids is saved?" + "<br>");
var counter = 0;
setInterval(function() {
while (counter < 100) {
document.write("checking if ids is saved after 10s?" + "<br>");
counter = counter + 1
}
}, 100);
sleep(10000);
documen.write("after wait");
}
document.write("Start");
get_ids()
document.write("End");
In this example I have commented out your sleep function and as expected the output gets printed every 100 ms:
function sleep(ms) {
var start = new Date().getTime(),
expire = start + ms;
while (new Date().getTime() < expire) {}
return;
}
function get_ids() {
document.write("before window.ids is saved?" + "<br>");
var counter = 0;
setInterval(function() {
while (counter < 100) {
document.write("checking if ids is saved after 10s?" + "<br>");
counter = counter + 1
}
}, 100);
//sleep(10000);
documen.write("after wait");
}
document.write("Start");
get_ids()
document.write("End");

Angular $q Service - Limiting Concurrency for Array of Promises

Might help to give a bit of background context for this problem: I'm building an angular service that facilitates uploading chunks of multipart form data (mp4 video) to a storage service in the cloud.
I'm attempting to limit the number of unresolved promises (PUT requests of chunk data) happening concurrently. I am using $q.all(myArrayOfPromises).then()... to listen for all chunk upload promises being resolved, and then return an asynchronous call (POST to complete the file) when that happens. I think I'm encountering a race condition with my algorithm, because $q.all() gets called before all jobs have been scheduled for files with a lot of chunks, but succeeds for smaller files.
Here's my alogorithm.
var uploadInChunks = function (file) {
var chunkPromises = [];
var chunkSize = constants.CHUNK_SIZE_IN_BYTES;
var maxConcurrentChunks = 8;
var startIndex = 0, chunkIndex = 0;
var endIndex = chunkSize;
var totalChunks = Math.ceil(file.size / chunkSize);
var activePromises = 0;
var queueChunks = function () {
while (activePromises <= maxConcurrentChunks && chunkIndex < totalChunks) {
var deferred = $q.defer();
chunkCancelers.push(deferred); // array with broader scope I can use to cancel uploads as they're happening
var fileSlice = file.slice(startIndex, Math.min(endIndex, file.size));
chunkPromises.push(addChunkWithRetry(webUpload, chunkIndex, fileSlice).then(function () {
activePromises--;
queueChunks();
});
activePromises++;
startIndex += chunkSize;
endIndex += chunkSize;
chunkIndex++;
}
}
queueChunks();
return $q.all(chunkPromises).then(function () {
return filesApi.completeFile(file.fileId);
});
};
Even though $q.all is called prematurely, the chunks of the file that are still pending / not even scheduled at that time are eventually executed and resolved successfully.
I've done a fair amount of reading about throttling the concurrency of $q and know there are libraries out there to assist, but I'd really like to have an understanding of why this does not work all of the time :)
The promise that you're returning ($q.all) isn't really indicative of the promise you actually want to return. In your code above, the returned promise will finish after the first maxConcurrentChunks get resolved, because that's how many promises are in chunkPromises when you pass it to $q.all().
Another way to handle this (and get the result you want) would be the following psuedocode:
var uploadInChunks = function(file){
//...vars...
var fileCompleteDeferral = $q.defer();
var queueChunks = function(){
chunkPromises.push(nextChunk(chunkIndex).then(function () {
activePromises--;
if(allChunksDone()) { //could be activePromises == 0, or chunkIndex == totalChunks - 1
fileCompleteDeferral.resolve();
}
else {
queueChunks();
}
});
}
return fileCompleteDeferral.promise.then(completeFile());
}
The promise returned by this code will only resolve after ALL the promises are done, rather than just the first 8.

Categories