Theory:
I have around 100 promises which I make in start and then later resolve them using Promise.all().
Each of those 100 promises in turn make some async REST calls whose response may vary mainly (for example due to network connectivity).
Process of resolving all 100 promises take around 20 sec. During that time user should be given a live feedback of progress to keep him engaged.
In order to implement progress of these async operations I am thinking of using a progressCounter on client end whose value will be updated by each promise as soon as its resolved.
Thing is that if progressCounter = 1 and as all those operations are async, I fear to hit a race condition where, for example, current value of progressCounter retrieved by two distinct promises might be found as same i.e. 1 so they may try to increment progressCounter to same value i.e. 2. So final value won't be 3 because of the race condition.
Experiment:
I tried to reproduce this theory but couldn't using following:
var progress = {};
progress.counter = 1;
var promise1 = new Promise(function(resolve, reject) {
resolve();
});
var promise2 = new Promise(function(resolve, reject) {
resolve();
});
promise1.then(function() {
progress.counter += 1;
});
promise2.then(function() {
progress.counter += 1;
});
setTimeout(function() {
alert(progress.counter); // progress.counter: 3
}, 1000);`
Question:
Question is can such a race condition be hit described in theory above? If not, how is the theory flawed?
If yes, what is a good way to track progress of resolution of the promises?
Question: Question is can such a race condition be hit described in theory above? If not, how is the theory flawed?
The answer is no, such a race condition can not occur in Javascript, because Javascript is single-threaded. (see: Concurrency Model and Event Loop on MDN)
This means that while one callback handler is working with the data (assuming that setting the counter is a synchronous operation, which += is), nothing can force it to "yield" its execution, the next handler can only run when the previous one has finished.
Related
I am having trouble finding a use for Promises. Wouldn't these 2 approaches below work the same exact way? Since the while loop in loopTest() is synchronous, logStatement() function wouldn't run until it's complete anyways so how would the the 2nd approach be any different ..wouldn't it be pointless in waiting for it to resolve() ?
1st approach:
function loopTest() {
while ( i < 10000 ) {
console.log(i)
i++
})
}
function logStatement() {
console.log("Logging test")
}
loopTest();
logStatement();
2nd approach:
function loopTest() {
return new Promise((resolve, reject) => {
while ( i < 10000 ) {
console.log(i)
i++
if (i === 999) {
resolve('I AM DONE')
}
})
});
}
function logStatement() {
console.log("Logging test")
}
loopTest().then(logStatement());
Promises don't make anything asynchronous,¹ so you're right, there's no point to using a promise in the code you've shown.
The purpose of promises is to provide a standard, composable means of observing the result of things that are already asynchronous (like ajax calls).
There are at least three massive benefits to having a standardized way to observe the results of asynchronous operations:
We can have standard semantics for consuming individual promises, rather than every API defining its own signature for callback functions. (Does it signal error with an initial parameter that's null on success, like Node.js? Does it call the callback with an object with a success flag? Or...)
We can have standard ways of composing/combining them, such as Promise.all, Promise.race, Promise.allSettled, etc.
We can have syntax to consume them with our usual control structures, which we have now in the form of async functions and await.
But again, throwing a promise at a synchronous process almost never does anything useful.²
¹ One very small caveat there: The handler functions to attach to a promise are always triggered asynchronously, whether the promise is already settled or not.
² Another small caveat: Sometimes, you have a synchronous result you want to include in a composition operation (Promise.all, etc.) with various asynchronous operations. In that case, wrapping the value in a promise that's instantly fulfilled is useful — and in fact, all the standard promise combinators (Promise.all, etc.) do that for you, as does await.
There's no point in what you are doing, because your function body is just a blocking loop.
To get a benefit from Promises, use it with APIs that do something with IO, such as a HTTP request, or reading a file from disk.
These APIs all traditionally used callbacks, and are now mostly Promise based.
Anything function that uses a Promise-based function, should itself also be Promise-based. This is why you see a lot of promises in modern code, as a promise only has to be used at 1 level in a stack for the entire stack to be asynchronous in nature.
Is this a better example of how Promises are used? This is all I can think of to make it show use to me:
Version 1
function getData() {
fetch('https://jsonplaceholder.typicode.com/todos/1')
.then(data => data.json())
.then(json => console.log(json))
}
function logInfo() {
console.log("i am a logger")
}
getData()
logInfo()
// "I am a logger"
// {"test": "json"}
Version 2
function getData() {
return fetch('https://jsonplaceholder.typicode.com/todos/1')
.then(data => data.json())
.then(json => console.log(json))
}
function logInfo() {
console.log("i am a logger")
}
getData().then(logInfo);
// "{"test": "json"}
// "I am a logger"
// waits for API result to log _then_ logInfo is run , which makes a log statement
There's definitely benefits to using Promises but that's only in certain scenarios where their usage would seem viable.
Your example could represent what would happen when you retrieve data from an external source synchronously, it would block the thread preventing further code from executing until the loop terminates (I explain below why exactly that happens) - wrapping it in a promise gives no different output in that the thread is still being blocked and when the next message in the queue has to be processed, it gets processed like normal right after it ends.
However an implementation similar to this could achieve a while loop running in a non-blocking manner, just an idea (don't mean to derail this topic with setInterval's implementation):
let f = () => {
let tick = Date.now;
let t = tick();
let interval = setInterval(() => {
if (tick() - t >= 3000) {
console.log("stop");
clearInterval(interval);
}
}, 0);
};
f()
console.log("start");
Basically the time is checked/handled in a separate thread in the browser and the callback is executed every time the time specified runs out while the interval hasn't been cleared, after the call stack becomes empty (so UI function isn't affected) and the current executing function terminates/ends or after other functions above it in the stack finish running. I don't know about the performance implications of doing something like this but I feel like this should only be used when necessary, since the callback would have to be executed very frequently (with 0 timeout, although it's not guaranteed to be 0 anyway).
why it happens
I mainly want to clarify that while the handler functions will be scheduled to be executed asynchronously, every message in the queue has to be processed completely before the next one and for the duration your while loop executes, no new message can be processed in the event queue so it would be pointless to involve Promises where the same thing would happen without them.
So basically the answer to:
wouldn't it be pointless in waiting for it to resolve() ?
is yes, it would be pointless in this case.
Disclaimer: I'm not experienced with programming or with networks in general so I might be missing something quite obvious.
So i'm making a function in node.js that should go over an array of image links from my database and check if they're still working. There's thousands of links to check so I can't just fire off several thousand fetch calls at once and wait for results, instead I'm staggering the requests, going 10 by 10 and doing head requests to minimize the bandwidth usage.
I have two issues.
The first one is that after fetching the first 10-20 links quickly, the other requests take quite a bit longer and 9 or 10 out of 10 of them will time out. This might be due to some sort of network mechanism that throttles my requests when there are many being fired at once, but I'm thinking it's likely due to my second issue.
The second issue is that the checking process slows down after a few iterations. Here's an outline of what I'm doing. I'm taking the string array of image links and slicing it 10 by 10 then I check those 10 posts in 10 promises: (ignore the i and j variables, they're there just to track the individual promises and timeouts for loging/debugging)
const partialResult = await Promise.all(postsToCheck.map(async (post, j) => await this.checkPostForBrokenLink(post, i + j)));
within checkPostForBrokenLink I have a race between the fetch and a timeout of 10 seconds because I don't want to have to wait for the connection to time out every time timing out is a problem, I give it 10 seconds and then flag it as having timed out and move on.
const timeoutPromise = index => {
let timeoutRef;
const promise = new Promise<null>((resolve, reject) => {
const start = new Date().getTime();
console.log('===TIMEOUT INIT===' + index);
timeoutRef = setTimeout(() => {
const end = new Date().getTime();
console.log('===TIMEOUT FIRE===' + index, end - start);
resolve(null);
}, 10 * 1000);
});
return { timeoutRef, promise, index };
};
const fetchAndCancelTimeout = timeout => {
return fetch(post.fileUrl, { method: 'HEAD' })
.then(result => {
return result;
})
.finally(() => {
console.log('===CLEAR===' + index); //index is from the parent function
clearTimeout(timeout);
});
};
const timeout = timeoutPromise(index);
const videoTest = await Promise.race([fetchAndCancelTimeout(timeout.timeoutRef), timeout.promise]);
if fetchAndCancelTimeout finishes before timeout.promise does, it will cancel that timeout, but if the timeout finishes first the promise is still "resolving" in the background, despite the code having moved on. I'm guessing this is why my code is slowing down. The later timeouts take 20-30 seconds from being set up to firing, despite being set to 10 seconds. As far as I know, this has to be because the main process is busy and doesn't have time to execute the event queue, though I don't really know what it could be doing except waiting for the promises to resolve.
So the question is, first off, am I doing something stupid here that I shouldn't be doing and that's causing everything to be slow? Secondly, if not, can I somehow manually stop the execution of the fetch promise if the timeout fires first so as not to waste resources on a pointless process? Lastly, is there a better way to check if a large number of links are valid that what I'm doing here?
I found the problem and it wasn't, at least not directly, related to promise buildup. The code shown was for checking video links but, for images, the fetch call was done by a plugin and that plugin was causing the slowdown. When I started using the same code for both videos and images, the process suddenly became orders of magnitude quicker. I didn't think to check the plugin at first because it was supposed to only do a head request and format the results which shouldn't be an issue.
For anyone looking at this trying to find a way to cancel a fetch, #some provided an idea that seems like it might work. Check out https://www.npmjs.com/package/node-fetch#request-cancellation-with-abortsignal
Something you might want to investigate here is the Bluebird Promise library.
There are two functions in particular that I believe could simplify your implementation regarding rate limiting your requests and handling timeouts.
Bluebird Promise.map has a concurrency option (link), which allows you to set the number of concurrent requests and it also has a Promise.timeout function (link) which will return a rejection of the promise if a certain timeout has occurred.
I'm studying about javascript event loop, and I tried some complex and nested async codes, and one of them complicated me so much. The code snippet looks like:
console.log(1);
new Promise((resolve, reject) => {
console.log(2);
resolve();
}).then(() => {
setTimeout(() => { console.log(3) }, 0);
});
setTimeout(() => {
new Promise((resolve, reject) => {
console.log(4);
resolve();
}).then(() => { console.log(5) });
});
And the result sometimes is 1 - 2 - 4 - 5 - 3, and sometimes is 1 - 2 - 4 - 3- 5.
It performs the same in browser environment and node environment.
Maybe my code is written wrong, or there are some issues exist in V8 resolving event loop?
Callback execution order is not really deterministic. Whenever a Promise resolves, a task that executes the .then() chain gets pushed onto the Eventqueue. The same happens when a timer resolves. As timers run concurrently, they might not finish exactly in the order they were started, if they are close to each other.
main code executes (1) {
promise (2) gets pushed onto the queue
promise (4) gets pushed onto the queue
}
promise resolves (2) {
a 0ms timer gets set (3)
}
// maybe the timer is done already (3)
promise resolves(4)
// maybe the timer is done now (3)
You are combining two things: Promise.resolve and window.setTimeout. The former runs synchronously and places the resolved item in the queue immediately. The window.setTimeout however has a different approach. It start to handle the provided function once the timer has expired. When using 0 as delay in window.setTimeout as you have used in your first promise:
setTimeout(() => { console.log(3) }, 0);
then it does not mean "run this immediately". It is more "run this function as soon as possible". That can be changed because the timer is being throttled by browsers. If you do not have problems to read some specs, you can read timer-initialisation-steps to comprehend how it is being initialized in detail.
There is a more easier-to-read information on MDN: Reasons for delays longer than specified
Dependent of the browser/engine, the engine might be busy with (background) tasks so that timers gets throttled. As you have experienced yourself, there is a situation where you are getting different results. The minimum throttling time is (according to the specs, but browsers can use a different value) 4ms. The end result is that the function in the throttled timer gets executed after the another timer that did not got throttled.
I have a long, complicated asynchronous process in TypeScript/JavaScript spread across many libraries and functions that, when it is finally finished receiving and processing all of its data, calls a function processComplete() to signal that it's finished:
processComplete(); // Let the program know we're done processing
Right now, that function looks something like this:
let complete = false;
function processComplete() {
complete = true;
}
In order to determine whether the process is complete, other code either uses timeouts or process.nextTick and checks the complete variable over and over again in loops. This is complicated and inefficient.
I'd instead like to let various async functions simply use await to wait and be awoken when the process is complete:
// This code will appear in many different places
await /* something regarding completion */;
console.log("We're done!");
If I were programming in Windows in C, I'd use an event synchronization primitive and the code would look something like this:
Event complete;
void processComplete() {
SetEvent(complete);
}
// Elsewhere, repeated in many different places
WaitForSingleObject(complete, INFINITE);
console.log("We're done!");
In JavaScript or TypeScript, rather than setting a boolean complete value to true, what exactly could processComplete do to make wake up any number of functions that are waiting using await? In other words, how can I implement an event synchronization primitive using await and async or Promises?
This pattern is quite close to your code:
const processComplete = args => new Promise(resolve => {
// ...
// In the middle of a callback for a async function, etc.:
resolve(); // instead of `complete = true;`
// ...
}));
// elsewhere
await processComplete(args);
console.log("We're done!");
More info: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise.
It really depends on what you mean by "other code" in this scenario. It sounds like you want to use some variation of the delegation pattern or the observer pattern.
A simple approach is to take advantage of the fact that JavaScript allows you to store an array of functions. Your processComplete() method could do something like this:
function processComplete(){
arrayOfFunctions.forEach(fn => fn());
}
Elsewhere, in your other code, you could create functions for what needs to be done when the process is complete, and add those functions to the arrayOfFunctions.
If you don't want these different parts of code to be so closely connected, you could set up a completely separate part of your code that functions as a notification center. Then, you would have your other code tell the notification center that it wants to be notified when the process is complete, and your processComplete() method would simply tell the notification center that the process is complete.
Another approach is to use promises.
I have a long, complicated asynchronous process in TypeScript/JavaScript spread across many libraries and functions
Then make sure that every bit of the process that is asynchronous returns a promise for its partial result, so that you can chain onto them and compose them together or await them.
When it is finally finished receiving and processing all of its data, calls a function processComplete() to signal that it's finished
It shouldn't. The function that starts the process should return a promise, and when the process is finished it should fulfill that promise.
If you don't want to properly promisify every bit of the whole process because it's too cumbersome, you can just do
function startProcess(…);
… // do whatever you need to do
return new Promise(resolve => {
processComplete = resolve;
// don't forget to reject when the process failed!
});
}
In JavaScript or TypeScript, rather than setting a boolean complete value to true, what exactly could processComplete do to make wake up any number of functions that are waiting using await?
If they are already awaiting the result of the promise, there is nothing else that needs to be done. (The awaited promise internally has such a flag already). It's really just doing
// somewhere:
var resultPromise = startProcess(…);
// elsewhere:
await resultPromise;
… // the process is completed here
You don't even need to fulfill the promise with a useful result if all you need is to synchronise your tasks, but you really should. (If there's no data they are waiting for, what are they waiting for at all?)
Since Python 3.5, the keywords await and async are introduced to the language. Now, I'm more of a Python 2.7 person and I have been avoiding Python 3 for quite some time so asyncio is pretty new to me. From my understanding it seems like await/async works very similar to how they work in ES6 (or JavaScript, ES2015, however you want to call it.)
Here are two scripts I made to compare them.
import asyncio
async def countdown(n):
while n > 0:
print(n)
n -= 1
await asyncio.sleep(1)
async def main():
"""Main, executed in an event loop"""
# Creates two countdowns
futures = asyncio.gather(
countdown(3),
countdown(2)
)
# Wait for all of them to finish
await futures
# Exit the app
loop.stop()
loop = asyncio.get_event_loop()
asyncio.ensure_future(main())
loop.run_forever()
function sleep(n){
// ES6 does not provide native sleep method with promise support
return new Promise(res => setTimeout(res, n * 1000));
}
async function countdown(n){
while(n > 0){
console.log(n);
n -= 1;
await sleep(1);
}
}
async function main(){
// Creates two promises
var promises = Promise.all([
countdown(3),
countdown(2)
]);
// Wait for all of them to finish
await promises;
// Cannot stop interpreter's event loop
}
main();
One thing to notice is that the codes are very similar and they work pretty much the same.
Here are the questions:
In both Python and ES6, await/async are based on generators. Is it a correct to think Futures are the same as Promises?
I have seen the terms Task, Future and Coroutine used in the asyncio documentation. What are the differences between them?
Should I start writing Python code that always has an event loop running?
In both Python and ES6, await/async are based on generators. Is it a correct to think Futures are the same as Promises?
Not Future, but Python's Task is roughly equivalent to Javascript's Promise. See more details below.
I have seen the terms Task, Future and Coroutine used in the asyncio documentation. What are the differences between them?
They're quite different concepts. Mainly, Task consists of Future and Coroutine. Let's describe these primitives briefly (I am going to simplify lots of things to describe only main principles):
Future
Future is simply an abstraction of value that may be not computed yet and will be available eventually. It's a simple container that only does one thing - whenever the value is set, fire all registered callbacks.
If you want to obtain that value, you register a callback via add_done_callback() method.
But unlike in Promise, the actual computation is done externally - and that external code has to call set_result() method to resolve the future.
Coroutine
Coroutine is the object very similar to Generator.
A generator is typically iterated within for loop. It yields values and, starting from PEP342 acceptance, it receives values.
A coroutine is typically iterated within the event loop in depths of asyncio library. A coroutine yields Future instances. When you are iterating over a coroutine and it yields a future, you shall wait until this future is resolved. After that you shall send the value of future into the coroutine, then you receive another future, and so on.
An await expression is practically identical to yield from expression, so by awaiting other coroutine, you stop until that coroutine has all its futures resolved, and get coroutine's return value. The Future is one-tick iterable and its iterator returns actual Future - that roughly means that await future equals yield from future equals yield future.
Task
Task is Future which has been actually started to compute and is attached to event loop. So it's special kind of Future (class Task is derived from class Future), which is associated with some event loop, and it has some coroutine, which serves as Task executor.
Task is usually created by event loop object: you give a coroutine to the loop, it creates Task object and starts to iterate over that coroutine in manner described above. Once the coroutine is finished, Task's Future is resolved by coroutine's return value.
You see, the task is quite similar to JS Promise - it encapsulates background job and its result.
Coroutine Function and Async Function
Coroutine func is a factory of coroutines, like generator function to generators. Notice the difference between Python's coroutine function and Javascript's async function - JS async function, when called, creates a Promise and its internal generator immediately starts being iterated, while Python's coroutine does nothing, until Task is created upon it.
Should I start writing Python code that always has an event loop running?
If you need any asyncio feature, then you should. As it turns out it's quite hard to mix synchronous and asynchronous code - your whole program had better be asynchronous (but you can launch synchronous code chunks in separate threads via asyncio threadpool API)
I see the main difference downstream.
const promise = new Promise((resolve, reject) => sendRequest(resolve, reject));
await promise;
In JavaScript, the two resolve and reject functions are created by the JS engine and they have to be passed around for you to keep track of them. In the end, you're still using two callback functions most of the time and the Promise won't really do more than setTimeout(() => doMoreStuff()) after doStuff calls resolve. There's no way to retrieve an old result or the status of a Promise once the callbacks were called. The Promise is mostly just the glue code between regular calls and async/await (so you can await the promise somewhere else) and a bit of error callback forwarding for chaining .thens.
future = asyncio.Future()
sendRequest(future)
await future
In Python, the Future itself becomes the interface with which a result is returned and it keeps track of the result.
Since Andril has given the closest Python equivalent to JavaScript's Promise (which is Task; you give it a callback and wait for it to complete), I'd like to go the other way.
class Future {
constructor() {
this.result = undefined;
this.exception = undefined;
this.done = false;
this.success = () => {};
this.fail = () => {};
}
result() {
if (!this.done) {
throw Error("still pending");
}
return this.result();
}
exception() {
if (!this.done) {
throw Error("still pending");
}
return this.exception();
}
setResult(result) {
if (this.done) {
throw Error("Already done");
}
this.result = result;
this.done = true;
this.success(this.result);
}
setException(exception) {
if (this.done) {
throw Error("Already done");
}
this.exception = exception;
this.done = true;
this.fail(this.exception);
}
then(success, fail) {
this.success = success;
this.fail = fail;
}
}
The JS await basically generates two callbacks that are passed to .then, where in a JS Promise the actual logic is supposed to happen. In many examples, this is where you'll find a setTimeout(resolve, 10000) to demonstrate the jump out of the event loop, but if you instead keep track of those two callbacks you can do with them whatever you want..
function doStuff(f) {
// keep for some network traffic or so
setTimeout(() => f.setResult(true), 3000);
}
const future = new Future();
doStuff(future);
console.log('still here');
console.log(await future);
The above example demonstrates that; three seconds after 'still here' you get 'true'.
As you can see, the difference is Promise receives a work function and deals with resolve and reject internally, while Future doesn't internalize the work and only cares about the callbacks. Personally, I prefer the Future because it's one layer less callback hell - which was one of the reasons for Promises in the first place: callback chaining.