Is there a delay between task completion and callback function execution? - javascript

I am learning Node.js and some javascript. I read up some stuff of thinks like queues and execution stacks.
I am trying to calculate time taken by a websocket request to complete. A very typical emit is of form:
microtime1 = getTimeNow;
socket.emit("message","data", function (error, data) {
// calculate time taken by using microtime module by getting updated time and calculating difference.
microtime2 = getTimeNow;
time = microtime2 - microtime1;
})
If I am sending multiple messages, can I rely on callback getting executed without delay or can there be a hold up in the queue and callback won't get executed.
In other words, would callback only get called once it's in stack or does it get executed while it's waiting to be picked up in the queue ?
Hope, I was able to explain my question.

In other words, would callback only get called once it's in stack or does it get executed while it's waiting to be picked up in the queue ?
The callback gets executed after the event, that it is waiting for, is resolved.
So the callback should work just fine, however there is a caveat. because node-js is single threaded, you could have another process that's blocked the main thread.
For example the simple view of execution may look like this. One event is processed, and then another one is processed after.
However, in reality it may look more like this
The single thread is meant for the main thread only, things like the IO operations are done on another dedicated thread that will notify main thread when it's done, and then the callback can be executed after
The problem occurs if your main thread becomes busy while waiting for the network action to complete
This is hard to predict though and depends on what the rest of the app is doing. If your app is not doing anything else, this likely won't be an issue. But, IMHO, a better way is to make hundreds or thousands of calls and allow get an average which will account for other possible causes for discrepancies in the delta.
Additional data from c-sharpcorner.com
The above diagram shows the
execution process of Node.js. Let's understand it step by step.
Step 1
Whenever a request comes to Node.js API, that incoming request is
added to the event queue. This is because Node.js can't handle
multiple requests simultaneously. The first time, the incoming request
is added to the event queue.
Step 2
Now, you can see in the diagram that one loop is there which always
checks if any event or request is available in event queue or not. If
any requests are there, then according to the "First Come, First
Served" property of queue, the requests will be served.
Step 3
This Node.js event loop is single threaded and performs non blocking
i/o tasks, so it sends requests to C++ internal thread pool where lots
of threads can be run. This C++ internal thread pool is the part of
event loop developed in Libuv. This can handle multiple requests. Now,
event loop checks again and again if any event is there in the Event
Queue. If there is any, then it serves to the thread pool if the
blocking process is there.
Step 4
Now, the internal thread pool handles a lot of requests, like database
request, file request, and many more.
Step 5
Whenever any thread completes that task, the callback function calls
and sends the response back to the event loop.
Step 6
Now, event loop sends back the response to the client whose request is
completed.

Related

How many JS statements get processed per event loop?

Is there a set number of instructions statements that get processed before checking the event queue/per tick/per loop (ways of saying the same thing, I think?)
Is there a set number of instructions that get processed before checking the event queue/per tick/per loop (ways of saying the same thing, I think?)
No, there is not.
In the node.js architecture, when an event is pulled from the event queue, it's tied to a callback. The interpreter calls that callback and that callback runs to completion. Only when it returns and the stack is again empty does it check to see if there is another event in the event queue to run.
So, it has absolutely nothing to do with a number of instructions. node.js runs your Javascript as single-threaded so there is no time slicing between pieces of Javascript which it sounds like your question perhaps was anticipating. Once a callback is called that corresponds to an even in the event queue, that callback runs until it finishes and returns control back to the interpreter.
So, it goes like this:
Pull event from the event queue
Call the Javascript callback associated with that event
Javascript callback runs until completion and then returns from the callback
node.js internals check event queue for next event. If an event is there, go to step 1 and repeat
If no event is there, go to sleep until an event is placed into the event queue.
In reality, this is a bit of a simplification because there are several different types of event queues with a priority order for which one gets to go first, but this describes the general process as it relates to your question.
There is no set number of instructions that get processed before checking the event queue. Each message is run to completion. From the Mozilla documentation (https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop):
Each message is processed completely before any other message is processed. This offers some nice properties when reasoning about your program, including the fact that whenever a function runs, it cannot be pre-empted and will run entirely before any other code runs (and can modify data the function manipulates). This differs from C, for instance, where if a function runs in a thread, it may be stopped at any point by the runtime system to run some other code in another thread.
A downside of this model is that if a message takes too long to complete, the web application is unable to process user interactions like click or scroll. The browser mitigates this with the "a script is taking too long to run" dialog. A good practice to follow is to make message processing short and if possible cut down one message into several messages.

Are all node js requests initially sent to callback queue? How does Node.js manage concurrent users?

What will happen if there is no IO in current scenario?
Will the requests get executed in synchronous way and each request will have to wait for previous request?
Are all node.js request initially sent to callback queue? Otherwise wont the stack throw stack overflow in case all requests are being served by stack...
I have just started using node and completely not sure about how things are working? Every online site says think of event loop as a waiter in a restaurant taking order but I do not get how node handles requests in case there are hundreds of request burst load. Will these request be kept on hold in some sort of queue?
The Event demultiplexer is a notification-issuing interface within the Node JS. It is used to gather every request from watched sources in form of an event and queues each event in a queue. It is the demultiplexer that forms the Event Queue. Event demultiplexer is an API run by Libuv.
All the collected requests are distributed in those different event queues. The queue at the top takes highest priority, and the queue at the bottom takes the least.
So, event loop checks timers queue first and if there are events that are ready to be passed to call stack with its callback to be executed, event loop will handle it. After event loop is done with timers queues, however before jumping to next queue, event loop will look at 2 other queues that run by node itself, and will handle those queues first. those are:
Next Ticks Queue — Callbacks added using process.nextTick function
Other Microtasks Queue — Includes other microtasks such as resolved promise callbacks, console.log etc.
After event loop is done with those 2 queues, then it will move to next queue which is I/O callbacks.
Note that Node event loop is single threaded. However for computationally intensive or time consuming tasks libuv has thread pool which by default has 4 pools, not to block the event loop. There are only four things that use this thread pool - DNS lookup, fs, crypto and zlib. Everything else executes in the main thread irrespective or blocking or non blocking.
So far we had event loop which handles events in event queues and thread pool that run by libuv for some other functions. Node standard library also has some functions that make use of code that is built into the underlying operating system through libuv. All modern operating systems provide APIs to interact with libuv.
Libuv delegates some tasks like network I/O and fs module's functions to operating system, so OS runs those tasks and brings the results to Libuv.

is asynchronicity a constant for a) functions or b) lines of code?

join(user) {
if(this.room.length === 2) throw new Error('The room is full');
this.room.push(user);
};
If asynchronicity goes on a 'function' basis, this code is 100% bulletproof. There's no way there can be more than two users in a room.
If asynchronicity goes on a 'per line' basis, this code may fail. Why? Because if three users enter the room at the same time, the following may happen in a 10ms interval:
1ms: Room is empty. Let User A in, push to array.
4ms: Room has one user.
5ms: User B asks to join.
6ms: User C asks to join.
7ms: Check array for length (1).
8ms: Check array for length (1).
9ms: Push user B to array because room length is 1.
10ms: Push user C to array because room length is 1.
15ms: Now room has 3 users and the max was two.
If asynchronicity goes on a 'per line' basis, how do I avoid the previous example to happen in a real scenario? Sorry if I haven't called things by their name, but I can't think of a better way to explain this.
The comments suggest (but don't say for sure and in a clearly manner) that 'instances' of the same function won't ever overlap each other. What about different functions pointing to the same object/array?
join(user) {
if(GLOBAL.room1.length === 2) throw new Error('The room is full');
GLOBAL.room1.push(user);
};
join2(user) {
if(GLOBAL.room1.length === 2) throw new Error('The room is full');
GLOBAL.room1.push(user);
};
Javascript is synchronous per event (a greater scope than a function or a line). This excludes any asynchronous operations you start yourself. They will start and then complete some time later with their own event (see discussion of events later on here). Synchronous code for an event (like what you show) will run entirely to completion before the next event can be run.
Your javascript in nodejs runs as single-threaded and event driven. That means that one a piece of Javascript starts executing, it will finish before any other events can be processed. As such, a function runs and completes before anyone can call it again (assuming it doesn't call itself) so you do not have the typical types of race conditions that can exist in a multi-threaded environment.
If asynchronicity goes on a 'function' basis, this code is 100% bulletproof. There's no way there can be more than two users in a room.
Javascript in node.js is single-threaded and it will run an entire event handler to completion (all functions in that event handler) before running the next event.
If asynchronicity goes on a 'per line' basis, this code may fail. Why? Because if three users enter the room at the same time, the following may happen in a 10ms interval:
Javascript's single threadedness is not per line or per function. It's per event. So, in this case it appears you're probably handling an incoming socket.io message. That means that all code in that incoming message event handler will run before ANY other incoming messages can be processed. Only when your event handler returns control back to the system (by returning from its event handler function) will nodejs get the next event in the event queue and call its event handler.
So, let's say that three users are all asking to join the same room at about the same time and examine that scenario to explain further how this works.
All three clients ask to join the same room at about the same time.
One of those requests arrives marginally sooner than the other (incoming packets are serialized on the network cable and in the incoming TCP stack so one of them arrives first).
That packet is processed by the local TCP stack and then the listening socket in nodejs is alerted (this would be in native code and runs in a different thread from the JS engine).
The nodejs platform code gets this socket notification and inserts an event into the nodejs event queue to alert the nodejs code listening for that incoming message.
If the nodejs Javascript engine is not doing anything at the moment, then it immediately fires that event handler and starts running the code associated with that event handler.
If the nodejs Javascipt engine is doing something at the moment, then the event sits in the event queue until the JS engine finishes what it is currently doing and then it fetches the next event in the event queue.
Now the other two requests to also join the same room arrive on the TCP stack.
The native code in nodejs that is servicing incoming TCP packets, see each of these two requests arrive. They are inserted into the nodejs event queue just like the first one was. Because the JS interpreter is busy servicing the first message that arrived, these just sit in the event queue for now.
When the first message finishes its processing, the nodejs interpreter, gets the next event from the event queue (whichever message arrived second will get processed now). Its event handler will be called and will run to completion before the third event is processed.
Hopefully from this description of how things are processed, you can see that your check of the room length does not have any possible race conditions. The adding of a user to the room is single threaded so it will start, progress and complete before any other requests to join the room can be processed. This single-threaded nature (while occasionally a limiting feature) vastly, vastly simplifies nodejs programming because it removes many of the possible causes of race conditions that multi-threaded programming has.
Let's look at your sequence with my comments on each step:
1ms: Room is empty. Let User A in, push to array.
4ms: Room has one user.
5ms: User B asks to join.
If the first request is not done processing yet, this request will be put in the event queue until that first request is done. If the first request is done, then this request will start processing.
6ms: User C asks to join.
Assuming the second request takes more than 1ms to process so it's not done yet, this request will go in the event queue and will not be processed until both of the first two requests are completely done.
7ms: Check array for length (1).
I don't understand what this step is meant to be. If this is part of processing the second request, then the second and third requests are still in the event queue and cannot run until the first one is done.
8ms: Check array for length (1).
Same as previous comment.
9ms: Push user B to array because room length is 1.
10ms: Push user C to array because room length is 1.
15ms: Now room has 3 users and the max was two.
This is flawed logic. Because of the event-queue and the single threaded nature of nodejs Javascript, the userB and userC requests do not run until the userA request is done. Thus, the room length is always accurate when any request runs. Multiple requests like this do not run at the same time.

Hidden threads in Javascript/Node that never execute user code: is it possible, and if so could it lead to an arcane possibility for a race condition?

See bottom of question for an update, based on comments/answers: This question is really about the possibility of hidden threads that do not execute callbacks.
I have a question about a potential arcane scenario involving the Node Request module in which:
A complete HTTP request is constructed and executed over the network (taking however many ms or even seconds)
... before a single function is executed at runtime on the local machine (typically in the nanoseconds?) - see below for details
I am posting this mostly as a sanity check just to make sure I am not misunderstanding something about Node / JS / Request module code.
From the examples in the Request module (see the SECOND example in that section), is this:
// Copied-and-pasted from the second example in the
// Node Request library documentation, here:
// https://www.npmjs.com/package/request#examples
// ... My ARCANE SCENARIO is injected in the middle
var request = require('request')
request(
{ method: 'GET'
, uri: 'http://www.google.com'
, gzip: true
}
, function (error, response, body) {
// body is the decompressed response body
console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity'))
console.log('the decoded data is: ' + body)
}
)
// **************************************************** //
// Is the following scenario possible?
//
// <-- HANG HANG HANG HANG HANG HANG HANG HANG HANG -->
//
// Let us pretend that the current thread HANGS here,
// but that the request had time to be sent,
// and the response is pending being received by the thread
//
// <-- HANG HANG HANG HANG HANG HANG HANG HANG HANG -->
// **************************************************** //
.on('data', function(data) {
// decompressed data as it is received
console.log('decoded chunk: ' + data)
})
.on('response', function(response) {
// unmodified http.IncomingMessage object
response.on('data', function(data) {
// compressed data as it is received
console.log('received ' + data.length + ' bytes of compressed data')
})
})
I have indicated my arcane scenario in the code snippet.
Suppose the Node process hangs at the point indicated, but that Node internally (in a hidden thread, invisible to Javascript, and therefore not calling any callbacks) WAS able to construct the request, and send it over the network; suppose the hang continues until a response (in two chunks, say) is received and waiting to be processed by Node. (This is the scenario that is certainly arcane, and that I'm not sure is even theoretically possible.)
Then suppose that the hang ends, and the Node thread above wakes up. Further, suppose that (somehow) Node was able to process the response all the way to the point of executing the callback function in the code above (yet without moving past the 'hanged' point in the code in the original code path -- again, if this is even theoretically possible).
Is the above arcane scenario theoretically possible? If so, wouldn't the data packets be received over the network and combined, ready to be passed to the callback function, before the 'data' event was scheduled on the object? In this case, if it's possible, I would imagine that the 'data' event would be missed.
Again, I understand that this is an arcane scenario - perhaps it's not even theoretically possible, given the internal mechanisms and coding involved.
That is my question - is the above arcane scenario, with its extremely unlikely race condition, nonetheless theoretically possible?
I ask just to make sure I'm not missing some key point. Thanks.
UPDATE: From comments & answers: I now have clarified my question. The 'arcane scenario' would require that there is a HIDDEN thread (which therefore CANNOT execute any USER code, including CALLBACKS) that constructs the request, sends it over the network, and receives the response - WITHOUT having any callbacks to trigger, including the 'data' callback - and stops short just at the point that the 'response' callback is ready to be called, waiting for the (single) visible JS thread to wake up.
No, this cannot happen.
Yes, there are indeed "hidden" background threads that do the work for asychronous methods, but those don't call callbacks. All execution of javascript does happen on the same thread, synchronously, sequentially. That data event callback will always be executed asynchronously, that is, after the current script/function ran to completion.
While there could indeed already arrive packets from the network before the callback is created and attached to the event emitter, the callback that listens for packets on the lowest level is always created before the request is sent - it is an argument to the native "makeRequest" method, and is available to be called right from the beginning. So when a packet does arrive before the current script (still being occupied by constructing event emitters and attaching handlers) has finished, this event is queued up, and the callback will only be executed after the event loop is ready - on the next turn. By then, the data event callback is certainly created and attached.
nodejs Javsacript execution is single threaded and event driven. That means that everything runs through an event queue. A thread of Javascript execution runs until it's done and then the system checks the event queue to see if there's anything else to do (timers waiting to fire, async callbacks waiting to be called, etc...).
nodejs does use some internal threads in some of its implementation (such as file I/O), but it is my understanding that it does not use threads in networking. But, it's immaterial whether there are some internal threads or not because all communication between sub-systems like networking and the main nodejs JS thread is via the event queue.
A nodejs thread of execution is never interrupted to do something else. It finishes and runs to completion, then the JS engine checks the event queue to see if there's something else waiting to be executed.
When there's incoming data available on socket, an event is placed in the event queue. The current nodejs Javascript that is executing finishes doing what it's doing, then the JS engine sees there's an event in the event queue and fires that event. If there's a function callback or event handler associated with that event (there usually is), then that gets called to execute the event.
If there's a mishap in the internals of some infrastructure such as networking, then all that happens to the nodejs code is that some networking event just doesn't occur. The nodejs code has its event handlers in place and just doesn't receive the event they are waiting for until the infrastructure gets unwedged and creates the event. This doesn't create any sort of hang in the nodejs code.
So, in your update:
From comments & answers: I now have clarified my question. The 'arcane
scenario' would require that there is a HIDDEN thread (which therefore
CANNOT execute any USER code, including CALLBACKS) that constructs the
request, sends it over the network, and receives the response -
WITHOUT having any callbacks to trigger, including the 'data' callback
- and stops short just at the point that the 'response' callback is ready to be called, waiting for the (single) visible JS thread to wake
up.
The nodejs thread runs to completion, then the JS engine waits for a new event to occur (e.g. get put in the event queue). When that event occurs, the JS engine runs the code that corresponds to that event (event handlers, callbacks, etc...). You make it sound like the single visible JS thread is asleep waiting to wake up and it can get stuck there because some other sub-system gets hung. That is not the case. The only thing that can happen is that some event that the single JS thread has an event handler for just never occurs. This would be no different than a situation where you send a message to a server and you have an event handler to see a response, but the server never sends the response. Your nodejs code continues processing other events (timers, other networking, other I/O), but this particular event just never occurs because the other server just never sent the data that would trigger that event. Nothing hangs.
This is "evented I/O" which is how nodejs describes itself.
There’s only one thread involved in Node.js; an event loop is used to process tasks that run asynchronously, and nothing queued will ever interrupt anything already running. So no, there is no race condition there.

Understanding the Event Loop

I am thinking about it and this is what I came up with:
Let's see this code below:
console.clear();
console.log("a");
setTimeout(function(){console.log("b");},1000);
console.log("c");
setTimeout(function(){console.log("d");},0);
A request comes in, and JS engine starts executing the code above step by step. The first two calls are sync calls. But when it comes to setTimeout method, it becomes an async execution. But JS immediately returns from it and continue executing, which is called Non-Blocking or Async. And it continues working on other etc.
The results of this execution is the following:
a c d b
So basically the second setTimeout got finished first and its callback function gets executed earlier than the first one and that makes sense.
We are talking about single-threaded application here. JS Engine keeps executing this and unless it finishes the first request, it won't go to second one. But the good thing is that it won't wait for blocking operations like setTimeout to resolve so it will be faster because it accepts the new incoming requests.
But my questions arise around the following items:
#1: If we are talking about a single-threaded application, then what mechanism processes setTimeouts while the JS engine accepts more requests and executes them? How does the single thread continue working on other requests? What works on setTimeout while other requests keep coming in and get executed.
#2: If these setTimeout functions get executed behind the scenes while more requests are coming in and being executed, what carries out the async executions behind the scenes? What is this thing that we talk about called the EventLoop?
#3: But shouldn't the whole method be put in the EventLoop so that the whole thing gets executed and the callback method gets called? This is what I understand when talking about callback functions:
function downloadFile(filePath, callback)
{
blah.downloadFile(filePath);
callback();
}
But in this case, how does the JS Engine know if it is an async function so that it can put the callback in the EventLoop? Perhaps something like the async keyword in C# or some sort of an attribute which indicates the method JS Engine will take on is an async method and should be treated accordingly.
#4: But an article says quite contrary to what I was guessing on how things might be working:
The Event Loop is a queue of callback functions. When an async
function executes, the callback function is pushed into the queue. The
JavaScript engine doesn't start processing the event loop until the
code after an async function has executed.
#5: And there is this image here which might be helpful but the first explanation in the image is saying exactly the same thing mentioned in question number 4:
So my question here is to get some clarifications about the items listed above?
1: If we are talking about a single-threaded application, then what processes setTimeouts while JS engine accepts more requests and executes them? Isn't that single thread will continue working on other requests? Then who is going to keep working on setTimeout while other requests keep coming and get executed.
There's only 1 thread in the node process that will actually execute your program's JavaScript. However, within node itself, there are actually several threads handling operation of the event loop mechanism, and this includes a pool of IO threads and a handful of others. The key is the number of these threads does not correspond to the number of concurrent connections being handled like they would in a thread-per-connection concurrency model.
Now about "executing setTimeouts", when you invoke setTimeout, all node does is basically update a data structure of functions to be executed at a time in the future. It basically has a bunch of queues of stuff that needs doing and every "tick" of the event loop it selects one, removes it from the queue, and runs it.
A key thing to understand is that node relies on the OS for most of the heavy lifting. So incoming network requests are actually tracked by the OS itself and when node is ready to handle one it just uses a system call to ask the OS for a network request with data ready to be processed. So much of the IO "work" node does is either "Hey OS, got a network connection with data ready to read?" or "Hey OS, any of my outstanding filesystem calls have data ready?". Based upon its internal algorithm and event loop engine design, node will select one "tick" of JavaScript to execute, run it, then repeat the process all over again. That's what is meant by the event loop. Node is basically at all times determining "what's the next little bit of JavaScript I should run?", then running it. This factors in which IO the OS has completed, and things that have been queued up in JavaScript via calls to setTimeout or process.nextTick.
2: If these setTimeout will get executed behind the scenes while more requests are coming and in and being executed, the thing carry out the async executions behind the scenes is that the one we are talking about EventLoop?
No JavaScript gets executed behind the scenes. All the JavaScript in your program runs front and center, one at a time. What happens behind the scenes is the OS handles IO and node waits for that to be ready and node manages its queue of javascript waiting to execute.
3: How can JS Engine know if it is an async function so that it can put it in the EventLoop?
There is a fixed set of functions in node core that are async because they make system calls and node knows which these are because they have to call the OS or C++. Basically all network and filesystem IO as well as child process interactions will be asynchronous and the ONLY way JavaScript can get node to run something asynchronously is by invoking one of the async functions provided by the node core library. Even if you are using an npm package that defines it's own API, in order to yield the event loop, eventually that npm package's code will call one of node core's async functions and that's when node knows the tick is complete and it can start the event loop algorithm again.
4 The Event Loop is a queue of callback functions. When an async function executes, the callback function is pushed into the queue. The JavaScript engine doesn't start processing the event loop until the code after an async function has executed.
Yes, this is true, but it's misleading. The key thing is the normal pattern is:
//Let's say this code is running in tick 1
fs.readFile("/home/barney/colors.txt", function (error, data) {
//The code inside this callback function will absolutely NOT run in tick 1
//It will run in some tick >= 2
});
//This code will absolutely also run in tick 1
//HOWEVER, typically there's not much else to do here,
//so at some point soon after queueing up some async IO, this tick
//will have nothing useful to do so it will just end because the IO result
//is necessary before anything useful can be done
So yes, you could totally block the event loop by just counting Fibonacci numbers synchronously all in memory all in the same tick, and yes that would totally freeze up your program. It's cooperative concurrency. Every tick of JavaScript must yield the event loop within some reasonable amount of time or the overall architecture fails.
Don't think the host process to be single-threaded, they are not. What is single-threaded is the portion of the host process that execute your javascript code.
Except for background workers, but these complicate the scenario...
So, all your js code run in the same thread, and there's no possibility that you get two different portions of your js code to run concurrently (so, you get not concurrency nigthmare to manage).
The js code that is executing is the last code that the host process picked up from the event loop.
In your code you can basically do two things: run synchronous instructions, and schedule functions to be executed in future, when some events happens.
Here is my mental representation (beware: it's just that, I don't know the browser implementation details!) of your example code:
console.clear(); //exec sync
console.log("a"); //exec sync
setTimeout( //schedule inAWhile to be executed at now +1 s
function inAWhile(){
console.log("b");
},1000);
console.log("c"); //exec sync
setTimeout(
function justNow(){ //schedule justNow to be executed just now
console.log("d");
},0);
While your code is running, another thread in the host process keep track of all system events that are occurring (clicks on UI, files read, networks packets received etc.)
When your code completes, it is removed from the event loop, and the host process return to checking it, to see if there are more code to run. The event loop contains two event handler more: one to be executed now (the justNow function), and another within a second (the inAWhile function).
The host process now try to match all events happened to see if there handlers registered for them.
It found that the event that justNow is waiting for has happened, so it start to run its code. When justNow function exit, it check the event loop another time, searhcing for handlers on events. Supposing that 1 s has passed, it run the inAWhile function, and so on....
The Event Loop has one simple job - to monitor the Call Stack, the Callback Queue and Micro task queue. If the Call Stack is empty, the Event Loop will take the first event from the micro task queue then from the callback queue and will push it to the Call Stack, which effectively runs it. Such an iteration is called a tick in the Event Loop.
As most developers know, that Javascript is single threaded, means two statements in javascript can not be executed in parallel which is correct. Execution happens line by line, which means each javascript statements are synchronous and blocking. But there is a way to run your code asynchronously, if you use setTimeout() function, a Web API given by the browser, which makes sure that your code executes after specified time (in millisecond).
Example:
console.log("Start");
setTimeout(function cbT(){
console.log("Set time out");
},5000);
fetch("http://developerstips.com/").then(function cbF(){
console.log("Call back from developerstips");
});
// Millions of line code
// for example it will take 10000 millisecond to execute
console.log("End");
setTimeout takes a callback function as first parameter, and time in millisecond as second parameter.
After the execution of above statement in browser console it will print
Start
End
Call back from developerstips
Set time out
Note: Your asynchronous code runs after all the synchronous code is done executing.
Understand How the code execution line by line
JS engine execute the 1st line and will print "Start" in console
In the 2nd line it sees the setTimeout function named cbT, and JS engine pushes the cbT function to callBack queue.
After this the pointer will directly jump to line no.7 and there it will see promise and JS engine push the cbF function to microtask queue.
Then it will execute Millions of line code and end it will print "End"
After the main thread end of execution the event loop will first check the micro task queue and then call back queue. In our case it takes cbF function from the micro task queue and pushes it into the call stack then it will pick cbT funcion from the call back queue and push into the call stack.
JavaScript is high-level, single-threaded language, interpreted language. This means that it needs an interpreter which converts the JS code to a machine code. interpreter means engine. V8 engines for chrome and webkit for safari. Every engine contains memory, call stack, event loop, timer, web API, events, etc.
Event loop: microtasks and macrotasks
The event loop concept is very simple. There’s an endless loop, where the JavaScript engine waits for tasks, executes them and then sleeps, waiting for more tasks
Tasks are set – the engine handles them – then waits for more tasks (while sleeping and consuming close to zero CPU). It may happen that a task comes while the engine is busy, then it’s enqueued. The tasks form a queue, so-called “macrotask queue”
Microtasks come solely from our code. They are usually created by promises: an execution of .then/catch/finally handler becomes a microtask. Microtasks are used “under the cover” of await as well, as it’s another form of promise handling. Immediately after every macrotask, the engine executes all tasks from microtask queue, prior to running any other macrotasks or rendering or anything else.

Categories