Nodejs setInterval to repeat a task every 30s - javascript

I built a nodejs application that should execute several tasks.
My app.js has a function call to the manager module which controls those tasks.
I want to call that function from my app.js and perform those tasks every 30s.
something like:
setInterval(manager.tasks(), 30000);
My question is that if using setInterval could give me any performance problems (slowing down the computer, blocking resources or any other reason)
Is there a more efficient way to do this or is setInterval ok?

it depends on how heavy the work/processing you want to do is, setInterval is async so your code will only be run once every 30 seconds, but at the same time JavaScript is single-threaded, so if you're doing a lot of work in your task, the one time it runs in 30 seconds it may take too long to execute thus blocking resources.
In most cases you should be fine using setInterval, but if you really want to emulate multi-threading or if you're doing too much work in your task, you may want to spawn another child process https://nodejs.org/api/child_process.html process or use the new Worker Threads API https://nodejs.org/api/worker_threads.html instead, but keep in mind its not as simple to implement as a setInterval call

Use node-cron or node-schedule instead

setInterval is implemented in standard node js, you won't have any performance / blocking problems, most libraries also use setInterval

It completely depends on the function you executing inside setInterval. If it is I/O bound operation then you don't need to worry too much because libuv will handle itself But if it is CPU bound then I will suggest you to use child_process api to fork a new child process and do your stuff in to that.

Related

What is the best way to schedule a small task to execute as fast as possible, while still making it wait for UI updates?

I'm reading a bit about micro tasks and such, because at the moment I have a project that is poorly optimized and some tasks make the UI hang.
I've solved 95% of that by using a (Service) Worker for the heaviest tasks. But there's still some code that just has to be on the main thread and I'm wondering what the best way is to optimize that code.
I basically have 2 wishes:
I want a function to wait a little bit before executing, just enough to let the browser do any necessary UI drawings / changes.
But after that, I do want that function to execute as soon as possible. If I can prevent it, I don't want the task to be put at the very end of the browser's task queue. The reason for this is because the function changes the value of a variable and other functions down the line would benefit from having the latest update of the value of that variable.
After reading about micro tasks, I'm not sure if they're the right tool for the job. Because as far as I understand, the browser's decision on when to run a microtask has not so much to do with the UI, but more with what other macro tasks are on the task queue.
These are the alternatives that I've been able to find:
setTimeout() with a timeout of 0, I've read that the browser will automatically increase the time-out if it needs to.
requestAnimationFrame(), but that seems to always wait at least 1/60th of a second. If the UI is done sooner than that, then I'd want my function run sooner than that.
requestIdleCallback() sounds perfect, except that it's not supported by all browsers.
using new MessageChannel() to send an empty message from one port to the other; then executing the task when the 2nd port receives the message. The only reason I'm considering this one is because apparently Facebook uses it in React to queue a task in the browser. On Node React uses setImmediate(), but since that doesn't exist in the browser, apparently Facebook's developers thought this was the best alternative. And because I'm assuming they're cleverer than me, I think there must be something to it, right?
If web workers are an option (see comlink or do vanilla implementation) otherwise see the rest of my answer.
Considering your wish #1, I understand that you want consistent frame rate i.e. use only ideal time and next critical UI task should supersede your task.
All the options that you have listed out will not work out if the task runs longer than time for next Paint. Once the task is running it will have to end before browser can do anything else. So the problem is not with scheduling per say but preempting the task for more critical UI tasks.
In terms of scheduling both setTimeout and requestIdleCallback will not satisfy your wish #2 as, setTimeout will push new task at the end of the queue and requestIdleCallback (being worse of two in terms of priority) will wait for all the tasks to complete.
There is no straight forward way to automatically preempt a task in javascript as of now (to best of my knowledge). Even libraries like React can cause UI throttling if pushed hard enough.
Solutions that we have at hand.
Recursive SetTimeout.
Generator functions with some form of scheduling.
In both scenario you will be responsible for breaking down the task in smaller chunks that can "hopefully" complete before next UI paint.
Both work in same fashion pretty much with only difference being that you will need to chain tasks in setTimeout and in generator you can yield. The generator approach is way better in my opinion in terms of syntax as you will just need to add yield statement at different places. Not ideal but better.
Generator Example.
function* task = () => {
let i = 0;
while (true) {
i = i+1 % 10;
yield;
}
}
const callback = () => {
task.next();
setTimeout(callback, 0);
}
setTimeout(callback, 0);
P.S. Personally I wish there was a simple solution, I've looked for one for a long time. Let me know if this answers your question, if you have any doubts or you find anything interesting around this topic.

Why NodeJs cannot process intensive CPU tasks even if it is running async?

What I don't understand is, nodejs is asynchronous so I run multiple tasks doesn't matter how long they take, because the code parser continue to the other tasks and when the long task is completed it will let me know with a callback function, so where is the blocking for the CPU intensive tasks? Even if a task it takes 10 seconds the other lines of code in Js will continue to execute and to start other tasks. As NodeJs use only 4 threads for heavy tasks I understand that all this threads will be busy so here is come in place the scenario of why to don't use heavy cpu tasks with nodejs, am I right?
var listener = readAsync("I/O heavy calculation", function(){
console.log("I run after the I/O is done.");
})
//The parse will send the request and pass to the next line of code
console.log("I run before I/O request is done")
I expect the global declared console.log to run before the callback function.
Nodejs programs are single threaded by design. It prioritizes the application of whats most important inside of the event loop. There are ways to optimize your performance at scale. For example, there is a cluster module built in node js to assign the same node clients to different workers.
https://nodejs.org/en/blog/release/v10.5.0/ from node 10.5 there is multi threading support but it is experimental.

Asynchronously running C++ and JS code in V8

I'm currently experimenting with embedding V8 in a project of mine. Since I use libev for listening to sockets and events and want to be able to script events with JS I would want to be able to just run v8 for a short while and then jump back to C++ to check for events and such and then go back to running JS-code. Since I haven't done much script embedding earlier I'm sure there are some clever way that this usually is done in so all ideas are appreciated.
The cleanest way I found of doing this is to create setTimeout and clearTimeout functions within JS. setTimeout creates a ev::Timer which has a callback that gets called after a certain amount of time. This makes it so that when you call a JS function you continue to execute that until it returns, but that function can set a number of timeouts which aren't called until after you exit the current JS and there hasn't happened any other libev events during the execution, in that case those are handled first (in C++). The limitations of this method is that the coder who writes JS has to remember to not write functions that goes into eternal while-loops or similar. A loop is instead done like this:
function repeat() { setTimeout(repeat, 0); }

Asynchronous programming in javascript (NOT AJAX)

Is it possibly to do things asynchronously in javascript (AJAX aside)? For example, to iterate multiple arrays at the same time. How is it done? A brief example would be nice. Searching for this was hard, due to all the ajax pollution, which is not what I am looking for.
Thanks in advance.
Use web Workers. But remember that it is a very new feature and not all browsers are fully supported.
You could use setTimeout.
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
I'm not sure how concurrent it will be, but it is an asynchronous programming model.
As stated by Grumdrig you can write code like this:
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
But it will still not run concurrently. Here's a general idea of what happens after such timeouts are called:
Any code after the setTimeout calls will be run immediately, including returns to calling functions.
If there are other timers in queue that are at or past their delay or interval time, they will be executed one at a time.
While any timer is running, another might hit its interval/delay time, but it will not be run until the last one is finished.
Some browsers give priority to events fired from user interaction such as onclick and onmousemove, in which case the functions attached to those events will execute at the expense of timer accuracy.
This will continue until there is an opening (no previously called timers or event handlers requesting execution). Only then will the functions in the example code be run. Again one at a time, with the first one likely but not certainly executing first. Also, I'm venturing a guess that some browsers might impose a minimum delay time, which would make any timers set with a delay of 0 milliseconds be run even later than expected.
Obviously there is no performance advantage to running code like this. In every case it will make things take longer to complete. However in cases where a single task is taking so long it freezes the browser (and possibly trips "Script is taking too long" browser warnings), it can be helpful to break it up into smaller faster executing pieces that run sequentially after some delay time, thus giving the browser some time to breathe.
Web Workers have been mentioned, and if you are not concerned about IE compatibility then you can use them for true concurrency. However there are some severe limitations on their use imposed for security reasons. For one they cannot interact with the DOM in any way, meaning any changes to the page still must be done synchronously. Also all data passed to and from workers is serialized in transit, meaning true Javascript objects cannot be used. That being said, for intensive data processing, Web Workers are probably a better solution than breaking a function up into multiple timer delayed tasks.
One new development in this field is HTML5 Web Workers.
JavaScript is normally single threaded; you cannot do several things at once. If your JavaScript code is too slow, you will need to offload the work. The new way is to use web workers, as others have noted. The old way is often to use AJAX and do the work on the server instead. (Either with web workers or with AJAX, the arrays would have to be serialized and the result deserialized)
I have to agree with MooGoo, i also wonder why you would run through such a big array in one go.
There's an extension to JavaScript called StratifiedJS, it allows you do multiple things at once as long as they're asynchronous. Also, webworkers are an awkward "solution" that just make things more complicated, also, they don't work in IE.
In StratifiedJS you could just write.
waitfor {
// do something long lasting here...
}
and {
// do something else at the same time...
}
// and when you get here, both are done

How to activate two JavaScript functions in parallel?

Anyone can tell me how to activate two (or more) JavaScript AJAX functions in parallel?
This is not possible. Javascript can only work in a single thread and there is no way to actually have two functions running in parallel. You need to make one call and then the other. The callbacks of these will be called (not necessarily in the same order with the invocation methods), when data have been returned or an error/timeout occurs. Only when one callback completes, will the second one be allowed to run.
Have also in mind that browsers restrict the number of active ajax calls. So, if you try to make too many ajax calls, one might wait (blocking all javascript code) for other calls to complete.
Search for Web Workers. These are kind of a new feature in modern browsers and may not be available in old ones.
Is this what you're looking for?
setTimeout('JsFunction1(val);', 0);
setTimeout('JsFunction2(val);', 0);
use Web Workers to run tasks in Parallel
You can a tutorial here: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers
Also, this library, which takes advantage of web workers, came up pretty fast on google: https://parallel.js.org/
Using several 'setInterval' may make parallel running possible, though it may still run on a single core. The following code is an example of parallelizing a function func with an array of data 'datas'. To run it, use Parallel(func,datas) where func is the name of your global function and datas is an array of data each one as an input for func.
var i_array=new Array();
function Parallel(func,datas){
$(datas).each(function(i,v){
i_array[i]=setInterval(function(){
clearInterval(i_array[i]);
window[func](datas[i]);
},10);
});
}
Here is a jsfiddle test. The time stamp in integer numbers show the two ajax were running in parallel.
Use window.open() to call a new page. That page will call the first js function. After window.open() calls the second function, you are not technically waiting for the first function to complete. You just wait for the window.open() to execute and then the second will get execute.
Javascript runs as a single thread, if the requests that you want to make doesn't have an IO, In that case its just not possible, if there's an IO operation involved, you can very well execute the two functions one after the other, the very nature of javascript will start executing the next function when it waits for IO.
Usually in languages that support threading the same thing is achieved automatically during the CPU rest period for a thread.

Categories