Async programing in Javascript - Promise.all() do not work as expected - javascript

I'm stuck in Async programming in Javascript.
I know that Promise.all() will run parallel.
Could you pls tell me what I'm wrong with this code below?
It should take 100ms. But actually, it takes 200ms :(
// 1. Define function here
var getFruit = async (name) => {
const fruits = {
pineapple: ":pineapple:",
peach: ":peach:",
strawberry: ":strawberry:"
};
await fetch('https://jsonplaceholder.typicode.com/photos'); // abount 100ms
return fruits[name];
};
var makeSmoothie = async () => {
const a = getFruit('pineapple');
const b = getFruit('strawberry');
const smoothie = await Promise.all([a, b]);
return smoothie;
//return [a, b];
};
/// 2. Execute code here
var tick = Date.now();
var log = (v) => console.log(`${v} \n Elapsed: ${Date.now() - tick}`);
makeSmoothie().then(log);

Your logic is fine, it is running as parallel as possible from client side.
You can test this by waiting on setTimeout instead of the fetch:
await new Promise(resolve => setTimeout(resolve, 100));
It's possible the placeholder site is queuing connections on its side or something. Either way you should measure with something predictable, not network.
Edit:
Just to explain a bit more that won't fit in a comment.
Let's put my waiting trick into an actual function:
const wait = ms => new Promise(resolve => setTimeout(resolve, ms));
OK, now you would call this as await wait(100). Now to compare:
await wait(100)
await fetch(...)
In terms of your code setup, they are the same. They are perform some async task that takes around 100ms and returns. If Promise.all() were not running this in parallel, the wait(100) version would certainly take 200ms or longer.
fetch is not as reliable of a test as setTimeout because it is running over the network. There's a lot of things you can't control with this call, like:
Some browsers limit how many parallel connections are made to the same domain
Lag in DNS or the server itself, typical network hiccups
The domain itself might force 1 connection at a time from your IP
It's not clear exactly what is causing the apparent synchronous behavior here. Maybe you will find some answers in the network panel. But in terms of code, there is nothing else for you to do. It is provably parallelized with the setTimeout test, which is much more reliable as a test since it is just a local timer.

Related

Replicating Golang's synchronous WaitGroup in JavaScript (node.js) using the Atomics API?

I am trying to replicate Golang's WaitGroup in JavaScript using the Atomics API.
The reason I need this is to add control in certain callback APIs that don't offer the ability to have async callbacks.
This is specifically for Node.js
A basic example:
import sqlite3 from 'sqlite3'
const db = new sqlite3.Database('./data.db')
const sleep = ms => new Promise(res => setTimeout(res, ms))
db.each('SELECT * from my_table', async (err, row) => {
console.log('start', row)
await sleep(2000)
console.log('done', row)
});
Each iteration will not care for the Promise returned by the callback and will instead flip through all of the rows immediately. This API does not offer me the ability to go row by row to minimize CPU overhead.
While this is one example and perhaps the answer is to open a PR to the library, there are many similar examples and it's more practical to have a solution I can implement on my end to address this while raising an issue with the maintainers.
I have since discovered the new Atomics API which offers the ability to sleep threads.
As an example here is a thread blocking sleep for 500ms.
const sleep = milliseconds => Atomics.wait(new Int32Array(new SharedArrayBuffer(4)), 0, 0, milliseconds)
sleep(500)
Obviously it's not ideal to use thread blocking techniques but in this case I believe it's appropriate.
I would like to produce a wrapper around Atomics that allows for the same control flow management as Golang's sync.WaitGoup
Assuming the following interface, producing a wrapper is challenging.
interface SyncWaitGroup {
new (ticks: number): SyncWaitGroup
done(): void
wait(): void
}
Where its usage would be
db.each('SELECT * from my_table', (err, row) => {
console.log('start', row)
const wg = new SyncWaitGroup(1) // WaitGroup wants 1 call to done()
setTimeout(() => {
wg.done() // This call will satisfy the required number of done() calls
}, 2000)
wg.wait() // synchronously wait for all the done() calls
console.log('done', row)
});
Using this strategy to force the callback to be executed synchronously allows me to write a wrapper that converts db.each into an AsyncIterable where I could use it idiomatically from there.
The issue I am having is understanding the Atomics API. Attempting something like this fails as the thread (including the event loop) is sleeping and mutations to the input array are not tracked.
const int32 = new Int32Array(new SharedArrayBuffer(4))
setTimeout(() => {
// Lives on the same thread so it's blocked
Atomics.store(int32, 0, 1)
Atomics.notify(int32, 0);
console.log('fired') // Does not run
}, 2000)
Atomics.wait(int32, 0, 0)

How to "queue" requests to run all the time without setInterval?

I am reading data in realtime from a device through http requests, however the way I am currently doing it is, something like this:
setInterval(() => {
for (let request of requests) {
await request.GetData();
}
}, 1000);
However, sometimes there is lag in the network, and since there are 4-5 requests, sometimes they don't all finish within a second, so they start stacking up, until the device eventually starts to timeout, so I need to somehow get rid of the setInterval. Increasing the time is not an option.
Essentially, I want them to get stuck in an infinite loop and I can add an inner timer to let the requests run again when half a second or a second has passed since the last run, but how do I get them stuck in an infinite loop without blocking the rest of the application?
Or maybe a way to make setInterval wait for all requests to finish before starting to count the 1 second interval?
Try:
(async () => {
while (true) {
for (let request of requests) {
await request.GetData();
}
await new Promise((resolve) => setTimeout(resolve, 1000));
}
})();
This will only start waiting once all the requests are finished, preventing them from stacking up.
Alternatively, on the inside of the async function, use:
while (true) {
await Promise.all(requests.map(request => request.GetData()));
await new Promise((resolve) => setTimeout(resolve, 1000));
}
This is different because all the calls to request.GetData() will run concurrently, which may or may not be what you want.

How to manually pause and execute for batch of records using map and promises in javascript?

function sleep(ms) {
var start = new Date().getTime(),
expire = start + ms;
while (new Date().getTime() < expire) {}
return;
}
async function executeWithDelay(offers) {
return Promise.all(
offers.map((offer, i) =>
getDetailedInfo(offer).then(data => {
offer = data;
if (i % 5 === 0) {
console.log('executed but it delays now for 3 seconds');
sleep(3000);
}
})
)
).then(function(data) {
return offers;
});
}
Trying to achieve web-scraping with possible best solution available. I am combining cheerio and puppeteer together and i have some nice code to debug. The above code works fine if the offers data is less i.e 5-10 records. If it exceeds, the browser gets crashed, though i am running in headless version.
This is basically due to in-house bug from the package what i am using which is unable to handle the load. My experiment is to a little delay for every batch of records and complete the full records execution.
But with the below code, it's behaving as non-blocking code and first time it delays and rest is executed constantly.
Should i use for loop instead of map or is there any alternative to handle this situation?
You never want to "sleep" with a busy-wait loop like that. It means nothing else can be done on the main thread.
Once the operations the promises represent have been started, unless they provide some custom means of pausing them, you can't pause them. Remember, promises don't actually do anything; they provide a standard way to observe something being done.
Another problem with that code is that you're returning offers, not the results of calling getExtendedInfo.
If the problem is calling getExtendedInfo too often, you can insert a delay (using setTimeout, not a busy-loop). For instance, this does the first request to getExtendedInfo almost immediately, the next after one second, the next after two seconds, and so on:
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
function executeWithDelay(offers) {
return Promise.all(
offers.map((offer, i) =>
sleep(i * 1000).then(getDetailedInfo)
)
});
}
Obviously, you can adjust that delay as required.
(Note that executeWithDelay isn't declared async; no need, since it has to use Promise.all anyway.)

Are asynchronous actions in Node.js needed only when doing a server application?

For example in the Node.js app I'm writing there's a piece of code along these lines wrapped in an async function:
await a(param1);
await b(param2);
await c(param3);
await d(param4);
As far I understand this would be good if my app were a server, so for example user 1 who sends a request to my server may be at stage await a(param1); and maybe this request takes a lot of time, so another user 2 who was at stage await b(param2); can still proceed with their request (not having to wait for the resolution of user 1's request).
However if it's only one user using my app I don't see any advantage of using asynchronous code because using await turns async code into "sync" because funcion b will not proceed until after a finished.
Is my understanding correct?
EDIT: functions a, b, c, d return promises and the next promise depends on the previous.
Is my understanding correct?
Not if a, b, c, and d return promises (implicitly, because they're also async functions, or explicitly), which presumably they do if you're using await on them. async/await don't make asynchronous code synchronous (that's impossible), they let you write your code in its logical flow rather than its temporal flow.
If you remove those awaits, you'll change the logic of the code: Instead of running a to completion, then b, then c, then d, it will start all of them and they'll all overlap. So instead of this:
const rnd = () => Math.floor(Math.random() * 800);
const runner = name => new Promise(resolve => {
console.log(name + " start");
setTimeout(() => {
console.log(name + " end");
resolve();
}, rnd())
});
const a = () => runner("a");
const b = () => runner("b");
const c = () => runner("c");
const d = () => runner("d");
(async() => {
await a();
await b();
await c();
await d();
console.log("after all calls");
})(); // In real code you'd catch errors
.as-console-wrapper {
max-height: 100% !important;
}
you'd get this:
const rnd = () => Math.floor(Math.random() * 800);
const runner = name => new Promise(resolve => {
console.log(name + " start");
setTimeout(() => {
console.log(name + " end");
resolve();
}, rnd())
});
const a = () => runner("a");
const b = () => runner("b");
const c = () => runner("c");
const d = () => runner("d");
(async() => {
a();
b();
c();
d();
console.log("after all calls");
})(); // In real code you'd catch errors
.as-console-wrapper {
max-height: 100% !important;
}
That probably isn't what that code is meant to do, regardless of whether it's on a server or in a single-user app.
In terms of using Node.js's various "Sync" methods (fs.readFileSync instead of fs.readFile, etc.), then yes, if you don't need the Node.js process to do anything else while it's reading the file (e.g., it doesn't have to do any other processing, like handling other requests as a server), you could use those in that single-user app situation.
The purpose of asynchronous code is not just allow multiple users to use the application but allow multiple tasks to be performed concurrently with single thread.
If a single user uses the application (e.g. web server), a user may need simultaneous requests. Browsers are able to make simultaneous requests, and they do them. If the code is synchronous and blocking, web server won't respond until synchronous routine is completed.
Even if synchronous control flow is currently suitable for non-web application, there's no guarantee that there won't be a necessity to rewrite it to asynchronous in future. The example is spinner indicator for CLI application.
However if it's only one user using my app I don't see any advantage of using asynchronous code because using await turns async code into "sync" because funcion b will not proceed until after a finished.
Only if a, etc. return promises. Omitting await will result in concurrent and uncontrollable promises in this case. In any other case await doesn't change how the code works except that introduces one-tick delay.
As another answer mentions, await is syntactic sugar for then. It doesn't make the code synchronous but performs promise operations in series, similarly to how synchronous code works.

Is it possible to halt execution of javascript for a finite amount of time?

Is it possible to halt execution of javascript for a finite amount of time? await is the closest thing I came across, but still it does not stop current execution of javascript. I actually tried busy-waiting on a variable, but in that case it was not able to come out of busy-waiting loop. Also I don't even know whether halting/stopping is ever possible in javascript since it is single threaded.
Exact scenario in which I am trying :
inside a callback() {
<make 2 ajax requests, wait for response>
<stop for sometime until one of the response is received>
return <redirect page using response received>
}
Note that callback can be called only once, and it should return a redirect url, if no return is given the page actually redirects to default fallback . So I want to stop for sometime to actually wait for events inside this callback.
You can certainly achieve it in an async scenario (by simulating it). As you postpone your task to a certain time in the future, when the event loop will pick it up.
Async/Await Example:
const sleep = t => new Promise(res => setTimeout(res, t))
const main = async _ => {
/*do something*/
await sleep(3000) //3 seconds
/*rest of the stuff*/
}
Promise way:
const sleep = t => new Promise(res => setTimeout(res, t))
const main = _ => {
doSometasks().then(_ => sleep(3000)).then(someClosingTasks)
}
Note: doSometasks() must return a promise (someClosingTasks() may or may not return a promise)
I think you could create js generator and (re)schedule execution with setTimeout until generator completes.
You can use synchronous XHR requests to block execution without busy loop, but in any case condition will not have a chance be changed if js is blocked in your method, so you have to allow execution of microtasks/macrotasks depending on what you really need.

Categories