How to "multicast" an async iterable? - javascript

Can an async generator be somehow broadcast or multicast, so that all its iterators ("consumers"? subscribers?) receive all values?
Consider this example:
const fetchMock = () => "Example. Imagine real fetch";
async function* gen() {
for (let i = 1; i <= 6; i++) {
const res = await fetchMock();
yield res.slice(0, 2) + i;
}
}
const ait = gen();
(async() => {
// first "consumer"
for await (const e of ait) console.log('e', e);
})();
(async() => {
// second...
for await (const é of ait) console.log('é', é);
})();
Iterations "consume" a value, so only one or the other gets it.
I would like for both of them (and any later ones) to get every yielded value, if such a generator is possible to create somehow. (Similar to an Observable.)

This is not easily possible. You will need to explicitly tee it. This is similar to the situation for synchronous iterators, just a bit more complicated:
const AsyncIteratorProto = Object.getPrototypeOf(Object.getPrototypeOf(async function*(){}.prototype));
function teeAsync(iterable) {
const iterator = iterable[Symbol.asyncIterator]();
const buffers = [[], []];
function makeIterator(buffer, i) {
return Object.assign(Object.create(AsyncIteratorProto), {
next() {
if (!buffer) return Promise.resolve({done: true, value: undefined});
if (buffer.length) return buffer.shift();
const res = iterator.next();
if (buffers[i^1]) buffers[i^1].push(res);
return res;
},
async return() {
if (buffer) {
buffer = buffers[i] = null;
if (!buffers[i^1]) await iterator.return();
}
return {done: true, value: undefined};
},
});
}
return buffers.map(makeIterator);
}
You should ensure that both iterators are consumed at about the same rate so that the buffer doesn't grow too large.

Here's a solution using Highland as an intermediary. From the docs:
A stream forked to multiple consumers will pull values, one at a time, from its source as only fast as the slowest consumer can handle them.
import _ from 'lodash'
import H from 'highland'
export function fork<T>(generator: AsyncGenerator<T>): [
AsyncGenerator<T>,
AsyncGenerator<T>
] {
const source = asyncGeneratorToHighlandStream(generator).map(x => _.cloneDeep(x));
return [
highlandStreamToAsyncGenerator<T>(source.fork()),
highlandStreamToAsyncGenerator<T>(source.fork()),
];
}
async function* highlandStreamToAsyncGenerator<T>(
stream: Highland.Stream<T>
): AsyncGenerator<T> {
for await (const row of stream.toNodeStream({ objectMode: true })) {
yield row as unknown as T;
}
}
function asyncGeneratorToHighlandStream<T>(
generator: AsyncGenerator<T>
): Highland.Stream<T> {
return H(async (push, next) => {
try {
const result = await generator.next();
if (result.done) return push(null, H.nil);
push(null, result.value);
next();
} catch (error) {
return push(error);
}
});
}
Usage:
const [copy1, copy2] = fork(source);
Works in Node, browser untested.

I built a library to do this here: https://github.com/tjenkinson/forkable-iterator
Means you can do something like:
import { buildForkableIterator, fork } from 'forkable-iterator';
function* Source() {
yield 1;
yield 2;
return 'return';
}
const forkableIterator = buildForkableIterator(Source());
console.log(forkableIterator.next()); // { value: 1, done: false }
const child1 = fork(forkableIterator);
// { value: 2, done: false }
console.log(child1.next());
// { value: 2, done: false }
console.log(forkableIterator.next());
// { value: 'return', done: true }
console.log(child1.next());
// { value: 'return', done: true }
console.log(forkableIterator.next());
If you no longer need to keep consuming from a fork providing you loose references to it there also shouldn’t be a memory leak.

Related

How to build an object tree with recursive API calls?

I want to construct a tree where each node is used in a API call to get the children nodes; starting at the root. And this will be done recursively until it reaches the TREE_DEPTH_LIMIT
export const search = async (searchTerm) => {
try {
const tree = {};
await createTree(searchTerm, tree);
return tree;
} catch (err: any) {}
};
const TREE_DEPTH_LIMIT = 3;
const createTree = async (searchTerm, tree) => {
if (counter === TREE_DEPTH_LIMIT) {
counter = 0;
return;
}
counter++;
tree[searchTerm] = {};
try {
const res = await axiosInstance.get(
`/query?term=${searchTerm}`
);
// res.data.terms is an array of strings
res.data.terms.forEach((term) => {
createTree(term, tree[searchTerm]);
});
} catch (err) {}
};
I am trying to do this recursively in the createTree() function above. It will use the searchTerm in the API call. Then it will loop through res.data.terms and call createTree() on each on the terms. But the output is not what I was expecting.
This is the output:
const tree = {
apple: {
apple_tree: {},
tree: {},
},
};
The expected output: (because the TREE_DEPTH_LIMIT is 3, it should have 3 levels in the tree)
const tree = {
apple: {
apple_tree: {
leaf: {},
},
tree: {
trunk: {},
},
},
};
Or is my solution completely incorrect and I should be going for another approach??
Some issues:
counter seems to be a global variable, but that will not work out well as at each return from recursion, counter should have its value restored. It is better to use a local variable for that, so that every execution context has its own version for it. Even better is to make it a parameter and let it count down instead of up.
The recursive call is not awaited, so in search the promise returned by createTree may resolve before all the children have been populated, and so you would work with an incomplete tree.
Not a real problem, but it is not the most elegant that the caller must create a tree object and pass it as argument. I would redesign the functions so that search will create that object itself, and then use a recursive function to create children -- so I'd name that function createChildren, instead of createTree.
Here is a snippet that first mocks the get method so you can actually run it:
// Mock for this demo
const axiosInstance = {async get(term) {const delay = ms => new Promise(resolve => setTimeout(resolve, ms));await delay(50);return {data: {terms: {"apple": ["apple_tree", "tree"],"apple_tree": ["leaf"],"leaf": [],"tree": ["trunk"],"trunk": []}[term.split("=")[1]]}};}}
const createChildren = async (searchTerm, depth) => {
if (depth-- <= 0) return {};
try {
const res = await axiosInstance.get(`/query?term=${searchTerm}`);
const promises = res.data.terms.map(async (term) =>
[term, await createChildren(term, depth)]
);
return Object.fromEntries(await Promise.all(promises));
} catch (err) {
console.log(err);
}
};
const TREE_DEPTH_LIMIT = 3;
const search = async (searchTerm, depth=TREE_DEPTH_LIMIT) =>
({[searchTerm]: await createChildren(searchTerm, depth)});
// Demo
search("apple").then(tree =>
console.log(tree)
);

Async generator class stuck on infinite loop javascript

I'm trying to get the following async generator to work:
class MyIterator {
constructor(m) {
this.collection = m;
}
async *[Symbol.iterator]() {
for (let item of this.collection) {
const resultItem = await Promise.resolve(item)
console.log("item: ", resultItem)
yield resultItem
}
}
}
(async () => {
const iterator = new MyIterator([1,2,3])
let times = 0
for await (let thing of iterator) {
console.log("thing: ", thing)
// this is here to avoid an infinite loop
times++
if (times > 1000) break
}
})()
But it ends up in an infinite loop, and thing is always undefined.
item: 1
thing: undefined
item: 2
thing: undefined
item: 3
thing: undefined (x999)
I've tried a similar code, but this time without the Promise/async behaviour, and it seems to work just fine.
class MyIterator {
constructor(m) {
this.collection = m;
}
*[Symbol.iterator]() {
for (let item of this.collection) {
console.log("item: ", item)
yield item
}
}
}
const iterator = new MyIterator([1,2,3])
for (let thing of iterator) {
console.log("thing: ", thing)
}
item: 1
thing: 1
item: 2
thing: 2
item: 3
thing: 3
The for await..of construct will attempt to iterate over an async iterator.
An async iterator is defined using the ##asyncIterator well-known symbol:
class MyIterator {
constructor(m) {
this.collection = m;
}
async *[Symbol.asyncIterator]() { //<-- this is async
for (let item of this.collection) {
const resultItem = await Promise.resolve(item)
//console.log("item: ", resultItem)
yield resultItem
}
}
}
(async () => {
const iterator = new MyIterator([1,2,3])
let times = 0
for await (let thing of iterator) {
//no infinite loop
console.log("thing: ", thing)
}
})()
for await..of can also consume plain iterables that produce promises:
const promiseArray = [Promise.resolve("a"), Promise.resolve("b"), Promise.resolve("c")];
(async function() {
for await(const item of promiseArray) {
console.log(item);
}
})()
Attempting to make a regular iterator that is an async method/function does not work.
If you want to keep your ##iterator defined method your the best choice is to make it produce promises instead:
class MyIterator {
constructor(m) {
this.collection = m;
}
*[Symbol.iterator]() { // not async
for (let item of this.collection) {
yield Promise.resolve(item); //produce a promise
}
}
}
(async () => {
const iterator = new MyIterator([1,2,3])
let times = 0
for await (let thing of iterator) {
console.log("thing: ", thing)
}
})()
Although, that's might be a bad practice if any of the promises rejects:
const wait = (ms, val) =>
new Promise(res => setTimeout(res, ms, val));
const fail = (ms, val) =>
new Promise((_, rej) => setTimeout(rej, ms, val));
const arr = [
wait(100, 1),
wait(150, 2),
fail(0, "boom"),
wait(200, 3)
];
(async function(){
try {
for await (const item of arr) {
console.log(item);
}
} catch (e) {
console.error(e);
}
})()
/* result in the browser console:
Uncaught (in promise) boom
1
2
boom
*/
However, be aware that there is a difference in semantics between these:
A regular iterator produces an IteratorResult - an object with value and done properties.
const syncIterable = {
[Symbol.iterator]() {
return {
next() {
return {value: 1, done: true}
}
}
}
}
const syncIterator = syncIterable[Symbol.iterator]();
console.log("sync IteratorResult", syncIterator.next());
An async generator produces a promise for an IteratorResult
const asyncIterable = {
[Symbol.asyncIterator]() {
return {
next() {
return Promise.resolve({value: 2, done: true});
}
}
}
}
const asyncIterator = asyncIterable[Symbol.asyncIterator]();
asyncIterator.next().then(result => console.log("async IteratorResult", result));
Finally, an iterator that produces promises will have an IteratorResult where value is a promise:
const promiseSyncIterable = {
[Symbol.iterator]() {
return {
next() {
return {value: Promise.resolve(3), done: true}
}
}
}
}
const promiseSyncIterator = promiseSyncIterable[Symbol.iterator]();
const syncPromiseIteratorResult = promiseSyncIterator.next();
console.log("sync IteratorResult with promise", syncPromiseIteratorResult);
syncPromiseIteratorResult.value
.then(value => console.log("value of sync IteratorResult with promise", value));
Side-note on nomenclature: MyIterator is not an iterator. An iterator is an object with a next() method which produces an IteratorResult. An object that you can iterate over has an ##iterator (or ##asyncIterable) method and it is called iterable (or async iterable respectively).
As #VLAZ pointed out in a comment to my question, I was using Symbol.iterator instead of Symbol.asyncIterator. The following implementation works as expected:
class MyIterator {
constructor(m) {
this.collection = m;
}
async *[Symbol.asyncIterator]() {
for (let item of this.collection) {
const resultItem = await Promise.resolve(item)
console.log("item: ", resultItem)
yield resultItem
}
}
}
(async () => {
const iterator = new MyIterator([1,2,3])
for await (let thing of iterator) {
console.log("thing: ", thing)
}
})()

Wrap a resultset callback function with a generator/iterator

I'm working on converting a legacy callback-based API into an async library. But I just can't wrap my head around getting a "resultset" to work as a generator (Node 10.x).
The original API works like this:
api.prepare((err, rs) => {
rs.fetchRows(
(err, row) => {
// this callback is called as many times as rows exist
console.log("here's a row:", row);
},
() => {
console.log("we're done, data exausted");
}
);
});
But here is how I want to use it:
const wrapped = new ApiWrapper(api);
const rs = await wrapped.prepare({});
for (let row of rs.rows()) {
console.log("here's a row:", row);
}
let row;
while(row = await rs.next()) {
console.log("here's a row:", row);
}
I thought I had it under control with generators, but it looks like you cannot use yield inside a callback. It actually seems logical if you think about.
class ApiWrapper {
constructor(api) {
this.api = api;
}
prepare() {
return new Promise((resolve, reject) => {
this.api.prepare((err, rs) => {
if (err) {
reject(err);
} else {
resolve(rs);
}
});
});
}
*rows() {
this.api.fetchRows((err, row) => {
if (err) {
throw err;
} else {
yield row; // nope, not allowed here
}
});
}
next() { ... }
}
So what alternatives do I have?
Important: I don't want to store anything in an array then iterate that, we're talking giga-loads of row data here.
Edit
I'm able to simulate the behavior I want using stream.Readable but it warns me that it's an experimental feature. Here's a simplified array-based version of the issue I'm trying to solve using stream:
const stream = require('stream');
function gen(){
const s = new stream.Readable({
objectMode: true,
read(){
[11, 22, 33].forEach(row => {
this.push({ value: row });
});
this.push(null)
}
});
return s;
}
for await (let row of gen()) {
console.log(row);
}
// { value: 11 }
// { value: 22 }
// { value: 33 }
(node:97157) ExperimentalWarning: Readable[Symbol.asyncIterator] is an experimental feature. This feature could change at any time
I finally realized I needed something similar to Go's channels that were async/await compatible. Basically the answer is to synchronize an async iterator and a callback, making them wait for each other as next() iterations are consumed.
The best (Node) native solution I found was to use a stream as an iterator, which is supported in Node 10.x but tagged experimental. I also tried to implement it with the p-defer NPM module, but that turned out to be more involved than I expected. Finally ran across the https://www.npmjs.com/package/#nodeguy/channel module, which was exactly what I needed:
const Channel = require('#nodeguy/channel');
class ApiWrapper {
// ...
rows() {
const channel = new Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
this.api.fetchRows(async (err, row) => {
await channel.push(row);
}).then(() => channel.close());
return iter;
}
}
// then later
for await (let row of rs.rows()) {
console.log(row)
}
Note how each iterating function core, next() and rows(), have a await that will throttle how much data can be pushed across the channel, otherwise the producing callback could end up pushing data uncontrollably into the channel queue. The idea is that the callback should wait for data to be consumed by the iterator next() before pushing more.
Here's a more self-contained example:
const Channel = require('#nodeguy/channel');
function iterating() {
const channel = Channel();
const iter = {
[Symbol.asyncIterator]() {
return this;
},
async next() {
console.log('next');
const val = await channel.shift();
if (val === undefined) {
return { done: true };
} else {
return { done: false, value: val };
}
}
};
[11, 22, 33].forEach(async it => {
await channel.push(it);
console.log('pushed', it);
});
console.log('returned');
return iter;
}
(async function main() {
for await (let it of iterating()) {
console.log('got', it);
}
})();
/*
returned
next
pushed 11
got 11
next
pushed 22
got 22
next
pushed 33
got 33
next
*/
Like I said, Streams and/or Promises can be used to implement this, but the Channel module solves some of the complexity that make it more intuitive.
The original question has two nested callback taking async functions
api.prepare((err,res) => ...)
rs.fetchRows((err,res) => ...)
The first one runs the callback only once so just promisifying it as follows is sufficient.
function promisfied(f){
return new Promise((v,x) => f(x,v));
}
However the second function will invoke it's callback multiple times and we wish to generate an async iterable from this function such that we can consume it in a for await of loop.
This is also possible by employing async generators as follows;
async function* rowEmitterGenerator(rs){
let _v, // previous resolve
_x, // previous reject
_row = new Promise((v,x) => (_v = v, _x = x));
rs.fetchRows((err, row) => ( err ? _x(err) : _v(row)
, _row = new Promise((v,x) => (_v = v, _x = x))
));
while(true){
try {
yield _row;
}
catch(e){
console.log(e);
}
}
}
Then putting all together in a top level await context;
const rows = await promisified(api.prepare),
rowEmitter = rowEmitterGenerator(rows);
for await (let row of rowEmitter){
console.log(`Received row: ${row}`);
// do something with the row
}

How to convert Node.js async streaming callback into an async generator?

I have a function that streams data in batches via a callback.
Each batch will await the callback function before fetching another batch and the entire function returns a promise that resolves when all batches are finished.
(I'm using TypeScript annotations to help with readability)
async function callbackStream(fn: (batch: Array<number>) => Promise<void>) {}
How do I to turn this function into an async generator that yields one value at a time?
async function* generatorStream(): AsyncIterableIterator<number> {}
This has proven to be quite a difficult task.
I've toyed around with this problem and I've built something that works, but its very convoluted and I can't justify merging this code and making others on my team deal with it.
Here's my current implementation:
I'm using this helper function that created a "deferred" promise which helps with passing promises around callbacks.
interface DeferredPromise<T> {
resolve: (value: T) => void
reject: (error: any) => void
promise: Promise<T>
}
function deferred<T>(): DeferredPromise<T> {
let resolve
let reject
const promise = new Promise<T>((res, rej) => {
resolve = res
reject = rej
})
return {
resolve: resolve as (value: T) => void,
reject: reject as (error: any) => void,
promise,
}
}
Next I have this hairball of logic that linearizes the promise callbacks into a chain where each promise resolved a batch with next function that will return another promise fetching the next batch.
type Done = { done: true }
type More = { done: false; value: Array<number>; next: () => Promise<Result> }
type Result = More | Done
async function chainedPromises() {
let deferred = PromiseUtils.deferred<Result>()
callbackStream(async batch => {
const next = PromiseUtils.deferred<null>()
deferred.resolve({
done: false,
value: batch,
next: () => {
deferred = PromiseUtils.deferred<Result>()
next.resolve(null)
return deferred.promise
},
})
await next.promise
}).then(() => {
deferred.resolve({ done: true })
})
return deferred.promise
}
From here, creating a generator that yields one item at a time isn't very difficult:
async function* generatorStream(): AsyncIterableIterator<number> {
let next = chainedPromises
while (true) {
const result = await next()
if (result.done) {
return
}
for (const item of result.value) {
yield item
}
next = result.next
}
}
I think we can all agree that the intermediate chainedPromises function is very confusing and convoluted. Is there any way I can transform callbackStream into generatorStream in a way that is easy to understand and easy to follow? I don't mind using a library if its well established, but I would also appreciate a simple implementation from first-principles.
No, I don't think there's a way to implement this transformation in a way that's easy to understand and easy to follow. However, I would recommend to drop the deferreds (you're never rejecting anyway) and just use the promise constructor. Also I'd rather implement an asynchronous generator right away.
function queue() {
let resolve = () => {};
const q = {
put() {
resolve();
q.promise = new Promise(r => { resolve = r; });
},
promise: null,
}
q.put(); // generate first promise
return q;
}
function toAsyncIterator(callbackStream) {
const query = queue();
const result = queue();
const end = callbackStream(batch => {
result.put(batch);
return query.promise;
}).then(value => ({value, done: true}));
end.catch(e => void e); // prevent unhandled promise rejection warnings
return {
[Symbol.asyncIterator]() { return this; },
next(x) {
query.put(x);
return Promise.race([
end,
result.promise.then(value => ({value, done:false})
]);
}
}
}
async function* batchToAsyncIterator(batchCallbackStream) {
for await (const batch of toAsyncIterator(batchCallbackStream)) {
// for (const val of batch) yield val;
// or simpler:
yield* batch;
}
}
You need a event bucket, here is an example:
function bucket() {
const stack = [],
iterate = bucket();
var next;
async function * bucket() {
while (true) {
yield new Promise((res) => {
if (stack.length > 0) {
return res(stack.shift());
}
next = res;
});
}
}
iterate.push = (itm) => {
if (next) {
next(itm);
next = false;
return;
}
stack.push(itm);
}
return iterate;
}
;(async function() {
let evts = new bucket();
setInterval(()=>{
evts.push(Date.now());
evts.push(Date.now() + '++');
}, 1000);
for await (let evt of evts) {
console.log(evt);
}
})();
Here's a more modern TypeScript BufferedIterator implementation that's inspired by #NSD's "bucket" approach: https://github.com/felipecsl/obgen/blob/master/src/bufferedIterator.ts
This class implements the AsyncIterableIterator<T> interface
Sample usage:
(async () => {
const buffer = new BufferedIterator();
const callback = (c: string) => buffer.emit(c);
callback("a");
callback("b");
callback("c");
delay(1000).then(() => callback("d"));
delay(2000).then(() => callback("e"));
// make sure you call end() to indicate the end of the stream if applicable
delay(3000).then(() => buffer.end());
for await (const value of buffer) {
console.log(value);
}
// or use drain() to collect all items into an array
// console.log(await buffer.drain());
console.log("done");
})();
Would it work if there will be typescript solution?
It should handle condition when callback is called faster then promise is resolved a couple of times.
Callback can be a method that has this signature callback(error, result, index)
It is set to finish when callback is called with no arguments.
Usage:
asAsyncOf(this.storage, this.storage.each);
Solution:
function asAsyncOf<T1, T2, T3, T4, Y>(c, fn: { (a: T1, a1: T2, a2: T3, a3: T4, cb: { (err?, res?: Y, index?: number): boolean }): void }, a: T1, a1: T2, a2: T3, a3: T4): AsyncGenerator<Y>
function asAsyncOf<T1, T2, T3, Y>(c, fn: { (a: T1, a1: T2, a2: T3, cb: { (err?, res?: Y, index?: number): boolean }): void }, a: T1, a1: T2, a3: T3): AsyncGenerator<Y>
function asAsyncOf<T1, T2, Y>(c, fn: { (a: T1, a1: T2, cb: {(err?, res?: Y, index?: number): boolean}): void}, a: T1, a1: T2): AsyncGenerator<Y>
function asAsyncOf<T, Y>(c, fn: { (a: T, cb: { (err?, res?: Y, index?: number): boolean }): void }, a: T): AsyncGenerator<Y>
function asAsyncOf<Y>(c, fn: { (cb: {(err?, res?: Y, index?: number): boolean}): void}): AsyncGenerator<Y>
async function* asAsyncOf(context, fn, ...args) {
let next = (result?) => { };
let fail = (err) => { };
let finish = {};
const items = [];
let started = true;
try {
fn.apply(context, [...args, function (err, result, index) {
const nextArgs = [].slice.call(arguments, 0);
if (nextArgs.length === 0) {
started = false;
next(finish);
return true;
}
if (err) {
fail(err);
return true;
}
items.push(result);
next(result);
}]);
} catch (ex) {
fail(ex);
}
while (true) {
const promise = started ? new Promise((resolve, error) => {
next = resolve;
fail = error;
}) : Promise.resolve(finish);
const record = await promise;
if (record === finish) {
while (items.length) {
const item = items.shift();
yield item;
}
return;
}
while (items.length) {
const item = items.shift();
yield item;
}
}
}
export { asAsyncOf };

How to build an event generator in JavaScript

I am trying to build a way to create a generator which can yield DOM events. More generally, I want to create a way to convert an event system to an async system yielding events.
My initial code example works, but I can see an issue with lifting the resolve function from the Promise so that I can call that function once the event comes in.
class EventPropagation {
constructor(id) {
const button = document.getElementById(id);
let _resolve;
button.addEventListener("click", event => {
if (_resolve) {
_resolve(event);
}
});
let _listen = () => {
return new Promise(resolve => {
_resolve = resolve;
});
}
this.subscribe = async function*() {
const result = await _listen();
yield result;
yield * this.subscribe();
}
}
}
async function example() {
const eventPropagation = new EventPropagation("btn");
for await (const event of eventPropagation.subscribe()) {
console.log(event);
}
}
// call the example function
example();
My question is: Is there a better way of building something like this? There are a lot of things to think about, like multiple events coming in at the same time or cleaning up the listener and the subscriptions. My goal is not to end up with a reactive library but I do want to create small transparent functions which yield events asynchronously.
fiddle
Edited 14 dec 2017 (Edited in response to Bergi's comment)
Async Generators
Babel and a few plugins later; async generators aren't a problem:
const throttle = ms => new Promise(resolve => setTimeout(resolve, ms));
const getData = async() => {
const randomValue = Math.floor(Math.random() * 5000 + 1);
await throttle(randomValue);
return `The random value was: ${randomValue}`;
}
async function* asyncRandomMessage() {
const message = await getData();
yield message;
// recursive call
yield *asyncRandomMessage();
}
async function example() {
for await (const message of asyncRandomMessage()) {
console.log(message);
}
}
// call it at your own risk, it does not stop
// example();
What I want to know is how I transform a series of individual callback calls into an async stream. I can't imagine this problem isn't tackled. When I look at the library Bergi showed in the comments I see the same implementation as I did, namely: "Store the resolve and reject functions somewhere the event handler can call them." I can't imagine that would be a correct way of solving this problem.
You need a event bucket, here is an example:
function evtBucket() {
const stack = [],
iterate = bucket();
var next;
async function * bucket() {
while (true) {
yield new Promise((res) => {
if (stack.length > 0) {
return res(stack.shift());
}
next = res;
});
}
}
iterate.push = (itm) => {
if (next) {
next(itm);
next = false;
return;
}
stack.push(itm);
}
return iterate;
}
;(async function() {
let evts = evtBucket();
setInterval(()=>{
evts.push(Date.now());
evts.push(Date.now() + '++');
}, 1000);
for await (let evt of evts) {
console.log(evt);
}
})();
My best solution thus far has been to have an internal EventTarget that dispatches events when new events are added onto a queue array. This is what I've been working on for a JS modules library (including used modules here). I don't like it... But it works.
Note: This also handles the new AbortSignal option for event listeners in multiple places.
export function isAborted(signal) {
if (signal instanceof AbortController) {
return signal.signal.aborted;
} else if (signal instanceof AbortSignal) {
return signal.aborted;
} else {
return false;
}
}
export async function when(target, event, { signal } = {}) {
await new Promise(resolve => {
target.addEventListener(event, resolve, { once: true, signal });
});
}
export async function *yieldEvents(what, event, { capture, passive, signal } = {}) {
const queue = [];
const target = new EventTarget();
what.addEventListener(event, event => {
queue.push(event);
target.dispatchEvent(new Event('enqueued'));
}, { capture, passive, signal });
while (! isAborted(signal)) {
if (queue.length === 0) {
await when(target, 'enqueued', { signal }).catch(e => {});
}
/**
* May have aborted between beginning of loop and now
*/
if (isAborted(signal)) {
break;
} else {
yield queue.shift();
}
}
}
The example provided by NSD, but now in Typescript
class AsyncQueue<T> {
private queue: T[] = [];
private maxQueueLength = Infinity;
private nextResolve = (value: T) => {};
private hasNext = false;
constructor(maxQueueLength?: number) {
if (maxQueueLength) {
this.maxQueueLength = maxQueueLength;
}
}
async *[Symbol.asyncIterator]() {
while (true) {
yield new Promise((resolve) => {
if (this.queue.length > 0) {
return resolve(this.queue.shift());
}
this.nextResolve = resolve;
this.hasNext = true;
});
}
}
push(item: T) {
if (this.hasNext) {
this.nextResolve(item);
this.hasNext = false;
return;
}
if (this.queue.length > this.maxQueueLength) {
this.queue.shift();
}
this.queue.push(item);
}
}
(async function () {
const queueu = new AsyncQueue<string>();
setInterval(() => {
queueu.push(Date.now().toString());
queueu.push(Date.now().toString() + "++");
}, 1000);
for await (const evt of queueu) {
console.log(evt);
}
})();

Categories