JS Program using Async & Recursion gets stuck in Main process - javascript

I am trying to retrieve the Firebase users in a project. I can retrieve them, but the main program never ends. I debug using VS Code or run the script with npm run and it gets stuck both ways. In VS Code there is nothing left on the stack...it just never stops
**Users function (It returns the users without a problem)
admin.auth().listUsers returns a listUsersResult object with properties nextPageToken and a users array
const BATCH_SIZE = 2;
const listAllUsers = async (nextPageToken = undefined) => {
let listUsersResult = await admin.auth().listUsers(BATCH_SIZE, nextPageToken);
if (listUsersResult.pageToken) {
return listUsersResult.users.concat(await listAllUsers(listUsersResult.pageToken));
} else {
return listUsersResult.users;
}
};
Main Routine (this is the one that gets stuck)
const uploadUsersMain = async () => {
try {
// if I comment out this call, then there is no problem
let firestoreUsers = await listAllUsers();
} catch(error) {
log.error(`Unable to retrieve users ${error}`)
}
finally {
// do stuff
}
}
uploadUsersMain();
What could be the issue that stops the main program from ending? What should I look for? Thanks

To shut down your script you should use
await admin.app().delete();
Node.js will quietly stay running as long as there are handles, which can really be anything — a network socket certainly.
You could also use process.exit(), but it's good practice to properly shut down the SDK, instead of exiting abruptly when it might not be finished.

Related

JavaScript : How to wait for camera access before proceed execution?

I have the following code from Twilio to access local camera on browser :
async function ShowLocalVideo() {
Twilio.Video.createLocalVideoTrack().then(track => {
const localMediaContainer = document.getElementById('LocalMedia');
localMediaContainer.appendChild(track.attach());
});
}
I would like to make sure the user Granted Access to the camera before continuing the execution of other steps. Or at least he responded. So I'm calling
await ShowLocalVideo();
alert('Hi !');
But alert Hi ! is trigerred before the browser says : This file wants to use your camera.
Is is possible to make sure the code doesn't continue until the user responds to This file wants to use your camera. ?
Thanks,
Cheers
You're mixing async/await syntax with using Promise .then.
Since you're not awaiting on the result of createLocalVideoTrack().then(...), the async function will return early. You can correct this by replacing .then with await the following way:
async function ShowLocalVideo() {
const track = await Twilio.Video.createLocalVideoTrack();
const localMediaContainer = document.getElementById('LocalMedia');
localMediaContainer.appendChild(track.attach());
}
Alternatively, you can await on the result of createLocalVideoTrack().then(...), but that would mix the styles and create confusion.

Handling errors and recoverying with node pg (postgres) client

I am using node module pg in my application and I want to make sure it can properly handle connection and query errors.
The first problem I have is I want to make sure it can properly recover when postgres is unavailable.
I found there is an error event so I can detect if there is a connection error.
import pg from 'pg'
let pgClient = null
async function postgresConnect() {
pgClient = new pg.Client(process.env.CONNECTION_STRING)
pgClient.connect()
pgClient.on('error', async (e) => {
console.log('Reconnecting')
await sleep(5000)
await postgresConnect()
})
}
I don't like using a global here, and I want to set the sleep delay to do an small exponential backoff. I noticed "Reconnecting" fires twice immediately, then waits five seconds and I am not sure why it fired the first time without any waiting.
I also have to make sure the queries execute. I have something like this I was trying out.
async function getTimestamp() {
try {
const res = await pgClient.query(
'select current_timestamp from current_timestamp;'
)
return res.rows[0].current_timestamp
} catch (error) {
console.log('Retrying Query')
await sleep(1000)
return getTimestamp()
}
}
This seems to work, but I haven't tested it enough to make sure it will guarantee the query is executed or keep trying. I should look for specific errors and only loop forever on certain errors and fail on others. I need to do more research to find what errors are thrown. I also need to do a backoff on the delay here too.
It all "seems" to work, I don't want to fail victim to the Dunning-Kruger effect. I need to ensure this process can handle all sorts of situations and recover.

puppeteer-cluster: queue instead of execute

I'm experimenting with Puppeteer Cluster and I just don't understand how to use queuing properly. Can it only be used for calls where you don't wait for a response? I'm using Artillery to fire a bunch of requests simultaneously, but they all fail while only some fail when I have the command execute directly.
I've taken the code straight from the examples and replaced execute with queue which I expected to work, except the code doesn't wait for the result. Is there a way to achieve this anyway?
So this works:
const screen = await cluster.execute(req.query.url);
But this breaks:
const screen = await cluster.queue(req.query.url);
Here's the full example with queue:
const express = require('express');
const app = express();
const { Cluster } = require('puppeteer-cluster');
(async () => {
const cluster = await Cluster.launch({
concurrency: Cluster.CONCURRENCY_CONTEXT,
maxConcurrency: 2,
});
await cluster.task(async ({ page, data: url }) => {
// make a screenshot
await page.goto('http://' + url);
const screen = await page.screenshot();
return screen;
});
// setup server
app.get('/', async function (req, res) {
if (!req.query.url) {
return res.end('Please specify url like this: ?url=example.com');
}
try {
const screen = await cluster.queue(req.query.url);
// respond with image
res.writeHead(200, {
'Content-Type': 'image/jpg',
'Content-Length': screen.length //variable is undefined here
});
res.end(screen);
} catch (err) {
// catch error
res.end('Error: ' + err.message);
}
});
app.listen(3000, function () {
console.log('Screenshot server listening on port 3000.');
});
})();
What am I doing wrong here? I'd really like to use queuing because without it every incoming request appears to slow down all the other ones.
Author of puppeteer-cluster here.
Quote from the docs:
cluster.queue(..): [...] Be aware that this function only returns a Promise for backward compatibility reasons. This function does not run asynchronously and will immediately return.
cluster.execute(...): [...] Works like Cluster.queue, just that this function returns a Promise which will be resolved after the task is executed. In case an error happens during the execution, this function will reject the Promise with the thrown error. There will be no "taskerror" event fired.
When to use which function:
Use cluster.queue if you want to queue a large number of jobs (e.g. list of URLs). The task function needs to take care of storing the results by printing them to console or storing them into a database.
Use cluster.execute if your task function returns a result. This will still queue the job, so this is like calling queue in addition to waiting for the job to finish. In this scenario, there is most often a "idling cluster" present which is used when a request hits the server (like in your example code).
So, you definitely want to use cluster.execute as you want to wait for the results of the task function. The reason, you do not see any errors is (as quoted above) that the errors of the cluster.queue function are emitted via a taskerror event. The cluster.execute errors are directly thrown (Promise is rejected). Most likely, in both cases your jobs fail, but it is only visible for the cluster.execute

Node: Confusing behaviour related to saving a TypeORM model using await inside a callback / promises clearing

I'm making a system that loops over all my emails (from a maildir folder), and I'm using an old NPM package called eml-format to parse each maildir file (single emails). The eml-format package doesn't use promises, it just takes a callback to execute after reading the email file, and inside that callback I'm trying to save the email's metadata to Postgres using await with TypeORM. Here's the relevant part of the code I'm having issues with (the code looks a bit pointless as I've removed everything irrelevant to the actual main problem).
The Maildir() class is my TypeORM model (which refers to a postgres table called maildir).
This snippet of code is looped for every email:
/* *****************************
* START OF TOGGLEBLOCK
const md1 = new Maildir();
md1.folder = 'md1';
await db.entityManager.save(md1);
* END OF TOGGLEBLOCK
***************************** */
emlformat.read(eml, { headersOnly: false }, async (error, data) => {
console.log('before save');
const md2 = new Maildir();
md2.folder = 'md2';
await db.entityManager.save(md2);
console.log('after save');
});
When running with the code as-is above (with TOGGLEBLOCK disabled):
"before save" is repeatedly printed to the console
await db.entityManager.save(md2); does not wait, they seem to just queue up all at once (no good when I run on my entire email account of about 50,000 emails)
after they're all queued up, they all save to the database
then all the "after save" messages get printed to the console at the same time
If I simply enable the TOGGLEBLOCK code, then the md2 await in the callback works exactly as I expect it to, for each email it does these in order:
shows one "before save" message
saves the md2 record to the database - await waits as expected
shows one "after save" message
...then does the same again for each email
The TOGGLEBLOCK/md1 code isn't needed, it's just some junk I put in there while trying to figure this all out it. Why does having this extra code outside the callback change whether or not the md2 await works inside the callback?
I'm guessing it's something to do with with the TOGGLEBLOCK clearing out the promises or something?
I just want to delete the junk TOGGLEBLOCK/md1 code entirely. How can I get the md2 await to work without it?
If you've got a suggestion you're not entirely sure of, please just post it as an answer rather than a comment directly under the question (gets too confusing with multiple conversations interlaced together).
You could wrap emlformat.read with Promise and then it would play well with async/await
const readEmlFormat = eml => new Promise(
(resolve, reject) => emlformat.read(eml, { headersOnly: false }, (error, data) => {
if (error) {
reject(error);
} else {
resolve(data);
}
}));
const data = await readEmlFormat(eml);
console.log('before save');
const md2 = new Maildir();
md2.folder = 'md2';
await db.entityManager.save(md2);
console.log('after save');
Note that the issue was that you've used async function as a callback which was not awaited anywhere!

What is the correct pattern with generators and iterators for managing a stream

I am trying to figure out how to arrange a pair of routines to control writing to a stream using the generator/iterator functions in ES2015. Its a simple logging system to use in nodejs
What I am trying to achieve is a function that external processes can call to write to a log.I am hoping that the new generator/iterator functions means that if it needs to suspend/inside this routine that is transparent.
stream.write should normally return immediately, but can return false to say that the stream is full. In this case it needs to wait for stream.on('drain',cb) to fire before returning
I am thinking that the actual software that writes to the stream is a generator function which yields when it is ready to accept another request, and that the function I provide to allow external people to call the stream is an interator, but I might have this the wrong way round.
So, something like this
var stopLogger = false;
var it = writer();
function writeLog(line) {
it.next(line);
})
function *writer() {
while (!stopLogger) {
line = yield;
if(!stream.write) {
yield *WaitDrain(); //can't continue until we get drain
}
}
});
function *waitDrain() {
//Not sure what do do here to avoid waiting
stream.on('drain', () => {/*do I yield here or something*/});
I found the answer here https://davidwalsh.name/async-generators
I have it backwards.
The code above should be
var stopLogger = false;
function *writeLog(line) {
yield writer(line)
})
var it = writeLog();
function writer(line) {
if (stopLogger) {
setTimeout(()=>{it.next();},1};//needed so can get to yield
} else {
if(stream.write(line)) {
setTimeout(()=>{it.next();},1}; //needed so can get to yeild
}
}
}
stream.on('drain', () => {
it.next();
}
I haven't quite tried this, just translated from the above article, and there is some complication around errors etc which the article suggests can be solved by enhancing the it operator to return a promise which can get resolved in a "runGenerator" function, But it solved my main issue, which was about how should the pattern work.

Categories