I'm completely new to Bull queue and I have a decent understanding of JavaScript. I am trying to get the following example to work:
const Queue = require('bull')
const myFirstQueue = new Queue('my-first-queue');
myFirstQueue.process(async (job, done) => {
await doSomething(job.data);
done();
});
const doSomething = data => {
return new Promise((resolve, reject) => {
return resolve(data);
});
};
myFirstQueue.on('completed', (job, result) => {
log.debug(`Job completed with result ${job}`);
});
(async function ad() {
const job = await myFirstQueue.add({
foo: 'bar',
});
})();
The error I'm getting is:
\node_modules\ioredis\built\redis\event_handler.js:177
self.flushQueue(new errors_1.MaxRetriesPerRequestError(maxRetriesPerRequest));
^
MaxRetriesPerRequestError: Reached the max retries per request limit (which is 20). Refer to "maxRetriesPerRequest" option for details.
at Socket.<anonymous> (...\node_modules\ioredis\built\redis\event_handler.js:177:37)
at Object.onceWrapper (node:events:652:26)
at Socket.emit (node:events:537:28)
at TCP.<anonymous> (node:net:747:14)
Based off the error I'm getting it seems like the queue cannot process this job at all and I'm not sure why I'm getting this error. I am running the script with the command node fileName.js in the correct directory. I can run other JavaScript files fine that are even more complex than this.
EDIT: Edited code to reflect changes from comments, still have the same error.
Related
I am testing an async method that returns some data from a web request using the native https.request() method in NodeJS. I am using mocha, chai, and sinon with the relevant extensions for each.
The method I'm testing essentially wraps the boilerplate https.request() code provided in the NodeJS docs in a Promise and resolves when the response 'end' event is received or rejects if the request 'error' event is received. The bits relevant to discussion look like this:
async fetch(...) {
// setup
return new Promise((resolve, reject) => {
const req = https.request(url, opts, (res) => {
// handle response events
});
req.on('error', (e) => {
logger.error(e); // <-- this is what i want to verify
reject(e);
});
req.end();
});
}
As indicated in the comment, what I want to test is that if the request error event is emitted, the error gets logged correctly. This is what I'm currently doing to achieve this:
it('should log request error events', async () => {
const sut = new MyService();
const err = new Error('boom');
const req = new EventEmitter();
req.end = sinon.fake();
const res = new EventEmitter();
sinon.stub(logger, 'error');
sinon.stub(https, 'request').callsFake((url, opt, cb) => {
cb(res);
return req;
});
try {
const response = sut.fetch(...);
req.emit('error', err);
await response;
} catch() {}
logger.error.should.have.been.calledOnceWith(err);
});
This feels like a hack, but I can't figure out how to do this correctly using the normal patterns. The main problem is I need to emit the error event after the method is called but before the promise is fulfilled, and I can't see how to do that if I am returning the promise as you normally would with mocha.
I should have thought of this, but #Bergi's comment about using setTimeout() in the fake gave me an idea and I now have this working with the preferred syntax:
it('should log request error events', () => {
const sut = new MyService();
const err = new Error('boom');
const req = new EventEmitter();
req.end = sinon.fake();
const res = new EventEmitter();
sinon.stub(logger, 'error');
sinon.stub(https, 'request').callsFake((url, opt, cb) => {
cb(res);
setTimeout(() => { req.emit('error', err); });
return req;
});
return sut.fetch(...).should.eventually.be.rejectedWith(err)
.then(() => {
logger.error.should.have.been.calledOnceWith(err);
});
});
I don't like adding any delays in unit tests unless I'm specifically testing delayed functionality, so I used setTimeout() with 0 delay just to push the emit call to the end of the message queue. By moving it to the fake I was able to just use promise chaining to test the call to the logger method.
I have made a function that I am exporting using node.js called return_match_uid. I am importing the function in another express routing file and am using async await, with try and catch to handle the error. But somehow, the errors produced by return_match_uid always slip and are unhandled, even though I am using the error handling for the realtime listener recommended by Firestore doc
Here is the function:
exports.return_match_uid = function return_match_uid() {
return new Promise((resolve, reject) => {
const unsub = db.collection('cities').onSnapshot(() => {
throw ("matching algo error");
resolve();
unsub();
}, err => {
console.log(err);
})
})
})
In another express router file, I am calling the function:
const Router = require('express').Router;
const router = new Router();
const {return_match_uid} = require("./match_algo");
router.get('/match', async (req, res) => {
try {
var match_user = await return_match_uid(req.query.data, req.query.rejected);
res.send(match_user);
}
catch (error) {
console.log("Matching algorithm return error: " + error);
}
})
The error I am throwing inside the function: matching algo error do not get caught by either the err => {console.log(err);}) in the function nor the try catch block in the router. It slips and causes my app to crash. It shows the following error:
throw "matching algo error!";
^
matching algo error!
(Use `node --trace-uncaught ...` to show where the exception was thrown)
[nodemon] app crashed - waiting for file changes before starting...
I am throwing an error inside matching algo error because I have some other codes in there, and there is a possibility that it produces an error. If it does, I would like to make sure that it gets handled properly.
Let's say I have a function hello written like this:
const functions = require("firebase-functions");
const wait3secs = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('3 secs job complete');
resolve('done 3 secs');
}, 3000);
});
}
const wait2secs = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('2 secs job complete');
resolve('done 2 secs');
}, 2000);
});
}
exports.hello = functions.https.onRequest((req, res) => {
wait3secs(); // Unhandled, this uses 3 seconds to complete.
return wait2secs().then(data => {
res.send(data); // Send response after 2 secs.
});
});
The question is: For the above implementation, did I write it correctly (considering unhandled promise)? And if yes, will the wait3secs be guaranteed to run (asynchronously) in firebase functions until the end, even after the response is being sent?
I have searched in Firebase (here) but haven't found a specific answer to my question.
According to the documentation:
Terminate HTTP functions with res.redirect(), res.send(), or res.end().
What this is saying is that when you call res.send(), the function will be terminated. You should not expect any async work to complete after that - the function will be shut down.
If you need to do more work after sending the response, you will have to arrange to trigger another function to run in the background, such as a pubsub function. That function will need to return a promise that resolves only after all the async work is complete, so that it also does not get shut down prematurely.
Snippets are from a node.js and mongoDB CRUD application.Github repo for full code. The code is working fine but unsure if my structure and use of promises and async await are bad practice.
handlers._newbies = {};
handlers._newbies.post = (parsedReq, res) => {
const newbie = JSON.parse(parsedReq.payload);
databaseCalls.create(newbie)
.then((result) => {
res.writeHead(200,{'Content-Type' : 'application/json'});
const resultToString = JSON.stringify(result.ops[0]);
res.write(resultToString);
res.end();
})
.catch(err => console.log(err));
};
const databaseCalls = {};
databaseCalls.create = (newbie) => {
return new Promise(async (resolve, reject) => {
try {
const client = await MongoClient.connect('mongodb://localhost:27017', { useNewUrlParser: true });
console.log("Connected correctly to server");
const db = client.db('Noob-List');
const result = await db.collection('newbies').insertOne(newbie);
client.close();
resolve(result);
} catch(err) {
console.log(err);
}
});
};
When the node server gets a POST request with the JSON payload, it calls the handlers._newbies.post handler which takes the payload and passed it to the
const newbie = JSON.parse(parsedReq.payload);
databaseCalls.create(newbie)
call. I want this database call to return a promise that holds the result of the db.collection('newbies').insertOne(newbie);
call. I was having trouble doing this with just returning the promise returned by the insertOne because after returning I cant call client.close();.
Again maybe what I have done here is fine but I haven't found anything online about creating promises with promises in them. Thank you for your time let me know what is unclear with my question.
It is considered an anti-pattern to be wrapping an existing promise in a manually created promise because there's just no reason to do so and it creates many an opportunities for error, particular in error handling.
And, in your case, you have several error handling issues.
If you get an error anywhere in your database code, you never resolve or reject the promise you are creating. This is a classic problem with the anti-pattern.
If you get an error after opening the DB, you don't close the DB
You don't communicate back an error to the caller.
Here's how you can do your .create() function without the anti-pattern and without the above problems:
databaseCalls.create = async function(newbie) {
let client;
try {
client = await MongoClient.connect('mongodb://localhost:27017', { useNewUrlParser: true });
console.log("Connected correctly to server");
const db = client.db('Noob-List');
return db.collection('newbies').insertOne(newbie);
} catch(err) {
// log error, but still reject the promise
console.log(err);
throw err;
} finally {
// clean up any open database
if (client) {
client.close();
}
}
}
Then, you would use this like:
databaseCalls.create(something).then(result => {
console.log("succeeded");'
}).catch(err => {
console.log(err);
});
FYI, I also modified some other things:
The database connection is closed, even in error conditions
The function returns a promise which is resolved with the result of .insertOne() (if there is a meaningful result there)
If there's an error, the returned promise is rejected with that error
Not particularly relevant to your issue with promises, but you will generally not want to open and close the DB connection on every operation. You can either use one lasting connection or create a pool of connections where you can fetch one from the pool and then put it back in the pool when done (most DBs have that type of feature for server-side work).
I'm pretty new to coding and to Node.JS but trying to learn... The first big difficulties I found with Node.JS was to think in an Async way when building your code, now I'm facing another monster: Promises.
Up until now, I've tried to build my code to get something to work and not minding at all the error handling (I know that's dumb, but it helps me learn) but now while running my code I get some errors from time to time.
Since "my program" is basically (for now) a bunch of requests (using request-promise) being made through infinite loops. Some manipulation of the info ({objects}) being received and sending it to a mongodb (via mongoose), I mostly know where the errors are coming from:
The servers (being requested) are returning an error (statusCode !== 200).
Too many requests were made (which is obviously a specific statusCode)
And my code runs mostly smoothly, until I get one of these errors. My objective is now to get back into the code to handle properly these errors and create logs once they occur. Another ideal function would be to restart the loop that yielded the error (request) once they occur (maybe after a setTimeout or something).
My questions are:
Could you recommend some great material on promiseRejection handling that I could dig into? For now I'm using .catch((err) => { foo }) for each promise, but I'm not sure what to do with the err once caught.
Is there out there something that helps with logs and promiseRejection handling (an npm package maybe)?
How can I handle a loop restart? Again, I'm not expecting for you to provide a full response with the whole code (if you can that's obviously even better) but to put me in the right direction (best practices, famous npms or articles to achieve this)!
I really hope I'm not out of the topic and following SO's rule! If not let me know and I'll delete the question or edit and adapt.
Thanks in advance for your help!
******* EDIT ********
Simplifying my code:
[]./module/apiRequest.js[]
var rq = require('request-promise');
var fs = require('fs');
function apiRequest(params, req) {
var date = new Date().toISOString();
var logFileName = '../logs/[KRAKEN]'+date+'errorlog.txt';
var options = {
uri: `https://api.kraken.com/0/public/${params}`,
qs: req,
json: true,
resolveWithFullResponse: true
};
return rq(options).then((res) => {
if (res.statusCode === 200 && !res.body.error.length) {
return res; // <=== THIS FOR AN UNKNOWN PARAM, THIS SERVER RETURNS A 200 WITH AN ERROR OBJECT.
} else {
fs.writeFile(logFileName, res, function(err) {
if (err) return console.log(err); // <==== SINCE A 200 WON'T APPEAR AS AN ERROR I'M TRYING TO HANDLE IT THAT WAY.
console.log(`[ERROR][KRAKEN]: log created!`);
});
}
}).catch((err) => { // <==== AS FAR AS I UNDERSTAND THIS IS FOR ANY OTHER ERROR (STATUSCODE 500 FOR INSTANCE / OR AN ERROR RELATED WITH THE CODE)
fs.writeFile(logFileName, err, function(err) {
if (err) return console.log(err);
console.log(`[ERROR][KRAKEN]: log created!`);
});
});
};
module.exports = {
apiRequest
};
[]./module/app.js[]
var apiRequest = require('./apiRequest.js').apiRequest;
function kraken(item) {
return apiRequest('Ticker', {pair: item}).then((res) => {
var result = {};
var timeStamp = Math.floor(new Date());
Object.keys(res.body.result).forEach((k) => {
result= {
mk: 'kraken',
name: k,
a: res.body.result[k].a,
b: res.body.result[k].b,
c: res.body.result[k].c,
v: res.body.result[k].v,
p: res.body.result[k].p,
t: res.body.result[k].t,
l: res.body.result[k].l,
h: res.body.result[k].h,
o: res.body.result[k].o,
n: timeStamp,
}
});
return result; // <=== THIS OCCURS WHEN THERE'S NO ERROR IN THE APIREQUEST.
}).catch((err) => {
console.log(err); // <==== THIS I'M NOT SURE IF IT GETS THE ERROR BACK FROM APIREQUEST. IF I'M NOT MISTAKEN THAT'S NOT THE CASE, IT'S THE ERROR HANDLER FOR THE 'kraken' FUNCTION I'M NOT SURE WHAT KIND OF ERRORS COULD COME OUT OF THIS...
});
};
module.exports = {
kraken,
}
[]./main.js[]
var fs = require('fs');
var mongo = require('mongodb');
var mongoose = require('mongoose');
mongoose.Promise = global.Promise;
// KRAKEN REQUIRE:
var kraken = require('./module/app.js').kraken;
var Krakentick = require('./module/model/krakenModel').Krakentick; //<=== THIS IS THE MODEL FOR MONGOOSE.
async function loopKR() {
setTimeout(
async function () {
var item = ['XBTUSD'];
var data = await kraken(item);
data.forEach((object) => {
if (object.name === 'XXBTZUSD') {
var iname = 'btcusd'
} else {
var iname = 'N/A' };
var tick = new Krakentick({
mk: object.mk,
name: object.name,
a: object.a,
b: object.b,
c: object.c,
v: object.v,
p: object.p,
t: object.t,
l: object.l,
h: object.h,
o: object.o,
n: object.n,
iname: iname,
});
tick.save(function(err, tick) {
if (err) return console.log(err); //<==== THIS IS RELATED WITH MONGOOSE NOT A PROMISE IF I'M NOT MISTAKEN... THEN HANDLING WOULD OCCUR IF I HAD A PROBLEM
console.log(`[SUCCESS][KRAKEN]: ${tick.name} added to db!`);
});
});
loopKR();
}
}, 1100);
};
loopKR();
So as you can see, I'm trying to handle mainly the errors coming out of the request. But how do I send them to a log (is my current code correct? Is there a better way?)? And after the error arises and breaks the loop, how do I restart the loop automatically?
With this code, the errors are not being handled properly... For some reasons I get the following message:
TypeError: Cannot read property 'body' of undefined
at apiRequest.then (fast-crypto/module/app.js:34:22)
at tryCatcher (fast-crypto/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (fast-crypto/node_modules/bluebird/js/release/promise.js:512:31)
at Promise._settlePromise (fast-crypto/node_modules/bluebird/js/release/promise.js:569:18)
at Promise._settlePromise0 (fast-crypto/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (fast-crypto/node_modules/bluebird/js/release/promise.js:693:18)
at Async._drainQueue (fast-crypto/node_modules/bluebird/js/release/async.js:133:16)
at Async._drainQueues (fast-crypto/node_modules/bluebird/js/release/async.js:143:10)
at Immediate.Async.drainQueues (fast-crypto/node_modules/bluebird/js/release/async.js:17:14)
at runCallback (timers.js:781:20)
at tryOnImmediate (timers.js:743:5)
at processImmediate [as _immediateCallback] (timers.js:714:5) (node:12507) UnhandledPromiseRejectionWarning: Unhandled promise
rejection (rejection id: 1): TypeError: Cannot read property 'name' of
undefined (node:12507) [DEP0018] DeprecationWarning: Unhandled promise
rejections are deprecated. In the future, promise rejections that are
not handled will terminate the Node.js process with a non-zero exit
code. { Error: ENOENT: no such file or directory, open
'../logs/[KRAKEN]2017-08-20T10:58:03.302Zerrorlog.txt' errno: -2,
code: 'ENOENT', syscall: 'open', path:
'../logs/[KRAKEN]2017-08-20T10:58:03.302Zerrorlog.txt' }