Performance question - I'm trying to understand if i have a node http server powered by express , is it consider bad practice to create new class instance on every request sent by the user?
The class instance fetch data from another api and expose some functionality to manipulate the fetched data.
example of the code:
//--- Handler.js ---
const _ = require("lodash");
class Handler {
constructor() {
this.fetchData = this.getSiteModel.bind(this);
this.getA = this.getA.bind(this);
this.getB = this.getB.bind(this);
this.getC = this.getC.bind(this);
}
async fetchData(req,res,id){
const result = await fetch(...)
this.data = result;
}
getA(){
...
return this.data.A
}
getB(){
...
return this.data.B
}
getC(){
...
return this.data.C
}
}
//---- controller.js ----
const Handler = require("../Handler/");
exports.getDataById = async function(req ,res) {
const handler = new Handler();
return handler.getA();
}
Would it be better to do this instead
//---- controller.js ----
const fetchData = require("../Handler/");
const getA = require("../Handler/getA");
const getB = require("../Handler/getB");
const getC = require("../Handler/getC");
exports.getDataById = async function(req ,res) {
//no new handler instance created
const data = fetchData(url)
return getA(data);
}
Is it considered bad practice to create new class instance on every request sent by the user?
No, it's not generically considered a bad practice. It's normal to need to create objects during the processing of an incoming request. Look at any database query which is often used during a request handler. It will likely create multiple objects.
Now, whether or not there is a more efficient way to do what you're doing is a different question and we would need to see your actual code in order to offer some advice on that topic.
Would it be better to do this instead?
I don't see a whole lot of reasons for you to put the data into an object before making a single operation on that object.
Something like you proposed (with the addition of await to make it work properly):
exports.getDataById = async function(req ,res) {
//no new handler instance created
const data = await fetchData(url)
return getA(data);
}
Seems perfectly fine to me. When to structure things into a class and put the data into the instance data and why to just use a function to operate on the data is a classic OOP question and it depends upon lots of things which are all dependent upon seeing and understanding your real code, what you're doing with it, how it is most likely to be expanded in the future, how often you are calling multiple functions on the same data in one request, etc...
Related
I'm working on an existing NodeJS web service using HapiJS, Hapi Lab for testing along with Sinon. The service connects to a Postgres DB using massiveJs. There's a method implemented by someone else, that doesn't have unit tests. Now I'm reusing this method and I want to implement some unit tests for it.
This method executes a massivejs transaction inside of it, persisting to several tables.
async createSomething(payload) {
const { code, member } = payload;
const foundCompany = await this.rawDb.ethnics.tenants.findOne({ code });
if (foundCompany && foundCompany.companyId) {
const { companyId } = foundCompany;
const { foreignId } = member;
return await this.rawDb.withTransaction(async (tx) => {
const foundMember = await tx.ethnics.members.findOne({ foreign_id: foreignId, company_id: companyId });
if (!foundMember) {
//some business logic...
const newMember = await tx.ethnics.members.insert(member);
//more business logic persisting to other tables...
return newMember;
}
});
}
}
Problem is, I don't know how to stub stuff only inside the arrow function, without stubbing the entire arrow function. I just want to stub the calls of tx. I also don't want to use a database but stub the rawDb property. Is that doable from unit testing perspective?
Yes it is doable. There are 2 alternatives:
Stub MassiveJS methods directly.
Example to stub massive method findOne:
const massive = require('massive');
const sinon = require('sinon');
// Stub massive.Readable class method findOne.
// You need to find where is the real method findOne and stub it.
const stub = sinon.stub(massive.Readable, 'findOne');
// You can resolve it.
stub.resolves();
// Or you can throw it.
stub.throws(new Error('xxx'));
Use pg in memory for test.
Just for testing purpose, you can use module like: test-pg-pool or pg-mem. Before testing, start the test pg and after the test finish, destroy it.
I have a producer of data, and a consumer of data. The producer produces asynchronously, and in turn I would like the consumer to consume asynchronously when there is data to consume.
My immediate thought to solve this problem is to use some queue object that has an awaitable shift/get, much like this async queue in the python standard
However, I searched and I couldn't find any JS libraries that have this type of data structure for me to use. I would have thought this would be a common pattern.
What is the common pattern for solving this problem in JS, and are there any libraries to help?
If the producer of the data is just spontaneously producing data and the consumer just wants to know when there's some new data, then this sounds like the consumer should just subscribe to an event that will be triggered any time there is new data. You can just use the EventEmitter object in node.js to create an emitter that the consumer can listen to and the producer will trigger and event whenever there's new data. No external library is needed to implement this as the built-in EventEmitter object has all the tools you need to register for notifications and trigger notifications.
If the consumer of the data requests data and the producer then goes and gets it asynchronously, then this is just a typical asynchronous API. The API should probably return a promise and the producer will resolve the promise with the new data when it's ready or reject it if there was an error retrieving the data.
With the little bit of description you've provided, I don't see any particular need for an elaborate queuing system. It just sounds like publish/subscribe or a simple event notification system. If the problem is more complicated, then please give us more details on the producer of the data so we can better match the tools available in node.js to the needs of your particular problem.
In case of small simple program, I would just simply write something like this.
var data = [];
function Consumer()
{
this.isConsuming = false;
this.notify = function(){
if(this.isConsuming)
{
return;
}
this.consumeNext();
}
this.consumeNext = async function(){
this.isConsuming = true;
if(data.length > 0)
{
//consume one datum
console.log(await this.consume(data.shift()));
//consume next datum
this.consumeNext();
}
else
{
this.isConsuming = false;
}
}
this.consume = async function(datum){
return datum * datum;
}
}
var consumer = new Consumer();
//call consumer.notify() when your producer produces
data.push(1,2,3,4,5);
consumer.notify();
This will give you another idea. In my scenario the producer creates data every 1000 milliseconds and consumer waits until producer created new data and resolved its promise.
let dataArray = []
let consumerResolver = null
function producer() {
setInterval(() => {
const newData = "my new Data"
dataArray.push(newData)
if (consumerResolver) {
consumerResolver()
}
}, 1000);
}
async function consumer() {
while (true) {
if (dataArray.length === 0) {
const producerPromise = new Promise((resolve) => {
consumerResolver = resolve
})
await producerPromise
}
consumerResolver = null
const data = dataArray.shift()
console.log(data)
}
}
Background
I am writing some asynchronous code in express. In one of my end points there I need to retrieve some data from firebase for 2 seperate things.
one posts some data
the other retrieves some data to be used in a calculation and another post.
These 2 steps are not dependent on one another but obviously the end result that should be returned is (just a success message to verify that everything was posted correctly).
Example code
await postData(request);
const data = await retrieveUnrelatedData(request);
const result = calculation(data);
await postCalculatedData(result);
In the code above postData will be holding up the other steps in the process even though the other steps (retrieveUnrelatedData & postCalculatedData) do not require the awaited result of postData.
Question
Is there a more efficient way to get the retrieveUnrelatedData to fire before the full postData promise is returned?
Yes, of course! The thing you need to know is that async/await are using Promises as their underlying technology. Bearing that in mind, here's how you do it:
const myWorkload = request => Promise.all([
postData(request),
calculateData(request)
])
const calculateData = async request => {
const data = await retrieveUnrelatedData(request);
const result = calculation(data);
return await postCalculatedData(result);
}
// Not asked for, but if you had a parent handler calling these it would look like:
const mainHandler = async (req, res) => {
const [postStatus, calculatedData] = await myWorkload(req)
// respond back with whatever?
}
i write this code but i don`t now why my console.log give me undefined and how can i get value of prototype properties or functions
(function() {
function Users() {}
Users.prototype.getUser = function() {
$.getJSON('/usersList/users.json', function(request) {
Users.prototype.allUsers = request.users;
});
}
var users = new Users();
users.getUsers();
console.log(users.allUsers);
}
())
What i wont to achieve is to have this user list as my object property like User.allUsers in some array.
Thanks
This is because $.getJSON is asynchronous.
When you create an object from User prototype, $.getJSON has to call its callback yet.
In the other hand, you're trying to simulate global variables adding data properties to a given prototype. This is a bad practice and, at the end of the day, you've already realized that you got stuck on this approach.
What you need an initialization code executed whenever the site or app is started so those allUsers are called once and injected everywhere:
const initApp = () => Promise.all([
$.getJSON("/someData/users.json")
// other comma-separated async functions which return promises
])
initApp().then(([allUsers, otherData, yetAnotherData]) => {
callSomeFunctionWhichRequiresAllUsers(allUsers)
})
I've tried to search for a similar problem on here but suprisingly couldn't find one posted already.
I use expressjs v4 framework and I'm constructing my routes like this:
'use strict';
let express = require('express');
let router = express.Router();
let users = require('./modules/users');
router.post('/',users.add);
router.put('/edit/:id',users.edit);
As you can see above, I'm requiring let users = require('./modules/users')
Now the users module looks (let's say) like this:
'use strict';
let usersDbModule = require('...');
let users = {
'add': (req, res, next) => {
let callback = (err, record) => {
//...do something
users.function1(record)
}
usersDbModule.save(req, callback);
},
'function1': (record) => {
users.function2()
},
'function2': () => {
//...do something with next() function
}
}
You can notice, that router from the first code block is using module's add function. add function it's a standard express middleware function but now the things are getting more complicated.
As you can see, add function has next as one of the params, now I'm doing some complex callbacks calls from different functions and let's say that in the end I want to call next in function2.
My question is, what is the best way of passing req, res and next params between different callback functions within the same module.
I come up with 3 different methods of doing it:
Method 1:
Pass req, res or next if necessary around to all the functions in the chain so in this case I would have to pass next to callback than to function1 and than from function1 to function2.
Not the best way in my opinion, difficult to maintain, read and probably test as well.
Method 2:
Wrap function1 and function2 with closures in the add passing all the necessary params. In this particular case I would have to wrap only function2 with closure passing next so it would looks something like this:
'add': (req, res, next) => {
users.function2(next);
//....rest of the code of the function add
}
And than the function2 itself:
'function2': (next) => {4
return () => {
//...now I have access to next here
// without a need to pass it to each and every
// function in the chain
}
}
Method 3:
Append all the necessary functions/variables to res.locals and pass only res object around.
It has exactly the same problem as Method 1 so I would personally be in favour of Method 2 but not sure if it doesn't make the code less readable and maybe there are some other problems with it, haven't tested it in production nor in development environment with the team.
I would really like to hear what are you guys using and how it plays in your projects/teams. Any preferences, best practices, best patterns ? Please share, I really want to know what's the best way.
Maybe there is even better way of doing it ?
All feedback greatly appreciated!
Real life example:
Example usage for function1 & function2 and possibly more...
Let's say we have an adapter that fetches data from an external API, than it needs to save the data into a database, and return a response. Let's also assume that the data returned from the API expires after 5s. If the client hits the route within 5s span, it gets the data from the database, if time between calls was longer, than it repeats the operation calling the API.
This would be of course more complicated than just function1 and function2. It would require a lot of callback functions, both from the adapter and the database, also separate functions for fetching data from the database, adapter, saving data into a database, and eventually deleting data from the database, it gives at least 4 callback functions already.
I think that mix express and app logic is not a good idea.
I use next way in my project
// middlewares/auth.js
// Example middleware
exports.isAdmin = function (req, res, next) {
if (smth-admin-check)
next();
else
next (new Error(403));
}
// routes/index.js
// Include only modules from /routes
let user = require('./user');
let auth = require('../middlewares/auth');
...
app.get('/user/:id(\\d+)', user.get);
app.post('/user', auth.isAdmin, user.post); // only admin can add user
// routes/user.js
// Call model methods and render/send data to browser
// Don't know about db
let User = require('/models/user');
...
exports.get = function(req, res, next) {
let id = req.params.id;
// I cache most data in memory to avoid callback-hell
// But in common case code like below
User.get(id, function(err, u) {
if (!u)
return next(new Error('Bad id'));
... render page or send json ...
});
}
...
exports.post = function(req, res, next) { ... }
// models/user.js
// Encapsulate user logic
// Don't use any express features
let db = require('my-db');
...
class User {
get(id, callback) { ... }
add(data, callback) { ... } // return Error or new user
...
}