Express performance requiring modules - javascript

Having a question regarding the performance of a NodeJS application.
When I have the following express app in NodeJS:
const app = require('express')();
const about = require('./about');
app.use('/about', about);
app.listen(3000, () => console.log('Example app listening on port 3000!'));
My current understanding is that only when the server is started up it does need to require() these files using commonJS modules.
Question:
Does the express application has to execute the require() statements with every request to the server or is this only necessary when starting the server?
Any extra information about how express works under the hood would be nice.

No, those require are only run once when you start the app. It would be different if you include those in the router functions.
app.use('/about', (req, res) => {
const some = require('some');
});
Still in this scenario, modules require are cached so it's not such a big deal.

Related

Restart express server every time a request is made to an endpoint

basically I would like to 'restart' it every time a client sends a request to /reset, how can I do that? any help is extremely valuable, I don't know how to approach this yet
As #jfriend00 mentioned, You can use pm2 like applications to monitor the process. Below is simple steps that can be followed.
Node Js code: Exit the process on route
const express = require("express");
const app = express();
app.get("/", (_, res) => res.send("hello"));
app.get("/restart", (_, res) => {
process.exit(0);
});
app.listen(8080, () => console.log("Server is running on :8080"))
Run server using pm2 in watch mode:
./node_modules/.bin/pm2 start app.js --watch
He also mentioned that you should not use this in the production environment. I support that statement. You should not use this approach in prod environment.

Next.js: middlewares without a custom server or wrappers

Is it possible to create a Next.js app with middlewares without using a custom server nor wrappers handlers?
When I create an Express app, I split my code into different require statements calling Express middlewares:
const express = require("express");
const app = express();
// I call the functions in each modules to use the different middlewares
require("./startup/cors")(app);
require("./startup/routes")(app);
require("./startup/db")();
const port = process.env.PORT || config.get("port");
const server = app.listen(port, () =>
winston.info(`Listening on port ${port}...`)
);
module.exports = server;
For example, the ./startup/cors module contains the following lines:
const cors = require("cors");
module.exports = function(app) {
app.use(cors());
};
However, with my Next.js app, I don't understand how to get something like this without creating a custom server.
I already came across the article Use middleware in Next.js without custom server but it uses a wrapper solution I would like to avoid.
Currently Next.js supports middlewares only for api paths. There is no support for middlewares in regular pages paths yet.

React API communication refuse on cloud IDE (MERN Stack)

I'm learning MERN stack following the below tutorial.
https://medium.com/#beaucarnes/learn-the-mern-stack-by-building-an-exercise-tracker-mern-tutorial-59c13c1237a1
I've decided to use a cloud platform IDE called goorm IDE (https://ide.goorm.io) which is similar to cloud 9 IDE, and as I followed the tutorial, I realized a simple problem, that the testing environment is little different because I can not access the localhost on my machine (Or at least I don't know how to.)
Working on the back End did not have much problem because this IDE provides a domain where I can access and I could just run the server.js (not the whole react app) and test the API end point easily.
But now that I run the whole react app as I'm learning Front End side, I discovered that my server.js is not accessible as before when I was running just the server and I would get refused from the connection as below.
Below code is the actual code I'm using from the front End side in order to make API call to the server.
axios.post('http://localhost:5000/users/add', user).then(res => console.log(res.data));
// I tried changing the url to external domain.. changing the directory.. with no luck..
And below is the code for server.js file.
const express = require('express');
const cors = require('cors');
const mongoose = require('mongoose');
require('dotenv').config();
const app = express();
const port = process.env.PORT || 5000;
app.use(cors());
app.use(express.json());
const uri = process.env.ATLAS_URI;
mongoose.connect(uri, { useNewUrlParser: true, useCreateIndex: true }
);
const connection = mongoose.connection;
connection.once('open', () => {
console.log("Mongo DB database connection established successfully");
});
const exercisesRouter = require('./routes/exercises');
const usersRouter = require('./routes/users');
app.use('/exercises', exercisesRouter);
app.use('/users', usersRouter);
app.listen(port, process.env.IP, () => {
console.log(`Server is running on port: ${port}`);
});
And below is the url of the page where I'm actually trying to make the API Call.
https://zimen.run.goorm.io/user
Other environment information
Running URL and Port setting from the IDE : https://zimen.run.goorm.io:3000
React App directory : root/mern-exercise-tracker/
dependencies : express, create-react-app, mongoose, cors
I'm wondering if it would be better to start the whole project again in a clean local environment..
If someone could please help, it would be much appreciated.
Any other information needed, please let me know, or you can actually join the IDE online as this is a cloud IDE.
Thank you in advance.
== UPDATE ==
Sorry I've forgotten to attach the error log.
xhr.js:178 POST https://localhost:3000/users/add net::ERR_CONNECTION_REFUSED
Uncaught (in promise) Error: Network Error
at createError (createError.js:16)
at XMLHttpRequest.handleError (xhr.js:83)
its seems you have CORS error,i solved this problem by installing CORS extension on Goolge Chrome
Set PORT=5000 in Nodejs server-side and the default port for React frontend as PORT=3000.
Set your own separate custom 'Running URL and Port' for Nodejs and React.
Example:
https://[custom-client-side-name].goorm.io
for React at PORT=3000
and
https://[custom-server-side-name].goorm.io
for Nodejs at PORT=5000
For Axios command on React side, use:
axios.post('https://my-server-side.goorm.io/users/add', user).then(res => console.log(res.data));
and similarly, the React frontend can be found on your custom website you have set up: https://my-client-side.goorm.io/[custom-routes]

Can't run puppeteer in react app, Module not found: Can't resolve 'ws' when compiling

I was wondering if it was possible to run puppeteer in my react app. Whenever I try to run puppeteer in my react app I get "Module not found: Can't resolve 'ws'". I've tried installing ws but will still get the same error.
Simple answer : You can't run puppeteer in react app.
React is a client side framework. which means it runs in browser.
While puppeteer is a NodeJS lib, it needs Node.js and runs on server side.
Puppeteer is a Node library which provides a high-level API to control Chrome or Chromium over the DevTools Protocol. Puppeteer runs headless by default, but can be configured to run full (non-headless) Chrome or Chromium.
Expanding on the above answer. You cannot directly run Puppeteer on a react app. React app is a frontend framework and you would need Puppeteer to run on a node js, server. It's a simple "fix" and I wanted to explain it a little more than the answer above does.
The steps to get them to work together would be.
Setup an express server. You can create an express server like so:
Separate directory reactServer. Npm init directory and npm i express.
In your express server you allocate a path
const express = require('express')
const app = express()
const port = 5000
app.get('/', (req, res) => {
res.send('Hello World!')
})
app.get('/my-react-path', (req, res) => {
// run any scripts you need here
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)
})
Then your react app will interact with your server like so:
localhost:5000/my-react-path
In the my-react-path you can create a function that fires up puppeteer and does all the things you want on the server side and it can return whatever results to React. The above example is just to get you started it doesn't have anything related to puppeteer.

NodeJS .fork() using threads

I'm needing to run Node servers for the development of several independent projects on a single box.
With Nginx in front routing based on virtual hosts, running 15+ node instances working separately isn't normally much of a problem. The catch is that just starting node that many times eats all my RAM with the overhead of just the internal libraries.
So does there exist a solution for this, allowing me to run several largely-independent Node servers while sharing the core libraries? Here are what I've considered:
Threads instead of processes—there would exist a master control process that would be able to create a new thread for each instance. I know Node is built to not require threads, but is there a way to utilize them in order to save memory?
Some library or addition to Node allowing me to do the above.
Shared core memory between processes—is this a possibility for Node?
Just require ()ing in each server onto the same Node instance—I can't think how killing or reloading those servers could work. Is there a way to unrequire and restart an external module? This method could be ideal by allowing other libraries (Socket.IO for example) to be shared.
Anyway, this is probably an edge case, so I'm not surprised that a solution isn't obvious.
Does anyone know of a way to implement this?
The existing thread libraries limit your access to I/O pretty severely and are intended for you to run your own CPU-bound Javascript code, not serve out data. Shared core is certainly possible, but also a bit of a pain. I think your last option is the most viable. You can put together a virtualization system for node pretty easily.
Here are a couple standalone Express apps.
app-one.js:
var app = require('express')();
app.get('/', function (req, res) {
res.send('This is app one.');
});
app.listen(3000);
app-two.js:
var app = require('express')();
app.get('/', function (req, res) {
res.send('This is app two.');
});
app.listen(3001);
Change those by replacing the listen lines with assignments to the module:
app-one-virtual.js:
var app = require('express')();
app.get('/', function (req, res) {
res.send('This is app one.');
});
module.exports = app;
app-two-virtual.js:
var app = require('express')();
app.get('/', function (req, res) {
res.send('This is app two.');
});
module.exports = app;
Then write a master app that requires each of those and delegates requests to them based on the incoming host header:
app-master.js:
var http = require('http');
var appOne = require('./app-one-virtual');
var appTwo = require('./app-two-virtual');
http.createServer(function (req, res) {
if (req.headers.host === 'one.example.com:3000') {
return appOne(req, res)
}
if (req.headers.host === 'two.example.com:3000') {
return appTwo(req, res)
}
res.writeHead(404)
res.end('Site not found.')
}).listen(3000);
Now just node app-master.js and you're set. Repeat for as many servers as you like.

Categories