How add new method in response and request - javascript

I want to add new method in response and request of node.js.
How i can do it more efficiently?
I can't understand how this is done in express.js

Being JavaScript, there are numerous ways to do this. The pattern that seems most reasonable to me for express is to add the function to each request instance in an early middleware:
//just an example
function getBrowser() {
return this.get('User-Agent');
}
app.use(function (req, res, next) {
req.getBrowser = getBrowser;
next();
});
app.get('/', function (req, res) {
//you can call req.getBrowser() here
});
In express.js, this is done by adding additional function to the prototype of http.IncomingMessage.
https://github.com/visionmedia/express/blob/5638a4fc624510ad0be27ca2c2a02fcf89c1d334/lib/request.js#L18
This is sometimes called "monkey patching" or "freedom patching". Opinions vary on whether this is fantastic or terrible. My approach above is more prudent and less likely to cause intended interference with other code running inside your node.js process. To add your own:
var http = require('http');
http.IncomingMessage.prototype.getBrowser = getBrowser; //your custom method

Add methods to express.response object:
const express = require('express');
express.response.getName = () => { return 'Alice' };

Related

In Nodejs, how to have individual "allowed methods" for each express enpoint?

I am building a rest API with nodejs and express.
I am trying to implement a small CORS system to the endpoints:
cors.js
export const cors = ({ allowedMethods }) => {
return (req, res, next) => {
res.header('Access-Control-Allow-Origin', '*');
if(!allowedMethods.includes(req.method)){
return res.sendStatus(405);
}
next();
}
}
server.js
const app = express()
app.use(express.json())
app.post('/', cors({allowedMethods: ['POST']}), (req, res) => {
})
app.put('/photo', cors({allowedMethods: ['PUT']}), (req, res) => {
})
Let's say I have these 2 endpoints, every time I go to a URL with an unallowed method, I get a 404 response but I want "405 Not Allowed". (makes sense, because of app.<method>())
How is it possible to look up the allowedMethods for each endpoint and decide what to do next?
I've tried using app.use(cors()), but that catches all endpoints and I'll never know about the allowedMethod specified for a specific endpoint. And I don't think that node cors package would do it.
You're conflating two different things here.
CORS has a mechanism for stating what HTTP methods are allowed to be used to make a cross origin request from JavaScript which is used in combination with a preflight OPTIONS request.
HTTP has a mechanism to say that a request is using an HTTP method that is not allowed at all.
Since you are attempting to generate 405 Not Allowed responses, you are dealing with the latter. This has nothing to do with CORS and should be kept separate from CORS middleware.
all will match all requests for a path, no matter what HTTP method is used, so you can write:
const method_not_allowed = (req, res) => {
return res.sendStatus(405);
};
app.post('/', cors(), (req, res) => { });
app.all('/', method_not_allowed);
Any POST request will get caught by app.post('/' and any other request to / will carry on and get matched by app.all('/').
That said, it looks like you are reinventing the wheel and could just use this existing module.
If you need to also deal with Access-Control-Allow-Methods then you need to take care of that separately.
That does need to be handled with the rest of your CORS logic, and you need to handle both the methods you want requests to be made with (post in this example) and OPTIONS (for the preflight).
If you don't mind a bit of configuration:
const routes = {
"/": {
get: (request, response) => { ... },
post: (request, response) => { ... },
},
"/photo": {
put: (request, response) => { ... },
},
};
app.use((request, response, next) => {
const route = routes[request.url] || {};
if (!Object.keys(route).includes(request.method.toLowerCase())) {
return response.sendStatus(405);
}
next();
});
for (let route in routes) {
for (let method in routes[route]) {
app[method](route, routes[route][method]);
}
}
Even though you'll get in trouble soon with request params (/photos/:photo_id).
EDIT: didn't know about app.all, much cleaner!

How can I add a custom value to my Node.js HTTP request for Express to consume

I have an Express server listening on a Unix Domain Socket. I'm making an http request like so:
const requestOptions = {socketPath, method, path, headers, _custom: {foo: () => 'bar'}}
http.request(requestOptions, responseCb)
And want to use _custom in my Express route handlers
app.get('/', (req, res) => {
console.log(req._custom)
})
Is this possible?
EDIT: _custom is an object with functions.
EDIT: The answer I marked as correct is the best solution I could find, however it does not allow sending of objects (which is what I really want).
You can add _custom to your req object by adding custom middleware in your Express server prior to your routes that need to use req._custom.
app.use((req, res, next) => {
if (req.get('X-Custom-Header')) {
// add custom to your request object
req._custom = req.get('X-Custom-Header');
}
return next();
});
On the client side you can add the custom header
let headers = {
'X-Custom-Header': 'my-custom-value'
};
const requestOptions = {socketPath, method, path, headers};
http.request(requestOptions, responseCb)

What javascript library sets the _parsedUrl property on a request object

I am working with node/express/passport/ looking at code that attempts to use a request like:
req._parsedUrl.pathname;
I cannot figure out where this variable is coming from. Is this a canonical variable name that is set in a common .js library? It doesn't seem exposed in any headers.
req._parsedUrl is created by the parseurl library which is used by Express' Router when handling an incoming request.
The Router doesn't actually intend to create req._parsedUrl. Instead parseurl creates the variable as a form of optimization through caching.
If you want to use req._parsedUrl.pathname do the following instead in order to ensure that your server doesn't crash if req._parsedUrl is missing:
var parseUrl = require('parseurl');
function yourMiddleware(req, res, next) {
var pathname = parseUrl(req).pathname;
// Do your thing with pathname
}
parseurl will return req._parsedUrl if it already exists or if not it does the parsing for the first time. Now you get the pathname in a save way while still not parsing the url more than once.
You can write a middleware to handle then set properties for req.
var myMiddleWare = function () {
return function (req, res, next) {
req._parsedUrl = 'SOME_THING';
next()
}
};
app.get('/', myMiddleWare, function (req, res) {
console.log(req._parsedUrl); // SOME_THING
res.end();
})
Express middleware document in here

Node + Express: Does Express clone the req and res objects for each request handler?

If Javascript copies objects by reference, then does Express clone the req and res objects before passing them down to each request handler? If not, then how does Express handle possible conflicts between routes running simultaneously and using the same reference to req and res?
Express doesn't clone req and res. You can see that in this example app:
var http = require('http');
var express = require('express');
var app = express();
var testReq, testRes;
app.use(function(req, res, next) {
console.log('middleware');
testReq = req;
testRes = res;
next();
});
app.get("*", function(req,res) {
console.log('route')
console.log('req the same? ' + (req === testReq)); // logs true
console.log('res the same? ' + (res === testRes)); // logs true
res.send(200);
});
http.createServer(app).listen(8080);
Test with curl:
$ curl localhost:8080
This is a useful feature - it means that middleware functions can use req and res to pass data to downstream functions. For example an authorisation middleware might add a req.user property.
Concurrency isn't a concern here because Node.js is single threaded - it is not possible for two routes to run at any given time.
It also doesn't run a single request through multiple routes - you can add another get("*") route and you'll see that it won't get called.
As JavaScript is single threaded, there is no simultaneous route handling and no multithreading pitfalls exist. req & res are not cloned, they are extended.
When we say parallel requests, it means the browser is creating a connection stream and gives it in the form of request or whatever you name it as.
A sequence of middleware will get it sequentially one after the other, based on the call to next, while getting appended with necessary things from middleware.
But the router.all/*('path', [middleware]) triggers for each request made by the clients.

Better way of exchanging data between two Node.js routes

It would be beneficial, if there is a optimized way of communication between two Node.js routes?
I use mainly Express and I don't know of a Express module, that provides such functionality. I am unsure whether I am missing the right framework or the right NPM script.
I know that this functionality can be implemented via simple Ajax, but I am looking for cleaner solution, such as websockets. I use socket.io, but it seems focused on communication between the client and the server, not between two servers.
Update - What I want to achieve?
Instead of placing all request I need in one file, such as:
router.post('/entode', function(req, res) {
var word_rf = req.body.word1;
if( /^[a-zA-Z]+$/g.test( word_rf ) && word_rf.length<18 ) {
entode.entode_function_rf(word_rf, function (res1) {
io.sockets.emit('series_results', res1);
res.send('done');
});
} else {
io.sockets.emit('error_rf', 'incorrect input');
res.send('done');
}
});
//-------------------------
router.post('/yandex_short', function(req, res) {
var word_rf = req.body.word1;
...
});
//-------------------------
router.post('/yandex_long', function(req, res) {
var word_rf = req.body.word1;
...
});
I prefer having something like:
router.post('/', function(req, res) {
var word_rf = req.body.word1;
var aPromise = new Promise(function(resolve,reject) {
makeRequestFn('./some_route/'+word_rf,function(returnedData){
resolve(returnedData)
});
});
aPromise.then(function(data) {
io.sockets.emit('series_results', data);
res.send('done');
});
//The other routes stay inside this scope
});
In this way I don't need to require the modules, as I transfer the logic directly to other files.
Sorry I am late to the party but I think what you want is inter-process communication. Also instead of reasoning in terms of routes, think in terms of microservices. So what you end up having are decoupled mini apps that can share data between themselves via websocket, message queues, http, etc.
Read this excellent article from the NGINX blog. It is part of a series that will guide you through the process of building your own microservices.
Cheers,

Categories