I had this idea for a backend design for my SPA-website. I would implement it so that routes starting with "/api/" would return only the relevant data in JSON. all other routes would lead to a full-page load.
Now my idea is to do this in middleware, as such:
app.use(function(req, res, next() ){
if(req.path.split("/")[0]=="api"){
res.send = res.json;
//or other custom response-function, I see many possibilities here!
}else{
...
}
});
app.use(routes);
now, my question is, does this modify the res object globally, or just for the current request (this instance of res)? My understanding is just the current one gets modified, and as far as I can tell thats true, but node is blazingly fast, so it's kinda hard to test on my own ( one can only refresh so many tabs in a millisecond! ) anyone know for sure?
Edit: A lot of the people answering my question have asked why I would want to do this. The point is to abstract what the server does for requests coming from clients with the front-end loaded, and clients who need the full page. I'm also considering the possibility of adding a route for loading partial templates, using this same method. by modifying res.send my controllers can worry about getting the data and send it of, res.send will already know if there needs to be some rendering involved. On second thought though res.send is really useful on its own, I might modify res to have res.answer or similar instead.(makes for less confusion too!)
I decided to make this an answer because sometimes future readers don't read the comments.
1) You can modify res and its members to your heart's content. You are operating on an instance of response, not its prototype. There is no "global" resource, but it does have a prototype.
2) Reading the documentation will pay off here. res.send operates identically to res.json if it is passed an object or array. Which is to say that the rest of your code will, in the typical case, run no differently than if you didn't monkey with res.send() but will confuse the heck out of someone (maybe you) several months or years later.
I tested this and each time a request came the response object had the original send function instead of the changed value. So no, it doesn't change it globally.
Here is the test I did:
app.use(function(req, res){
console.log(res.send);
res.send = 'a';
console.log(res.send);
});
Though I'm still not quite sure why you want to do this? It seems like it would be very confusing when someone looks at your API routes and sees res.send() but the effect instead is that of res.json. Why can't you use res.json in your API route functions?
Related
I joined a small team of devs for a start up. We have not even launched yet. I have been handed a backend service written in node/express. I have not worked with this tech beyond small pet projects. I was looking into implementing a style guide just to keep code consistent, with the goal of implementing this across other backend services as well.
That brought me to the Airbnb style guide. This part jumped out at me.
Never mutate parameters
// bad
function f1(obj) {
obj.key = 1;
}
// good
function f2(obj) {
const key = Object.prototype.hasOwnProperty.call(obj, 'key') ? obj.key : 1;
}
In express there are typically controllers that get defined like so:
async function someController(req, res, next) {
// I've seen similar code to this
req.someNewProp = "Some new value."
res.status(200).json({"someJSONKey":"someJSONVal"});
}
Middleware typically gets defined like this:
// Route
router.get('/endpoint', function1, function2)
async function function1(req, res, next) {
// I've seen similar code to this
req.someNewProp = "Some new value."
// Pass req and res to function2
next();
}
I notice that the req object, as it gets passed around gets modified a lot. Data gets added to this object in middleware and other functions as it is passed along before the response is returned. The original dev that authored the code referred to it as "keeping things in request scope." But that seems to directly contradict a major point in the style guide and made me wonder if this is bad practice.
So the question now is, is there a "better" or more widely accepted way to keep track of things in the context of the request that is not mutating the original request object? What are some approaches of doing this?
Express provides a name space for applications to store request/response processing variables by adding them as properties of res.locals. This seems a better choice than attaching not standard properties to the request or response objects themselves.
In similar fashion, global application variables can be stored as properties of app.locals
Unfortunately there doesn't seem to be a locals property defined for router instances. I have placed a reference to global router instance options in res.locals as the first middleware step in a route I wrote, but that was my choice.
It can happen that request properties do need to be changed during processing, such as req.path, but this is not something to avoid at all costs. For example Express provides req.originalURL so you can recalculate path components any time you need to by deliberate design.
You may find Express gets more interesting with use - I've only recently learned of and passed an error object argument to the next function. As for the Airbnb guide quote in the post: underwhelming in a word! The "good" and "bad" code quoted in the post don't do the same thing.
It's not a bad practice, this is the idea behind the middleware in express, in the simple definition, middlewares are functions that can modify the request and response object or even decide if the flow of the request continue or it's terminated. However you have to be careful and don't set a value in a pre-existing property or you can have some strange behaviors, also if the information that you are going to store in the request in big, you can think in other strategies for instance store the information in a memory database as Redis.
After reviewing the express docs I found this bit in the middleware section:
Middleware functions can perform the following tasks:
Execute any code.
Make changes to the request and the response objects.
End the request-response cycle.
Call the next middleware function in the stack.
Middleware Docs
So it's probably safe to say that if the docs explicitly say that we can modify req and response objects in middleware, it is probably not bad practice.
Just looking for a bit of theory here as going through the Strongloop documentation doesn't seem to help with my problem.
What I'd like to do is set up some middleware that can combine the requested object with other details; effectively what I have is objects that contain a user's ID and I'd like to have the server query the username and a couple of other details and collate the information into one object and respond with the new object.
So far I've literally just got some middleware running upon every API endpoint call and able to tell which route it is, but I don't know where to go from there. The REST response object is seemingly disconnected and I'm not sure how to access it. I'm not even sure it's been created at this point; is that where the routes:after phase might be used?
Many thanks in advance for any help, completely lost right now.
Here's my default use case: I'm thinking about serving some of my REST resources over a socket.io layer instead of over http (since I end up needing to serve a lot of small api requests to render a typical page, and they're all over https, which has extra handshaking issues).
I'm still not sure this is a good idea in general (and have been looking at http2.0 as well). In the short term, I don't want to migrate off of hapijs or rewrite a ton of modular code, but I do want to try out making this work on some test servers to see how well it performs.
So I wrote a super-basic socket.io event handler that just takes requests off of the websocket event emitter and repackages them into hapi via a server.inject call:
module.exports = {
addSocket: function(sock) {
sock.on('rest:get:request', function(sock) {
return function(url) {
console.log(url);
hapi.inject({url: url, credentials: {user: sock.user}}, function(res) {
sock.emit('rest:get:response', url, res.payload);
});
};
})(sock);
}
};
So, all it really does is make sure the authentication object is set up (I have previously authenticated the user on the socket) and then inject a GET request to the server.
Usually, it seems like server.inject is used for testing, but this seems like a perfectly cromulent plan, unless (of course), it's super-slow or a bad idea for reasons I haven't foreseen. Hence: my question.
Server.inject is a great way of encapsulating sub requests however it can become more complicated than necessary though. A lighter approach would be to use shared handler functions and run them as pre-handlers.
What's nice about the pre handlers is that you can run them in parellel if needed.
One use case where I have used server.inject (other than in tests) is for a health check route. I'd have a route /health-check/db and /health-check/queue. I'd then have a /health-check route which encapsulated all of them. But again, this could have been completed with shared route handlers.
I had a lengthy discussion on the gitter channel the other day about this and my understanding is that it is neither good nor bad.
I guess a really good use case would be if you have multiple teams building plugins which expose routes and you want to use one of those routes in your plugin. You don't care about the implementation; you can just call the route.
Anothe reason for using server.inject is that it includes the validation steps provided by the route so you only need to have your resource defined in one place.
i am starting to learn Node.js and trying to understand the architecture of it combined with the micro-framework Express.
I see that Express uses Connect as a middleware. Connect augments the request and response objects with all kinds of stuff in a chain of functions, and it provides an API so you can add custom middleware. I guess this augmenting is a way to keep things simple and flexible in the handlers/controllers, instead of having a variable number of parameters and parameter types. Here is an example of a simple GET handler:
app.get('/', function (req, res) {
res.render('index', { title: 'Hey', message: 'Hello there!'});
})
In tutorials from Node.js experts i have seen stuff like augmenting the request object with a MongoDB collection. In a blog from Azat Mardan i have seen this code :
var db = mongoskin.db('mongodb://#localhost:27017/test', {safe:true})
app.param('collectionName', function(req, res, next, collectionName){
req.collection = db.collection(collectionName)
return next()
})
The approach above is using the 'collectionName' parameter in the route name as a conditional to control the augmentation of the request. However, i have seen uglier code where the database middleware is attached on EVERY request that goes through Node.js without this conditional approach.
Looking at standard software principles like single responsibility principle, separation of concerns and testability why is it a good idea to extend the request with a MongoDB collection object and dozens of other objects? isn't the request and response object bloated with functionality this way and has unpredictable state and behavior? Where does this pattern come from and what are the pros and cons and alternatives?
This is fine. IMHO the very purpose of the request object is as a container to pass things down the stack for other handlers to use. It is far cleaner than looking for some agreed-upon-named global holder.
You could argue that it should be mostly empty, and then have the "official" request and response functionality on some property of the request/response objects, so it is cleaner, but I think the benefits are minimal.
Matter of fact, just about every middleware I have seen, including looking at the express source code and ones I have authored, uses request for exactly this sort of "container to pass properties and functionalities down the handler stack".
I know these types of question come up fairly often, but I need help with a wait-like mechanism in JavaScript. I know setTimeout-based solutions are going to come up, but I'm not sure how to pull it off in my case.
I'm writing an API that uses a WebSocket internally. There's a connect() method that sets up the WebSocket, and I need to make it not return until after the WebSocket is set up. I'd like it to return a value for whether or not the connection was successful, but that's not the main problem.
The issue I'm hitting is that after a user calls connect(), they may call another method that relies on the WebSocket to be properly set up. If it's called too early, an error is thrown stating that the object is not usable.
My current solution is setting a "connected" flag when I've determined a successful connection and in each method checking for it in each method. If it's not connected, I add the method call to a queue that is ran through by the same code that sets the flag. This works, but it introduces that style of code all over my methods and also seems misleading from the user-perspective, since the call of those functions is deferred. Also, if there is other user code that relies on those calls being completed before it gets to them, it won't behave as expected.
I've been racking my brain with how to handle this case. The easiest solution is to just find a way to block returning from connect until after the WebSocket is set up, but that's not really the JavaScript way. The other option was to make them provide the rest of their code in a callback, but that seems like a weird thing to do in this case. Maybe I'm over-thinking it?
Edit: To better illustrate my problem, here's a example of what the user could do:
var client = new Client(options);
client.connect();
client.getServerStatus();
The getServerStatus() method would be using the WebSocket internally. If the WebSocket is not set up yet, the user will get that not usable error.
Todays Javascript does not really work like that unfortunately. In the future (ECMA6) there may be new language features that address this issue more directly. However for now you are stuck with the currently accepted method of handling asynchronous events, which is limited to callbacks. You may also want to explore 'promises' to handle 'callback hell' however you will need a library for this.
And yes it does seem strange to have callbacks everywhere, especially for someone new to web programming, however it is really the only way to go about it at this stage (assuming you want a cross-browser friendly solution).
"Wait" is almost the keyword you are looking for. Actually, it's yield that does this. See e.g. MDN's documentation.
There's a connect() method that sets up the WebSocket, and I need to make it not return until after the WebSocket is set up
That isn't going to happen unless you rewrite the javascript execution engine.
Either the code trying to send data will need to check the socket state (I'd go with encapsulating the socket in a object, supplying a method which sets a member variable on the open/close events and poll the state of that member variable from the external code). Alternatively you could add messages and call backs to a queue and process the queue when the socket connects.