It would be beneficial, if there is a optimized way of communication between two Node.js routes?
I use mainly Express and I don't know of a Express module, that provides such functionality. I am unsure whether I am missing the right framework or the right NPM script.
I know that this functionality can be implemented via simple Ajax, but I am looking for cleaner solution, such as websockets. I use socket.io, but it seems focused on communication between the client and the server, not between two servers.
Update - What I want to achieve?
Instead of placing all request I need in one file, such as:
router.post('/entode', function(req, res) {
var word_rf = req.body.word1;
if( /^[a-zA-Z]+$/g.test( word_rf ) && word_rf.length<18 ) {
entode.entode_function_rf(word_rf, function (res1) {
io.sockets.emit('series_results', res1);
res.send('done');
});
} else {
io.sockets.emit('error_rf', 'incorrect input');
res.send('done');
}
});
//-------------------------
router.post('/yandex_short', function(req, res) {
var word_rf = req.body.word1;
...
});
//-------------------------
router.post('/yandex_long', function(req, res) {
var word_rf = req.body.word1;
...
});
I prefer having something like:
router.post('/', function(req, res) {
var word_rf = req.body.word1;
var aPromise = new Promise(function(resolve,reject) {
makeRequestFn('./some_route/'+word_rf,function(returnedData){
resolve(returnedData)
});
});
aPromise.then(function(data) {
io.sockets.emit('series_results', data);
res.send('done');
});
//The other routes stay inside this scope
});
In this way I don't need to require the modules, as I transfer the logic directly to other files.
Sorry I am late to the party but I think what you want is inter-process communication. Also instead of reasoning in terms of routes, think in terms of microservices. So what you end up having are decoupled mini apps that can share data between themselves via websocket, message queues, http, etc.
Read this excellent article from the NGINX blog. It is part of a series that will guide you through the process of building your own microservices.
Cheers,
Related
Background
I'm using express-http-proxy to proxy a handful of requests between my SPA (single page application) and a CouchDB instance. I'm doing this proxy on a per call basis, NOT creating a proxy server (this will be important in a moment).
example of current use
app.use(`some/url`, proxy(dburl, {
forwardPath: req => {return 'some/url'+require('url').parse(req.url).path;}
}) );
Which means I am NOT using httpProxy.createServer. I want to send some snippet of text data along with my responses as a header. After looking through the documentation I've come to the conclusion that what I want will be using intercept. Unfortunately I've not quite managed to grasp how to use it, and the only related questions I've found so far appear to be based on httpProxy.createServer which appears (from my limited understanding) to work differently.
We are using individual request proxying because we wish to proxy different requests to different micro-services, and found this to be the most concise way (that we knew of & at the time) of doing that.
The Question
Given the code
const text = 'asdf';
app.use(`some/url`, proxy(dburl, {
forwardPath: req => {return 'some/url'+require('url').parse(req.url).path;},
intercept: function(rsp, data, req, res, callback) {
//SUSPECT LOCATION
}
}) );
Is there some code at SUSPECT LOCATION which would allow me to place text on the header for the final response without further affects to the (currently otherwise working) proxy?
Additional Notes
Headers and network requests in general are not very familiar to me, my apologies if the answer seems self evident.
Bonus points for a link to a resource that helps explain either the finer points of using this library for proxying, a similar library for proxying, or the underlying technologies which would make it clear how to use this library for proxying. AKA I'd rather spend some of my own time looking further into this and not come back for further questions.
I am not entirely confident that the place for my code will be SUSPECT LOCATION and I will happily listen if it needs to go somewhere else, or if we need to approach this problem in a different way.
The accepted answer is now outdated.
Intercept does not exist anymore.
Instead, use your own middleware before the proxy function
router.route('/my-route').get((req, res, next) => {
res.set('My-Header', 'my-header-value');
next();
}, proxyFunction);
It follows express.js methods on req, res objects.
Within the intercept function body, set the response headers using the following express format.
res.set('hola', 'amigos!!!');
Refer below link:
http://expressjs.com/en/4x/api.html#res.set
The best way to understand a library when there is no documentation is to follow its test suite. If there is no test suite don't use that library.
This is the test suite for the express-http-proxy intercept function
https://github.com/villadora/express-http-proxy/blob/master/test/intercept.js
This is the test case
it('can modify the response headers', function(done) {
var app = express();
app.use(proxy('httpbin.org', {
intercept: function(rsp, data, req, res, cb) {
res.set('x-wombat-alliance', 'mammels');
res.set('content-type', 'wiki/wiki');
cb(null, data);
}
}));
request(app)
.get('/ip')
.end(function(err, res) {
if (err) { return done(err); }
assert(res.headers['content-type'] === 'wiki/wiki');
assert(res.headers['x-wombat-alliance'] === 'mammels');
done();
});
});
If you want to undetstand in and out of proxying, the best resource is haproxy
http://cbonte.github.io/haproxy-dconv/1.7/intro.html
But before that you need to understand http more (a constructive comment)
I'm running a small Angular application with a Node/Express backend.
In one of my Angular factories (i.e. on the client side) I make a $http request to Github to return user info. However, a Github-generated key (which is meant to be kept secret) is required to do this.
I know I can't use process.env.XYZ on the client side. I'm wondering how I could keep this api key a secret? Do I have to make the request on the back end instead? If so, how do I transfer the returned Github data to the front end?
Sorry if this seems simplistic but I am a relative novice, so any clear responses with code examples would be much appreciated. Thank you
Unfortunately you have to proxy the request on your backend to keep the key secret. (I am assuming that you need some user data that is unavailable via an unauthenticated request like https://api.github.com/users/rsp?callback=foo because otherwise you wouldn't need to use API keys in the first place - but you didn't say specifically what you need to do so it is just my guess).
What you can do is something like this: In your backend you can add a new route for your frontend just for getting the info. It can do whatever you need - using or not any secret API keys, verify the request, process the response before returning to your client etc.
Example:
var app = require('express')();
app.get('/github-user/:user', function (req, res) {
getUser(req.params.user, function (err, data) {
if (err) res.json({error: "Some error"});
else res.json(data);
});
});
function getUser(user, callback) {
// a stub function that should do something more
if (!user) callback("Error");
else callback(null, {user:user, name:"The user "+user});
}
app.listen(3000, function () {
console.log('Listening on port 3000');
});
In this example you can get the user info at:
http://localhost:3000/github-user/abc
The function getUser should make an actual request to GitHub and before you call it you can change if that is really your frontend that is making the request e.g. by cheching the "Referer" header or other things, validate the input etc.
Now, if you only need a public info then you may be able to use a public JSON-P API like this - an example using jQuery to make things simple:
var user = prompt("User name:");
var req = $.getJSON('https://api.github.com/users/'+user);
req.then(function (data) {
console.log(data);
});
See DEMO
I've split some parts of a Socket.IO application I'm working on into various middleware that perform operations necessary to the application like managing sessions and enforcing access restrictions.
I am currently trying to write another set of middleware functions to handle a list of currently connected sockets intended to be used to kick specific clients from the server, and I find myself in need of the global io object within the context of a socket.on event callback to broadcast messages to specific rooms via io.sockets.in('room').emit(...).
As far as I know, if you split your socket events out of your main program the way I have there is no tidy way to access the global io object in an external file. Merely require-ing the Socket.IO module within the external file will not be enough.
I have come up with this workaround to inject the global io object and return a function matching the signature Socket.IO expects from middleware and it appears to work. Still, I worry.
Is there a more elegant solution for my problem? A cleaner alternative, maybe? Have I missed something in the documentation?
And are there any hidden gotchas to the "solution" I've found?
index.js
// Other required stuff- Express.js, Mongoose, whatever.
var io = require('socket.io').listen(server);
var Middleware = require('./middleware.js')(io);
io.use(Middleware);
server.listen(port);
middleware.js
module.exports = function(io) {
return function(socket, next) {
socket.on('kickUser', function(data) {
// Do something with the global io object here.
});
return next();
};
};
Any advice would be welcome!
The io object is an instance a socket.io Server.
This instance is also available via socket.server from within an event handler.
The following two examples are equivalent:
module.exports = function(io) {
return function(socket, next) {
socket.on('kickUser', function(data) {
io.emit('userKicked', socket.id);
});
return next();
};
};
module.exports = function() {
return function(socket, next) {
socket.on('kickUser', function(data) {
socket.server.emit('userKicked', socket.id);
});
return next();
};
};
I want to add new method in response and request of node.js.
How i can do it more efficiently?
I can't understand how this is done in express.js
Being JavaScript, there are numerous ways to do this. The pattern that seems most reasonable to me for express is to add the function to each request instance in an early middleware:
//just an example
function getBrowser() {
return this.get('User-Agent');
}
app.use(function (req, res, next) {
req.getBrowser = getBrowser;
next();
});
app.get('/', function (req, res) {
//you can call req.getBrowser() here
});
In express.js, this is done by adding additional function to the prototype of http.IncomingMessage.
https://github.com/visionmedia/express/blob/5638a4fc624510ad0be27ca2c2a02fcf89c1d334/lib/request.js#L18
This is sometimes called "monkey patching" or "freedom patching". Opinions vary on whether this is fantastic or terrible. My approach above is more prudent and less likely to cause intended interference with other code running inside your node.js process. To add your own:
var http = require('http');
http.IncomingMessage.prototype.getBrowser = getBrowser; //your custom method
Add methods to express.response object:
const express = require('express');
express.response.getName = () => { return 'Alice' };
I'm building an NPM module that needs to make an HTTP request to itself (the running web server). For example:
var url = "http://127.0.0.1:" + (process.env.PORT || 3000) + path;
request(url, function(error, response, body){
...
});
Is there a way to process a request through the NodeJS pipeline without actually doing an HTTP request?
Or is there a better way to form the URL? I'm nervous that 127.0.0.1 isn't the most robust way to handle this for production sites.
Self Consuming JSON API
In a self consuming JSON API, you define some functionality in some standalone controller functions and then wire the functionality up to express after the fact. Let's use a library application as an example:
books.js
module.exports = {
browse: function () {
return Book.findAll()
},
read: function (options) {
return Book.findById(options.book)
},
processLateFees: function () {
// Do a bunch of things to process late fees
}
}
to-http.js
In this file we build a function that converts a controller function to an HTTP route. We take the query params and pass that to our controller as options:
module.exports = function toHTTP (func) {
return function (req, res) {
func(req.params).then(function (data) {
res.send(data)
})
}
}
router.js
And then we connect up our controller to our http router
var express = require('express')
var books = require('./books')
var toHTTP = require('./to-http')
var app = express()
app.get('/books', toHTTP(books.browse))
app.get('/books/:book', toHTTP(books.read))
app.get('/batch-jobs/process-late-fees', toHTTP(books.processLateFees))
So we now have an express application connected up to controller functionality. And the wonderful thing is that we can call these controller functions manually too.
var books = require('./books')
books.processLateFees().then(function () {
// late fees have been processed
})
If you need a more in depth example of this, the Ghost blog codebase is built around this pattern. It is a very informative read.
You can put that method inside your Model or Controller and call it inside the app if you just have one nodejs application, it needs fewer resources than to create a new request.
If you have more than one nodejs apps (or other services), it is normal to create a request to other web services with specific URL and port.
I do it in one of my project and that works fine.
I use it in dev and prod without issues so far, because I use several nodejs applications with 3 differents web services that call themself to log in or check auhtentication. I use both express.js and sails.js (based on express.js)
i think using request module is acceptable, it is quite fast and i use this approach when i do unit tests