Double parameters with require: var io = require('socket.io')(http); - javascript

I'm new to node and JS and was working thorough the socket.io chat example (http://socket.io/get-started/chat/). I came across this code in the server:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
I've looked at other tutorials and never seen double parentheses after require before. What does the (http)part do? Is it a parameter for require, doest it change the type, or something else?
Thanks!

In JavaScript, function is the First-class citizen. This means that it can be returned by another function.
Consider the following simple example to understand this:
var sum = function(a) {
return function(b) {
return a + b;
}
}
sum(3)(2); //5
//...or...
var func = sum(3);
func(2); //5
In your example, require('socket.io') returns another function, which is called immediately with http object as a parameter.

To expand if you had a library http and it has a exported module server.
Lets say we picked apart the line:
var http = require('http').Server(app);
into two lines:
var http = require('http')
Imports the "http" module library as a JSON object into the http variable. This module library has a bunch of modules that you can access now by calling them via the http var.
httpServer = http.Server(app)
This loads the Server module with the express data that you called above (Kind of line a constructor) and puts it into the httpServer var.
The difference above is that instead of the two steps, they are condensing it into one, so that http has the Server module inside it instead of the entire http library. This can be useful if you only want to use that specific part of the http library.

Nodejs allows you to assign an object/function to the exported module using the statement module.exports = something. So each one of those statements are importing a library, and then running the function that was assigned to what was exported.
For example, here is the source code for express where they export the createApplication function.
And here's an article where they go into a bit more detail.

Related

Conditionally filtering dependency-injected async data?

I have a NodeJS application which I've built around dependency injection. The app can run any combination of its functions (modules) at the same time, and any modules that request data from the same async resource will instead share a single request. So if module A and module B both need data from https://example.com/path/to/resource and I run both, then that request gets made once and the result is handed off to both modules. Great.
But what if module C only wants part of the data from the same resource? If I run modules A-C, then it can just await the data and filter the result, but that's wasteful when I only want to run module C. Conversely, if I just have module C request path + ?filter=foo, then that's efficient when only module C is run, but wasteful if A-C are as two requests would be made for a superset of the same data.
I sort of see a way to create an optimizing technique around this, but I'm somewhat new to dependency injection and I'm afraid I'll end up creating an anti-pattern or something convoluted and hard to maintain. What's the best approach to handling this?
EDIT
To clarify, in an ideal solution, that example would only have three possible flows and request one of two possible URLs depending on the set of modules being run:
Set = {A}, {B}, or {A, B}. We request /path/to/resource and pass the result as is to the modules.
Set = {C}. We request /path/to/resource?filter=foo and pass the result as is to the module.
Set = {C, A}, {C, B}, or {C, A, B}. We request /path/to/resource. To modules A/B, we pass the result as is. But to module C, we first process the result for foo before passing it along.
This way, C never needs to be aware of what the requested URL was, no data is ever wasted, and no unnecessary CPU cycles need to be burned.
I would use two Promise.all():
1. for all the request that don't need the results
2. for all the request that needs the results
Assume modules = [] is an array of modules that you want to execute with objects:
{
requiredResult: boolean;
url: string;
}
runCombination(modules = []){
let needResultPromises = [];
let backgroundPromises = [];
modules.forEach((eachModule) => {
if(eachModule.requiredResult){
needResultPromises.push(RunModule(eachModule.url))
}else{
backgroundPromises.push(RunModule(eachModule.url))
}
})
Promise.all(backgroundPromises);
return Promise.all(needResultPromises);
}
So from the function above, you can see that some promise all is running in the background and the one with return will wait to the results.
Hope this helps you.

What exactly is happening when a class is required and a method within another method is invoked?

When a method, such as createServer() in a Class is invoked as following:
const http = require("http");
const server = http.createServer();
server.listen(3000);
what exactly is happening? If I'm understanding this correctly, the http class is a constructor and the require function instantiates const http. createServer() is a method of the constructor http which is, in turn, invoked by http.createServer(). The Node.js document shows that createServer() "returns a new instance of http.Server."
Does this mean createServer() works as a constructor as well and creates another instance within the already created instance? Oddly enough, server.__proto__ points to the Function setTimeOut, not createServer nor http.
require() is a function, and is invoked with the String "http" here.
Node.js then looks for the http module, interprets it, and require() returns an Object with the exports of the given module. So a module is require()d, which can export multiple things, not just a class.
An Object in JavaScript is what in other languages is called a 'dictionary' or a 'map', associating values to keys of type String. (Map was later added to JavaScript and can have keys of any type.)
createServer() is just a function in this Object, under the key "createServer". When invoked, it returns another Object, an instance of class http.Server. It could be called a factory function. See its implementation here.
listen() in turn is similarly a function in that Object, in other words, a method of class http.Server.
Neither of these are constructors, which are invoked with the new keyword as in: new http.Server().
server.constructor === http.Server
What exactly is happening when a class is required?
First, http is just a module not a class. So if you are asking What is happening when invoke require(), it returns a reference exported module such as http.
What is happening when a method within another method is invoked?
It depends on behavior of the method. In your case, http.createServer() returns the instance of http.Server. That doesn't mean that http.createServer() is constructor of http.Server but returns a instance of it.
const http = require('http');
console.log(http.createServer() instanceof http.Server)
// true

Javascript -Node.js When do i need to 'require' a module

I'm fairly new to Node.js and server side programming in general, and still a bit confused about basic Node rules. To start an http server, I need to require the http module. The server itself returns a request and a response object, which (as I understand, correct me if I'm wrong) are both eventEmitters and stream objects. I can still use methods like req.on() and res.write() without requiring the stream and eventEmitter modules. However, when I try to use the pipe function req.pipe(res), an error occurs saying that the pipe function is not defined. I assume this happens because I didn't include the stream module. How come I can use certain stream functions without requiring modules, but not others?
It seems like you're pretty new to Javascript as well, but I'll try to do my best to explain this.
require basically imports an object so you can use the functionality it provides. But if you already have an object, then you don't need to import (require) because it already exists.
For instance, if you want to create a stream, then you'll need to require it since creating a stream is a functionality you need to import. However, your function can still take a stream as a parameter and use it since it already exists.
const Stream = require('stream');
// need to require stream since we'd like to use it to create an object
const mystream = new Stream();
--
// no need to require stream here since you're given the object
function doSomethingWithStream(streamObject){
streamObject.on('data', () => { /* do something */ });
}
So in your case you don't need to require stream since you already have the object. Although your res object should have a pipe method (see example below), but it won't work because piping only works from a Readable stream to a Writable stream and res is a Writable stream (see Node Stream docs).
const express= require('express');
const app = express();
app.get('/has-pipe',(req, res) => {
const success = !!res.pipe; // true if res.pipe exists
req.pipe(res) // Readable stream to Writable stream
res.send({ success }); // returns true
})
const port = 3000
app.listen(port, () => console.log(`Example server running on port 3000`))

Asynchronous initialization of express.js (or similar) apps

Consider an example: I have the following express.js app (see code snippet below). I want to have one persistent connection to the DB, and one persistent connection to my own service (which required async call to start) during entire app lifetime. And there are a few entry points, i.e. one can access my app not only via HTTP protocol. Of course, I want to avoid service initialization code duplication and there could be several such async-initializing services.
/* app.js */
var app = require('express')();
// set views, use routes, etc.
var db = require('monk/mongoose/etc')(...); // happily, usually it's a sync operation
var myService = require('./myService'); // however, it's possible to have several such services
myService.init(function(err, result) {
// only here an initialization process is finished!
});
module.exports.app = app;
/* http_server.js (www entry point) */
var app = require('app');
// create an HTTP server with this app and start listening
/* telnet_server.js (other entry point) */
var app = require('app');
// create a Telnet server with this app and start listening
In the code snippet above, by the time http (or telnet, or any other) server is starting, there is no guarantee, that myService already has initialized.
So, I have to somehow reorganize my app creation code. For now I stick with the next solution:
/* app.js */
var app = require('express')();
module.exports.app = app;
module.exports.init = function(callback) {
var myService = require('./myService');
myService.init(callback);
}
/* entry_point.js */
var app = require('app');
app.init(function(err) {
if (!err) {
// create an HTTP/Telnet/etc server and start listening
}
});
So, my question is: what is the common way to initialize services required asynchronous call to start?
I would recommend you to promisify the initialization function of your services(s) and then use them in the following manner:
const app = require('express')();
const util = require('util');
const myService = require('./myService');
const myServiceInit = util.promisify(myService.init);
Promise.all([myServiceInit]).then(() => {
// delayed listening of your app
app.listen(2000);
}).catch(err => {
// handle error here
});
I've used Promise.all in order for you to add initialization of multiple internal services.
The pre-requisite of promisifying your init function is that it should be using an error first callback mechanism. You can read more about it here Node Official Doc
Hope this helps your cause.
I've created a gist here with a sample of the code I normally use for this task. (It uses the Q promise library, but could easily be modified to use any other promises lib).
The basic idea is to describe the app backbone as a sequence of asynchronous initialization steps. Each step calls one or more async functions and binds the result to a name; the startup process only progresses to the next initialization step once all values are resolved for the current step, and subsequent steps can then access all values resolved by previous steps. This allows the dependency order of services and components within the app to be easily described.
For example, a backbone can be defined as follows:
var app = [
{ s1: startService1 },
{ s2: startService2, s3: startService3 },
{ s4: startService4 }
]
(Note that in each step definition, just references to each function are given; the start() function - shown in the gist - will invoke each function in the correct order).
Each of the startXxx vars is a function which takes a single argument, and returns a deferred promise, e.g.:
function startService4( app ) {
var s1 = app.s1;
var s2 = app.s2;
var deferred = Q.defer();
// ... start the service, do async stuff ...
return deferred;
}
The function's app argument represents the configured app backbone, and results from previous initialization steps are available as its named properties.
I've used this pattern in fairly complicated systems, and find it a simple, flexible and effective way to define a system's high level structure.

Node.js database initalization from multiple modules

I have 3 modules in a project: A,B,C; all of them are using Rethinkdb, which requires an async r.connect call upon initialization.
I'm trying to make a call from module A to B from command line; however, despite starting r.connect on require(), B couldn't serve this, because rethinkdb haven't loaded by the time the module A calls.
In what ways might this code be refactored, such that I can actually make sure all initializations are complete before calling B?
I've tried to use closures to pass state around modules; however, due to r.connect only being available as async function, this would take the form of:
r.connect( config.rethinkdb, function(err, connection) {
rconn = connection;
// all module requires
require("./moduleB")(rconn);
require("./moduleC")(rconn);
...lotsacode...
});
Which feels very wrong. Any better suggestions?
You can use promise, and pass the connection around. Something like this
r.connect(config.rethinkdb)
.then(function(connection) {
//do some stuff here if you want
initWholeApp(connection)
})
and inside initWholeApp connection, you can put your application code.
You can even simplify it to
r.connect(config.rethinkdb)
.then(initWholeApp)
With initWholeApp is a function that accept an argument as the establish connection.
More than that, you can even run each of query on a connection(just ensure to close the connection) once you are done with that query, or using a RethinkDB connection pool with a driver that support it such as https://github.com/neumino/rethinkdbdash or roll your own.

Categories