If one request will be take a long time for execution, another tabs will wait until first request will completed. Why? Please explain me. And how to solve it? Here my code:
var http = require("http");
var url = require("url");
http.createServer(function(request, response) {
if (url.parse(request.url).pathname == '/long')
{
function longlong()
{
var counter = 0;
var startTime = new Date().getTime();
while (new Date().getTime() < startTime + 10000)
{
counter++;
}
return counter;
}
response.writeHead(200, {"Content-Type": "text/plain"});
response.write(longlong() + '');
response.end();
}
else
{
response.writeHead(200, {"Content-Type": "text/plain"});
response.write(0 + '');
response.end();
}
}).listen(8888);
Good read:
http://blog.mixu.net/2011/02/01/understanding-the-node-js-event-loop/
A key line from that write up is:
…however, everything runs in parallel except your code
That means expensive I/O should be async but your code can block. but, it's typically expensive I/O that we're worried about blocking on the server.
See this other question for more details with an example: Node.js blocking the event loop?
Because NodeJS is single threaded ( just like browser JavaScript ). It can only process one code at a time. We get the illusion of concurrency, because NodeJS has very fancy queue of code blocks which are fired in an asynchronous way. However once the block is processed no other block can be processed at the same time.
With NodeJS you have to make sure that every request ends (either successfuly or not), otherwise it may crash beyond any help (entire server, not only the request).
Also using process.nextTick instead of classical loop may help your requests work faster ( i.e. be more scalable ).
Related
I send JSON requests one by one to the nodejs server. After 6th request, server can't reply to the client immediately and then it takes a little while(15 seconds or little bit more and send back to me answer 200 ok) It occurs a writing json value into MongoDB and time is important option for me in terms with REST call. How can I find the error in this case? (which tool or script code can help me?) My server side code is like that
var controlPathDatabaseSave = "/save";
app.use('/', function(req, res) {
console.log("req body app use", req.body);
var str= req.path;
if(str.localeCompare(controlPathDatabaseSave) == 0)
{
console.log("controlPathDatabaseSave");
mongoDbHandleSave(req.body);
res.setHeader('Content-Type', 'application/json');
res.write('Message taken: \n');
res.write('Everything all right with database saving');
res.send("OK");
console.log("response body", res.body);
}
});
My client side code as below:
function saveDatabaseData()
{
console.log("saveDatabaseData");
var oReq = new XMLHttpRequest();
oReq.open("POST", "http://192.168.80.143:2800/save", true);
oReq.setRequestHeader("Content-type", "application/json;charset=UTF-8");
oReq.onreadystatechange = function() {//Call a function when the state changes.
if(oReq.readyState == 4 && oReq.status == 200) {
console.log("http responseText", oReq.responseText);
}
}
oReq.send(JSON.stringify({links: links, nodes: nodes}));
}
--Mongodb save code
function mongoDbHandleSave(reqParam){
//Connect to the db
MongoClient.connect(MongoDBURL, function(err, db)
{
if(!err)
{
console.log("We are connected in accordance with saving");
} else
{
return console.dir(err);
}
/*
db.createCollection('user', {strict:true}, function(err, collection) {
if(err)
return console.dir(err);
});
*/
var collection = db.collection('user');
//when saving into database only use req.body. Skip JSON.stringify() function
var doc = reqParam;
collection.update(doc, doc, {upsert:true});
});
}
You can see my REST call in google chrome developer editor. (First six call has 200 ok. Last one is in pending state)
--Client output
--Server output
Thanks in advance,
Since it looks like these are Ajax requests from a browser, each browser has a limit on the number of simultaneous connections it will allow to the same host. Browsers have varied that setting over time, but it is likely in the 4-6 range. So, if you are trying to run 6 simultaneous ajax calls to the same host, then you may be running into that limit. What the browser does is hold off on sending the latest ones until the first ones finish (thus avoiding sending too many at once).
The general idea here is to protect servers from getting beat up too much by one single client and thus allow the load to be shared across many clients more fairly. Of course, if your server has nothing else to do, it doesn't really need protecting from a few more connections, but this isn't an interactive system, it's just hard-wired to a limit.
If there are any other requests in process (loading images or scripts or CSS stylesheets) to the same origin, those will count to the limit too.
If you run this in Chrome and you open the network tab of the debugger, you could actually see on the timline exactly when a given request was sent and when its response was received. This should show you immediately whether the later requests are being held up at the browser or at the server.
Here's an article on the topic: Maximum concurrent connections to the same domain for browsers.
Also, keep in mind that, depending upon what your requests do on the server and how the server is structured, there may be a maximum number of server requests that can efficiently processed at once. For example, if you had a blocking, threaded server that was configured with one thread for each of four CPUs, then once the server has four requests going at once, it may have to queue the fifth request until the first one is done causing it to be delayed more than the others.
In browser javascript is pathetically broken in that the only way to make requests is using script tags and jsonp. To make this useful, I'm trying to make a nodejs server that, given a callback name and address, loads the page at the address and pads it in a call to callback and serves the result. However, I know next to nothing about nodejs. If the server's response is loaded from a script tag it would result in actually loading a web page. Currently, I'm writing the request as localhost:8000/callback/address so a script tag might be <script src="localhost:8000/alert/https://www.google.com" type="text/javascript"></script>. Here is my code for the server:
var http = require("http");
var request = require("request");
var server = http.createServer(function(req, res){
req.on("end", function(){
console.log("alive");
var url = req.url;
var i = url.indexOf("/", 1);
request(url.substring(i + 1), function(err, ret, body){
res.writeHead(200);
res.write(url.substring(1, i) + "(\"" + body + "\");");
res.end();
});
});
});
server.listen(8000);
Why does this stay loading for a very long time but never actually load? By using console.log() it seems as if the req.on("end") callback is never even called.
If you don't care about any request data, you could just add req.resume(); after you add your end event handler.
The reason it's getting "stuck" is that since node v0.10, streams start out in a paused state, so you need to unpause them by reading from them in some way. req.resume(); accomplishes this. Once there is nothing left in the request stream (which there could be nothing), the end event will be emitted.
If you look at the answer by Casey Chu (answered Nov30'10) in this question : How do you extract POST data in Node.js?
You will see that he is responding to 'data' events , to construct the body of the request. Reproducing code here:
var qs = require('querystring');
function (request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
// Too much POST data, kill the connection!
if (body.length > 1e6)
request.connection.destroy();
});
request.on('end', function () {
var post = qs.parse(body);
// use post['blah'], etc.
});
}
}
Suppose I don't care about POST requests, and hence never check if a request is POST or create a 'data' event handler, is there a risk that someone can block my thread by sending a really large post request ? For example, instead of the above code, what if I just did:
function hearStory(request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.write("Cool story bro!");
response.end();
}
What happens to really large POST requests then ? Does the server just ignore the body ? Is there any risk to this approach ? Get requests including their headers must be less that 80kB, so it seems like a simple way to avoid flooding my server.
Hopefully these kinds of attacks can be detected and averted before it ever gets to your server via a firewall or something else. You shouldn't handle DOS attacks with the server itself. However, if they've gotten to your server with malicious intent, there needs to be a way to handle it. If you intend on handling POST requests, the code you're referring will help.
You could, if you just want to avoid POST requests all together and not listen for them, as is demonstrated by the second code snippet, do something like the following.
function denyPost(req, res) {
if (request.method == 'POST') {
console.log('POST denied...'); // this is optional.
request.connection.destroy(); // this kills the connection.
}
}
Of course, this wont work if you plan on handling post requests somehow. But again, DOS attacks need to be handled before they ever get to your server. If they've gotten there, they've already won.
Noob question on using callbacks as a control flow pattern with Node and the http class. Based on my understanding of the event loop, all code is blocking, i/o is non-blocking and using callbacks, here's the a simple http server and a pseudo rest function:
// Require
var http = require("http");
// Class
function REST() {};
// Methods
REST.prototype.resolve = function(request,response,callback) {
// Pseudo rest function
function callREST(request, callback) {
if (request.url == '/test/slow') {
setTimeout(function(){callback('time is 30 seconds')},30000);
} else if (request.url == '/test/foo') {
callback('bar');
}
}
// Call pseudo rest
callREST(request, callback);
}
// Class
function HTTPServer() {};
// Methods
HTTPServer.prototype.start = function() {
http.createServer(function (request, response) {
// Listeners
request.resume();
request.on("end", function () {
// Execute only in not a favicon request
var faviconCheck = request.url.indexOf("favicon");
if (faviconCheck < 0) {
//Print
console.log('incoming validated HTTP request: ' + request.url);
//Instantiate and execute on new REST object
var rest = new REST();
rest.resolve(request,response,function(responseMsg) {
var contentType = {'Content-Type': 'text/plain'};
response.writeHead(200, contentType); // Write response header
response.end(responseMsg); // Send response and end
console.log(request.url + ' response sent and ended');
});
} else {
response.end();
}
});
}).listen(8080);
// Print to console
console.log('HTTPServer running on 8080. PID is ' + process.pid);
}
// Process
// Create http server instance
var httpServer = new HTTPServer();
// Start
httpServer.start();
If I open up a browser and hit the server with "/test/slow" in one tab then "/test/foo" in another, I get the following behavior - "foo" responds with "Bar" immediately and then 30 secs late, "slow" responds with "time is 30 seconds". This is what I was expecting.
But if I open up 3 tabs in a browser and hit the server with "/test/slow" successively in each tab, "slow" is being processed and responds serially/synchronously so that the 3 responses appear at 30 second intervals. I was expecting the responses right after each other if they were being processed asynchronously.
What am I doing wrong?
Thank you for your thoughts.
This is actually not the server's fault. Your browser is opening a single connection and re-using it between the requests, but one request can't begin until the previous finishes. You can see this a couple of ways:
Look in the network tab of the Chrome dev tools - the entry for the longest one will show the request in the blocking state until the first two finish.
Try opening the slow page in different browsers (or one each in normal and incognito windows) - this prevents sharing connections.
Thus, this will only happen if the same browser window is making multiple requests to the same server. Also, note that XHR (AJAX) requests will open separate connections so they can be performed in parallel. In the real world, this won't be a problem.
I've created a node.js script, that scans network for available HTTP pages, so there is a lot of connections i want to run in parallel, but it seems that some of the requests wait for previous to complete.
Following is the code fragment:
var reply = { };
reply.started = new Date().getTime();
var req = http.request(options, function(res) {
reply.status = res.statusCode;
reply.rawHeaders = res.headers;
reply.headers = JSON.stringify(res.headers);
reply.body = '';
res.setEncoding('utf8');
res.on('data', function (chunk) {
reply.body += chunk;
});
res.on('end', function () {
reply.finished = new Date().getTime();
reply.time = reply.finished - reply.started;
callback(reply);
});
});
req.on('error', function(e) {
if(e.message == 'socket hang up') {
return;
}
errCallback(e.message);
});
req.end();
This code performs only 10-20 requests per second, but i need 500-1k requests performance. Every queued request is made to a different HTTP server.
I've tried to do something like that, but it didn't help:
http.globalAgent.maxSockets = 500;
Something else must be going on with your code. Node can comfortably handle 1k+ requests per second.
I tested with the following simple code:
var http = require('http');
var results = [];
var j=0;
// Make 1000 parallel requests:
for (i=0;i<1000;i++) {
http.request({
host:'127.0.0.1',
path:'/'
},function(res){
results.push(res.statusCode);
j++;
if (j==i) { // last request
console.log(JSON.stringify(results));
}
}).end();
}
To purely test what node is capable of and not my home broadband connection the code requests from a local Nginx server. I also avoid console.log until all the requests have returned because it is implemented as a synchronous function (to avoid losing debugging messages when a program crash).
Running the code using time I get the following results:
real 0m1.093s
user 0m0.595s
sys 0m0.154s
That's 1.093 seconds for 1000 requests which makes it very close to 1k requests per second.
The simple code above will generate OS errors if you try to make a lot of requests (like 10000 or more) because node will happily try to open all those sockets in the for loop (remember: the requests don't start until the for loop ends, they are only created). You mentioned that your solution also runs into the same errors. To avoid this you should limit the number of parallel requests you make.
The simplest way of limiting number of parallel requests is to use one of the Limit functions form the async.js library:
var http = require('http');
var async = require('async');
var requests = [];
// Build a large list of requests:
for (i=0;i<10000;i++) {
requests.push(function(callback){
http.request({
host:'127.0.0.1',
path:'/'
},function(res){
callback(null,res.statusCode);
}).end()
});
}
// Make the requests, 100 at a time
async.parallelLimit(requests, 100,function(err, results){
console.log(JSON.stringify(results));
});
Running this with time on my machine I get:
real 0m8.882s
user 0m4.036s
sys 0m1.569s
So that's 10k request in around 9 seconds or roughly 1.1k/s.
Look at the functions available from async.js.
I've found solution for me, it is not very good, but works:
childProcess = require('child_process')
I'm using curl:
childProcess.exec('curl --max-time 20 --connect-timeout 10 -iSs "' + options.url + '"', function (error, stdout, stderr) { }
This allows me to run 800-1000 curl processes simultaneously. Of course, this solution has it's weekneses, like requirement for lots of open file decriptors, but works.
I've tried node-curl bindings, but that was very slow too.