I am writing a bare minimal node.js server to understand how it works. From what I can tell this line:
res.writeHead(200);
does nothing. If I remove it I get the exact same behavior from the browser. Does it have any purpose? Is it OK to remove it?
// https://nodejs.dev/learn/the-nodejs-http-module
const http = require('http');
const server = http.createServer(handler);
server.listen(3000, () => {
console.log('node: http server: listening on port: ' + 3000);
});
function handler(request, response) {
res.writeHead(200);
res.end('hello world\n');
}
Is it some how related to http headers?
The default http status is 200, so you do not have to tell the response object that the status is 200. If you don't set it, the response will automatically be 200. You can remove res.writeHead(200); and similarly, you don't need the express version of that which would be res.status(200).
The other thing that res.writeHeader(200) does is cause the headers to be written out in the response stream at that point. You also don't need to call that yourself because when you do res.send(...), the headers will automatically be sent first (if they haven't already been sent). In fact, the res.headersSent property keeps track of whether the headers have been sent yet. If not, they are sent as soon as you call any method that starts sending the body (like res.send() or res.write(), etc...).
Is it OK to remove it?
Yes, in this context it is OK to remove it.
Related
I am trying to setup a very simple nodeJS HTTP server. When I call it from the browser, like this http://localhost:8081, it works fine, but when I call is using a JS fetch() method, I get a 404 error:
GET http://localhost/:8081?q=hi
JS:
fetch(":8081/?q=hi")
NODE JS:
const requestListener = function (req, res) {
res.writeHead(200);
res.end('Hello, World!');
}
const server = http.createServer(requestListener);
server.listen(8081);
Every thing is fine, you just need to enable cors that's it, use the below code
const http = require('http')
const requestListener = function (req, res) {
const headers = {
'Access-Control-Allow-Origin': '*', /* #dev First, read about security */
'Access-Control-Allow-Methods': 'OPTIONS, POST, GET',
'Access-Control-Max-Age': 2592000, // 30 days
/** add other headers as per requirement */
};
res.writeHead(200, headers);
res.end(JSON.stringify({"key":"value"}));
}
const server = http.createServer(requestListener);
server.listen(8081);
If you are running both frontend and backend code on the same server then you don’t have to use complete url while if you are running fronted and backed on different server you need to enable cors and use complete url.
When you're calling your local server through JS fetch, you don't need to add the port number you can call it like below:
fetch('/?q=hi')
the URL handed to fetch function looks wronge, it would work if you adjust it to:
fetch('http://localhost:8081/?q=hi');
// or
fetch('/?q=hi');
it should work just fine,
and ensure that you enable the cors if you need to works from any domain
I'm using the npm request library and am running into an issue where the request is never sent if I call express's res.send() after calling request. I realize the request callback won't fire if I close the connection, but I'm not even seeing the request being sent in the first place.
This code is being executed on RunKit (formerly TonicDev), an online code editor that allows code execution via endpoints. I'm not seeing this issue on my local machine, so it seems like it may have to do with RunKit. Anyone have any ideas as to what's going on here or how I might work around this?
You can execute the code yourself by going to:
https://runkit.com/gragland/58056bc6e9d9ed00130c84d5 and clicking the endpoint link at the top.
// Helper to return a RunKit compatible express app (runkit.com/tonic/express-endpoint)
var tonicExpress = require("#runkit/tonic/express-endpoint/1.0.0")
// Provide the exports object to the tonicExpress helper
var app = tonicExpress(module.exports)
var request = require('request')
app.get("/", function(req, res){
var request_number = 9
request({
// To see if request is sent go to: https://requestb.in/1coqbqn1?inspect
url: 'http://requestb.in/1coqbqn1',
method: 'POST',
json: {
request_number: request_number,
message: 'hello'
}
})
// The line below has to be commented out for the above request to be sent
// I don't care about the request callback() firing, I just want the request to be sent
res.send('Done')
})
We have a Node/Restify app and I'm having a problem with this route (some code has been omitted, so please ignore missing identifiers):
server.post('/report', requireAuth, function (req, res, next) {
generateReport(err, result, extension, contentType) {
// Filename for the report
var filename = 'Report_' + moment().format('YYYY-MM-DD_HH:mm:ss')
// Output the result directly with a lower-level APIs
res.writeHead(200, {
'Content-Length': Buffer.byteLength(result),
'Content-Type': contentType,
'Content-Disposition': 'attachment;filename='+filename+'.'+extension
})
res.write(result, 'utf8')
res.end()
console.log('ended')
return next()
}
})
The problem is that, after sending the output, the connection with the client is not closed, even if I call res.end()! (res is just a wrapper around the standard HTTP ServerResponse module).Commenting out return next() doesn't make any difference.
I can see on the console 'ended' and I know the content is sent to the client because if I force the termination of the connection (by simply terminating the Node process), the client has all the output.
What am I doing wrong?
EDIT
Removing the Content-Length header, instead, makes it work. Why is that? (The advice to use that header is part of the documentation)
EDIT2
The value of Buffer.byteLength(result) is 4478 bytes, but the data downloaded by the client (Postman) is 4110, and that's likely why the connection is not being closed (the client is expecting to receive more data). How is that possible?
I have just got started with Node.js and I am trying to write a simple http client that just sends a post request to a server.
var req = http.request(
{
host : 'localhost',
port: 3000,
url : '/',
method:'POST'
},function(res){
console.log('res status - ' + res.statusCode);
res.on('data', function(){}); //<--------
}
);
(I have omitted the code that writes to request and calls req.end()). I have observed if the last line is commented out and the client doesn't read the response, the client doesn't terminate. What is the reason behind this ?
This behaviour is because of following Node implementation:
Issuing http.request() creates an object http.ClientRequest with below behaviour
If no 'response' handler(i.e 2nd argument of request method) is added, then the response will be entirely discarded. However, if you add a 'response' event handler, then you must consume the data from the response object, either by calling response.read() whenever there is a 'readable' event, or by adding a 'data' handler, or by calling the .resume() method. Until the data is consumed, the 'end' event will not fire. Also, until the data is read it will consume memory that can eventually lead to a 'process out of memory' error.
I'm creating a reverse HTTP proxy using Node.js for fun. The code is pretty simple at the moment. It listens on 127.0.0.1:8080 for HTTP requests and forwards these to hostname.com, responses from hostname.com are then forwarded back to the client. Nothing fancy is done yet such as rewriting redirect headers, etc. The code is as follows:
var http = require('http');
var server = http.createServer(
function(request, response) {
var proxy = http.createClient(8080, 'hostname.com')
var proxyRequest = proxy.request(request.method, request.url, request.headers);
proxyRequest.on('response', function(proxyResponse) {
proxyResponse.on('data', function(chunk) {
response.write(chunk, 'binary');
});
proxyResponse.on('end', function() {
response.end();
});
response.writeHead(proxyResponse.statusCode, proxyResponse.headers);
});
request.on('data', function(chunk) {
proxyRequest.write(chunk, 'binary');
});
request.on('end', function() {
proxyRequest.end();
});
proxyRequest.on('close', function(err) {
if (err) {
console.log('close error: ' + err + ' for ' + request.url);
}
});
});
server.listen(8080);
server.on('clientError', function(exception) {
console.log('boo a clientError occured :(');
});
All appears to work well until I browse to a page that requires many additional resources (such as images) to be fetched. Naturally the browser will generate a number of GET requests to the reverse proxy to fetch these additional resources.
When I do browse to such a page some of the http.ServerRequests for the additional resources never receive responses. If I restart the page request it almost always results in success as all the resources that were successfully fetched on the first attempt were cached (hence the browser doesn't try GET them again) and so now the browser only needs to grab a few missing ones.
At a guess I would imagine I'm hitting some kind of connection limit although I'm not sure. Any help would be greatly appreciated!
If you set up Wireshark on the proxy, you'll almost certainly see what's happening. (Note that you may need a second machine for this, because some TCP/IP stacks don't provide anything that Wireshark can listen on for loopback traffic - see this)
I'm almost certain that the problem(s) you are running into here are all down to the Connection: header - proxies MUST parse this header and handle it correctly. At a guess, I would say your code is handling the first request in a Connection: keep-alive stream and ignoring the rest. As a proxy, you are supposed to parse and remove/replace this header, and any associated headers (in this case the Keep-Alive: header), before forwarding the request to the server.
If you want to build a HTTP/1.1 proxy, it's very important that you read RFC 2616 and adhere to the many, many rules that it places on their behaviour. The particular problem you are running into here is documented in section 14.10.