I am running this script in node:
var http = require('http');
var server = http.createServer(function (request, response) {
response.writeHead(200, { "Content-Type": "text/plain" });
response.write('Hello World\n');
response.end('Goodbye World', 'utf8', function() {console.log(response.body);
});
server.listen(8000);
console.log('running');
When I load the page (localhost:8000) in Chrome I see:
Hello World
Goodbye World
So far so good, but I'm trying to understand where in the response object the data ('Hello World/nGoodbyeWorld') is. That's why I have 'console.log(response.body)' as the callback in response.end() ( the node http documentation says that the callback will be executed when the response has finished streaming). However the console.log just gives 'undefined'. When I console.log the whole response object it console.logs the response object ok but I can't see any data or body in there even though it has 'hasBody:true'.
So the question is:
a) is there a response.body? I am thinking there has to be one otherwise nothing would show in the browser window.
b) if so how can i access it and why doesn't my way work?
The closest answer i could find was this one: Where is body in a nodejs http.get response? , but I tried adding
response.on('data', function(chunk) {
body += chunk;
});
response.on('end', function() {
console.log(body);
});
, as suggested there and it didn't work. Also people there are just answering HOW you can access the data, not WHY the response.body isn't easily accessible.
Thanks
There is no response body, the data you write to the response stream is just sent to the client as you write it (for the most part). It wouldn't make sense to keep in memory everything ever written to the response.
The same goes for requests. You have to buffer the incoming data yourself if you want that, it is not done behind the scenes, it is merely streamed in.
Related
I have a Google Cloud Function based on node.js 8 and I'd like to process the body of the IncomingMessage object. I can't access the body via req.body as lined out in the Google Examples. I get req.body is undefined.
If I log the req object, I get an IncomingMessage object, hence I try to read the body as explained here and I end up with the following implementation.
'use strict';
exports.insertSuccessfulConsent = (req, res) => {
console.log(`METHOD: ${req.method}`);
console.log(`HEADERS: ${JSON.stringify(req.headers)}`);
let body = "";
req.on('data', chunk => {
body += chunk.toString();
});
req.on('end', () => {
console.log(body);
});
console.log('Body: ' + body);
let message = 'POST processed';
res.status(200).send(message);
};
Unfortunately the body is empty, although the HTTP POST request has data in the body. This is my test call:
curl -X POST HTTP_TRIGGER_ENDPOINT -H "Content-Type:application/json" -d '{"name":"Jane"}'
Headers and HTTP Methods are correct in the log, only the body is missing.
Question is: How to I get the Body from the req object?
I'm not sure but in the example written by Google they are referencing Express and you referenced a solution for an issue with NodeJS plain http module. I'm not sure if it fits here, despite that being used by express itself.
Is your listener for the 'data' event even being called? If not, that's the reason why your body is empty, you defined it as empty before and your listener never got called by the data event.
The reason why your req.data is set as undefined is probably because it is undefined by default in Express. You will need a parser as express points out on its documentation.
You can use something like body-parser module for populating your req.body.
I hope it helps.
New to Node.js I do understand that createReadStream() function is better for the performance than readFile(), because createReadStream() reads and writes data in chucks while readFile() first reads the whole content. Thus if the file is large, readFile() function might take longer before data can be processed futher. Thus I choose to create server using createReadStream() function as following.
// Create a server with fs.createReadStream(), better performance and less memory usage.
http.createServer( function (request, response) {
// Parse the request containing file name
var pathname = url.parse(request.url).pathname;
// Create a readable stream.
var readerStream = fs.createReadStream(pathname.substr(1));
// Set the encoding to be UTF8.
readerStream.setEncoding('UTF8');
// Handle stream events --> data, end and error
readerStream.on('data', function(chunk) {
// Page found
// HTTP Status: 200 : OK
// Content Type: text/plain
response.writeHead(200, {'Content-type': 'text/html'});
// Write the content of the file to response body.
response.write(chunk);
console.log('Page is being streamed...');
});
readerStream.on('end', function() {
console.log('Page is streamed and emitted successfully.');
});
readerStream.on('error', function(err) {
// HTTP Status: 404 : NOT FOUND
// Content Type: text/plain
response.writeHead(404, {'Content-type': 'text/html'});
console.log('Page streaming error: ' + err);
});
console.log('Code ends!');
}).listen(8081);
// Console will print the message
console.log('Server running at http://127.0.0.1:8081/');
My .html or .txt file contains three short lines of text. After starting my server I visit my web page by going to http://127.0.0.1:8081/index.html. Everything works fine and the content of index.html is echoed on the browser.
But on the tab of the browser, the loader icon keeps turning like it keeps loading for about 1 minute.
Is that normal with Node.js server? Does the icon just keep turning, but costs nothing to the server? Or do I miss something and icon is not supposed to keep turning?
It doesn't look like you are ending your response. The browser probably thinks the request isn't finished and thus continues to "load".
If you look at the Network tab in the developer console you might see the request hasn't finished.
You should be sending response.end()
This method signals to the server that all of the response headers and body have been sent; that server should consider this message complete. The method, response.end(), MUST be called on each response.
I believe you should be calling response.end() in both the readerStream.on('end' and readerStream.on('error' callbacks after you write the head. This will tell the browser the request is finished and it can stop the loading action.
-- The background to situation --
I'm making an e-form signup for a client of our business marketing strategy service blah blah blah...
the form is done and looks great. now I need to hook it up to the existing API of the service our business uses to hold/sort/query/etc the submitted information.
I'm a very junior developer and the API is very complex. I just want to make sure my ES6/javascript is in proper working order. The ajax calls are working, no bugs in my code etc. So it seemed the quickest easiest thing to do in order to test things was just make a simple local server so I can test my calls get everything working BEFORE I start going through tons of API documentation and getting it properly hooked up to our service. The first call seems to work fine. But I couldn't get my lil' baby server to "respond" properly with some static info to parse through. I'm primarily a front-end developer, but I'm obsessed with figuring this little server problem out at this point... So help would be VERY appreciated.
-- the fetch request --
fetch('http://localhost:4000/send-zip')
.then(
(response) => {
response.json(),
console.log('begin fetch to local server'),
console.log(response),
populate_store_selector(response)
})
.catch(
(error)=> console.log('basic fetch request failed' + error)
)
-- that other function in case people ask --
(it is simply meant to iterate through and populate an html
input type="select" )
function populate_store_selector(arg) {
for (i of arg) {
let new_option = document.createElement('option')
new_option.innerHTML = arg[i]
select_shop.appendChild(new_option)
}
}
-- my little baby server --
const express = require('express')
const server = express()
server.use(function (req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
server.get('/send-zip', function (req, res) {
res.send({ "options": ['option1', 'option2', 'option3'] })
})
server.listen(4000, () => console.log('mock server listening on port 4000!'))
the server works just fine and does it's job OTHER than I'm never able to get it to send a JSON object back :(
I've tried lots of things so far. Honestly it doesn't matter as my request on the front end works just fine but I'm too obsessed to let this go right now...
-- what the console.log() shows in the browser --
begin fetch to local server
main.js:104
Response {type: "cors", url: "http://localhost:4000/send-zip", redirected: false, status: 200, ok: true, …}body: (...)bodyUsed: trueheaders: Headers {}ok: trueredirected: falsestatus: 200statusText: "OK"type: "cors"url: "http://localhost:4000/send-zip"__proto__: Response
main.js:108
basic fetch request failedTypeError: arg is not iterable
You might try parsing the response after the response stream completes, and then taking action on the data.
fetch('http://localhost:4000/send-zip')
.then(
(response) => {
return response.json();
}).then(
(response_json) => {
console.log('begin fetch to local server'),
console.log(response_json),
populate_store_selector(response_json)
})
.catch(
(error)=> console.log('basic fetch request failed' + error)
)
The reason that you need to include the additional .then step and return response.json() is that the http response object returns the body data as a readable stream.
The JSON function is designed to accept a stream and convert it into JSON after the stream completes. This may feel somewhat unintuitive for anyone familiar with axios or other AJAX convenience libraries, as that part of the process is abstracted from view.
What this basically means is that after you wait for the response http object to be returned, you need to wait again for the stream to also complete.
There are a few different methods available which can act upon a stream upon completion including arrayBuffer,blob, and text (there are a few more I think as well). Usually they tend to convert the data into the format you prefer after it has completed.
I'm having trouble with something very basic. Going through node.js in Action (great book so far!) and I can't get this simple example to work. Perhaps it's because the stream api was updated after the book came out. I'm not sure. Anyway, here's the code:
var http = require('http');
var server = http.createServer(function (req, res) {
req.on('data', function (chunk) {
console.log("Chuck: ", chunk);
});
req.on('end', function () {
console.log("End of Request");
res.end('yay');
});
}).listen(3000);
The console.log('Chunk: ', chunk) never fires. It's almost as if the data events never fire, but according to the documentation the presence of the data handler should switch the readable stream (req) into flowing mode. Am I missing something?
Any help would rock!
The above code is current initally the request body is undefined you have to pass request data inorder to execute this line.
req.on('data', function (chunk) {
console.log("Chuck: ", chunk);
});
Use postman to send data in the request this line will be executed
I really can't figure out what I've done wrong. I've spent about half an hour looking at this code and re-reading code that essentially does the same thing and works. The 'data' event and corresponding callback is never triggered.
var http = require("http");
http.createServer(function(request, response){
response.writeHead(200);
console.log('Executing');
request.on('data', function(chunk){
console.log('data being read');
console.log(chunk.toString());
});
request.on('end', function(){
console.log('done');
response.end();
});
}).listen(8080);
Please help
You probably aren't sending a request body, so the data and end event don't fire. Trying sending a POST or PUT request. If you use a GET request with a query string, you will fire the end event, but not data.