How to get information about the client in node.js - javascript

in this very simple example:
var sys = require("sys"),
http = require("http");
http.createServer(function(request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
response.end("Hello World!");
}).listen(8080);
sys.puts("Server running at http://localhost:8080/");
1.) What kind of information can I get from the client? like browser, screen resolution, etc?
2.) How can I send information from the client to server, like parameter?
thanks!

1) Referrer URL, IP address, User Agent, screen size and other stats.
You can also get geo location but that is more involved.
2) Some data is available in the headers so these are sent on every request - other data such as screen size is a little trickier so you'll want to make an ajax request to send that along.
// Somewhere on your page(s) - here we use jQuery
$(document).ready(function(){
// Check if they have been logged
if ($.cookie('logged') == null ){
// Send screen size and whatever else that is not available from headers
$.post('/logger', { width: screen.width, height: screen.height }, function(res) {
// Set cookie for 30 days so we don't keep doing this
$.cookie('logged', true, { expires: 30 });
});
}
});
// Server side - example is an Express controller
exports.logger = function(req, res) {
var user = {
agent: req.header('user-agent'(, // User Agent we get from headers
referrer: req.header('referrer'), // Likewise for referrer
ip: req.header('x-forwarded-for') || req.connection.remoteAddress, // Get IP - allow for proxy
screen: { // Get screen info that we passed in url post data
width: req.param('width'),
height: req.param('height')
}
};
// Store the user in your database
// User.create(user)...
res.end();
}

You can't get the screen resolution information, but you can get the user agent from Request Header "User-Agent"

Have you read the API docs? The req object is a http.ServerRequest object as documented there. It's HTTP, and such things like resolution are not part of the protocol. What you can get is a user-agent, and from there you might be able to retrieve more information using another service.
Remember that node.js is a standalone app - it's not running in a browser - it's an HTTP Server application that is running in a JS interpreter.

Related

Nodejs server with Express not sending responses to HTTP requests from javascript file, but is receiving the requests and processing them

So I have a node-js server and an apache server on the same machine, and one of the javascript files is sending an HTTP request to the node-js server. The node-js server receives the file, reads the data, puts it in the database, as it should, but it isn't sending back any status codes or data.
Here is the XHTMLRequest send code snippet,
// creates a new http request to be sent to the nodejs server
function createNewUser(username, password, email) {
// The url is the URL of our local nodejs server
var userCreateRequest = new XMLHttpRequest();
userCreateRequest.open( "POST", "http://<machine's IP>:8080/api/users" );
// Create json object for user data
var user = "name="+username+"&password="+password+"&email="+email;
alert(user);
// set content type for http request
userCreateRequest.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
// Event listern for server response
// userCreateRequest.addEventListener("readystatechange", processRequest, false);
// Call process request whenever state changes
userCreateRequest.onreadystatechange = function() {
alert(this.readyState + ", " + this.status);
if (this.readyState == 4 && this.status == 200) {
var response = this.response;
alert(response.name);
}
}
// Send user data to server
userCreateRequest.send(user);
}
And here is the code for the node-js server (with express)
router.route('/users')
.post(function(req, res) { //create a new user
var user = new User();
user.name = req.body.name;
user.password = req.body.password;
user.email = req.body.email;
user.save(function(err) { //add user object to database
if(err)
res.send(err);
res.status(200).json(user);
});
});
As I said above, the code works fine in terms of putting the body of the request in the database and what-not, but the server is not sending back the 200 OK response (or I'm failing to receive it for some reason). The only times I get an alert from onreadystatechange is when it's state 2, status 0, and state 4, status 0.
Try below code snippet.
user.save(function(err, user) {
if(err)
res.send(err);
res.status(200).json(user);
});
It did end up being a CORS issue. I'm still a little iffy on exactly why, but after configuring the express/CORS package to allow requests from the IP and port of my apache server, it started working.
My understanding is that cross origin implies a different domain, where-as both of my servers are (as I understand it) on different ports on the same domain.
Either way, enabling CORS fixed the issue. Thank you to Jaromanda X for pointing it out and getting me on the right track.

nodejs - how to write a client that connects to a nodeJS Server and receives broadcast message

I have the following nodeJS Server that seems to work fine. I would like to write a client that receives message from the server and invokes some JS based on the message.
The steps involved are:
User accesses the url http://server.xyz.com:8080/pa
nodeJS Server receives that call and broadcasts to the connected clients that pa is the api call received.
nodeJS Clients that are connected to the server invoke some JS related to the pa action.
My questions are:
1. How do I make sure the server broadcasts that message like Step 2?
2. How do I write a client that performs Step 3 above.
For the client, I am seeing a lot of references to socket.io, but I am not sure what's the best framework in this case.
server.js
var http = require('http');
http.createServer(function(request, response) {
request.on('error', function(err) {
console.error(err);
response.statusCode = 400;
response.end();
});
response.on('error', function(err) {
console.error(err);
});
response.writeHead(200, {'Content-Type': 'application/json'});
var body=[];
if (request.method === 'GET' && request.url === '/pa') {
response.end(JSON.stringify({"action": "pa"}));
}
else if (request.method === 'GET' && request.url === '/pi') {
response.end(JSON.stringify({"action": "pi"}));
}
else {
response.statusCode = 404;
response.end();
}
}).listen(8080);
If the clients are also in Node.js then you should be able to set up a webhook/push service. Webhooks are used in every major API today, they are extremely prevalent in Slack's and Microsoft's. The following is inspired by Microsoft's Office 365 push and streaming services APIs.
Add a route that clients can POST to, let's call this /subscribe. In the request body to the /subscribe route, they include a url, we'll call it the paReceiverUrl.
If a client wishes to be notified when someone else messages the /pa endpoint, a client can send a request to the /subscribe endpoint and include the location to notify them at - the paReceiverUrl (along with any other info, probably some auth data). You should store the subscribed clients' information in non-volatile storage to be safe, have a database do this for you or simply write it to a file until it gets more complex.
Now your '/pa' route becomes something more like this (your Step 2):
if (request.method === 'GET' && request.url === '/pa') {
// Send response to the request sender
response.end(JSON.stringify({"action": "pa"}));
// Get subscribed clients' information
var subscribedClients = <Read from file or db>
// Broadcast
for(var i = 0; i < subscribedClientsUrls.length; i++) {
// Send them a request at subscribedClients[i].paReceiverUrl
}
}
Now for your Step 3, the clients simply just need to have their server accepting requests at their paReceiverUrl and there they can invoke whatever JS they want.
If you need it to be more real-time then I would go with the web sockets protocol to set a persistent HTTP connection to stream data over.

Pass cookie as part of node.js request

I am using the request package to create my server side requests. I wrote authentication middleware that checks for a cookie/session id for all requests. Therefore, is there a way I include the user's cookie as part of the request? Here is my current code:
var cookie = parseCookie.parseCookie(req.headers.cookie);
request('http://localhost:3000/users/api', function(error, response, body) {
console.log(body); //this console.logs my login page since requests w/o valid cookies get redirected to login
res.render('../views/admin');
});
Currently, this returns 'no cookie found' in the console. However, if I turn off my authentication middleware, the code above works as intended.
Additional info:
The cookie I want is the end user's cookie located on the browser. The end user's cookie is created by the app whenever the user logs in.
Update - solution attempt 1:
I tried this from the documentation:
var cookie = parseCookie.parseCookie(req.headers.cookie);
var cookieText = 'sid='+cookie;
var j = request.jar();
var cookie = request.cookie(cookieText);
var url = 'http://localhost:3000/users/api';
j.setCookie(cookie, url);
request({url: url, jar: j}, function(error, response, body) {
request('http://localhost:3000/users/api');
});
However, the console is still returning 'no cookie found'
Can someone help?
Thanks in advance!
Let me explain about cookies and that will probably show you why it's hard to get the cookie you want.
When your user's browser logs into http://localhost:3000, that server creates a login cookie and returns it as part of the login response.
When the browser receives that cookie, it saves that cookie persistently within the browser and it associates that cookie with the http://localhost:3000 domain and port.
When the user again makes a request to http://localhost:3000, the browser sends all cookies it has previously saved for that particular domain and port with the request to the server.
When the server receives the request, it can examine any cookies that are sent with the request.
When the browser then makes a request to a different server or even the same server, but on a different port, the browser does NOT send the previously saved cookies with that request because those cookies belong to a different server and port. The browser goes to great security lengths to send cookies only to the servers that the cookies belong to. Since cookies often provide login access, you can clearly see why it's important that things like login credential cookies are not sent to servers they should not be sent to.
Now, on to your node.js code. You show a block of node.js code that is trying to access the same http://localhost:3000 server. But, the cookies are stored in the user's browser. Your node.js code cannot get them from the browser as the browser guards them and will only reveal them when the browser itself sends a request to http://localhost:3000.
If you do actually have the right cookie in your node.js code, then you can set it on your request like this:
request({url: 'http://localhost:3000/users/api', headers: {Cookie: somedataHere}}, function(error, response, body) {
console.log(body); //this console.logs my login page since requests w/o valid cookies get redirected to login
res.render('../views/admin');
});
Relevant documentation for custom headers in the request module is here.
Answer:
var cookie = parseCookie.parseCookie(req.headers.cookie);
var cookieText = 'sid='+cookie;
var options = {
url: 'https://api.github.com/repos/request/request',
headers: {
'User-Agent': 'request'.
'host': 'localhost:3000',
'cookie': cookieText //this is where you set custom cookies
}
};
function callback(error, response, body) {
if (!error && response.statusCode == 200) {
var info = JSON.parse(body);
console.log(info.stargazers_count + " Stars");
console.log(info.forks_count + " Forks");
}
}
request(options, callback);

Cross-domain jQuery.getJSON from a Node.JS (using express) server does not work in Internet Explorer

This is an annoying problem, and I don't suppose that it's only IE that has this problem. Basically I have a Node.js server, from which I am making cross-domain calls to get some JSON data for display.
This needs to be a JSONP call and I give a callback in the URL. What I am not sure is, how to do this?
So the website (domainA.com) has an HTML page with a JS script like this (all works fine in Firefox 3):
<script type="text/javascript">
var jsonName = 'ABC'
var url = 'http://domainB.com:8080/stream/aires/' //The JSON data to get
jQuery.getJSON(url+jsonName, function(json){
// parse the JSON data
var data = [], header, comment = /^#/, x;
jQuery.each(json.RESULT.ROWS,function(i,tweet){ ..... }
}
......
</script>
Now my Node.js server is very simple (I'm using express):
var app = require('express').createServer();
var express = require('express');
app.listen(3000);
app.get('/stream/aires/:id', function(req, res){
request('http://'+options.host+':'+options.port+options.path, function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body); // Print the google web page.
res.writeHead(200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Credentials': 'true'
});
res.end(JSON.stringify(JSON.parse(body)));
}
})
});
How can I change these two so they will work with cross-domain GET in IE? I have been searching the internet and there seem to be a few different things like jQuery.support.cors = true; which does not work. There also seem to be a lot of lengthy workarounds.
There is no real 'ideal' design pattern which I have been able to find for this type of thing.
Seeing as I have control over both the web page and the cross domain web service I'm sending to what is the best change to make to ensure compatability across all IE versions along with FireFox, Opera, Chrome etc?
Cheers!
Say we have two servers, myServer.com and crossDomainServer.com, both of which we control.
Assuming we want a client of myServer.com to pull some data from crossDomainServer.com, first that client needs to make a JSONP request to crossDomainServer.com:
// client-side JS from myServer.com
// script tag gets around cross-domain security issues
var script = document.createElement('script');
script.src = 'http://crossDomainServer.com/getJSONPResponse';
document.body.appendChild(script); // triggers a GET request
On the cross-domain server we need to handle this GET request:
// in the express app for crossDomainServer.com
app.get('/getJSONPResponse', function(req, res) {
res.writeHead(200, {'Content-Type': 'application/javascript'});
res.end("__parseJSONPResponse(" + JSON.stringify('some data') + ");");
});
Then in our client-side JS we need a global function to parse the JSONP response:
// gets called when cross-domain server responds
function __parseJSONPResponse(data) {
// now you have access to your data
}
Works well across a wide variety of browsers, IE 6 included.
The following code shows how to handle the GET request (using express) and how to wrap the JSON response using the callback given:
app.get('/foo', function(req, res){
res.header('Content-Type', 'application/json');
res.header('Charset', 'utf-8')
res.send(req.query.callback + '({"something": "rather", "more": "pork", "tua": "tara"});');
});

Node.js http.ServerRequest response never arrives

I'm creating a reverse HTTP proxy using Node.js for fun. The code is pretty simple at the moment. It listens on 127.0.0.1:8080 for HTTP requests and forwards these to hostname.com, responses from hostname.com are then forwarded back to the client. Nothing fancy is done yet such as rewriting redirect headers, etc. The code is as follows:
var http = require('http');
var server = http.createServer(
function(request, response) {
var proxy = http.createClient(8080, 'hostname.com')
var proxyRequest = proxy.request(request.method, request.url, request.headers);
proxyRequest.on('response', function(proxyResponse) {
proxyResponse.on('data', function(chunk) {
response.write(chunk, 'binary');
});
proxyResponse.on('end', function() {
response.end();
});
response.writeHead(proxyResponse.statusCode, proxyResponse.headers);
});
request.on('data', function(chunk) {
proxyRequest.write(chunk, 'binary');
});
request.on('end', function() {
proxyRequest.end();
});
proxyRequest.on('close', function(err) {
if (err) {
console.log('close error: ' + err + ' for ' + request.url);
}
});
});
server.listen(8080);
server.on('clientError', function(exception) {
console.log('boo a clientError occured :(');
});
All appears to work well until I browse to a page that requires many additional resources (such as images) to be fetched. Naturally the browser will generate a number of GET requests to the reverse proxy to fetch these additional resources.
When I do browse to such a page some of the http.ServerRequests for the additional resources never receive responses. If I restart the page request it almost always results in success as all the resources that were successfully fetched on the first attempt were cached (hence the browser doesn't try GET them again) and so now the browser only needs to grab a few missing ones.
At a guess I would imagine I'm hitting some kind of connection limit although I'm not sure. Any help would be greatly appreciated!
If you set up Wireshark on the proxy, you'll almost certainly see what's happening. (Note that you may need a second machine for this, because some TCP/IP stacks don't provide anything that Wireshark can listen on for loopback traffic - see this)
I'm almost certain that the problem(s) you are running into here are all down to the Connection: header - proxies MUST parse this header and handle it correctly. At a guess, I would say your code is handling the first request in a Connection: keep-alive stream and ignoring the rest. As a proxy, you are supposed to parse and remove/replace this header, and any associated headers (in this case the Keep-Alive: header), before forwarding the request to the server.
If you want to build a HTTP/1.1 proxy, it's very important that you read RFC 2616 and adhere to the many, many rules that it places on their behaviour. The particular problem you are running into here is documented in section 14.10.

Categories