I'm building my first website using Node.js as the webserver and fetch to request to the server. When the MySQL database is run from localhost, my requests take less than 30ms each and there are no issues with the webpage. However, when the server is hosted somewhere else, I get very long response times ONLY when the resulting data from the query returns '[].' I'm trying to figure out why this would be the case only when requesting from a non-localhost database (or why this is happening in general).
See the difference in response times between these two links:
{ link removed }
{ link removed }
You'll see that the one which returns '[]' takes significantly longer than the other.
Here's the serverside code used:
router.get('/getgames/:matchid/:season', (req, res) => {
let sql = `SELECT * FROM games WHERE match_id='${req.params.matchid}' AND season='${req.params.season}'`;
let query = db.query(sql, (err, result) => {
if (err) throw err;
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(result));
});
});
I don't think it's necessary to include the clientside code because the request takes just as long when it's put into the URL bar.
It's also important to note that the tables being accessed have less than 10 rows, so I don't think indexing/posting my explains would actually fix this problem (or would it??).
Edit: I've added a covering index to the 'games' table, and the query is still just as slow.
Thanks for your help.
For some reason I'm not sure of, node seems to be waiting for keep-alive to be closed before sending that empty array body. Disabling keep-alive mitigates this issue, see https://nodejs.org/api/http.html#http_server_keepalivetimeout (the default timeout of keep-alive is 5s)
I suppose having a longer body overcomes this issue, that's why it's returning immediately for sets with actual sql results.
Related
I send JSON requests one by one to the nodejs server. After 6th request, server can't reply to the client immediately and then it takes a little while(15 seconds or little bit more and send back to me answer 200 ok) It occurs a writing json value into MongoDB and time is important option for me in terms with REST call. How can I find the error in this case? (which tool or script code can help me?) My server side code is like that
var controlPathDatabaseSave = "/save";
app.use('/', function(req, res) {
console.log("req body app use", req.body);
var str= req.path;
if(str.localeCompare(controlPathDatabaseSave) == 0)
{
console.log("controlPathDatabaseSave");
mongoDbHandleSave(req.body);
res.setHeader('Content-Type', 'application/json');
res.write('Message taken: \n');
res.write('Everything all right with database saving');
res.send("OK");
console.log("response body", res.body);
}
});
My client side code as below:
function saveDatabaseData()
{
console.log("saveDatabaseData");
var oReq = new XMLHttpRequest();
oReq.open("POST", "http://192.168.80.143:2800/save", true);
oReq.setRequestHeader("Content-type", "application/json;charset=UTF-8");
oReq.onreadystatechange = function() {//Call a function when the state changes.
if(oReq.readyState == 4 && oReq.status == 200) {
console.log("http responseText", oReq.responseText);
}
}
oReq.send(JSON.stringify({links: links, nodes: nodes}));
}
--Mongodb save code
function mongoDbHandleSave(reqParam){
//Connect to the db
MongoClient.connect(MongoDBURL, function(err, db)
{
if(!err)
{
console.log("We are connected in accordance with saving");
} else
{
return console.dir(err);
}
/*
db.createCollection('user', {strict:true}, function(err, collection) {
if(err)
return console.dir(err);
});
*/
var collection = db.collection('user');
//when saving into database only use req.body. Skip JSON.stringify() function
var doc = reqParam;
collection.update(doc, doc, {upsert:true});
});
}
You can see my REST call in google chrome developer editor. (First six call has 200 ok. Last one is in pending state)
--Client output
--Server output
Thanks in advance,
Since it looks like these are Ajax requests from a browser, each browser has a limit on the number of simultaneous connections it will allow to the same host. Browsers have varied that setting over time, but it is likely in the 4-6 range. So, if you are trying to run 6 simultaneous ajax calls to the same host, then you may be running into that limit. What the browser does is hold off on sending the latest ones until the first ones finish (thus avoiding sending too many at once).
The general idea here is to protect servers from getting beat up too much by one single client and thus allow the load to be shared across many clients more fairly. Of course, if your server has nothing else to do, it doesn't really need protecting from a few more connections, but this isn't an interactive system, it's just hard-wired to a limit.
If there are any other requests in process (loading images or scripts or CSS stylesheets) to the same origin, those will count to the limit too.
If you run this in Chrome and you open the network tab of the debugger, you could actually see on the timline exactly when a given request was sent and when its response was received. This should show you immediately whether the later requests are being held up at the browser or at the server.
Here's an article on the topic: Maximum concurrent connections to the same domain for browsers.
Also, keep in mind that, depending upon what your requests do on the server and how the server is structured, there may be a maximum number of server requests that can efficiently processed at once. For example, if you had a blocking, threaded server that was configured with one thread for each of four CPUs, then once the server has four requests going at once, it may have to queue the fifth request until the first one is done causing it to be delayed more than the others.
I want to get the content of a webpage by running javascript code on NodeJs . I want the content to be exactly the same as what I see in the browser.
This is the URL :
https://www.realtor.ca/Residential/Single-Family/17219235/2103-1185-THE-HIGH-STREET-Coquitlam-British-Columbia-V3B0A9
I use the following code but I get 405 in response.
var fs = require('fs');
var link = 'https://www.realtor.ca/Residential/Single-Family/17219235/2103-1185-THE-HIGH-STREET-Coquitlam-British-Columbia-V3B0A9';
var request = require('request');
request(link, function (error, response, body) {
fs.writeFile("realestatedata.html", body, function(err) {
if(err) {
console.log('error in saving the file');
return console.log(err);
}
console.log("The file was saved!");
});
})
The file which is saved is not related to what I can see in the browser.
I think a real answer will be easier to understand since my comment was truncated.
It seems the method of the request you send is not supported by the server (405 Method Not Allowed - The method specified in the Request-Line is not allowed for the resource identified by the Request-URI. The response MUST include an Allow header containing a list of valid methods for the requested resource.). Do you have more information about the HTTP response.
Have you tried the following code instead of yours ?
request('https://www.realtor.ca/Residential/Single-Family/17219235/2103-1185-THE-HIGH-STREET-Coquitlam-British-Columbia-V3B0A9').pipe(fs.createWriteStream('realestatedata.html'))
You could also have a look at In Node.js / Express, how do I "download" a page and gets its HTML?.
Note that anyway the page will not render the same way when you only open the html since it also requires many other resources (110 requests are done when display the page).
I think the following answer can help you to download the whole page.
https://stackoverflow.com/a/34935427/1630604
I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});
I'm having trouble running a Node.js server with Adobe Brackets. Once in live preview (the URL is http://localhost:SOMERANDOMPORT/path/to/file.html), I start the server. If I type http://localhost:3000/test straight into another tab, it displays the correct JSON.
I then added an event function to an element in file.html that upon clicking it makes an AJAX request to my server and uses the response to change some of its inner HTML. However, clicking the element in live preview fails, and the error callback gets called instead.
How can I fix this? I suspect it has to do with the fact that the AJAX request sends to http://localhost:SOMERANDOMPORT/test rather than http://localhost:3000/test, but I can't seem to find a solution.
Everything runs locally. Below is my server:
var express = require('express');
var mysql = require('mysql');
var app = express();
app.get('/test', function(req, res){
var connection = mysql.createConnection(...);
connection.query("SELECT author FROM posts", function(err, results) {
if (err) {
console.log(err);
console.log('Error on retrieving data.');
res.send(err);
return;
}
console.log(results[results.length - 1]);
res.send(results[results.length - 1]); // return last row
});
connection.end();
});
app.listen(3000);
console.log('Listening on port ' + port);
And the event function:
function getAuthor() {
$.ajax({
type: 'GET',
url: '/test',,
success: function(data, status) {
$('.author').text('Authored by ' + data.author);
},
error: function(jqXHR, status, error) { // this always get called
$('.author').text('Something went wrong.');
}
});
}
I appreciate any help.
The simplest fix is to point Live Preview directly at your own Node server, letting it serve up the pages itself from the correct port number (rather than serving the pages from Brackets's built-in server that's on a different port). See instructions on the Brackets wiki under "Using your own backend."
The downside is that HTML live updating is disabled - but you'll still get CSS live updating, and Brackets falls back on a simpler "live reload" on save for HTML content.
To keep live HTML updating enabled you'd need to work around the port number difference somehow. You could hardcode a localhost:3000 base URL for testing, but you'll run same-origin problems due to the port numbers not matching. Working around that would be pretty involved (set up CORS on your Node server, etc.).
One other option for keeping the full Live Preview experience is to shim all your $.ajax() calls so they return hardcoded dummy data without hitting the server. If you're already doing some mocking for unit tests, you might be able to reuse that existing infrastructure for this.
I'm creating a reverse HTTP proxy using Node.js for fun. The code is pretty simple at the moment. It listens on 127.0.0.1:8080 for HTTP requests and forwards these to hostname.com, responses from hostname.com are then forwarded back to the client. Nothing fancy is done yet such as rewriting redirect headers, etc. The code is as follows:
var http = require('http');
var server = http.createServer(
function(request, response) {
var proxy = http.createClient(8080, 'hostname.com')
var proxyRequest = proxy.request(request.method, request.url, request.headers);
proxyRequest.on('response', function(proxyResponse) {
proxyResponse.on('data', function(chunk) {
response.write(chunk, 'binary');
});
proxyResponse.on('end', function() {
response.end();
});
response.writeHead(proxyResponse.statusCode, proxyResponse.headers);
});
request.on('data', function(chunk) {
proxyRequest.write(chunk, 'binary');
});
request.on('end', function() {
proxyRequest.end();
});
proxyRequest.on('close', function(err) {
if (err) {
console.log('close error: ' + err + ' for ' + request.url);
}
});
});
server.listen(8080);
server.on('clientError', function(exception) {
console.log('boo a clientError occured :(');
});
All appears to work well until I browse to a page that requires many additional resources (such as images) to be fetched. Naturally the browser will generate a number of GET requests to the reverse proxy to fetch these additional resources.
When I do browse to such a page some of the http.ServerRequests for the additional resources never receive responses. If I restart the page request it almost always results in success as all the resources that were successfully fetched on the first attempt were cached (hence the browser doesn't try GET them again) and so now the browser only needs to grab a few missing ones.
At a guess I would imagine I'm hitting some kind of connection limit although I'm not sure. Any help would be greatly appreciated!
If you set up Wireshark on the proxy, you'll almost certainly see what's happening. (Note that you may need a second machine for this, because some TCP/IP stacks don't provide anything that Wireshark can listen on for loopback traffic - see this)
I'm almost certain that the problem(s) you are running into here are all down to the Connection: header - proxies MUST parse this header and handle it correctly. At a guess, I would say your code is handling the first request in a Connection: keep-alive stream and ignoring the rest. As a proxy, you are supposed to parse and remove/replace this header, and any associated headers (in this case the Keep-Alive: header), before forwarding the request to the server.
If you want to build a HTTP/1.1 proxy, it's very important that you read RFC 2616 and adhere to the many, many rules that it places on their behaviour. The particular problem you are running into here is documented in section 14.10.