I'm trying to write a simple Node.js http server which stops and exits its process when a given url is accessed: in this case, /stop. This is my program so far:
var http = require('http');
var server = http.createServer(function(request,response) {
response.end(function(){
if (request.url === '/stop') server.close();
});
});
server.listen(12345);
What I expect to happen is that the callback is called when the response finishes sending, and the server closes if the URL is /stop. However, all this ends up doing is it shows a blank page (which I expected) and does nothing else.
On the other hand, this program almost works if I omit response.end():
var http = require('http');
var server = http.createServer(function(request,response) {
if (request.url === '/stop') server.close();
});
server.listen(12345);
It does manage to stop the server, but not until I terminate the GET request by closing the browser window.
How can I get the server to stop when I GET a certain URL without having to close the browser window? (It would be nice if I could send a "Server stopped" message to the browser, too.)
NOTE: I know that I don't actually have the code to differentiate between GET and POST requests, but I would like to be able to handle both, and I'm assuming the answer isn't different depending upon the type of request.
Related
We need to send an HTTP CODE = 200 with a body 'OK' in reply to a notification through Zapier.
Is it possible to use the following code in Zapier:
var http = require('http');
const server = http.createServer((req,res) => {
res.statusCode = 200;
res.end('OK');
}).listen(80);
It returns an error:
Error: You did not define `output`! Try `output = {id: 1, hello: "world"};`
And the reply doesn't work.
David here, from the Zapier Platform team.
To cut to the chase - though it might be possible to start an http server (there's no reason it wouldn't be, as far as I know), it's not going to do what it seems like you're hoping to do. Namely, you can't send a custom response to an incoming webhook. From the docs:
There is no way to customize the response to the request you send to the Catch Hook URL, as the response is sent before the Zap triggers and runs on the webhook request.
If you need behavior like that, I'd suggest running a webserver.
The specific Code step error you're seeing has to do with not defining output to the function. Something goes in and something must come out. You can customize the output based on the input and use that output, but something has to be returned from the function (even if it's just {}).
I send JSON requests one by one to the nodejs server. After 6th request, server can't reply to the client immediately and then it takes a little while(15 seconds or little bit more and send back to me answer 200 ok) It occurs a writing json value into MongoDB and time is important option for me in terms with REST call. How can I find the error in this case? (which tool or script code can help me?) My server side code is like that
var controlPathDatabaseSave = "/save";
app.use('/', function(req, res) {
console.log("req body app use", req.body);
var str= req.path;
if(str.localeCompare(controlPathDatabaseSave) == 0)
{
console.log("controlPathDatabaseSave");
mongoDbHandleSave(req.body);
res.setHeader('Content-Type', 'application/json');
res.write('Message taken: \n');
res.write('Everything all right with database saving');
res.send("OK");
console.log("response body", res.body);
}
});
My client side code as below:
function saveDatabaseData()
{
console.log("saveDatabaseData");
var oReq = new XMLHttpRequest();
oReq.open("POST", "http://192.168.80.143:2800/save", true);
oReq.setRequestHeader("Content-type", "application/json;charset=UTF-8");
oReq.onreadystatechange = function() {//Call a function when the state changes.
if(oReq.readyState == 4 && oReq.status == 200) {
console.log("http responseText", oReq.responseText);
}
}
oReq.send(JSON.stringify({links: links, nodes: nodes}));
}
--Mongodb save code
function mongoDbHandleSave(reqParam){
//Connect to the db
MongoClient.connect(MongoDBURL, function(err, db)
{
if(!err)
{
console.log("We are connected in accordance with saving");
} else
{
return console.dir(err);
}
/*
db.createCollection('user', {strict:true}, function(err, collection) {
if(err)
return console.dir(err);
});
*/
var collection = db.collection('user');
//when saving into database only use req.body. Skip JSON.stringify() function
var doc = reqParam;
collection.update(doc, doc, {upsert:true});
});
}
You can see my REST call in google chrome developer editor. (First six call has 200 ok. Last one is in pending state)
--Client output
--Server output
Thanks in advance,
Since it looks like these are Ajax requests from a browser, each browser has a limit on the number of simultaneous connections it will allow to the same host. Browsers have varied that setting over time, but it is likely in the 4-6 range. So, if you are trying to run 6 simultaneous ajax calls to the same host, then you may be running into that limit. What the browser does is hold off on sending the latest ones until the first ones finish (thus avoiding sending too many at once).
The general idea here is to protect servers from getting beat up too much by one single client and thus allow the load to be shared across many clients more fairly. Of course, if your server has nothing else to do, it doesn't really need protecting from a few more connections, but this isn't an interactive system, it's just hard-wired to a limit.
If there are any other requests in process (loading images or scripts or CSS stylesheets) to the same origin, those will count to the limit too.
If you run this in Chrome and you open the network tab of the debugger, you could actually see on the timline exactly when a given request was sent and when its response was received. This should show you immediately whether the later requests are being held up at the browser or at the server.
Here's an article on the topic: Maximum concurrent connections to the same domain for browsers.
Also, keep in mind that, depending upon what your requests do on the server and how the server is structured, there may be a maximum number of server requests that can efficiently processed at once. For example, if you had a blocking, threaded server that was configured with one thread for each of four CPUs, then once the server has four requests going at once, it may have to queue the fifth request until the first one is done causing it to be delayed more than the others.
I'm using the npm request library and am running into an issue where the request is never sent if I call express's res.send() after calling request. I realize the request callback won't fire if I close the connection, but I'm not even seeing the request being sent in the first place.
This code is being executed on RunKit (formerly TonicDev), an online code editor that allows code execution via endpoints. I'm not seeing this issue on my local machine, so it seems like it may have to do with RunKit. Anyone have any ideas as to what's going on here or how I might work around this?
You can execute the code yourself by going to:
https://runkit.com/gragland/58056bc6e9d9ed00130c84d5 and clicking the endpoint link at the top.
// Helper to return a RunKit compatible express app (runkit.com/tonic/express-endpoint)
var tonicExpress = require("#runkit/tonic/express-endpoint/1.0.0")
// Provide the exports object to the tonicExpress helper
var app = tonicExpress(module.exports)
var request = require('request')
app.get("/", function(req, res){
var request_number = 9
request({
// To see if request is sent go to: https://requestb.in/1coqbqn1?inspect
url: 'http://requestb.in/1coqbqn1',
method: 'POST',
json: {
request_number: request_number,
message: 'hello'
}
})
// The line below has to be commented out for the above request to be sent
// I don't care about the request callback() firing, I just want the request to be sent
res.send('Done')
})
In browser javascript is pathetically broken in that the only way to make requests is using script tags and jsonp. To make this useful, I'm trying to make a nodejs server that, given a callback name and address, loads the page at the address and pads it in a call to callback and serves the result. However, I know next to nothing about nodejs. If the server's response is loaded from a script tag it would result in actually loading a web page. Currently, I'm writing the request as localhost:8000/callback/address so a script tag might be <script src="localhost:8000/alert/https://www.google.com" type="text/javascript"></script>. Here is my code for the server:
var http = require("http");
var request = require("request");
var server = http.createServer(function(req, res){
req.on("end", function(){
console.log("alive");
var url = req.url;
var i = url.indexOf("/", 1);
request(url.substring(i + 1), function(err, ret, body){
res.writeHead(200);
res.write(url.substring(1, i) + "(\"" + body + "\");");
res.end();
});
});
});
server.listen(8000);
Why does this stay loading for a very long time but never actually load? By using console.log() it seems as if the req.on("end") callback is never even called.
If you don't care about any request data, you could just add req.resume(); after you add your end event handler.
The reason it's getting "stuck" is that since node v0.10, streams start out in a paused state, so you need to unpause them by reading from them in some way. req.resume(); accomplishes this. Once there is nothing left in the request stream (which there could be nothing), the end event will be emitted.
Basically I have client side javascript which sends post requests (through jQuery) triggered by user interactions with the page to my node.js server. The node.js server then handles the requests and updates content in the database.
For some reason, I am reaching a limit of the number of posts I can send to the server in a single page load. This maximum is 6. After 6 posts are sent from a page, I get these errors for trying to send any more requests:
EDIT:
These red errors are popping up in my Javascript console after trying to send more than 6 requests:
send jquery-latest.js:8526
jQuery.extend.ajax jquery-latest.js:7978
jQuery.(anonymous function) jquery-latest.js:7614
haveLikedOrDislikedObject
(anonymous function) localhost:33
fire jquery-latest.js:1037
self.fireWith jquery-latest.js:1148
done jquery-latest.js:8074
callback
My code for sending the post: (Basically a listener is attached to numerous divs, and when it is clicked a post request is sent)
//Sets on click listener for like button of content
$(document).delegate("div[id^='likeDiv']", "click", function() {
var el = this;
$.getScript("public/javascripts/load_content.js", function(){
haveLikedOrDislikedObject(0, $(el).attr('name'), theUser);
});
});
function haveLikedOrDislikedObject(res, contentNumber, user){
if(user != undefined){
if(res == 0){
$.post("/likeContent", { content: contentNumber, user: user.UserID });
$("#haveLikedDiv_" + contentNumber).text("You like this.");
} else{
$.post("/dislikeContent", { content: contentNumber, user: user.UserID });
$("#haveLikedDiv_" + contentNumber).text("You dislike this.");
}
} else{
$("#haveLikedDiv_" + contentNumber).text("Sorry, something went wrong.");
}
};
Just wondering why I would be getting this limit? Also, any thoughts on how I can go around this, or other ways to send numerous things to my server from a single page?
SOLVED:
Turns out I was not sending back anything from my server and I think this means each post was waiting for a response, therefore making 6 open requests. So, make sure that you are sending something back from the server, even if it is undefined like this:
app.post('/likeContent', function(req, res){
res.send(undefined);
});
Turns out I was not sending back anything from my server and I think this means each post was waiting for a response, therefore making 6 open requests. So, make sure that you are sending something back from the server, even if it is undefined like this:
app.post('/likeContent', function(req, res){
res.send(undefined);
});