I am writing a game in javascript and on the server that does the world/map services I also need to add a command that updates all entities.
Let us say that an entity/monster is moving this means that a constant update is sent to all connected clients.
If I do something like
while(true)
sendToAllConnectedClientsNearToThisMonster(data)
items.forEach
checkIfItemHasNotExpiredYet(item)
deleteItemFromWorldIfExpired()
But at the same time, the same service is doing other stuff like handling the packets coming in and out. Encrypting Decrypting packets. Routing Packets, Forwarding chat packet to chat server...etc..
Will this not block my node.js server? What is the proper way of handling such tasks?
Use setInterval, it executes your function every X (in this example 250) milliseconds. This way you are not blocking your server. Since node.js is single threaded you should always follow the law of turns: Never wait. Never Block. And finish fast!
Here is your pseudo code wrapped in setInterval:
setInterval( function() {
sendToAllConnectedClientsNearToThisMonster(data)
items.forEach
checkIfItemHasNotExpiredYet(item)
deleteItemFromWorldIfExpired()
}, 250);
http://nodejs.org/api/timers.html#timers_setinterval_callback_delay_arg
Related
I'm trying to time limit a socket.io connection time on a node.js server. I asked a previous question as to whether this was possible without causing a huge overhead on the server and or blocking the main thread if we had say 1000 concurrent socket connections in various rooms, through something like:
socket.on('connection', function(params){
var maxTime = params.maxTime;
socket.join(params.roomId);
setTimeout(function{
socket.leave(params.roomId);
}, 180000)
});
The best case scenario would be handle this on the client side from a resources perspective but it isn't exactly secure to send the timeout/disconnection value as any lines of client side code that dealt with it could be easily manipulated and a knowing user could in effect prevent the disconnect event/functionality from being called.
Could I execute a function client-side sent as a string? Say:
setTimeout(function(){//disconnect},18000);
socket.emit('timeout_set', function(params){foo:bar});
Then handle appropriately on the server with a response knowing that the timeout has indeed been set:
socket.on('timeout_set', function(params){
socket.emit('proceed_with_stuff', {foo:bar});//includes critical info for proceeding
});
I'm thinking this depends on a few things:
Can you take a string from a server response and execute said string as JS?
Can a client still disrupt the setTimeout function without also triggering the socket.disconnect event?
Is this logic or anything similar possible?
Would the first scenario work on a node.js server given a number of concurrent connections?
Use Function constructor, see https://developer.mozilla.org/de/docs/Web/JavaScript/Reference/Global_Objects/Function
I'm using nodejs, and one of the reasons why I switched from a php socket server to nodejs is because of the threading ability. (Essentially, I wanted my monsters in the gameserver to auto attack players).
Let's say in my sever.js file for node I put:
setInterval(function(){
console.log('Hello');
}, 1000);
And I login and authenticate my character on one browser, then look at the console I can see 'Hello' being outputted every second. That's fine, but then I load up a new browser, authenticate another user and then look at the console.. It's actually outputting twice as fast, which is not really the correct way to do this right?
Edit: I'm using https://github.com/websockets/ws and the setInterval function is just under the
socket.on('message', function(Message, flags) {
~~~gameserver authentication /blah mysql blah ~~
setInterval(function(){
console.log('Hello');
}, 1000);
})
Hope this helps, sorry for not being specific enough.
Your script is run for each user (since that is the server). You can listen and emit to a specific user of course. You need to generate an emit for each one, or write the emit in such a way it sends data only to the desired clients.
This may help you: socket.io and node.js to send message to particular client
Edit after comment:
No, the script will be run for each user so you start an interval for each. If you want to start only one you can:
1. Name your interval and if it is defined not start it again.
2. Start the interval on a separate script that you run from console or something like that and it is never accessed by clients.
I am using the Poco C++ libraries to setup a websocket server, which clients can connect to and stream some data to their webinterface. So I have a loop which continuously sends data and I also want to listen if the clients closes the connection by using the receiveFrame() function, for the rest, the client is totally passive and doesn't send any data or whatsoever. The problem is that receiveFrame() blocks the connection, which is not what I want. I basically want to check if the client has not yet called the close() javascript function and stop streaming data if it has. I tried using
ws.setBlocking(false);
But now receiveFrame throws an exception every time it is called. I also tried removing receiveFrame entirely, which works if the connection is terminated by closing the browser but if the client calls the function close(), the server still tries to send data to the client. So how can I pull this off? Is there somehow a way to check if there are client frames to be received and if not to just continue?
You can repeatedly call Socket::select() (with timeout) in a separate thread; when you detect a readable socket, call receiveFrame(). In spite of the misleading name, Socket::select() call wraps epoll() or poll() on platforms where those are available.
You can also implement this in somewhat more complicated but perhaps a more elegant fashion with Poco::NotificationQueues, posting a notification every time when a socket is readable and reading data in the handler.
setBlocking() does not do what you would expect it to. Here's a little info on it:
http://www.scottklement.com/rpg/socktut/nonblocking.html
What you probably want to do is use setReceiveTimeout() on your socket to control how long it will wait for before giving you back control. Then test your response and loop everything if needed. The Poco docs have more info on how to use that part of the API. Just look up WebSockets.
I'm using SignalR to transferring commands from client to server without refreshing the page. When the client enter some of my web pages, I'm starting a new hub connection. Like this:
var hub = $.connection.siteControllerHub;
$.connection.hub.start();
This "start()" function takes some time (+-5 seconds). mean while, the page is already finished loading and the user start using my UI. SingalR cannot serve the user, until it's finish loading the connection.
I'm know that I'm can use the async approach with the done() register:
$.connection.siteControllerHub.start().done(function () {
// On finish loading...
});
But this kind of operations is not suitable for me, since if I'm using this - I'm need to disable the UI until this event happens. And this not cool at all.
I'm prefer that loading of the page will takes longer but when it's done, everything will be ready for use.
What do you think? How do you recommend to implement it?
Thank you.
5 seconds is not normal. Anyway you can queue the messages and when done is called take the queued messages and send to server. Look here for example
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/blob/aa239a7bb9d79346cacd16ea1ee97946b2d5d44b/SignalR.EventAggregatorProxy.Client.JS/signalR.eventAggregator.js#L165
setInterval(function{
//send ajax request and update chat window
}, 1000)
is there any better way to update the chat with new messages? is this the right way to update the chat using setInterval?
There are two major options (or more said popular ways)
Pulling
First is pulling, this is what you are doing. Every x (milli)seconds you check if the server config has changed.
This is the html4 way (excluding flash etc, so html/js only). For php not the best way because you make for a sinle user a lot of connections per minute (in your example code at least 60 connections per second).
It is also recommended to wait before the response and then wait. If for example you request every 1 second for an update, but your response takes 2 seconds, you are hammering your server. See tymeJV answer for more info
Pushing
Next is pushing. This is more the HTML5 way. This is implemented by websockets. What is happining is the client is "listing" to a connection and waiting to be updated. When it is updated it will triger an event.
This is not great to implement in PHP because well you need a constanct connection, and your server will be overrun in no time because PHP can't push connections to the background (like Java can, if I am correct).
I made personally a small chat app and used pusher. It works perfectly. I only used the free version so don't know how expensive it is.
Pretty much yes, one minor tweak, rather than encapsulate an AJAX call inside an interval (this could result in pooling of unreturned requests if something goes bad on the server), you should throw a setTimeout into the AJAX callback to create a recursive call. Consider:
function callAjax() {
$.ajax(options).done(function() {
//do your response
setTimeout(callAjax, 2000);
});
}
callAjax();