Hub.start - wait for finish - javascript

I'm using SignalR to transferring commands from client to server without refreshing the page. When the client enter some of my web pages, I'm starting a new hub connection. Like this:
var hub = $.connection.siteControllerHub;
$.connection.hub.start();
This "start()" function takes some time (+-5 seconds). mean while, the page is already finished loading and the user start using my UI. SingalR cannot serve the user, until it's finish loading the connection.
I'm know that I'm can use the async approach with the done() register:
$.connection.siteControllerHub.start().done(function () {
// On finish loading...
});
But this kind of operations is not suitable for me, since if I'm using this - I'm need to disable the UI until this event happens. And this not cool at all.
I'm prefer that loading of the page will takes longer but when it's done, everything will be ready for use.
What do you think? How do you recommend to implement it?
Thank you.

5 seconds is not normal. Anyway you can queue the messages and when done is called take the queued messages and send to server. Look here for example
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/blob/aa239a7bb9d79346cacd16ea1ee97946b2d5d44b/SignalR.EventAggregatorProxy.Client.JS/signalR.eventAggregator.js#L165

Related

How to make client wait for server to finish running a script?

I have a web client getting arguments from a user and submitting a form to server with a simple HTML submit form.
Server is running a script that sometimes may take several minutes to finish.
When the script is running more than a couple of minutes the client is re-submitting the form, instead of waiting for the first submit to finish.
How can I prevent that from happening?
Example code:
Submit form:
<form action="/testCall">
<button type="submit" name="Button">Submit</button>
</form>
Calling command on server:
app.get('/testCall', function(req,res){
var exec = require('child_process').exec;
var cmd = "myCommand";
exec(cmd, function callback(error, stdout, stderr){
res.render('newPage.html', {data: stdout});
});
});
In this case myCommand will run for a long time and the client will attempt to resubmit the form.
That's not the way it's usually done, a problem you don't mention is that some layer in between the user and your backend code may signal a Timeout at some point. Form submits really shouldn't take minutes.
It's more usual to start an asynchronous task on the server, using whatever technology you are using on the server side (apparently Node, but I don't know how it's done on Node), and then immediately return to the user to show him a page without the form, and instead some indication that the work is now under way.
Every now and then the page can then ask the server whether the task is already finished and then refresh itself with the result of the script, or perhaps the user can do that by hand.
Also, this script is started by a GET request. Please don't do that, GET requests aren't meant to change anything on the server. If you use a POST instead, then at least the browser will already show a warning when users mindlessly re-submit it with F5 (but users will also mindlessly ignored that warning).
Although the advice by RemcoGerlich was really good, my problem was that nodeJS blocked requests that ran for too long (more than two minutes)
To resolve that you have to disable response timeout in nodeJS backend code:
Something like this:
app.post('/testCall', function(req,res){
var date = new Date(Date.now()).toLocaleString();
res.setTimeout(0);
Took me a while to find that, hope it helps somebody out there!

Isolate setInterval for each thread/user?

I'm using nodejs, and one of the reasons why I switched from a php socket server to nodejs is because of the threading ability. (Essentially, I wanted my monsters in the gameserver to auto attack players).
Let's say in my sever.js file for node I put:
setInterval(function(){
console.log('Hello');
}, 1000);
And I login and authenticate my character on one browser, then look at the console I can see 'Hello' being outputted every second. That's fine, but then I load up a new browser, authenticate another user and then look at the console.. It's actually outputting twice as fast, which is not really the correct way to do this right?
Edit: I'm using https://github.com/websockets/ws and the setInterval function is just under the
socket.on('message', function(Message, flags) {
~~~gameserver authentication /blah mysql blah ~~
setInterval(function(){
console.log('Hello');
}, 1000);
})
Hope this helps, sorry for not being specific enough.
Your script is run for each user (since that is the server). You can listen and emit to a specific user of course. You need to generate an emit for each one, or write the emit in such a way it sends data only to the desired clients.
This may help you: socket.io and node.js to send message to particular client
Edit after comment:
No, the script will be run for each user so you start an interval for each. If you want to start only one you can:
1. Name your interval and if it is defined not start it again.
2. Start the interval on a separate script that you run from console or something like that and it is never accessed by clients.

Javascript infinite loop to update entities

I am writing a game in javascript and on the server that does the world/map services I also need to add a command that updates all entities.
Let us say that an entity/monster is moving this means that a constant update is sent to all connected clients.
If I do something like
while(true)
sendToAllConnectedClientsNearToThisMonster(data)
items.forEach
checkIfItemHasNotExpiredYet(item)
deleteItemFromWorldIfExpired()
But at the same time, the same service is doing other stuff like handling the packets coming in and out. Encrypting Decrypting packets. Routing Packets, Forwarding chat packet to chat server...etc..
Will this not block my node.js server? What is the proper way of handling such tasks?
Use setInterval, it executes your function every X (in this example 250) milliseconds. This way you are not blocking your server. Since node.js is single threaded you should always follow the law of turns: Never wait. Never Block. And finish fast!
Here is your pseudo code wrapped in setInterval:
setInterval( function() {
sendToAllConnectedClientsNearToThisMonster(data)
items.forEach
checkIfItemHasNotExpiredYet(item)
deleteItemFromWorldIfExpired()
}, 250);
http://nodejs.org/api/timers.html#timers_setinterval_callback_delay_arg

how to update chat window with new messages

setInterval(function{
//send ajax request and update chat window
}, 1000)
is there any better way to update the chat with new messages? is this the right way to update the chat using setInterval?
There are two major options (or more said popular ways)
Pulling
First is pulling, this is what you are doing. Every x (milli)seconds you check if the server config has changed.
This is the html4 way (excluding flash etc, so html/js only). For php not the best way because you make for a sinle user a lot of connections per minute (in your example code at least 60 connections per second).
It is also recommended to wait before the response and then wait. If for example you request every 1 second for an update, but your response takes 2 seconds, you are hammering your server. See tymeJV answer for more info
Pushing
Next is pushing. This is more the HTML5 way. This is implemented by websockets. What is happining is the client is "listing" to a connection and waiting to be updated. When it is updated it will triger an event.
This is not great to implement in PHP because well you need a constanct connection, and your server will be overrun in no time because PHP can't push connections to the background (like Java can, if I am correct).
I made personally a small chat app and used pusher. It works perfectly. I only used the free version so don't know how expensive it is.
Pretty much yes, one minor tweak, rather than encapsulate an AJAX call inside an interval (this could result in pooling of unreturned requests if something goes bad on the server), you should throw a setTimeout into the AJAX callback to create a recursive call. Consider:
function callAjax() {
$.ajax(options).done(function() {
//do your response
setTimeout(callAjax, 2000);
});
}
callAjax();

How would you create an auto-updating newsfeed without a reload?

How would I go around creating an auto-updating newsfeed? I was going to use NodeJS, but someone told me that wouldn't work when I got into the thousands of users. Right now, I have it so that you can post text to the newsfeed, and it will save into a mysql database. Then, whenever you load the page, it will display all the posts from that database. The problem with this is that you have to reload the page everytime there is an update. I was going to use this to tell the nodejs server someone posted an update...
index.html
function sendPost(name,cont) {
socket.emit("newPost", name, cont);
}
app.js
socket.on("newPost", function (name,cont) {
/* Adding the post to a database
* Then calling an event to say a new post was created
* and emit a new signal with the new data */
});
But that won't work for a ton of people. Does anyone have any suggestions for where I should start, the api's and/or programs I would need to use?
You're on the right track. Build a route on your Node webserver that will cause it to fetch a newspost and broadcast to all connected clients. Then, just fire the request to Node.
On the Node-to-client front, you'll need to learn how to do long polling. It's rather easy - you let a client connect and do not end the response until a message goes through to it. You handle this through event handlers (Postal.JS is worth picking up for this).
The AJAX part is straightforward. $.get("your/node/url").then(function(d) { }); works out of the box. When it comes back (either success or failure), relaunch it. Set its timeout to 60 seconds or so, and end the response on the node front the moment one event targetted it.
This is how most sites do it. The problem with websockets is that, right now, they're a bit of a black sheep due to old IE versions not supporting them. Consider long polling instead if you can afford it.
(Psst. Whoever told you that Node wouldn't work in the thousands of users is talking through their asses. If anything, Node is more adapted to large concurrency than PHP due to the fact that a connection on Node takes almost nothing to keep alive due to the event-driven nature of Node. Don't listen to naysayers.)

Categories