Reliable timeout in Javascript - javascript

We're implementing an AngularJS application which displays questions to a user and counts the number of correct answers. The test is strictly limited to 20 minutes. But there are a few tricky requirements:
Accuracy
The error for 20 minutes timeout must not exceed 2 seconds, even for not-so-fast devices like Android 2.3 tablets or iPad 2.
Local time modifications tolerance
The result of the timeout must not be affected by changing the computer's local time. We cannot kick the user out as a cheater when such a change is detected: it might as well be caused by an honest NTP update.
Progress tracking
The user must be reminded with remaining time by a constantly ticking countdown which should not accumulate error if the UI is lagging.
-
I've tried a few approaches which didn't work:
windows.performance.now: hardly implemented anywhere (Mobile Safari is a requirement)
Server pingbacks: constant internet connection must not be required
Is it possible to implement all requirements at once?

First of all, you need some backend, that does the counting for you. If you manage your counter completely on clientside, reloading the site will kill the counter.
You can sync the counter initially with the backend and count down on clientside. The 20 min expire should be send by backend, maybe using Server-sent events or nodejs.
By moving the real counter into the backend, you will remove any possible impact for cheating or lagging on clientside.
To make sure, that the counter is right, you could synchronize it in a intervall with the backend.

You should try using $interval for this.
Check out the bottom example on that page.

Related

Start timers on different devices at the exact same time

I am essentially trying to create a web app where one person can start a timer, and everyone else's timers (on different computers/phones) will start at the exact same time. I am currently using node.js and websockets. When the "master" timer hits start, the server uses websockets to tell all the devices to start. Since all the users should be on the same LAN, I thought I would not have to compensate for latency, but the timers are starting a few hundred milliseconds off of each other, and for my purposes, it is very noticeable. There isn't much delay between PCs, but mobile phones tend to be the most off of each other.
What would be the best way to get everything to start at the same exact time, within a margin of error of let's say, 50ms? I do not mind if the timers take a few extra seconds to start if the delay between them is within 50ms.
Send a timestamp to the clients when to start the timer.
Then the accuracy is tied to the accuracy of the system time.
If you can't ensure that the system time is accurate, another way would be to meassure latency and add it as an offset.

Synchronous countdown across users on Meteor

I am developing a multiplayer game (for scientific experiment) where participants engage in 20 rounds of interactive decision making. Each round has 3 stages, each should last maximum for 30 seconds. I wonder what would be a good way to implement the countdown.
Currently, I am using a client side approach. On the creation of the round template, I use client side timer that submits the answer of the participant when it reaches 0. This is working fine so far (because everyone starts the game at the exact same time, and the next round starts only after everyone has submitted an answer). I am not sure that this is a good way to do it, considering that participants might disconnect (go offline, close the browser, have connectivity issues) and might manipulate the sessions or something.
Would it be a better approach to do a server side timer? For instance, a collection that contains the timer, and participants subscribe to that collection? If so, how would one implement a server-side countdown? Also, would this approach cause high demand on the server, given that every second in the countdown (that we display in the template) would require listening to data on the server?
Never trust the client.
With that in mind, we need to find a way for the client to display the remaining time, according to the time the server chose... First, the server puts the end-time of a round when it is created (or start time + duration).
Now that everyone has the same end-time which is according to the server, we need to sync them with server time. Let's use mizzao:timesync it's pretty straightforward, it receives the server time, and creates a difference from the client time. Monitors the client time to make sure no weird clock changes occur and even considers the latency. This might be a bit more than what you need, but it's already done, so less work for us!
Now that we know the current server time, and the round end time, we can easily show how long we got remaining! If a player comes back after a disconnect or refresh, both of those times will still be valid and they'll be able to continue the game.

Node.js scalable/performance capabilities with large number of users

I am in the middle of creating a node.js project however I am concerned as to the performance of the website once it is up and running.
I am expecting it to have a surge of maybe 2000 users for 4-5 hours over a period of one night per week.
The issue is that each user could be receiving a very small message once every second. i.e a timer adjustment or a price change.
2000*60 = 120000 messages in total per minute.
Would this be possible it would be extremely important that there was minimum lag, less than 1 second if possible?
Thanks for the help
You can certainly scale the users with socket.io but the question is how you want to do that. It is unclear whether you have considered the cluster module as that will significantly take the load of the single node process for that amount of users and reduce the latency time. Of course when you do this you need to stop using the in-memory store that socket.io uses by default and use something like redis instead so that you don't end up with duplicate authentication handshakes. Socket.io even has a document to explain how to do this.
What you really need to do is test the performance of your application by creating 2000 clients and simulating 2000 users. The socket.io client can let you set this up in a node application (and then perhaps sign up for the free tier of an EC2 machine to run them all).
It's worth noting that I haven't actually seen v1.0 benchmarks and really the socket.io developers should have a page on it dedicated to benchmarks as this is always a common question with developers.

Synchronize Javascript Timer

I am working on an app that has a javascript interval timer. I would like the timer to run on a server then I would like to grab that time on multiple devices on click.
Ex: Countdown Timer to be displayed on a large screen in a gym (this is coming from the server). I would like users on mobile devices to be able to capture time and record it in real-time.
If the timer on the big screen said 10:35 when I push the button on my mobile device I want it to grab 10:35 and put it into an input field.
I have the timer code figured out. I just don't know how to get the time that is displayed on the server to the other devices.
I know that this is a very general question. I am just wondering if it is possible, and if someone can point me in the right direction.
Thanks!!!
If you are using javascript/using node.js...you can use websockets to synchronize your timers in real time.
here is a link to a good tutorial :-) http://www.youtube.com/watch?v=pNKNYLv2BpQ
Hmmm. This is a particularly difficult problem given the intricacies of keeping time in sync across multiple devices.
Here's the architecture I'd suggest:
The server serves an HTML document to the clients that contains three timestamps: the start time, the end time, and the current server time. This could conceivably be fetched over AJAX.
From the current server time, the client can calculate its current skew and display a countdown timer matching the server ("big board") using client-side Javascript.
A button could then grab the value of a timer and populate the input field.
Here's a pretty similar question.

Why setting a client-side timeout when using long polling?

In almost every long polling examples I see, there is something like a timeout of 30 seconds client-side. What is the precise reason for this?
Some routers, proxies or whatever device there is in the middle might decide to drop TCP/IP connections being idle for extensive period of time. Also refreshing connection once in a while makes sure you'll discover server failure quicker.

Categories