I am developing a bidding system on Google App Engine with Python. A product is opened for bidding at a predefined time and the bidding lasts for 3 minutes. The price of the product drops down every second by a constant value in the 3-minute period until it is zero. And the bidder who presses the "Bid" button first wins the bid. One critical issue in this application is that the time among all users should be perfectly synced so that everybody sees the same price at the same time. The following is what I have come up so far but it suffers from inaccurate timers.
When a user visit or refresh the page, the Python code will pass the sever time to the javascript timer and the timer starts to count down. The simplified javascript is as follows:
var count = {{timeToBid}}; // 'timeToBid' is in seconds and is inserted by the template engine
var price =={{initialPrice}} // The initial product price
var currentPrice; // Current price
function updateTime() {
timeStr = ...; // Set the time string to be HH:MM:SS
$("#timeToBid").html(timeStr); // Display time in the 'timeToBid' div
if (count <= 180) { // Start bidding
currentPrice = ...; // Calculate the current price
$("#currentPrice").html(currentPrice); // Update current price
}
count--;
setTimeout(updateTime, 1000); // Executed every second
}
updateTime();
The above javascript suffers significantly from unsynced timers among user browsers, as well as unsynced server times (see this). The current price displayed in each browser can vary in a huge range which makes the system totally unusable. This is a rather complicated problem and I am asking for good solutions. One or two seconds time inaccuracy is acceptable. My questions:
(a) To make the javascript timer more accurate, I plan to use AJAX to retrieve the server time every 10 seconds and then update 'count'. Is this a good solution? Or there are other options?
$.ajax({
url: "/getServerTime",
cache: false,
success: function(serverCount) {
count = parseInt(serverCount);
return false;
}
});
(b) I have absolutely no idea how to solve the server time inconsistency problem. Can anybody help? Thanks.
System clocks on App Engine servers are kept in sync: see this authoritative answer.
Every App Engine HTTP response has a Date header, like this Date: Wed, 24 Apr 2013 06:03:37 GMT. You can use this to sync browser with server time.
Related
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
I am implementing a live clock in front-end of a large application. For that I have came up with following approach -
JavaScript
var span = document.getElementById('span');
function time() {
var d = new Date();
var s = d.getSeconds();
var m = d.getMinutes();
var h = d.getHours();
span.textContent = h + ":" + m + ":" + s;
}
setInterval(time, 1000);
HTML
<span id="span"></span>
This approach works perfectly fine in isolation, but when this code is integrated in the application which is having several events and function calls, the clock starts lagging by few minutes after say couple of hours until the page is refreshed.
I think this delay is because of setInterval being a web (browser) API and is handled asynchronously it may not execute exactly after 1 second as written in the code every time, if the call stack is not empty after 1 second from time being setInterval is registered due to other function calls/ events present in the call stack of event loop.
So if the page is not refreshed for long time the delay continues to grow. Also the application is written in Angular which is a Single Page application where the page never reloads on navigation because of routing until the page is forcefully refreshed.
So how to build a precise clock in JavaScript which will never delay when integrated in a large application?
Update: Thanks everyone for the response. I think some of you are right, I may be missing some of the details. Actually I was implementing this few days back at work, but have to left this due to some reason and lost track of it. But there was some delay issue for sure working with Date and Timers. Suddenly now this came to my mind and thought asking it here. Extremely sorry for not providing concrete details.
Still I will try to recollect the details and update the question accordingly if possible.
the clock starts lagging by few minutes after say couple of hours until the page is refreshed.
Thats impossible with the code you've shown, new Date should return the correct time, no matter how often you reflect its value to the page.
So if the page is not refreshed for long time the delay continues to grow.
Most browsers today will adjust the timers slightly, so that they are quite accurate on average (e.g. if one timer gets called to late by 1ms, the next will be called 1ms earlier), therefore you can only cause a drift over a longer time if you will leave the page, which will pause the timer. That still shouldn't affect new Date though.
Have a look at the Chromium source
Timers in web browsers get dialled back when the page doesn't have focus. You can't change or prevent that. You're already doing the primary thing that's important: Using the current time to update the clock, so that even if your time function isn't called for three seconds, when it runs it updates with the then-current time, skipping over the intermediate values. (You often see people assuming the timer will run at exactly 1000ms intervals and just adding to the seconds value rather than using the current time, which is incorrect.)
If I were doing this, I'd probably decrease the interval (run the callback more often) and use a chained series of setTimeout rather than a single setInterval, not least because different browsers have historically handled setInterval in different ways.
So for instance:
function time() {
var d = new Date();
var s = d.getSeconds();
var m = d.getMinutes();
var h = d.getHours();
span.textContent = h + ":" + m + ":" + s;
setTimeout(time, 250);
}
time();
But if the page is inactive, the clock will get out of date, because the browser will either completely suspend timer execution or at least markedly dial it back. When the page becomes active again, though, hopefully it'll correct itself after no more than 250ms.
I have created a script in JavaScript that is injected into our Ext JS application during automated browser testing. The script measures the amount of time taken to load the data in our grids.
Specifically, the script polls each grid, looks to see if there is a first row or a 'no data' message, and once all grids have satisfied this condition the script records the value between Date.now() and performance.timing.fetchStart, and treats this as the time the page took to load.
This script works more or less as expected, however when compared with human measured timings (Google stopwatch ftw), the time reported by this test is consistently around 300 milliseconds longer than when measured by stopwatch.
My questions are these:
Is there a hole in this logic that would lead to incorrect results?
Are there any alternative and accurate ways to achieve this
measurement?
The script is as follows:
function loadPoll() {
var i, duration,
dataRow = '.firstRow', noDataRow = '.noData',
grids = ['.grid1', '.grid2', '.grid3','.grid4', 'grid5', 'grid6', 'grid7'];
for (i = 0; i < grids.length; ++i) {
var data = grids[i] + ' ' + dataRow,
noData = grids[i] + ' ' + noDataRow;
if (!(document.querySelector(data) || document.querySelector(noData))) {
window.setTimeout(loadPoll, 100);
return;
}
}
duration = Date.now() - performance.timing.fetchStart;
window.loadTime = duration;
}
loadPoll();
Some considerations:
Although I am aware that human response time can be slow, I am sure
that the 300 millisecond inconsistency is not introduced by the human
factor of using Google stopwatch.
Looking at the code it might appear that the polling of multiple
elements could lead to the 300 ms inconsistency, however when I
change the number of elements being monitored from 7 to 1, there
still appears to be a 300 ms surplus in the time reported by the
automated test.
Our automated tests are executed in a framework controlled by
Selenium and Protractor.
Thanks in advance if you are able to provide any insight to this!
If you use performance.now() the time should be accurate to 5 microseconds. According to MDN:
The performance.now() method returns a DOMHighResTimeStamp, measured
in milliseconds, accurate to five thousandths of a millisecond (5
microseconds).
The returned value represents the time elapsed since the time origin
(the PerformanceTiming.navigationStart property).
If I were you I would revise my approach to how the actual measuring of the time is captured. Rather than evaluating the time for each loadPoll() call, you can evaluate how many calls you can perform for a given period of time. In other words you can count the number of function iterations for a longer period of time, eg 1000 milliseconds. Here's how this can be done:
var timeout = 1000;
var startTime = new Date().getTime();
var elapsedTime = 0;
for (var iterations = 0; elapsedTime < timeout; iterations++) {
loadPoll();
elapsedTime = new Date().getTime() - startTime;
}
// output the number of achieved iterations
console.log(iterations);
This approach will give you more consistent and accurate time estimates. Faster systems will simply achieve a greater number of iterations. Keep in mind that setInterval()/setTimeout() are not perfectly precise and for really small interval timers these functions may give you invalid results due to garbage collection, demands from events and many other things that can run in parallel while your code is being executed.
At our office we check in by opening a webpage and clicking on a check-in button.The following function is called while clicking the button:
function checkInOutSubmit(thisController, thisAction, checkName){
var visitortime = new Date();
var visitortimezone = "GMT " + -visitortime.getTimezoneOffset() / 60;
var timeZone = jstz.determine_timezone();
var timeZoneName = timeZone.name();
var checkInCheckOut = checkName
jQuery.ajax({type:'POST',data:{checkInCheckOut:checkInCheckOut,currentController: thisController,currentAction: thisAction,timez:timeZoneName}, url:'/pms/attendanceForgot/checkInCheckOut',success:function(data,textStatus){jQuery('#successdiv').html(data);successCheckInOut();},error:function(XMLHttpRequest,textStatus,errorThrown){}});
}
But I want to put a old time when clicking on the button and not the current time.(If I reach click the button at 11:00am, I want to post 10:00am as my checkin time).
How can this be done?
There is not enough information here to answer your question.
All this code is doing is finding the current time zone, not the current time. It passes that to the server via an ajax request, which makes me think the time is generated server side. It's possible you could alter the logged time on the server by changing the timezone to an offset that would make it look like you are clocking in at the right time, but it would have to be some seriously deficient code on the server for that to work.
In almost all likelihood, the server is storing the time of the request in universal time as the clock in time and when you leave it's storing the time you leave in universal time as well. (think a point in time that isn't dependent on timezones) If your goal is get more hours, you'll just have to work later when you come in late. If you want it to look like you came in "on time", then changing the timezone might help until they notice that you've been there from 10am to 6pm but are only logging 7 hours.
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)