When I use
a custom zoneinfo file for TAI or /usr/share/zoneinfo-leaps and
a modified NTP client (currently it just adds 27 seconds; and waits for a time stamp that is about 1 second off)
on my ArchLinux box,
the system time behaves nicely:
> date
Tue Oct 23 17:10:34 TAI 2018
> date -d #1483228827
Sun Jan 1 00:00:00 UTC 2017
> date -d #1483228826
Sat Dec 31 23:59:60 UTC 2016
> date -d #1483228825
Sat Dec 31 23:59:59 UTC 2016
but: JavaScript does not:
test page
screenshot
Does Mozilla/Firefox/Javascript use its own zoneinfo files
somewhere?
How can i fix it?
Not even websites dedicated to time seem to get it right... Or am i missing something?
-arne
The JavaScript Date object specifically adheres to the concept of Unix Time (albeit with higher precision). This is part of the POSIX specification, and thus is sometimes called "POSIX Time". It does not count leap seconds, but rather assumes every day had exactly 86,400 seconds. You can read about this in section 20.3.1.1 of the current ECMAScript specification, which states:
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day.
JavaScript is not unique in this regard. This is what the vast majority of other languages do, including Python, Ruby, .NET, the typical implementation of time_t in C, and many others.
Because you have altered your system to track TAI instead of UTC, and there is an implementation of a leap second table on your system that the date command understands, then on your system time_t isn't a Unix timestamp, but rather a TAI-based variant masquerading as a Unix timestamp. Just because the date command and the other underlying functions recognize this, doesn't mean that carries through to all platforms and runtimes on your machine.
The fact is, that the unpredictable nature of leap seconds makes them very difficult to work with in APIs. One can't generally pass timestamps around that need leap seconds tables to be interpreted correctly, and expect that one system will interpret them the same as another. For example, while your example timestamp 1483228826 is 2017-01-01T00:00:00Z on your system, it would be interpreted as 2017-01-01T00:00:26Z on POSIX based systems, or systems without leap second tables. So they aren't portable. Even on systems that have full updated tables, there's no telling what those tables will contain in the future (beyond the 6-month IERS announcement period), so I can't produce a future timestamp without risk that it may eventually change.
To be clear - to support leap seconds in a programming language, the implementation must go out of its way to do so, and must make tradeoffs that are not always acceptable. Though there are exceptions, the general position is to not support them - not because of any subversion or active countermeasures, but because supporting them properly is much, much harder.
That said, there is hope for you if you really care about leap seconds in JavaScript. You can add your thoughts to TC39 Temporal proposal (of which I am one of the champions). This won't change the behavior of the Date object - that is baked and has been for decades. But we are developing a new set of standard objects for date and time in JavaScript, and would love your feedback and participation. There is a thread where we have been considering various ways that leap seconds could be part of this in issue #54. At the moment, we haven't put much thought into TAI-based systems. If this is an area that you have experience in, please add your thoughts there. Keep in mind we'll need to balance this with the general needs of the community, but we'd like your thoughts. Thanks!
Related
I'm using Node v16.17 on MacBook Pro M1.
I want to use microsecond timestamps, so I tried process.hrtime().
But this is very strange, as the first array element (which should be seconds when multiplied by 1000) is like some date in 2017:
> new Date().getTime();
1669997280728
> process.hrtime();
[ 1486038, 90680583 ]
So, if I take 1486038000 --> it is Thu, 02 Feb 2017 12:20:00 GMT
If I take out the milliseconds from new Date().getTime() -> it is correctly Fri, 02 Dec 2022 16:08:00 GMT
What it the issue here? I thought process.hrtime() will be the high resolution time, but why is this so off?
Thanks
Fritz
Per the docs,
These times are relative to an arbitrary time in the past, and not related to the time of day and therefore not subject to clock drift. The primary use is for measuring performance between intervals
https://nodejs.org/api/process.html#processhrtimetime
It is only a coincidence that you got a somewhat relevant date.
You should be using process.hrtime.bigint(), however, because process.hrtime() has been legacy for a while (even in Node v16.17).
What?
process.hrtime() has nothing do to with the real-time clock, as is explained by the docs:
These times are relative to an arbitrary time in the past, and not related to the time of day and therefore not subject to clock drift.
(emphasis mine)
And,
The primary use is for measuring performance between intervals:
Running a simple new Date().toString(). On Node 11, You get something like
'Fri May 10 2019 10:44:44 GMT-0700 (Pacific Daylight Time)'
While on Node 8 you get
'Fri May 10 2019 10:44:44 GMT-0700 (PDT)'
Note the different in timezone abbreviation. Why is that? And how can you force toString() to always return the zone in the abbreviation?
Stolen answer from #ssube who was too lazy to log in and post.
the whole Intl object and default formats were introduced between those two versions, which may have become the new default for Date as well.
After some digging on my own, and reading some of the Intl spec:
The ECMAScript 2015 Internationalization API Specification identifies time zones using the Zone and Link names of the IANA Time Zone Database. Their canonical form is the corresponding Zone name in the casing used in the IANA Time Zone Database.
As to how to revert back to an abbreviated timezone, I am seeing that there are several github repos that suggest using regex, others using an abbreviation Map, or even Ben Nadel who uses some regex to process the short timezone or long timezone, as seen in his blog here
Looks like JavaScript leaves this up to the implementer. Based on the below GitHub Issue for ECMA262, there are known differences between the ways UNIX & Windows handle the timezone value.
Across multiple JS implementations, it seems that Date.prototype.toString writes the timezone (in parens) in a long, locale-dependent form on Windows, but in a short form (2-4 letters) from the tz database on Unix-based OSes. More details in the V8 bug tracker.
The spec is really light on details for Date.prototype.toString:
Return an implementation-dependent String value that represents tv as a date and time in the current time zone using a convenient, human-readable form.
Does anyone have a good memory of why this is the definition? Looks like it goes all the way back to ES1.
Fortunately, it seems that, at this point, implementations have converged on something that's almost always the same, with the exception of the timezone string.
For the timezone string, would it be a good idea to pick one of the two alternatives and standardize it across all platforms? Does anyone have evidence one way or the other whether either of the two is likely to be more web-compatible, or whether we need to preserve the variation?
Additionally, it looks like there is still active discussion in the V8 Issues for Date.prototype.toString() normalization.
Going through the NodeJS there doesn't seem to be an explicit mention of this in their change logs for v10+.
Update
After digging through V8 commits, it looks like there is a new Timezone Names Cache implemented for performance in V8 when using Date.prototype.toString(). Based on the below excerpt from the message for this commit, it seems like this change is why there is a difference between Node v8 & Node v11
To speed up Date.prototype.toString(), this patch adds a cache in the
DateCache for the string short name representing the time zone.
Because time zones in a particular location just have two short names
(for DST and standard time), and the DateCache already understands
whether a time is in DST or not, it is possible to keep the result of
OS::LocalTimezone around and select between the two based on whether
the time is DST or not.
In local microbenchmarks (calling Date.prototype.toString() in a
loop), I observed a 6-10% speedup with this patch. In the browser, the
speedup may be even greater as the system call needs to do some extra
work to break out of the sandbox. I don't think the microbenchmark is
extremely unrealistic; in any real program which calls
Date.prototype.toString() multiple times, the cache should hit almost
all of the time, as time zone changes are rare.
The proximate motivation for this patch was to enable ICU as a backend
for timezone information, which is drafted at
https://codereview.chromium.org/2724373002/ The ICU implementation of
OS::LocalTimezone is even slower than the system call one, but this
patch makes their performance indistinguishable on the microbenchmark.
In the tz database, many timezones actually do have a number of
different historical names. For example, America/Anchorage went
through a number of changes, from AST to AHST to YST to AKST. However,
both ICU and the Linux OS interfaces just report the modern timezone
name in tests for the appropriate timezone name, even for historical
times. I can see why this would be:
- For ICU, CLDR only has two short names in the data file: the one for dst and non-dst
- For Linux, the timezone names do seem to make it into the /etc/localtime file. However, glibc assumes there are only two
relevant names and selects between them, as you can see in its
implementation of localtime_r:
http://bazaar.launchpad.net/~vcs-imports/glibc/master/view/head:/time/tzset.c#L573
So, this cache should be valid until we switch to a more accurate
source of short timezone names.
So a little context, I have an array of 24 arrays -- one for every hour in the day.
So midnight, 0 index, would be [133.00, 234.00] which would indicate 133 actions from 12 - 1230 and 234 actions between 1230 - 1am.
I need to adjust these indexed arrays to account for the user's timezone in a browser with JS, so that if the user is in New York the 0 index (midnight in the user's home turf) is displayed in China's offset (12pm tomorrow, from user's perspective).
I've been trying to think of a solution, I have a simple function for what I've been able to think of
function offsetHourIndex(hourIndex, dataCenterTimeZone) {
let userTime = new Date().setHour(hourIndex)
return moment(userTime).tz(dataCenterTimeZone).hour();
}
How reliable would this approach be?
Your approach has a few problems:
You are assuming that the current date in the local time zone is the correct date for the target time zone. Most of the time, there are two dates active somewhere around the world. For example, 2019-04-02 04:00 in London is 2019-04-01 23:00 in New York. If you just take hour 4 from London but apply it to the current date in New York, you've created a whole new point in time, a day too early.
You assume there will be exactly 24 hours in every day. In time zones that have transitions for daylight saving time or changes in standard time, you may have more or less hours of local time on a the day of the transition.
In the case of a backward transition, there is a period of ambiguous local time. For example, when US Pacific Time moves from PDT to PST in November, the hour from 1:00-1:59 is repeated. If data from both hours are summarized into array element 1, then you will have higher than normal results for that hour. The opposite is true for forward transitions - you will have an hour with no data.
The time zone setting of the server can be a fickle thing. What if you change data centers? What if you move to the cloud? What if you are operating multiple data centers? What if a server administrator thinks all they are affecting by changing the system time zone is the readout on the taskbar or front panel, etc., and then it affects your application? In general one should avoid these things by never relying on the server's local time zone setting.
You can avoid all of these problems by basing everything on Coordinated Universal Time (UTC) - especially the array. Then you can be ignorant of any server time zone setting, and just base everything off the current UTC day, which is the same all over the world.
This will give you the local hour from the given UTC hour in your index:
var localHour = moment.utc({hour: hourIndex}).local().hour();
You do not need moment-timezone for the above recommendation.
However, if you really feel like you need to convert from a specific time zone to the browser local time, then you would use moment-timezone like this:
var localHour = moment.tz({hour: hourIndex}, timeZoneName).local().hour();
Note when you do this, you also have another problem - not every time zone is offset by a whole number of hours. For example, India uses UTC+05:30. There are many that are :30 and a few that are :45. By tracking hours only, you're not providing enough information to properly convert to the correct local hour. Your results may be off by one.
It seems reasonable. And the code should work as long as you have the properly formatted inputs. I like the brevity and clarity of the function. Any reason you are concerned about reliability?
You might mention in your question that you are using the moment and moment-timezone packages here to derive your data via its functions (moment & tz) on this line of code:
return moment(userTime).tz(dataCenterTimeZone).hour();
Your function may appear a bit cryptic without the imports in your example for folks reading here to understand, such as :
import * as moment from 'moment';
import 'moment-timezone';
I want to use shell to get epoch time
and later use javascript on a html page to get another epoch time
and then get the difference between them
but I'm afraid that the epoch time may not be synchronized among different scripts
so this difference is useless
so I want to know, if at the very same time, I use shell and javascript to get epoch tiem
will the result be the same or not?
if not, how big is the difference?
thanks!
If you mean number of seconds since Unix epoch (1970-01-01T00:00:00Z), it is governed by this very definition. The only differences you should be able to see are caused by:
different times of invocation of the system call that returns it*);
unsynchronized clocks on different systems.
and possibly also:
unsynchronized clocks on different processor cores/physical processors;
implementation dependent handling of the function that returns current time (e.g. JS engine implementation might possibly cache the value for a short time as not to have to do the actual syscall, although I would doubt this).
Depending on the time resolution you need, some of these are not a problem. My guess is, that if you don't need granularity finer than 1s, you should be more than fine (on the same machine).
*) also note, that on single core system, you can't really get the same time (at least with the ns resolution) from different syscalls, unless the kernel caches it, simply because they have to happen one after another.
According to ECMA-262 section 15.9.1.1:
Time is measured in ECMAScript in milliseconds since 01 January, 1970
UTC. In time values leap seconds are ignored. It is assumed that there
are exactly 86,400,000 milliseconds per day.
So, yes, every JavaScript implementation that adheres to standard must return exactly same values, taking care of any physical clock quirks by itself. Barring wrong set up of system clock you will have same value everywhere.
Note that definition of "epoch" and "seconds since" in other languages and systems could be different (at very least most other systems use seconds, not milliseconds and most often take leap seconds in account), so you can't guarantee that JS time, even divided by 1000 will match timestamp from another platform or OS.
"During the "energy crisis" years, Congress enacted earlier starting dates for daylight time. In 1974, daylight time began on 6 January and in 1975 it began on 23 February. After those two years the starting date reverted back to the last Sunday in April. "
(via http://aa.usno.navy.mil/faq/docs/daylight_time.php )
There appears to be a bug in the Javascript date object for these dates. If you convert 127627200000 milliseconds to a date, it should be Thu Jan 17 00:00:00 EDT 1974. This is correct on http://www.fileformat.info/tip/java/date2millis.htm, but incorrect on
http://www.esqsoft.com/javascript_examples/date-to-epoch.htm, which says it converts to Wed Jan 16 1974 23:00:00 GMT-0500 (Eastern Standard Time). If you create a new Date(127627200000) object in javascript, it gives the latter date conversion. This happens in all major browsers.
I can't imagine this is first time this has been a problem for anyone, but I can't find any other cases of this problem with a few searches online. Does anyone know if there is an existing fix for this or an easier fix than manually checking the dates Javascript has the conversion wrong? Are there other dates this is a problem?
As ever, it's best to check the spec :)
In this case, I was pretty shocked to see this in section 15.9.1.9 of ECMA-262:
The implementation of ECMAScript
should not try to determine whether
the exact time was subject to daylight
saving time, but just whether daylight
saving time would have been in effect
if the current daylight saving time
algorithm had been used at the time.
This avoids complications such as
taking into account the years that the
locale observed daylight saving time
year round.
In other words, a conformant ECMAScript implementation is not allowed to be historically accurate.
Now whether all implementations follow this or not, I'm not sure... but it does suggest you'd need some kind of separate library if you wanted to get historically accurate time zones... where "historically accurate" doesn't have to be nearly as far back as 1974, of course: the US changed its DST schedule in 2007, and other countries have done so more recently than that (and with less warning).
1 The first occurrence of 15.9.1.9. For some reason it occurs twice - once for "Daylight Saving Time Adjustment" and once for "Local Time". Wow.
Java does historical time zones (back to about 1920), JavaScript apparently does not.