I read in the MDN documentation that:
toLocaleTimeString() without arguments depends on the
implementation,the default locale, and the default time zone
What does this mean exactly?
And I tried the following code in both Chrome(Version 87.0.4280.88) and Safari browser(Version 14.0).
new Date().toLocaleTimeString()
in Chrome it gives output as
16:57:37
whereas in Safari it gives output as
4:57:37 PM
With regards to the above example can someone explain how the implementation is changing from one browser to another, and why is it so?
Edit:
All this was done using MAC, I tried changing preferred language under Setting -> "Language and Region" to English(US) it was English(India) before, as soon as I did that change and restarted chrome the result became.
4:57:37 PM
But for Safari without doing this change it was able to show in 12 hour format, what was the reason for that?
The specification for toLocaleTimeString states:
This function returns a String value. The contents of the String are implementation-defined, but are intended to represent the “time” portion of the Date in the current time zone in a convenient, human-readable form that corresponds to the conventions of the host environment's current locale.
with the definition of implementation-defined being:
An implementation-defined facility is one that defers its definition
to an external source without further qualification. This
specification does not make any recommendations for particular
behaviours, and conforming implementations are free to choose any
behaviour within the constraints put forth by this specification.
Therefore browsers are free to implement this feature as they see fit.
Related
Getting different output for both browser if we select '(UTC +01:00) Brussels, Copenhagen, Madrid, Paris' in system.
var tz = jstz.determine();
var tzName = tz.name();
Output:
IE11: Europe/Berlin
Chrome: Europe/Paris
From the jsTimeZoneDetect docs:
Limitations
This script does not do geo-location, nor does it care very much about historical time zones. So if you are unhappy with the time zone "Europe/Berlin" when the user is in fact in "Europe/Stockholm" - this script is not for you. They are both identical in modern time.
Indeed, if we carefully examine and compare the history of time zone changes for Berlin and Paris, we find that they have been identical since 1980. Thus, unless your application is dealing with dates before 1980, it is inconsequential whether you detect Europe/Berlin or Europe/Paris.
As to which is more correct, the CLDR windowsZones.xml file (which is the canonical mapping between Windows and IANA time zones) contains the following:
<!-- (UTC+01:00) Brussels, Copenhagen, Madrid, Paris -->
<mapZone other="Romance Standard Time" territory="001" type="Europe/Paris"/>
Chrome uses the Intl API, which internally uses ICU, which contains the CLDR data. Thus, Chrome is providing the more correct answer. You should get the same answer in FireFox, Edge, and other modern web browsers.
Internet Explorer is older and doesn't contain the data needed to resolve this correctly. Thus, libraries like jsTimeZoneDetect (and also moment-timezone via moment.tz.guess()) first try to use the Intl approach, but when not available they make an educated guess by testing various known points in time for their UTC offset changes. Since it's just a guess, it's sometimes going to be imprecise.
If you are interested, there's a community-maintained compatibility chart that tracks which browsers support the "correct" time zone detection process. Expand the "DateTimeFormat" section and check the row labeled "resolvedOptions().timeZone defaults to the host environment"
Running a simple new Date().toString(). On Node 11, You get something like
'Fri May 10 2019 10:44:44 GMT-0700 (Pacific Daylight Time)'
While on Node 8 you get
'Fri May 10 2019 10:44:44 GMT-0700 (PDT)'
Note the different in timezone abbreviation. Why is that? And how can you force toString() to always return the zone in the abbreviation?
Stolen answer from #ssube who was too lazy to log in and post.
the whole Intl object and default formats were introduced between those two versions, which may have become the new default for Date as well.
After some digging on my own, and reading some of the Intl spec:
The ECMAScript 2015 Internationalization API Specification identifies time zones using the Zone and Link names of the IANA Time Zone Database. Their canonical form is the corresponding Zone name in the casing used in the IANA Time Zone Database.
As to how to revert back to an abbreviated timezone, I am seeing that there are several github repos that suggest using regex, others using an abbreviation Map, or even Ben Nadel who uses some regex to process the short timezone or long timezone, as seen in his blog here
Looks like JavaScript leaves this up to the implementer. Based on the below GitHub Issue for ECMA262, there are known differences between the ways UNIX & Windows handle the timezone value.
Across multiple JS implementations, it seems that Date.prototype.toString writes the timezone (in parens) in a long, locale-dependent form on Windows, but in a short form (2-4 letters) from the tz database on Unix-based OSes. More details in the V8 bug tracker.
The spec is really light on details for Date.prototype.toString:
Return an implementation-dependent String value that represents tv as a date and time in the current time zone using a convenient, human-readable form.
Does anyone have a good memory of why this is the definition? Looks like it goes all the way back to ES1.
Fortunately, it seems that, at this point, implementations have converged on something that's almost always the same, with the exception of the timezone string.
For the timezone string, would it be a good idea to pick one of the two alternatives and standardize it across all platforms? Does anyone have evidence one way or the other whether either of the two is likely to be more web-compatible, or whether we need to preserve the variation?
Additionally, it looks like there is still active discussion in the V8 Issues for Date.prototype.toString() normalization.
Going through the NodeJS there doesn't seem to be an explicit mention of this in their change logs for v10+.
Update
After digging through V8 commits, it looks like there is a new Timezone Names Cache implemented for performance in V8 when using Date.prototype.toString(). Based on the below excerpt from the message for this commit, it seems like this change is why there is a difference between Node v8 & Node v11
To speed up Date.prototype.toString(), this patch adds a cache in the
DateCache for the string short name representing the time zone.
Because time zones in a particular location just have two short names
(for DST and standard time), and the DateCache already understands
whether a time is in DST or not, it is possible to keep the result of
OS::LocalTimezone around and select between the two based on whether
the time is DST or not.
In local microbenchmarks (calling Date.prototype.toString() in a
loop), I observed a 6-10% speedup with this patch. In the browser, the
speedup may be even greater as the system call needs to do some extra
work to break out of the sandbox. I don't think the microbenchmark is
extremely unrealistic; in any real program which calls
Date.prototype.toString() multiple times, the cache should hit almost
all of the time, as time zone changes are rare.
The proximate motivation for this patch was to enable ICU as a backend
for timezone information, which is drafted at
https://codereview.chromium.org/2724373002/ The ICU implementation of
OS::LocalTimezone is even slower than the system call one, but this
patch makes their performance indistinguishable on the microbenchmark.
In the tz database, many timezones actually do have a number of
different historical names. For example, America/Anchorage went
through a number of changes, from AST to AHST to YST to AKST. However,
both ICU and the Linux OS interfaces just report the modern timezone
name in tests for the appropriate timezone name, even for historical
times. I can see why this would be:
- For ICU, CLDR only has two short names in the data file: the one for dst and non-dst
- For Linux, the timezone names do seem to make it into the /etc/localtime file. However, glibc assumes there are only two
relevant names and selects between them, as you can see in its
implementation of localtime_r:
http://bazaar.launchpad.net/~vcs-imports/glibc/master/view/head:/time/tzset.c#L573
So, this cache should be valid until we switch to a more accurate
source of short timezone names.
I am using JavaScript Intl API to detect the user's timezone in the browser:
Intl.DateTimeFormat().resolvedOptions().timeZone
While this API looks stable, I don't know what to do with the next. On my Windows 10 machine, the timezone is set to US Eastern Standard Time:
C:\> tzutil /g
US Eastern Standard Time
In Firefox and Edge the above JavaScript code results with "America/Indiana/Indianapolis", while Chrome returns just "America/Indianapolis".
Why this yields different results in different browsers? Is it a Chrome bug or should I transform the time zones somehow before processing them on the server?
On Ubuntu (which is my server's OS), if I list available time zones with timedatectl list-timezones, there is "America/Indiana/Indianapolis", but there is no "America/Indianapolis". Because of this issue I cannot accept Chrome's "America/Indianapolis" from the client, as I grab the "available time zones list" from the output of timedatectl list-timezones. Which behavior should I apply here?
Any help is appreciated. Thank you!
Chrome, like many others, gets its time zone information through ICU, which in turn sources from CLDR, which in part sources from IANA.
There is a subtle difference between IANA and CLDR with regard to canonicalization.
IANA treats the preferred form of the time zone identifier as canonical. If the preferred form changes, they make the new one primary (a Zone entry) and move the old one to an alias (a Link entry).
CLDR treates the first form of the time zone identifier as canonical. That is, the first time it appeared in CLDR, it is locked in forever. If a new form ever appears, it is added as an alias in the /common/bcp47/timezone.xml file.
Taking your case of Indianapolis:
IANA Zone entry (reference here)
Zone America/Indiana/Indianapolis ...
IANA Link entry (reference here)
Link America/Indiana/Indianapolis America/Indianapolis
In CLDR, the primary zone is listed first in the "aliases" attribute, followed by other aliases. (reference here)
<type name="usind" description="Indianapolis, United States" alias="America/Indianapolis America/Fort_Wayne America/Indiana/Indianapolis US/East-Indiana"/>
The same thing can be found with Asia/Calcutta vs. Asia/Kolkata and several other examples.
Additionally, be aware that the Windows time zone mappings also source from CLDR, in the /common/supplemental/windowsZones.xml file. You'll notice there that US Eastern Standard Time is mapped to America/Indianapolis. So the difference between browsers depends very much on which canonicalization rules are followed.
In the end, it doesn't really matter which alias is used. They point at the same data.
Also worth pointing out, that particular Windows zone should only be selected if you care about historical time changes in Indiana. If you are just in the US Eastern time zone, you should set your system to "Eastern Standard Time", rather than "US Eastern Standard Time". (Yes, those IDs are confusing...)
zone.tab is incomplete. Some time zones are represented as symbolic links:
$ find /usr/share/zoneinfo/America -name Indianapolis -exec file {} \;
/usr/share/zoneinfo/America/Indiana/Indianapolis: symbolic link to ../Indianapolis
/usr/share/zoneinfo/America/Indianapolis: timezone data, version 2, 7 gmt time flags, 7 std time flags, no leap seconds, 99 transition times, 7 abbreviation chars
I'm not sure how/why each browser returns a different result (probably they read from different sources and/or it can be due to how each implements Intl.DateTimeFormat). But it doesn't matter, because both are the same timezone.
If you check this list, you'll see that America/Indianapolis links to America/Indiana/Indianapolis. Those names in the format Region/City are maintained by IANA and it's probably one of the best sources we have about timezone information.
IANA also keeps a backward file that maps those 2 names, with America/Indiana/Indianapolis being the most recent one.
So one alternative is to get a copy of the backward file and use it to map whatever name you receive to the respective new name, and also validate against your server's timezone list.
Hmmm... Okay so I've been searching the web for 2 days now without any luck. I've seen a lot of answers on how to format a javascript date for example new Date().toString("yyyy-MM-dd")... Which would return something like 2013-04-05.
This is absolutely not the problem.
What I want, is the possibility to set the format in which my OS displays dates, then retrieve that specific format and display it in the browser.
For example, let's say I changed the format of the date in my OS to MM-yyyy/dd (this is for arguement sakes, whether that would work or not is irrelevant)). Then I'd expect to see 04-2013/05 in my OS, right? Next I want to retrieve this specific format in my browser via Javascript so that I can use this to format my dates throughout my webpage.
If this is lunacy and cannot be done, please tell me, as I've got a headache from searching.
Also, if you say use someDateObject.toLocaleDateString() without explaining exactly why .toLocaleDateString would work, I'm going to ignore it, because I've tried setting my date-format in my OS to numerous formats and every single time I use .toLocaleDateString(), I receive the same format: dd/MM/yyyy
first attribute of .toLocaleDateString method locale(s) used.
current locale you can obtain through navigator.language (in firefox or chrome) parameter.
in IE you can obtain navigator.browserLanguage or navigator.systemLanguage
in browsers other than IE it is impossible to obtain system language this way
after this you can call new Date.toLocaleString(navigator.language||navigator.browserLanguage) and it will be formated correctly depending on browser language
I understood the javascript method toLocaleDateString() used computer settings.
Let's take the W3Schools example :
when i change date and hour formats of my computer, the result is different in Firefox or IE (as expected), but Chrome still shows the same date format, why?
From the MDN:
"The exact format depends on the platform, locale and user's settings."
And,
"You shouldn't use this method in contexts where you rely on a particular format or locale."
Basically, "Why" is because that's how Chrome does it. If you need a specific format, you're going to have to specify it yourself.
From the EMCAScript 5 standard:
15.9.5.6 Date.prototype.toLocaleDateString ( )
This function returns a String value. The contents of the String are implementation-dependent, but are intended to represent the “date” portion of the Date in the current time zone in a convenient, human-readable form that corresponds to the conventions of the host environment’s current locale.
Chrome can represent the date as a locale date string in whatever manner it likes. The standard only supplies guidelines; it does not mandate a particular format. And, in fact, the result will vary not only between browsers but also within Chrome itself depending on your locale settings.
It looks like Chrome does not use the Windows regional settings, but its own settings instead. These are available via Settings > Advanced Settings > Language. However the date format is not explicitly defined, it is inferred from the language + country choice, for instance:
English (US) sets date format to mm/dd/yyyy
English (UK) sets date format to dd/mm/yyyy
(For anyone trying to change these, don't forget - like I did - to restart Chrome for the settings to take effect)
Back to the original question, it looks like it was legit to use toLocaleDateString() as long as the idea is to present the information in a format the human user understands. But this would be an ideal world, where every user has his/her browser properly configured. Instead, Chrome is set by default to English(US) as long as people leave it be in English, and it takes some googling (which most users won't do) to change these settings.
This makes it risky to use toLocaleDateString() even when not "relying on a particular format or locale". It looks like the only "serious" option for any cross-browser web application is to manage its own date format preferences (per user, of course...)