Is it safe to compare JavaScript's getTime() across different systems? - javascript

JavaScript's getTime() returns "the number of milliseconds since 1 January 1970 00:00:00 UTC".
Can I rely on this being similar across different machines? I don't need it to be accurate to the millisecond, just to a few seconds.
Or do I need to use an external time service API, as in this question?
Where does JavaScript get the current time from - is it dependent on the machine's clock?

Can I rely on this being similar across different machines?
No.
Where does JavaScript get the current time from
The system datetime on which this javascript runs.
Or do I need to use an external time service API, as in this question?
You could use the server's time and send it to the client.

Javascript is definitely going to use the internal clock of the machine (where else could it possibly get this information?).
Whether it is safe depends on what you are writing and for what reason. If you do decide to use it to compare it across machines, make sure it is very fault taulerant, as this can easily be messed with.

Related

Node 8 vs 11 Timezone is different

Running a simple new Date().toString(). On Node 11, You get something like
'Fri May 10 2019 10:44:44 GMT-0700 (Pacific Daylight Time)'
While on Node 8 you get
'Fri May 10 2019 10:44:44 GMT-0700 (PDT)'
Note the different in timezone abbreviation. Why is that? And how can you force toString() to always return the zone in the abbreviation?
Stolen answer from #ssube who was too lazy to log in and post.
the whole Intl object and default formats were introduced between those two versions, which may have become the new default for Date as well.
After some digging on my own, and reading some of the Intl spec:
The ECMAScript 2015 Internationalization API Specification identifies time zones using the Zone and Link names of the IANA Time Zone Database. Their canonical form is the corresponding Zone name in the casing used in the IANA Time Zone Database.
As to how to revert back to an abbreviated timezone, I am seeing that there are several github repos that suggest using regex, others using an abbreviation Map, or even Ben Nadel who uses some regex to process the short timezone or long timezone, as seen in his blog here
Looks like JavaScript leaves this up to the implementer. Based on the below GitHub Issue for ECMA262, there are known differences between the ways UNIX & Windows handle the timezone value.
Across multiple JS implementations, it seems that Date.prototype.toString writes the timezone (in parens) in a long, locale-dependent form on Windows, but in a short form (2-4 letters) from the tz database on Unix-based OSes. More details in the V8 bug tracker.
The spec is really light on details for Date.prototype.toString:
Return an implementation-dependent String value that represents tv as a date and time in the current time zone using a convenient, human-readable form.
Does anyone have a good memory of why this is the definition? Looks like it goes all the way back to ES1.
Fortunately, it seems that, at this point, implementations have converged on something that's almost always the same, with the exception of the timezone string.
For the timezone string, would it be a good idea to pick one of the two alternatives and standardize it across all platforms? Does anyone have evidence one way or the other whether either of the two is likely to be more web-compatible, or whether we need to preserve the variation?
Additionally, it looks like there is still active discussion in the V8 Issues for Date.prototype.toString() normalization.
Going through the NodeJS there doesn't seem to be an explicit mention of this in their change logs for v10+.
Update
After digging through V8 commits, it looks like there is a new Timezone Names Cache implemented for performance in V8 when using Date.prototype.toString(). Based on the below excerpt from the message for this commit, it seems like this change is why there is a difference between Node v8 & Node v11
To speed up Date.prototype.toString(), this patch adds a cache in the
DateCache for the string short name representing the time zone.
Because time zones in a particular location just have two short names
(for DST and standard time), and the DateCache already understands
whether a time is in DST or not, it is possible to keep the result of
OS::LocalTimezone around and select between the two based on whether
the time is DST or not.
In local microbenchmarks (calling Date.prototype.toString() in a
loop), I observed a 6-10% speedup with this patch. In the browser, the
speedup may be even greater as the system call needs to do some extra
work to break out of the sandbox. I don't think the microbenchmark is
extremely unrealistic; in any real program which calls
Date.prototype.toString() multiple times, the cache should hit almost
all of the time, as time zone changes are rare.
The proximate motivation for this patch was to enable ICU as a backend
for timezone information, which is drafted at
https://codereview.chromium.org/2724373002/ The ICU implementation of
OS::LocalTimezone is even slower than the system call one, but this
patch makes their performance indistinguishable on the microbenchmark.
In the tz database, many timezones actually do have a number of
different historical names. For example, America/Anchorage went
through a number of changes, from AST to AHST to YST to AKST. However,
both ICU and the Linux OS interfaces just report the modern timezone
name in tests for the appropriate timezone name, even for historical
times. I can see why this would be:
- For ICU, CLDR only has two short names in the data file: the one for dst and non-dst
- For Linux, the timezone names do seem to make it into the /etc/localtime file. However, glibc assumes there are only two
relevant names and selects between them, as you can see in its
implementation of localtime_r:
http://bazaar.launchpad.net/~vcs-imports/glibc/master/view/head:/time/tzset.c#L573
So, this cache should be valid until we switch to a more accurate
source of short timezone names.

Java and JavaScript timestamps are not the same

I've got a problem with timestamps between java and javascript.
I already found these 2 questions about the timestamps and I know about the timechanges all over the years.
Timestamp deviation Java vs Javascript for old dates (3600secs)
Why is subtracting these two times (in 1927) giving a strange result?
Basically at midnight at the end of 1927, the clocks went back 5
minutes and 52 seconds. So "1927-12-31 23:54:08" actually happened
twice, and it looks like Java is parsing it as the later possible
instant for that local date/time.
What the problems makes is that when I have javascript and put the timestamp in there then I get an other date than the Java date. I need this to show the correct date on the webpage. I know I can request the date as a string but I prefer using a timestamp.
Java date 0001-01-01 timestamp is -62135773200000
JavaScript date 0001-01-01 timestamp is -62135596800000
The difference is -176400000; 49 hours.
Does anybody know what I can do for this.
Personally, I would avoid passing numerical timestamps around from a system in one language to a system in another language for the sole reason that the languages may differ in the algorithm they use to generate them.
There is an international standard in place (ISO-8601) to deal with passing timestamps from system to system. In this your date representation becomes 0001-01-01T00:00:00+00:00. I would recommend using this approach, as it's a widely accepted solution for this very problem.
This might be related to TZ and DST settings which diverge from browser to java. In order to nail it down, I recommend to use ISO-8601 formats like 2008-02-01T09:00:22+05, this is ambiguous-less

For a given hour, how to get the timezone where the hour is

So to clarify why I want to do this: I need to send push notifications to users when it's 7pm in their timezone.
For each registered device, I have a timezone string, like "Europe/Paris".
I'm creating a background job which will run every hour. It should fetch the list of users for which it's 7pm, and send them a notification.
So the question I'd like to answer is:
"Where in the world is it 7pm now"
Edit The important thing is to get the timezone, even if it's not formatted like "Europe/Paris", I can do that conversion manually with an array.
There's no built-in support in Javascript for converting the "standard" timezone names (e.g. Europe/London) to timezone offsets.
You mention push notifications so that suggests you're not running in a browser. If you're using Node.js there's a good library I've used called timezone which uses a local set of timezone spec files to handle conversion between timezones.
Note that timezone specs do sometimes change, for example when a national government decides with little notice that they're not doing daylight savings this year. It's vital that your local mappings are kept up to date accordingly.
There may be an easier way to do this, but have a database of users, who have a timezone. Then have a hour difference between your notification server, and that time zone. So your server is in pacific time, and the timezone is eastern, it will have a difference of 3.
Use the difference of the time you want, and what it is on your server, to determine the timezone. You want 16:xx it is 13:xx you get +3 hour difference, use that to look up the time zone, then push to all users associated with that time zone.
There may be an easier way, but that way is real simple solution with a db and a little sql knowledge.
EDIT: Also if you dont mind the time delay of using a web api, you should check this out: https://developers.google.com/maps/documentation/timezone/
Whoops numbers were off with the time-zones, Thanks Trent

is epoch time the same on the same machine with diffent scripts/programs

I want to use shell to get epoch time
and later use javascript on a html page to get another epoch time
and then get the difference between them
but I'm afraid that the epoch time may not be synchronized among different scripts
so this difference is useless
so I want to know, if at the very same time, I use shell and javascript to get epoch tiem
will the result be the same or not?
if not, how big is the difference?
thanks!
If you mean number of seconds since Unix epoch (1970-01-01T00:00:00Z), it is governed by this very definition. The only differences you should be able to see are caused by:
different times of invocation of the system call that returns it*);
unsynchronized clocks on different systems.
and possibly also:
unsynchronized clocks on different processor cores/physical processors;
implementation dependent handling of the function that returns current time (e.g. JS engine implementation might possibly cache the value for a short time as not to have to do the actual syscall, although I would doubt this).
Depending on the time resolution you need, some of these are not a problem. My guess is, that if you don't need granularity finer than 1s, you should be more than fine (on the same machine).
*) also note, that on single core system, you can't really get the same time (at least with the ns resolution) from different syscalls, unless the kernel caches it, simply because they have to happen one after another.
According to ECMA-262 section 15.9.1.1:
Time is measured in ECMAScript in milliseconds since 01 January, 1970
UTC. In time values leap seconds are ignored. It is assumed that there
are exactly 86,400,000 milliseconds per day.
So, yes, every JavaScript implementation that adheres to standard must return exactly same values, taking care of any physical clock quirks by itself. Barring wrong set up of system clock you will have same value everywhere.
Note that definition of "epoch" and "seconds since" in other languages and systems could be different (at very least most other systems use seconds, not milliseconds and most often take leap seconds in account), so you can't guarantee that JS time, even divided by 1000 will match timestamp from another platform or OS.

How to find browser date time format [duplicate]

I have set a deadline in UTC, as shown below, and I'm wondering what exactly the toLocaleString() method will do to it on user's local machines. For instance, will it account for daylight savings if they are in a timezone that recognizes it? Or will I need to insert additional code that checks where the user is, and then fixes the displayed time?
http://javascript.about.com/library/bldst.htm
var deadline = new Date('5/1/2013 ' + "16:15" + ' UTC');
alert(deadline.toLocaleString());
In general, the answer is yes. JavaScript will represent the UTC value at the appropriate local time based on the time zone settings of the computer it is running on. This includes adjustment for DST. However, as others have pointed out, the details are implementation specific.
If you want a consistent output, I would use a library to format your dates instead of relying on the default implementation. The best library (IMHO) for this is moment.js. The live examples on their main page will give you an idea of what it can do.
UPDATE
If you are passing UTC values that you want converted to the correct local time, and that time falls into a period where the time zone rules are different than the current one - then the results will be invalid. This is crazy, but true - and by design in the ECMA spec. Read - JavaScript Time Zone is wrong for past Daylight Saving Time transition rules
We don't know what exactly the toLocaleString method does (§15.9.5.5):
This function returns a String value. The contents of the String are
implementation-dependent, but are intended to represent the Date in
the current time zone in a convenient, human-readable form that
corresponds to the conventions of the host environment’s current
locale.
But yes, most implementations will consider DST if it is active in the current local timezone. For your example I'm getting "Mittwoch, 1. Mai 2013 18:15:00" - CEST.
Will I need to insert additional code that checks where the user is, and then fixes the displayed time?
I think you can trust toLocaleString - the browser should respect the user's settings. If you want to do it manually, check out timezone.js.
As you use "UTC" the date itself will be UTC format, but the toLocaleString() takes client's locale into account, which means it'll return the date in string updated with all and every changes typical to client's regional and locale settings (DST, date/time format, etc).As JS documentation describes this: "The toLocaleString() method converts a Date object to a string, using locale settings.".If you want to avoid this, use the toUTCString() method instead.I'd also recommend reading the accepted solution for the question Javascript dates: what is the best way to deal with Daylight Savings Time? to avoid (at least, to try to avoid :) future issues related to JS, browsers and locales.Hope this helps!

Categories