What is the maximum and the minimum date that I can use with the Date object in Javascript?
Is it possible to represent ancient historical dates (like January 1, 2,500 B.C.) or dates that are far into the future (like October 7, 10,000)?
If these far-from-present dates can't be represented with the Date object, how should I represent them?
According to §15.9.1.1 of the ECMA-262 specification,
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC.
...
The actual range of times supported by ECMAScript Date objects is ... exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
So the earliest date representable with the Date object is fairly far beyond known human history:
new Date(-8640000000000000).toUTCString()
// Tue, 20 Apr 271,822 B.C. 00:00:00 UTC
The latest date is sufficient to last beyond Y10K and even beyond Y100K, but will need to be reworked a few hundred years before Y276K.
new Date(8640000000000000).toUTCString()
// Sat, 13 Sep 275,760 00:00:00 UTC
Any date outside of this range will return Invalid Date.
new Date(8640000000000001) // Invalid Date
new Date(-8640000000000001) // Invalid Date
In short, the JavaScript Date type will be sufficient for measuring time to millisecond precision within approximately 285,616 years before or after January 1, 1970. The dates posted in the question are very comfortably inside of this range.
Related
I'm trying to construct a Date from a year, month, day tuple, however I'd like to be able to construct "historic" dates as well. For instance Nero's birthday: 15 December 37. The straightforward implementation
function constructDate(year, month, day) {
return new Date(year, month-1, day);
}
however returns Wed Dec 15 1937 00:00:00 GMT+0100 for constructDate(37, 12, 15), because of the constructor's magic to treat years in the range 0-100 as 1900-2000. I'm trying to find a good way to bypass this behavior.
A naive approach would be to construct a temporary date with an arbitrary year, and then call setFullYear() to get the actual target date. However, I have a few concerns:
Trying to do so for Nero's birthday in Firefox, returns Tue Dec 15 0037 00:00:00 GMT+0053. What has happened to the time zone offset there?
I'm not sure if there are subtleties e.g. when constructing February 29. If the target year is a leapyear, but the temporary year isn't, i.e., doesn't have February 29, am I in trouble?
Internally the construction from year/month/day requires the computation of the underlying Unix epoch timestamp. With the setFullYear() work-around there would be two timestamp computations, one of them all in vain.
Is there a way to construct such a Date without a temporary / computing the timestamp twice internally?
This question already has answers here:
Why does Date.parse give incorrect results?
(11 answers)
Closed 7 years ago.
I was executing below statement under nodejs repl and i was getting two different result for same date
var dateStr1 = "2015/03/31";
var dateStr2 = "2015-03-31";
var date1 = new Date(dateStr1);//gives Tue Mar 31 2015 00:00:00 GMT+0530 (IST)
var date2 = new Date(dateStr2);//gives Tue Mar 31 2015 05:30:00 GMT+0530 (IST)
In the 1st one hour,min,seconds are all zeros while in the 2nd one by default hour,min is getting set to as a timezone hour,min which is 5:30
It boils down to how Date.parse() handles the ISO-8601 date format.
The date time string may be in ISO 8601 format. For example, "2011-10-10" (just date) or "2011-10-10T14:48:00" (date and time) can be passed and parsed. The UTC time zone is used to interpret arguments in ISO 8601 format that do not contain time zone information (note that ECMAScript ed 6 draft specifies that date time strings without a time zone are to be treated as local, not UTC)
Your first date format 2015/03/31 is assumed to be March 31st, 2015 at 12am in your current time zone.
Your second date format 2015-03-31 is seen as ISO-8601 and is assumed to be March 31st, 2015 at 12am UTC time zone.
The "Differences in assumed time zone" heading from the linked documentation goes into a bit more detail:
Given a date string of "March 7, 2014", parse() assumes a local time zone, but given an ISO format such as "2014-03-07" it will assume a time zone of UTC. Therefore Date objects produced using those strings will represent different moments in time unless the system is set with a local time zone of UTC. This means that two date strings that appear equivalent may result in two different values depending on the format of the string that is being converted (this behavior is changed in ECMAScript ed 6 so that both will be treated as local).
The first string, "2015/03/31", is an unsupported format according to the ECMAScript standard. Whenever an unsupported value is passed to the constructor its behaviour is implementation dependent, i.e. the standard doesn't say what the implementation must do. Some browsers, like Firefox, try to guess what the format is and apparently it creates a date object with that date at midnight local time. Other browsers may return NaN or interpret the parts differently.
The second string, "2015-03-31", is a properly formatted ISO 8601 date. For these strings there are well defined rules and all browsers will interpret it as that date, midnight, UTC.
I have created a date variable pointing to 9th of July 2014
var date = new Date(2014, 6, 9);
When I try to obtain the time from this date, I expect that the time variable
var time = date.getTime();
would give me the value milliseconds value of July 9th 2014 at 00:00:00.
Instead it gives me
1404860400000
which is the milliseconds value of 8th July 2014 at 23:00:00.
Can someone explain me why this?
Your code here:
var date = new Date(2014, 6, 9);
creates a Date instance initialized to your local time of midnight on July 9th, 2014. Timestamp numbers (both JavaScript's milliseconds-since-the-Epoch and Unix's seconds-since-the-epoch) are unaffected by timezones, the value is since midnight Jan 1 1970 UTC.
If you were to construct this date:
var newDate = new Date(1404860400000);
...you'd have a date that was exactly equivalent to your first one. If you asked it for the local version of the moment it represented, it would say midnight on July 9th, 2014.
In both date and newDate above, if you ask it the UTC version of the date, you'll see it's offset from midnight (the direction depends on where you are, west or east of Greenwich, England). At the moment I'm writing this, almost no one is on GMT, not even those of us in the UK who normally are, because of Summer time. But for most people who are never in GMT, the value will always be offset.
If you want a Date instance giving you midnight on July 9th, 2014 UTC (e.g., not local time), use new Date(Date.UTC(2014, 6, 9)). Date.UTC gives you the time value for the given date in UTC, and then if you feed that time value into new Date, you get a Date for it.
January 1, 1970 :
getTime() Returns the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this Date object. Returns: the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this date.
Google.
The Mozilla docs usually cover documentation issues like this quite well.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date
I need to translate an integer representing the number of days since 01.01.1601 (as of 6th November 2012: 150422) to a javascript Date object.
Each year has approximately 365.242199 days, so the calculation should be as follows:
var daysPerYear = 365.242199;
var daysSince = 150422;
var year = 1601 + Math.floor(daysSince / daysPerYear); // correct, gives me 2012
var days = Math.floor(daysSince % daysPerYear); // wrong, gives me 307
Now I create the Date object:
var date = new Date(year, 0);
date.setDate(days);
The date now points to 'Fri Nov 02 2012 00:00:00 GMT+0100 (CET)' which is off by about 4 days.
What is wrong with my calculation? Is there an easier way to obtain the Date object?
Clone out a copy of OpenCOBOL 1.1, and look through libcob/intrinsic.c for computations.
See cob_intr_date_of_integer in particular.
For an SVN readonly checkout
svn checkout svn://svn.code.sf.net/p/open-cobol/code/trunk open-cobol-code
or browse to
http://sourceforge.net/projects/open-cobol/files/open-cobol/1.1/open-cobol-1.1.tar.gz/download
JavaScript dates revolve from midnight on 1st January, 1970. If you do new Data().getTime(), for example, you'll be returned the number of milliseconds from that point. Therefore, to convert your dates from 1st January, 1601, you need to calculate the exact number of milliseconds between 1/1/1601 and 1/1/1970 and take the difference from your date (also converted into milliseconds).
This way, all you are doing is adding numbers together and won't suffer from any floating point inaccuracies or error from your approximations.
I was wondering which is the minimum and the maximum date allowed for a Javascript Date object. I found that the minimum date is something like 200000 B.C., but I couldn't get any reference about it.
Does anyone know the answer? I just hope that it doesn't depend on the browser.
An answer in "epoch time" (= milliseconds from 1970-01-01 00:00:00 UTC+00) would be the best.
From the spec, §15.9.1.1:
A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.
The third paragraph being the most relevant. Based on that paragraph, we can get the precise earliest date per spec from new Date(-8640000000000000), which is Tuesday, April 20th, 271,821 BCE (BCE = Before Common Era, e.g., the year -271,821).
To augment T.J.'s answer, exceeding the min/max values generates an Invalid Date.
let maxDate = new Date(8640000000000000);
let minDate = new Date(-8640000000000000);
console.log(new Date(maxDate.getTime()).toString());
console.log(new Date(maxDate.getTime() - 1).toString());
console.log(new Date(maxDate.getTime() + 1).toString()); // Invalid Date
console.log(new Date(minDate.getTime()).toString());
console.log(new Date(minDate.getTime() + 1).toString());
console.log(new Date(minDate.getTime() - 1).toString()); // Invalid Date
A small correction of the accepted answer; the year of the minimum date is actually 271,822 BCE, as you can see when running the following snippet:
console.log(new Date(-8640000000000000).toLocaleString("en", {"year": "numeric", "era": "short"}))
Indeed, year -271,821 is actually 271,822 BCE because JavaScript's Date (along with ISO 8601) uses astronomical year numbering, which uses a year zero.
Thus, year 1 is 1 CE, year 0 is 1 BCE, year -1 is 2 BCE, etc.
var startingDate = new Date().toISOString().split('T')[0] + "T00:00:00.001Z"; // for 2022-07-25 00 00 00
var endingDate = new Date().toISOString().split('T')[0] + "T23:59:59.999Z" ;
As you can see, 01/01/1970 returns 0, which means it is the lowest possible date.
new Date('1970-01-01Z00:00:00:000') //returns Thu Jan 01 1970 01:00:00 GMT+0100 (Central European Standard Time)
new Date('1970-01-01Z00:00:00:000').getTime() //returns 0
new Date('1970-01-01Z00:00:00:001').getTime() //returns 1