Minimum and maximum date - javascript

I was wondering which is the minimum and the maximum date allowed for a Javascript Date object. I found that the minimum date is something like 200000 B.C., but I couldn't get any reference about it.
Does anyone know the answer? I just hope that it doesn't depend on the browser.
An answer in "epoch time" (= milliseconds from 1970-01-01 00:00:00 UTC+00) would be the best.

From the spec, §15.9.1.1:
A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.
The third paragraph being the most relevant. Based on that paragraph, we can get the precise earliest date per spec from new Date(-8640000000000000), which is Tuesday, April 20th, 271,821 BCE (BCE = Before Common Era, e.g., the year -271,821).

To augment T.J.'s answer, exceeding the min/max values generates an Invalid Date.
let maxDate = new Date(8640000000000000);
let minDate = new Date(-8640000000000000);
console.log(new Date(maxDate.getTime()).toString());
console.log(new Date(maxDate.getTime() - 1).toString());
console.log(new Date(maxDate.getTime() + 1).toString()); // Invalid Date
console.log(new Date(minDate.getTime()).toString());
console.log(new Date(minDate.getTime() + 1).toString());
console.log(new Date(minDate.getTime() - 1).toString()); // Invalid Date

A small correction of the accepted answer; the year of the minimum date is actually 271,822 BCE, as you can see when running the following snippet:
console.log(new Date(-8640000000000000).toLocaleString("en", {"year": "numeric", "era": "short"}))
Indeed, year -271,821 is actually 271,822 BCE because JavaScript's Date (along with ISO 8601) uses astronomical year numbering, which uses a year zero.
Thus, year 1 is 1 CE, year 0 is 1 BCE, year -1 is 2 BCE, etc.

var startingDate = new Date().toISOString().split('T')[0] + "T00:00:00.001Z"; // for 2022-07-25 00 00 00
var endingDate = new Date().toISOString().split('T')[0] + "T23:59:59.999Z" ;

As you can see, 01/01/1970 returns 0, which means it is the lowest possible date.
new Date('1970-01-01Z00:00:00:000') //returns Thu Jan 01 1970 01:00:00 GMT+0100 (Central European Standard Time)
new Date('1970-01-01Z00:00:00:000').getTime() //returns 0
new Date('1970-01-01Z00:00:00:001').getTime() //returns 1

Related

Weired number when i used currentddate.setDate(currentddate.getDate() + 30) to add 30 days from today

I have the following code to add 30 days to the current date:-
var currentddate = new Date();
alert(currentddate);
var currentddate2 = new Date();
alert(currentddate2.setDate(currentddate2.getDate() + 30));
now the first alert will return the current date as follow Wed Jun 12 2019 22:15:49 GMT+0100 (British Summer Time) but the second alert will return this weird number 1562966157303 ? so can anyone advice on this?
Even though setDate changes the date, its return value is:
The number of milliseconds between 1 January 1970 00:00:00 UTC and the
given date (the Date object is also changed in place)
See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/setDate
If you alert(currentddate2) again, you'll see that it was changed correctly.
If some API returns an unexpected result, first of all, look into the docs!
Return value
The number of milliseconds between 1 January 1970 00:00:00 UTC and the given date (the Date object is also changed in place).

Issue about the return value of getTime()

I have created a date variable pointing to 9th of July 2014
var date = new Date(2014, 6, 9);
When I try to obtain the time from this date, I expect that the time variable
var time = date.getTime();
would give me the value milliseconds value of July 9th 2014 at 00:00:00.
Instead it gives me
1404860400000
which is the milliseconds value of 8th July 2014 at 23:00:00.
Can someone explain me why this?
Your code here:
var date = new Date(2014, 6, 9);
creates a Date instance initialized to your local time of midnight on July 9th, 2014. Timestamp numbers (both JavaScript's milliseconds-since-the-Epoch and Unix's seconds-since-the-epoch) are unaffected by timezones, the value is since midnight Jan 1 1970 UTC.
If you were to construct this date:
var newDate = new Date(1404860400000);
...you'd have a date that was exactly equivalent to your first one. If you asked it for the local version of the moment it represented, it would say midnight on July 9th, 2014.
In both date and newDate above, if you ask it the UTC version of the date, you'll see it's offset from midnight (the direction depends on where you are, west or east of Greenwich, England). At the moment I'm writing this, almost no one is on GMT, not even those of us in the UK who normally are, because of Summer time. But for most people who are never in GMT, the value will always be offset.
If you want a Date instance giving you midnight on July 9th, 2014 UTC (e.g., not local time), use new Date(Date.UTC(2014, 6, 9)). Date.UTC gives you the time value for the given date in UTC, and then if you feed that time value into new Date, you get a Date for it.
January 1, 1970 :
getTime() Returns the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this Date object. Returns: the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this date.
Google.
The Mozilla docs usually cover documentation issues like this quite well.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date

ANSI date (COBOL Integer date) to current date

I need to translate an integer representing the number of days since 01.01.1601 (as of 6th November 2012: 150422) to a javascript Date object.
Each year has approximately 365.242199 days, so the calculation should be as follows:
var daysPerYear = 365.242199;
var daysSince = 150422;
var year = 1601 + Math.floor(daysSince / daysPerYear); // correct, gives me 2012
var days = Math.floor(daysSince % daysPerYear); // wrong, gives me 307
Now I create the Date object:
var date = new Date(year, 0);
date.setDate(days);
The date now points to 'Fri Nov 02 2012 00:00:00 GMT+0100 (CET)' which is off by about 4 days.
What is wrong with my calculation? Is there an easier way to obtain the Date object?
Clone out a copy of OpenCOBOL 1.1, and look through libcob/intrinsic.c for computations.
See cob_intr_date_of_integer in particular.
For an SVN readonly checkout
svn checkout svn://svn.code.sf.net/p/open-cobol/code/trunk open-cobol-code
or browse to
http://sourceforge.net/projects/open-cobol/files/open-cobol/1.1/open-cobol-1.1.tar.gz/download
JavaScript dates revolve from midnight on 1st January, 1970. If you do new Data().getTime(), for example, you'll be returned the number of milliseconds from that point. Therefore, to convert your dates from 1st January, 1601, you need to calculate the exact number of milliseconds between 1/1/1601 and 1/1/1970 and take the difference from your date (also converted into milliseconds).
This way, all you are doing is adding numbers together and won't suffer from any floating point inaccuracies or error from your approximations.

What range of dates are permitted in Javascript?

What is the maximum and the minimum date that I can use with the Date object in Javascript?
Is it possible to represent ancient historical dates (like January 1, 2,500 B.C.) or dates that are far into the future (like October 7, 10,000)?
If these far-from-present dates can't be represented with the Date object, how should I represent them?
According to §15.9.1.1 of the ECMA-262 specification,
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC.
...
The actual range of times supported by ECMAScript Date objects is ... exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
So the earliest date representable with the Date object is fairly far beyond known human history:
new Date(-8640000000000000).toUTCString()
// Tue, 20 Apr 271,822 B.C. 00:00:00 UTC
The latest date is sufficient to last beyond Y10K and even beyond Y100K, but will need to be reworked a few hundred years before Y276K.
new Date(8640000000000000).toUTCString()
// Sat, 13 Sep 275,760 00:00:00 UTC
Any date outside of this range will return Invalid Date.
new Date(8640000000000001) // Invalid Date
new Date(-8640000000000001) // Invalid Date
In short, the JavaScript Date type will be sufficient for measuring time to millisecond precision within approximately 285,616 years before or after January 1, 1970. The dates posted in the question are very comfortably inside of this range.

Tips for working with Pre-1000 A.D. Dates in JavaScript

I'm curious if anyone has any good solutions for accurately building dates prior to the year 1000 A.D. - particularly the years 1 - 100 AD.
For example, if I want to build a date for the start of the 1st millenium, I can't just do...
new Date(Date.UTC(1,0,1,0,0,0,0));
because it tries to be "smart" and assume that 1 is 1901, which gives me...
Sun Dec 31 1900 18:00:00 GMT-0600 (CST)
The same thing goes for the year 99...
new Date(Date.UTC(99,0,1,0,0,0,0));
which becomes
Thu Dec 31 1998 18:00:00 GMT-0600 (CST)
Thoughts?
i prefer:
var d = new Date(Date.UTC(year, month, day, hour, min, sec, 0));
d.setUTCFullYear(year);
this always works for all supported year values.
the setUTCFullYear() call fixes JavaScript's intentional bug if you ask me.
Have you tried using the setUTC... functions on a date object after its creation?
setUTCDate()
setUTCFullYear()
setUTCMonth()
setUTCHours()
setUTCMinutes()
setUTCSeconds()
Here is the basic solution I came up with. If a date is prior to year 1000, I just add a 1000 to it while constructing the date, then use setUTCFullYear() afterwards.
if (year >= 0 && year < 1000) {
var d = new Date(Date.UTC(year + 1000,mon,day,hour,min,sec,0));
d.setUTCFullYear(d.getFullYear() - 1000);
return d;
}
1000 may be overkill since I was only having problems with pre-100 dates... but, whatever.
You have to set the year again, like setFullYear() or setUTCFullYear().
The Date can store 285 616 years before and after 1. 1. 1970.
var d = new Date( 0000, 1, 29 ); // Thu Mar 01 1900 00:00:00 GMT+0100
d.setFullYear(-0004); // Wed Mar 01 -0004 00:00:00 GMT+0057
d.setFullYear( 0000, 1, 29 ); // Tue Feb 29 0000 00:00:00 GMT+0057
// Yes, year zero was a leap year
Explanation:
new Date( year [4-digit number, 0–99 map to 1900–1999], month [0-11], day [def. 1], hours, minutes, seconds, milisecs ); is same like Date.UTC but in local timezone.
hours, minutes and seconds will be automatically filled with zeros.
the year lower than 1900 is converted to 1900 by default.
the year 1900 is not a leap year, so it is shifted to next closest day 1. Mar.
so, we have to set the year to zero 0000 again (year always must be min. 4-digit in this case), month and day. We use setFullYear() method with optional parameters for month and day; then if the year will be a leap year, it won’t be shifted.
It's exactly what Date.UTC and the Date constructor function (called with numbers as arguments) are supposed to do. A simple workaround is to use Date.parse, which will not apply any corrections
new Date(Date.parse('0001-01-04'));
new Date(Date.parse('0001-01-04T18:00:00Z'));
Make some arbitrary date d and call d.setUTCFullYear(myDate). This seems to be working for me in Chrome's console.
Others have provided hints at a fix. The javascript date object was copied from Java, so has all its bugs too. There is not much point to Gregorian dates before 1582 anyway since before that various other calendars were in use.

Categories