Issue about the return value of getTime() - javascript

I have created a date variable pointing to 9th of July 2014
var date = new Date(2014, 6, 9);
When I try to obtain the time from this date, I expect that the time variable
var time = date.getTime();
would give me the value milliseconds value of July 9th 2014 at 00:00:00.
Instead it gives me
1404860400000
which is the milliseconds value of 8th July 2014 at 23:00:00.
Can someone explain me why this?

Your code here:
var date = new Date(2014, 6, 9);
creates a Date instance initialized to your local time of midnight on July 9th, 2014. Timestamp numbers (both JavaScript's milliseconds-since-the-Epoch and Unix's seconds-since-the-epoch) are unaffected by timezones, the value is since midnight Jan 1 1970 UTC.
If you were to construct this date:
var newDate = new Date(1404860400000);
...you'd have a date that was exactly equivalent to your first one. If you asked it for the local version of the moment it represented, it would say midnight on July 9th, 2014.
In both date and newDate above, if you ask it the UTC version of the date, you'll see it's offset from midnight (the direction depends on where you are, west or east of Greenwich, England). At the moment I'm writing this, almost no one is on GMT, not even those of us in the UK who normally are, because of Summer time. But for most people who are never in GMT, the value will always be offset.
If you want a Date instance giving you midnight on July 9th, 2014 UTC (e.g., not local time), use new Date(Date.UTC(2014, 6, 9)). Date.UTC gives you the time value for the given date in UTC, and then if you feed that time value into new Date, you get a Date for it.

January 1, 1970 :
getTime() Returns the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this Date object. Returns: the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this date.
Google.

The Mozilla docs usually cover documentation issues like this quite well.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date

Related

Explain javascripts Date() functions

Why is it when I have
var dt = new Date(2015, 6, 1);
dt.toUTCString()
My output is Tue, 30 Jun 2015 23:00:00 GMT
And
var dt = new Date(2015, 6, 2);
dt.toUTCString()
Wed, 01 Jul 2015 23:00:00 GMT
I'm clearly missing something here, I want to be able to loop through each days of the month and get a Date() for that day
I don't understand why if the day is 1, it says the date is the 30th
Javascript dates are always generated with local time zone. Using toUTCString converts the time in the Date object to UTC time, and apparently in your case that means -1 hours. If you want to initialize a Date object with UTC time, use:
var dt = new Date(Date.UTC(2015, 6, 1));
The toUTCString() method converts a Date object to a string, according to universal time.
The Universal Coordinated Time (UTC) is the time set by the World Time Standard.
Note: UTC time is the same as GMT time.
Try to change dt.toUTCString() in another function.
There are a lot of hour on the planet,for example in America is the 5 o' Clock,in Japan is 10 o' Clock etc... the UTC is a time zone,try to change this.

Javascript getFullYear() Date method weird behavior

Can anyone explain why getFullYear does not return 2014?
console.log(new Date('2014-01-01').getFullYear()) //2013
console.log(new Date('2014-01-01').getUTCFullYear()) //2014
From MDN:
The dateString of "March 7, 2014" returns a different date than "2014-03-07" unless the local time-zone is UTC. When converting a dateString of "March 7, 2014" the local time-zone is assumed. When converting a dateString of "2014-03-07" the UTC time-zone is assumed. This results in two different Date values depending on the format of the string that is being converted.
So when you ask it to parse "2014-01-01", you're getting the time in UTC.
Then you call .getFullYear() on your object, which uses local time. If you live in the Eastern US like I do, then it basically subtracts 4 hours from the internal time and returns the year.
So here's what happens:
"2014-01-01" is converted to "1388534400000"
.getFullYear() is called and "1388534400000" is converted to local time
Local time is something like "1388534160000"
New years hasn't yet occurred at "1388534160000", so it's still 2013
All of this implies that if we do something like
console.log(new Date('January 1, 2014').getUTCFullYear()); // 2014
console.log(new Date('January 1, 2014').getFullYear()); // 2014
We'll get the same year, because we told the browser to use our timezone right on New Year's, but they're not equivalent:
console.log(new Date('January 1, 2014').getUTCHours()); // 5
console.log(new Date('January 1, 2014').getHours()); // 0
According to this:
"The difference is when you specify a string in the format YYYY-MM-DD, you get a date that is 12am in the GMT timezone and when you specify a date in the format DD-MM-YYYY, you get a date that is 12am in your current timezone."
So basically since you are specifying News Years Day 2014 when it gets converted from GMT to your local time it believes it is 12-31-13 not 01-01-14.

What range of dates are permitted in Javascript?

What is the maximum and the minimum date that I can use with the Date object in Javascript?
Is it possible to represent ancient historical dates (like January 1, 2,500 B.C.) or dates that are far into the future (like October 7, 10,000)?
If these far-from-present dates can't be represented with the Date object, how should I represent them?
According to §15.9.1.1 of the ECMA-262 specification,
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC.
...
The actual range of times supported by ECMAScript Date objects is ... exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
So the earliest date representable with the Date object is fairly far beyond known human history:
new Date(-8640000000000000).toUTCString()
// Tue, 20 Apr 271,822 B.C. 00:00:00 UTC
The latest date is sufficient to last beyond Y10K and even beyond Y100K, but will need to be reworked a few hundred years before Y276K.
new Date(8640000000000000).toUTCString()
// Sat, 13 Sep 275,760 00:00:00 UTC
Any date outside of this range will return Invalid Date.
new Date(8640000000000001) // Invalid Date
new Date(-8640000000000001) // Invalid Date
In short, the JavaScript Date type will be sufficient for measuring time to millisecond precision within approximately 285,616 years before or after January 1, 1970. The dates posted in the question are very comfortably inside of this range.

Minimum and maximum date

I was wondering which is the minimum and the maximum date allowed for a Javascript Date object. I found that the minimum date is something like 200000 B.C., but I couldn't get any reference about it.
Does anyone know the answer? I just hope that it doesn't depend on the browser.
An answer in "epoch time" (= milliseconds from 1970-01-01 00:00:00 UTC+00) would be the best.
From the spec, §15.9.1.1:
A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.
The third paragraph being the most relevant. Based on that paragraph, we can get the precise earliest date per spec from new Date(-8640000000000000), which is Tuesday, April 20th, 271,821 BCE (BCE = Before Common Era, e.g., the year -271,821).
To augment T.J.'s answer, exceeding the min/max values generates an Invalid Date.
let maxDate = new Date(8640000000000000);
let minDate = new Date(-8640000000000000);
console.log(new Date(maxDate.getTime()).toString());
console.log(new Date(maxDate.getTime() - 1).toString());
console.log(new Date(maxDate.getTime() + 1).toString()); // Invalid Date
console.log(new Date(minDate.getTime()).toString());
console.log(new Date(minDate.getTime() + 1).toString());
console.log(new Date(minDate.getTime() - 1).toString()); // Invalid Date
A small correction of the accepted answer; the year of the minimum date is actually 271,822 BCE, as you can see when running the following snippet:
console.log(new Date(-8640000000000000).toLocaleString("en", {"year": "numeric", "era": "short"}))
Indeed, year -271,821 is actually 271,822 BCE because JavaScript's Date (along with ISO 8601) uses astronomical year numbering, which uses a year zero.
Thus, year 1 is 1 CE, year 0 is 1 BCE, year -1 is 2 BCE, etc.
var startingDate = new Date().toISOString().split('T')[0] + "T00:00:00.001Z"; // for 2022-07-25 00 00 00
var endingDate = new Date().toISOString().split('T')[0] + "T23:59:59.999Z" ;
As you can see, 01/01/1970 returns 0, which means it is the lowest possible date.
new Date('1970-01-01Z00:00:00:000') //returns Thu Jan 01 1970 01:00:00 GMT+0100 (Central European Standard Time)
new Date('1970-01-01Z00:00:00:000').getTime() //returns 0
new Date('1970-01-01Z00:00:00:001').getTime() //returns 1

DST problem storing date in Javascript Date object

I have this simple Javascript code
var d = new Date(2011, 9, 8);
alert(d); // Show: Fri Oct 7 23:00:00 UTC-0400 2011
Important: My Time Zone is "Santiago de Chile", and I set computer clock to: 2-Oct-2011.
Alert show that Day is 7!!... why? How I can get it right? (Problem is only in this day)
According to this page DST change in Chile is always at the second full weekend of October and starts at midnight (lets disregard the extended DST of this year, as it's very likely your computer doesn't know about it).
That means that, as is also shown using the script I linked to earlier, at your timezone 8 Oct 2011 midnight can't be represented in UTC-0300, so it automatically switches to UTC-0400 to represent the same time.
I can reproduce your issue in my timezone (Amsterdam, UTC+1/2h) by asking for Sun 27 March 2011, 2:00 AM (with current timezone CEST=UTC+0200) (at which time summer time (CEST) starts and the clock is adjusted from UTC+0100 to UTC+0200) (remember this is exactly the opposite on the northern hemisphere))
new Date(2011, 2, 27, 2, 0, 0, 0);
which effectively can't be represented in UTC+0200, so the system chooses UTC+0100 and outputs
Sun Mar 27 2011 01:00:00 GMT+0100 (CET)
So, it's the same time in Unix time, only represented differently in your local timezone.
And yes, I know 8 October is a Saturday and DST change starts at Sunday, midnight, but this is the closest I can get.
Update: I can now reproduce your issue by setting the timezone of a WinXP machine to Santiago and let the OS automatically adjust for DST and alerting new Date(2011, 9, 9) (which is a Sunday and is shown as a Saturday; there must be something strange with your settings). Of course you can't control these settings on the client.
To work around this issue, I have to know what you exactly need: if you merely want some date/time display that doesn't take timezone and DST into account, just use UTC, like:
var d = new Date(Date.UTC(2011, 9, 9));
alert(d.toUTCString()); // Shows: Sun, 9 Oct 2011 00:00:00 UTC
To show/calculate with the individual parts of the Date object, use the corresponding UTC methods.

Categories