How to construct a historic Date (year 0-100) in JavaScript? - javascript

I'm trying to construct a Date from a year, month, day tuple, however I'd like to be able to construct "historic" dates as well. For instance Nero's birthday: 15 December 37. The straightforward implementation
function constructDate(year, month, day) {
return new Date(year, month-1, day);
}
however returns Wed Dec 15 1937 00:00:00 GMT+0100 for constructDate(37, 12, 15), because of the constructor's magic to treat years in the range 0-100 as 1900-2000. I'm trying to find a good way to bypass this behavior.
A naive approach would be to construct a temporary date with an arbitrary year, and then call setFullYear() to get the actual target date. However, I have a few concerns:
Trying to do so for Nero's birthday in Firefox, returns Tue Dec 15 0037 00:00:00 GMT+0053. What has happened to the time zone offset there?
I'm not sure if there are subtleties e.g. when constructing February 29. If the target year is a leapyear, but the temporary year isn't, i.e., doesn't have February 29, am I in trouble?
Internally the construction from year/month/day requires the computation of the underlying Unix epoch timestamp. With the setFullYear() work-around there would be two timestamp computations, one of them all in vain.
Is there a way to construct such a Date without a temporary / computing the timestamp twice internally?

Related

Javascript Date, setFullYear changes timezone

Javascript Dates work with timezones. If I create a date, it sets the timezone. When I update the year of that date, I don't expect the timezone to change. It does, however. The worst thing is that it changes the timezone but not the time, causing the actual time to shift by one hour!
This causes the issue that if I have the person's birth date, and want to know his birthday this year, I cannot simply set the year to the current year using birthdate.setFullYear(2018), because it will return the birthday minus one hour. That means that it occurs one day before the actual birthday, at eleven 'o clock.
let test = new Date('1990-10-20');
console.log(test);
console.log(test.toISOString().substring(0, 10));
// 1990-10-20 ( Sat Oct 20 1990 01:00:00 GMT+0100 (Central European Standard Time) )
test.setFullYear(2000);
console.log(test);
console.log(test.toISOString().substring(0, 10));
// 2000-10-19 ( Fri Oct 20 2000 01:00:00 GMT+0200 (Central European Summer Time) === one hour too soon!! )
It might be that your timezone does not reproduce, here is my output:
"1990-10-20T00:00:00.000Z"
1990-10-20
"2000-10-19T23:00:00.000Z"
2000-10-19
The only workaround I found is substring the date and replace it as string values. How can I do it better using the Date object?
Hm, maybe it would be better to use setUTCFullYear(...) (MDN Docs) instead? At least in that case, it won't mess up the time.
let test = new Date('1990-10-20');
console.log(test);
console.log(test.toISOString().substring(0, 10));
// "1990-10-20T00:00:00.000Z")
test.setUTCFullYear(2000);
console.log(test);
console.log(test.toISOString().substring(0, 10));
// "2000-10-20T00:00:00.000Z"
BEWARE THE TIMEZONE If you want to work with just-a-date using the date object (and you more or less have to) your date objects should represent UTC midnight at the start of the date in question. In your case, to indicate a year, use UTC midnight at the start of the 1st January. This is a common and necessary convention, but it requires a lot of attention to make sure that the timezone doesn't creep back in, as you've discovered.
When you say "javascript dates work with timezones", a javascript date is a moment in time (ticks since the epoch) with handy static functions for converting that moment into a meaningful string in the local timezone (or a specified timezone). The date object itself does not have a timezone property.
So, you can create your UTC midnight year with something like...
var myDate = new Date(Date.UTC(2018,0,1)); // months are zero-indexed !
Then serialize it specifying UTC...
myDate.toISOString();
myDate.toLocaleDateString("en",{timezone:"UTC"});

Not getting the correct date when converting timestamp to date in javascript

I'm not used to work with dates in Javascript. I am trying to understand how it work, and in order to do that, I try to convert a string to timestamp, and then back to string.
So the first step is getting the timestamp:
const dateTime = new Date('2012-06-08').getTime()
console.log(dateTime) //1339113600000
Then, I am using this timestamp to generate a new date
const d = new Date(dateTime)
console.log(d) //Fri Jun 08 2012 02:00:00 GMT+0200 (W. Europe Daylight Time)
Everything looks good until there. Now I want to separate the date components (year, month, day)
console.log(d.getFullYear()) //2012
console.log(d.getMonth()) //5
console.log(d.getDay()) //5
The date is correct, but the month and the days are wrong. I don't get why it's returning 5th of May instead of 8th of June.
Does anyone has the answer to that?
Thanks!
The getDay() method returns the day of the week. You want getDate, which returns the day of the month.
Months start at 0: 0 is January, 5 is June.
Javascript is returning everything correctly, it returns 5 as a month because of the way it indexes months (0: January, 1: February, 2: March...), thus June will have index of 5.
Now about the day, what you actually want is the date, what you're asking for is the day of the week, so use getDate() instead.

Issue about the return value of getTime()

I have created a date variable pointing to 9th of July 2014
var date = new Date(2014, 6, 9);
When I try to obtain the time from this date, I expect that the time variable
var time = date.getTime();
would give me the value milliseconds value of July 9th 2014 at 00:00:00.
Instead it gives me
1404860400000
which is the milliseconds value of 8th July 2014 at 23:00:00.
Can someone explain me why this?
Your code here:
var date = new Date(2014, 6, 9);
creates a Date instance initialized to your local time of midnight on July 9th, 2014. Timestamp numbers (both JavaScript's milliseconds-since-the-Epoch and Unix's seconds-since-the-epoch) are unaffected by timezones, the value is since midnight Jan 1 1970 UTC.
If you were to construct this date:
var newDate = new Date(1404860400000);
...you'd have a date that was exactly equivalent to your first one. If you asked it for the local version of the moment it represented, it would say midnight on July 9th, 2014.
In both date and newDate above, if you ask it the UTC version of the date, you'll see it's offset from midnight (the direction depends on where you are, west or east of Greenwich, England). At the moment I'm writing this, almost no one is on GMT, not even those of us in the UK who normally are, because of Summer time. But for most people who are never in GMT, the value will always be offset.
If you want a Date instance giving you midnight on July 9th, 2014 UTC (e.g., not local time), use new Date(Date.UTC(2014, 6, 9)). Date.UTC gives you the time value for the given date in UTC, and then if you feed that time value into new Date, you get a Date for it.
January 1, 1970 :
getTime() Returns the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this Date object. Returns: the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this date.
Google.
The Mozilla docs usually cover documentation issues like this quite well.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date

What range of dates are permitted in Javascript?

What is the maximum and the minimum date that I can use with the Date object in Javascript?
Is it possible to represent ancient historical dates (like January 1, 2,500 B.C.) or dates that are far into the future (like October 7, 10,000)?
If these far-from-present dates can't be represented with the Date object, how should I represent them?
According to §15.9.1.1 of the ECMA-262 specification,
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC.
...
The actual range of times supported by ECMAScript Date objects is ... exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
So the earliest date representable with the Date object is fairly far beyond known human history:
new Date(-8640000000000000).toUTCString()
// Tue, 20 Apr 271,822 B.C. 00:00:00 UTC
The latest date is sufficient to last beyond Y10K and even beyond Y100K, but will need to be reworked a few hundred years before Y276K.
new Date(8640000000000000).toUTCString()
// Sat, 13 Sep 275,760 00:00:00 UTC
Any date outside of this range will return Invalid Date.
new Date(8640000000000001) // Invalid Date
new Date(-8640000000000001) // Invalid Date
In short, the JavaScript Date type will be sufficient for measuring time to millisecond precision within approximately 285,616 years before or after January 1, 1970. The dates posted in the question are very comfortably inside of this range.

The fifteenth of February isn't found

I'm in javascript, running this in the console
d = new Date();
d.setMonth(1);
d.setFullYear(2009);
d.setDate(15);
d.toString();
outputs this:
"Sun Mar 15 2009 18:05:46 GMT-0400 (EDT)"
Why would this be happening? It seems like a browser bug.
That's because when you initialize a new Date, it comes with today's date, so today is Oct 30 2008, then you set the month to February, so there is no February 30, so set first the day, then the month, and then the year:
d = new Date();
d.setDate(15);
d.setMonth(1);
d.setFullYear(2009);
But as #Jason W, says it's better to use the Date constructor:
new Date(year, month, date [, hour, minute, second, millisecond ]);
It's probably best to construct a Date object in one step to avoid the Date object being in an ambiguous or invalid state:
d = new Date(2009, 1, 15);
d = new Date();
d.setDate(15);
d.setMonth(1);
d.setFullYear(2009);
d.toString();
This works.
After a bunch of testing in FF3 on XP with Firebug, here are the things I can tell you
Calling Date.setDate() after calling Date.setMonth() will generate this odd behavior.
Date.setMonth() forces the timezone to be CST (or, some non DST-aware zone)
Date.setDate() forces the timezone to be CDT (or, some DST-aware zone)
So, there's definitely something wonky going on with setMonth() and setDate() in respect to the timezone.
The only solution I can offer is this: Set the date before you set the month.
This will work generally to avoid the rollover behavior of the javascript Date API:
d.setDate(1);
d.setFullYear(year);
d.setMonth(month);
d.setDate(day);
Given that year + month + day are in a "valid" combination, e.g. taken from another Date object using getFullYear(), getMonth(), getDate().
The important parts are:
starting with setDate(1) to avoid possible rollover when the current date value is 29, 30 or 31
call setMonth(month) before setDate(day) to avoid the same rollover in case the current month value is "problematic" (because then the initial setDate(1) would be without effect)

Categories