I have seen that Date.UTC() returns the number of milliseconds in a Date object since January 1, 1970, 00:00:00, universal time.
If I do new Date(Date.UTC(1900, 0, 1)).toString() I get
Sun Dec 31 1899 23:45:16 GMT-0014 (hora estándar de Europa central)
Why?
Test here:
const date = new Date(1900, 0, 1);
console.log(date.toString())
console.log("==========================");
const date2 = new Date(Date.UTC(1900, 0, 1));
console.log(date2.toString())
toString() will return tour local time based on OS configuration.
If you want UTC then use toUTCString()
console.log(new Date(Date.UTC(1900, 0, 1)).toString())
console.log(new Date(Date.UTC(1900, 0, 1)).toUTCString())
Related
I create date in two ways:
new Date('some date').getTime();
new Date().getTime('some date');
I did it before I had read on MDN, that Date.prototype.getTime() doesn't have parameter. It means second way is wrong. Nevertheless, it gives the same date value the right way gives (new Date('*some date*').getTime();) but amount of milliseconds is different and I don't get why.
Could someone explain me?
(function () {
let dateToCount = "Jan 01, 2022 00:00:00";
let date1 = new Date(dateToCount).getTime();
let date2 = new Date().getTime(dateToCount);
console.log(Date(date1).toString()); // Tue Oct 19 2021 22:41:59 GMT+0300 (Eastern European Summer Time)
console.log(Date(date2).toString()); // Tue Oct 19 2021 22:41:59 GMT+0300 (Eastern European Summer Time)
console.log(`date1 = ${date1} ms`); // date1 = 1640988000000 ms
console.log(`date2 = ${date2} ms`); // date2 = 1634672519002 ms
console.log(`date1 - date2 = ${+date1 - (+date2)} ms`); // date1 - date2 = 6315480998 ms
})();
it gives the same date value the right way gives
No, it doesn't - it just that when you were debugging with console.log(Date(date1).toString()); you fell in yet another trap: missing the new operator in the call the Date. As MDN puts it:
Calling the Date() function (without the new keyword) returns a string representation of the current date and time, exactly as new Date().toString() does. Any arguments given in a Date() function call (without the new keyword) are ignored; regardless of whether it’s called with an invalid date string — or even called with any arbitrary object or other primitive as an argument — it always returns a string representation of the current date and time.
So if you fix that as well, you'll realise that the two different millisecond values you get back from getTime() actually do represent two different dates:
const dateToCount = "Jan 01, 2022 00:00:00";
const date1 = new Date(dateToCount).getTime();
const date2 = new Date().getTime(dateToCount);
console.log(new Date(date1).toString()); // Sat Jan 01 2022 00:00:00, as expected
console.log(new Date(date2).toString()); // Surprise!
console.log(`date1 = ${date1} ms`);
console.log(`date2 = ${date2} ms`);
How to convert Javascript Date Object to another timezone but the result must be Date object with the correct timezone
let date = new Date();
console.log(date);
date = date.toLocaleString('en-US', { timeZone: 'America/Vancouver' });
date = new Date(date);
console.log(date);
that gives the following result, the last result line (Date/Time) is correct but the time zone is incorrect which is still GMT-0500 (Colombia Standard Time) but must be GMT-0800 (Pacific Standard Time) timezone
Wed Jan 20 2021 00:14:11 GMT-0500 (Colombia Standard Time)
Tue Jan 19 2021 21:14:11 GMT-0500 (Colombia Standard Time)
You may try this :
let date = new Date();
console.log(date);
date = date.toLocaleString("en-CA", {
timeZone: "America/Vancouver",
timeZoneName: "long",
});
console.log(date);
Output:
Wed Jan 20 2021 09:18:16 GMT+0300 (Arabian Standard Time)
2021-01-19, 10:18:16 p.m. Pacific Standard Time
Once you get the correct TimeZone, you may change how the date and time are displayed by string manipulation if you need too.
Update:
This may not look pretty but i believe it should satisfy the requirements:
let date = new Date().toLocaleString("en-US", {
timeZone: "America/Vancouver",
timeZoneName: "short",
});
let date1 = new Date(date);
//adding a new property to Date object called tz and initializing it to null
Date.prototype.tz = null;
//stting the tz value to the Time zone output from toLocalString
date1.tz = date.slice(date.length - 3);
console.log(date1.toISOString() + " " + date1.tz);
console.log(date);
console.log(typeof date1);
Output:
2021-01-20T09:01:06.000Z PST
1/20/2021, 1:01:06 AM PST
Object
What i've done is create a new property of the object date to replace the built-in time zone property in Date, hence you get an object with a user specified Time zone.
I'm using EJS to compare the current date and a event date (data from database).
if (event[i].date >= new Date()) {
render this html
}
The problem is that event[i].date always have the milliseconds 0, example:
Fri Jan 23 2015 00:00:00 GMT-0200 (BRST)
And when i try to get the new Date(), now for example, this will happen:
Fri Jan 23 2015 01:28:42 GMT-0200 (BRST)
So, new Date() is greater then event[i].date, this will make the html not render..
How can i set to 0 the milliseconds of new Date()?
Thanks!
You can do as the following
var a = new Date();
a.setHours(0, 0, 0, 0);
Why do these two dates are differents :
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10); // month (from 0-11)
date1.setDate(1); // day of the month (from 1-31)
var date2 = new Date(2012, 10, 1, 0, 0, 0, 0);
Result :
Date 1 : Sat Dec 01 2012 14:56:16 GMT+0100
Date 2 : Thu Nov 01 2012 00:00:00 GMT+0100
whereas these two dates are equals :
var date3 = new Date();
date3.setFullYear(2012); // year (four digits)
date3.setMonth(9); // month (from 0-11)
date3.setDate(1); // day of the month (from 1-31)
var date4 = new Date(2012, 9, 1, 0, 0, 0, 0);
Result :
Date 3 : Mon Oct 01 2012 14:56:16 GMT+0200
Date 4 : Mon Oct 01 2012 00:00:00 GMT+0200
Another question is why do date1.setMonth(10) gives a date in December (should be November).
Finally got it. new Date() sets the date to the current date and time. In other words, October 31st (at the time of this writing).
When you then try to set the month to November, what's it to do? November only has 30 days... so it wraps it round to December.
If you change the order so that you set the day-of-month before the month, it works:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setDate(1); // day of the month (from 1-31)
date1.setMonth(10); // month (from 0-11)
Or as implied by jbabey's answer:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10, 1); // month (from 0-11) and day (1-31)
The documentation isn't terribly clear, but it's at least suggestive:
If a parameter you specify is outside of the expected range, setMonth attempts to update the date information in the Date object accordingly. For example, if you use 15 for monthValue, the year will be incremented by 1 (year + 1), and 3 will be used for month.
("Accordingly" is far from precise, but it means the implementation is at least arguably correct...)
setMonth accepts a second parameter:
If you do not specify the dayValue parameter, the value returned from the getDate method is used.
When you set the month to 10 (November), it grabs the current day value (31) and sets that as the day. Since there are only 30 days in November, it rolls you over to December 1st.
You're creating a var containing the current date (new Date()) and then you're changing some of it's keys (year, month and day).
On the other hand new Date(2012, 10, 1, 0, 0, 0, 0) means "create a date object with those exact values".
And that's why your date objects aren't equal.
I am trying to understand more about the Date object in javascript.
I thought that when you call valueOf(), you get the amount of milliseconds since january 1, 1970.
So what I would expect is that the following should return exactly zero;
alert((new Date(1970, 1, 1).valueOf() )/ ( 86400 * 1000));
but it does not, it returns 30.958333333333332. What am I missing?
gr,
Coen
new Date(1970, 1, 1) actually is Feb. Months are zero-indexed. Try changing it to new Date(1970, 0, 1).
Second parameter, month, starts with 0, so you need to do:
alert((new Date(1970, 0, 1).valueOf() )/ ( 86400 * 1000));
but even with this you'll get the offset, in seconds, off GMT.
the value you posted says you are GMT +1 : )
If you're looking to work with the unix epoch time, you have a few options
UTC() Returns the number of
milliseconds in a date string since
midnight of January 1, 1970,
according to universal time
setTime() Sets a date and time by
adding or subtracting a specified
number of milliseconds to/from
midnight January 1, 1970
parse() Parses a date string and
returns the number of milliseconds
since midnight of January 1, 1970
getTime() Returns the number of
milliseconds since midnight Jan 1,
1970
valueOf() returns a primitive of the value, I'd stay away from it and work with the above options.
source: http://www.w3schools.com/jsref/jsref_obj_date.asp.
edit: also, your asking for Feb 1, 1970
use this, it's dangerous to go alone:
var d=new Date(1970, 0, 1);
document.write(d.getTime());
or
var d= Date.parse("Jan 1, 1970"); //Note, we don't use NEW keyword.
document.write(d);
Remember, epcoh is Wed Dec 31 1969 19:00:00 GMT-0500. If you use .getTime() you'll see UTC time Thu, 01 Jan 1970 00:00:00 GMT +0000.
The method you are searching is .getTime() not .valueOf()
Months are zero based in Date objects.
January 1st, 1970 is new Date(1970, 0, 1), since months start at 0 = January.
The first of january 1970 with the Date object is new Date(1970, 0, 1)
it was the month that should have been 0 in combination with an hour difference from GMT
alert((new Date(1970, 0, 1, 1, 0, 0, 0).valueOf()));
produces 0