I am trying to understand more about the Date object in javascript.
I thought that when you call valueOf(), you get the amount of milliseconds since january 1, 1970.
So what I would expect is that the following should return exactly zero;
alert((new Date(1970, 1, 1).valueOf() )/ ( 86400 * 1000));
but it does not, it returns 30.958333333333332. What am I missing?
gr,
Coen
new Date(1970, 1, 1) actually is Feb. Months are zero-indexed. Try changing it to new Date(1970, 0, 1).
Second parameter, month, starts with 0, so you need to do:
alert((new Date(1970, 0, 1).valueOf() )/ ( 86400 * 1000));
but even with this you'll get the offset, in seconds, off GMT.
the value you posted says you are GMT +1 : )
If you're looking to work with the unix epoch time, you have a few options
UTC() Returns the number of
milliseconds in a date string since
midnight of January 1, 1970,
according to universal time
setTime() Sets a date and time by
adding or subtracting a specified
number of milliseconds to/from
midnight January 1, 1970
parse() Parses a date string and
returns the number of milliseconds
since midnight of January 1, 1970
getTime() Returns the number of
milliseconds since midnight Jan 1,
1970
valueOf() returns a primitive of the value, I'd stay away from it and work with the above options.
source: http://www.w3schools.com/jsref/jsref_obj_date.asp.
edit: also, your asking for Feb 1, 1970
use this, it's dangerous to go alone:
var d=new Date(1970, 0, 1);
document.write(d.getTime());
or
var d= Date.parse("Jan 1, 1970"); //Note, we don't use NEW keyword.
document.write(d);
Remember, epcoh is Wed Dec 31 1969 19:00:00 GMT-0500. If you use .getTime() you'll see UTC time Thu, 01 Jan 1970 00:00:00 GMT +0000.
The method you are searching is .getTime() not .valueOf()
Months are zero based in Date objects.
January 1st, 1970 is new Date(1970, 0, 1), since months start at 0 = January.
The first of january 1970 with the Date object is new Date(1970, 0, 1)
it was the month that should have been 0 in combination with an hour difference from GMT
alert((new Date(1970, 0, 1, 1, 0, 0, 0).valueOf()));
produces 0
Related
I have seen that Date.UTC() returns the number of milliseconds in a Date object since January 1, 1970, 00:00:00, universal time.
If I do new Date(Date.UTC(1900, 0, 1)).toString() I get
Sun Dec 31 1899 23:45:16 GMT-0014 (hora estándar de Europa central)
Why?
Test here:
const date = new Date(1900, 0, 1);
console.log(date.toString())
console.log("==========================");
const date2 = new Date(Date.UTC(1900, 0, 1));
console.log(date2.toString())
toString() will return tour local time based on OS configuration.
If you want UTC then use toUTCString()
console.log(new Date(Date.UTC(1900, 0, 1)).toString())
console.log(new Date(Date.UTC(1900, 0, 1)).toUTCString())
I'm trying to get the last of day of previous month using the current date:
var myDate = new Date();
According to MDN:
if 0 is provided for dayValue, the date will be set to the last day of the previous month.
But when set date to zero:
myDate.setDate(0)
console.log(JSON.stringify(myDate));
I get "2021-08-01T01:18:34.021Z" which first day of the current month. What is wrong with this approach?
JSON.stringify() is serializing the timestamp with a Z timezone, indicating UTC. The difference between UTC and your local timezone is causing the date to rollover to the next day.
You can use toLocaleString() to print the date in your local timezone:
var myDate = new Date();
myDate.setDate(0);
console.log(myDate.toLocaleString());
I would use dateInstance.toString() or dateInstance.toLocaleString():
const myDate = new Date;
myDate.setDate(0); myDate.setHours(0, 0, 0, 0);
console.log(myDate.toString()); console.log(myDate.toLocaleString());
You can use date-fns package
var df = require("date-fns")
let myDate = new Date() //Thu Aug 05 2021 22:16:09
let lastDayOfPrevMonth = df.endOfMonth(df.subMonths(myDate, 1)) //Sat Jul 31 2021 23:59:59 GMT-0400
I'm using EJS to compare the current date and a event date (data from database).
if (event[i].date >= new Date()) {
render this html
}
The problem is that event[i].date always have the milliseconds 0, example:
Fri Jan 23 2015 00:00:00 GMT-0200 (BRST)
And when i try to get the new Date(), now for example, this will happen:
Fri Jan 23 2015 01:28:42 GMT-0200 (BRST)
So, new Date() is greater then event[i].date, this will make the html not render..
How can i set to 0 the milliseconds of new Date()?
Thanks!
You can do as the following
var a = new Date();
a.setHours(0, 0, 0, 0);
I want to print a Date to ISO-8601 standard: YYYY-MM-DDTHH:mm:ss.sssZ so I used the following lines of code, but I am getting unexpected output
var date = new Date(2012, 10, 30, 6, 51);
print('UTC Format: '+date.toGMTString());
print('toString() method: '+date.toString());
print('toJSON() method: '+date.toJSON());//print hours and minutes incorrectly
print('to UTCString() method: ' + date.toUTCString());
The corresponding output is
UTC Format: Fri, 30 Nov 2012 01:21:00 GMT
toString() method: Fri Nov 30 2012 06:51:00 GMT+0530 (India Standard Time)
toJSON() method: 2012-11-30T01:21:00.000Z
to UTCString() method: Fri, 30 Nov 2012 01:21:00 GMT
The toJSON() method prints hours and minutes incorrectly but toString() prints it correctly, I wanted to know what is the reason for that.
Do I have to add time offset to the Date object, if yes then how?
var date = new Date();
console.log(date.toJSON(), new Date(date.getTime() - (date.getTimezoneOffset() * 60000)).toJSON());
date.toJSON() prints the UTC-Date into a string formatted as json-date.
If you want your local-time to be printed, you have to use getTimezoneOffset(), which returns the offset in minutes. You have to convert this value into seconds and add this to the timestamp of your date:
var date = new Date(2012, 10, 30, 6, 51);
new Date(date.getTime() - (date.getTimezoneOffset() * 60000)).toJSON()
In a previous version of this answer, the offset was erroneously added instead of subtracted.
Why do these two dates are differents :
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10); // month (from 0-11)
date1.setDate(1); // day of the month (from 1-31)
var date2 = new Date(2012, 10, 1, 0, 0, 0, 0);
Result :
Date 1 : Sat Dec 01 2012 14:56:16 GMT+0100
Date 2 : Thu Nov 01 2012 00:00:00 GMT+0100
whereas these two dates are equals :
var date3 = new Date();
date3.setFullYear(2012); // year (four digits)
date3.setMonth(9); // month (from 0-11)
date3.setDate(1); // day of the month (from 1-31)
var date4 = new Date(2012, 9, 1, 0, 0, 0, 0);
Result :
Date 3 : Mon Oct 01 2012 14:56:16 GMT+0200
Date 4 : Mon Oct 01 2012 00:00:00 GMT+0200
Another question is why do date1.setMonth(10) gives a date in December (should be November).
Finally got it. new Date() sets the date to the current date and time. In other words, October 31st (at the time of this writing).
When you then try to set the month to November, what's it to do? November only has 30 days... so it wraps it round to December.
If you change the order so that you set the day-of-month before the month, it works:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setDate(1); // day of the month (from 1-31)
date1.setMonth(10); // month (from 0-11)
Or as implied by jbabey's answer:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10, 1); // month (from 0-11) and day (1-31)
The documentation isn't terribly clear, but it's at least suggestive:
If a parameter you specify is outside of the expected range, setMonth attempts to update the date information in the Date object accordingly. For example, if you use 15 for monthValue, the year will be incremented by 1 (year + 1), and 3 will be used for month.
("Accordingly" is far from precise, but it means the implementation is at least arguably correct...)
setMonth accepts a second parameter:
If you do not specify the dayValue parameter, the value returned from the getDate method is used.
When you set the month to 10 (November), it grabs the current day value (31) and sets that as the day. Since there are only 30 days in November, it rolls you over to December 1st.
You're creating a var containing the current date (new Date()) and then you're changing some of it's keys (year, month and day).
On the other hand new Date(2012, 10, 1, 0, 0, 0, 0) means "create a date object with those exact values".
And that's why your date objects aren't equal.