Is there any real case where getUTCFullYear() differs from getFullYear() in javascript?
The same goes for:
getUTCMonth() vs getMonth()
getUTCDate() vs getDate()
Am I missing something here?
EDIT:
See getUTCFullYear() documentation.
Is there any real case where getUTCFullYear() differs from getFullYear() in javascript?
The getUTC...() methods return the date and time in the UTC timezone, while the other functions return the date and time in the local timezone of the computer that the script is running on.
Sometimes it's convenient to have the date and time in the local timezone, and sometimes it's convenient to have the date and time in a timezone that's independent of the computer's local timezone.
There are ways to convert between timezones in JavaScript, but they're cumbersome.
So, I guess it's just for convenience.
Yes, around new year. If your TZ is -12, the value differs between 31. December, 12:00am and midnight.
The functions you list essentially report the time in different timezones, so to have a case where getUTCFullYear() will differ from getFullYear() it will be on evening of the 31st December if you live West of the Greenwich Meridian.
For example, in you local timezone the time could be 9pm but in UTC it might already be 2am on 1st January so:
var d = new Date();
d.getFullYear() == 2009 //True
d.getUTCFullYear() == 2010 //True
It is confusing since the methods operate on a Date object rather than reporting the current UTC time. But the method says, if this is the time here, what year is it in the UTC timezone. So for 364.75 days of the years reported by the two methods will be the same.
As 31 December goes to 1 January, this happens at different times in different parts of the world. Hence, the UTC time will be different from e.g. the local time in New Zealand - so at some point getFullYear() will differ from getUTCFullYear() by 1.
Related
I need to get start of day and end of day. I have been reading that if I am going to do date stuff to use momentjs. While I might go that route right now I do not think I will need that much date manipulation so am gonna try to not add more npm packages to this project than needed. I was reading that you can set a date to start of day with .setHours(0,0,0,0) though when I try this in my terminal I am seeing the hours get set to T07:00:00.000Z can someone explain why? Feels like it should be T00:00:00.000Z
let date = new Date('2019-08-16T20:30:38Z');
date.setHours(0,0,0,0);
console.log(date);
I live in the Central timezone, UTC -5 this time of year, so I get T05:00:00.000Z when I run it. Since you live in the Pacific timezone (presumably), UTC -7, you get 7am UTC. You are setting the local time but outputting the time in UTC. From the documentation (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/setHours):
The setHours() method sets the hours for a specified date according to local time, and returns the number of milliseconds since January 1, 1970 00:00:00 UTC until the time represented by the updated Date instance
Javascript Dates work with timezones. If I create a date, it sets the timezone. When I update the year of that date, I don't expect the timezone to change. It does, however. The worst thing is that it changes the timezone but not the time, causing the actual time to shift by one hour!
This causes the issue that if I have the person's birth date, and want to know his birthday this year, I cannot simply set the year to the current year using birthdate.setFullYear(2018), because it will return the birthday minus one hour. That means that it occurs one day before the actual birthday, at eleven 'o clock.
let test = new Date('1990-10-20');
console.log(test);
console.log(test.toISOString().substring(0, 10));
// 1990-10-20 ( Sat Oct 20 1990 01:00:00 GMT+0100 (Central European Standard Time) )
test.setFullYear(2000);
console.log(test);
console.log(test.toISOString().substring(0, 10));
// 2000-10-19 ( Fri Oct 20 2000 01:00:00 GMT+0200 (Central European Summer Time) === one hour too soon!! )
It might be that your timezone does not reproduce, here is my output:
"1990-10-20T00:00:00.000Z"
1990-10-20
"2000-10-19T23:00:00.000Z"
2000-10-19
The only workaround I found is substring the date and replace it as string values. How can I do it better using the Date object?
Hm, maybe it would be better to use setUTCFullYear(...) (MDN Docs) instead? At least in that case, it won't mess up the time.
let test = new Date('1990-10-20');
console.log(test);
console.log(test.toISOString().substring(0, 10));
// "1990-10-20T00:00:00.000Z")
test.setUTCFullYear(2000);
console.log(test);
console.log(test.toISOString().substring(0, 10));
// "2000-10-20T00:00:00.000Z"
BEWARE THE TIMEZONE If you want to work with just-a-date using the date object (and you more or less have to) your date objects should represent UTC midnight at the start of the date in question. In your case, to indicate a year, use UTC midnight at the start of the 1st January. This is a common and necessary convention, but it requires a lot of attention to make sure that the timezone doesn't creep back in, as you've discovered.
When you say "javascript dates work with timezones", a javascript date is a moment in time (ticks since the epoch) with handy static functions for converting that moment into a meaningful string in the local timezone (or a specified timezone). The date object itself does not have a timezone property.
So, you can create your UTC midnight year with something like...
var myDate = new Date(Date.UTC(2018,0,1)); // months are zero-indexed !
Then serialize it specifying UTC...
myDate.toISOString();
myDate.toLocaleDateString("en",{timezone:"UTC"});
Would you recommend using
A) the Date-object as returned by new Date(...)
or
B) milliseconds since epoch as returned by Date.parse(...)
for handling dates in a client-side/browser Javascript application?
I would love to hear any experiences you've had, or pitfalls you've hit.
My worries are primarily comparisons between e.g. new Date() (in case A) and the dates I've received from the server -- or in case B that would be comparisons with Date.now().
The dates are instantiated from string values from the server, that are of the form 2011-10-10T14:48:00Z (ISO 8601)
I would use Date objects with a DateFormat to parse and format dates:
DateFormat format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSX");
As with many things, it depends on your use cases.
Generally, you'll have a safer time using milliseconds since epoch, especially if you need to manipulate the date or store it somewhere outside of your code.
A date as milliseconds is safer to manipulate because it's a number and we have a lot of tools for manipulating numbers. You can use the converters provided by Date to switch between numbers and actual Date instances when you need to do something more fancy, like display the date, get the day of the week, etc.
A date as milliseconds is safer to store because, as you already know from your date strings coming from the server, most languages have a different implementation of how they structure and store dates, but most languages handle numbers the same way. Again, using milliseconds gives you flexibility to always convert the number to the Date implementation in whatever language in which you're working.
If you're interested in using a third-party library to make working with dates much easier, I'd strongly recommend checking out date-fns.
You can get milliseconds since epoch from new Date(...) itself.
For example,
var date = new Date();
var epoch = date.getTime();//A number representing the milliseconds
// elapsed between 1 January 1970 00:00:00
//UTC and the given date.
I prefer the new Date() as we have multiple constructors to initialize the date.
For example, we can do
Date.parse("2012")// returns 1325376000000
Date.parse("2012","12")// also returns 1325376000000
but
var d = new Date("2012")
// return Sun Jan 01 2012 05:30:00 GMT+0530 (IST)
d = new Date("2012", "01")
// Wed Feb 01 2012 00:00:00 GMT+0530 (IST)
I got two timestamps in unix format and I need to find a way to compare them and to find out which one is the newest (closest to present date).
The two timestamps are:
1299925246
1300526796
Is there a simple way of doing this in Javascript?
UNIX time is expressed as the number of seconds elapsed since January 1st, 1970, 00:00:00 UTC.
Comparison is therefore straightforward: in your example, the second timestamp (1300526796) is the newest, because 1300526796 (March 19th, 2011, 09:26:36 UTC) is greater than 1299925246 (March 12th, 2011, 10:20:46 UTC).
Just encountered the same issue, but with multiple timestamps, like so:
1356036198452
1356039026690
1356039067568
1356035166411
In my case, there could be anywhere from 1 to 100 timestamps available.
So the quickest way to get to the "newest" date (in my opinion) would be to do this:
var newestDate = Math.max(1356036198452,1356039026690,1356039067568,1356035166411);
alert(newestDate);
Hope that helps someone out there in the real world.
If JavaScript (new Date()).getTime() is run from 2 different timezones simultaneously, will you get the same value?
Will this value be affected by the system time set on the machine where the browser is running?
Yes, it's affected by system time. However, if the local time is correct (for whatever time zone the computer's set to), it should be the same in any time zone.
The ECMAScript standard says (ยง15.9.1.1):
"Time is measured in ECMAScript in
milliseconds since 01 January, 1970
UTC."
Code:
var today = new Date();
console.log(today);
var t = today.getTime();
console.log(t);
My Computer in the UK:
Sat Sep 21 2013 03:45:20 GMT+0100 (GMT Daylight Time)
1379731520112
My VPS:
Sat, 21 Sep 2013 02:44:31 GMT
1379731471743
Difference between getTime values is 48,369 milliseconds (48s) out of sync not the 1 hour zone difference
You won't get the same value - difference between two client's browsers picking up their system time, but if their time is set up ok, you should get two times with a minimal difference since getting the timestamp using new Date(), you can get the UTC value (new Date() returns number of milliseconds ellapsed since January 1, 1970, and that won't change), which is universal time and is location agnostic.
There will most likely always be a deviation between times attained between machines, but (I was wrong before) JavaScript Date() takes the UTC timezone as default.
Usually when time is essential, it's best to simply use the Server time and apply timezone corrections to that in the output if required.