For example I have this:
d = new Date(2013,04,20,14,56,10)
Mon May 20 2013 14:56:10 GMT+0800 (SGT)
dt = d.getTime() /1000
1369032970
Now, timezoneOffset value is
d.getTimezoneOffset()*60
-28800
So if I reduce it, I get
dt -= d.getTimezoneOffset()*60
1369061770
My question is, is 1369032970 my local timestamp, and 1369061770 UTC timestamp?
Can I safely say that any current timestamp reduced by the timezoneOffset is the UTC timestamp?
The result from getTime is in milliseconds since 1/1/1970 UTC. The local time zone plays no part in it. So if your question was how to get the UTC timestamp, simply use the result from getTime without any modification.
The idea of a "local timestamp" isn't very useful. One might apply an offset to the UTC timestamp before rendering it to a human-readable date string - but in Javascript, that's already done for you behind the scenes. You really don't want to pass a numeric timestamp to anyone else unless it is strictly UTC, because the meaning of what is "local" would be lost.
Also, when you call getTimezoneOffset, you are getting back the specific offset at the moment represented by the date - in minutes. Also, the sign is the opposite of what we normally see for time zone offsets. For example, I live in Arizona where the offset is UTC-07:00 year-round. But a call to getTimezoneOffset returns a positive value of 420. If you were to apply it a timestamp, you would do the following:
dt -= dt.getTimezoneOffset() * 60 * 1000;
You almost had it, but forgot to convert from seconds to milliseconds. But like I said, this value is meaningless. If you created a new Date object from it, it would display with the offset applied twice - once by your own code, and again by the Javascript internals.
Related
I have a piece of code which finds the difference between two dates(in the format of yyyy-MM-dd hh:mm:ss) . This code is run in multiple servers across the globe. One of the two dates is the current time in that particular timezone where the code is being run(server time) and another is the time obtained from a database. if the difference between these two times is greater than 86400 seconds(1day), then it should print "invalid" else, it should print "valid".
Problem im facing with the code is when I run it on my local, its working fine, but when i deploy it onto a server in US, its taking GMT time into consideration and not local time.
Wherever the code is run, I want the difference between current time and time fetched from the database, and if its greater than 86400 seconds, i want to print invalid. How to achieve this in java?
PS: I tried with Date object, but its considering GMT only everywhere.
I would use GMT everywhere and only convert to the local times for display purposes.
To find the difference, convert both times to the same timezone (say GMT) and take the difference.
You can do it by the below example code.
Date date = new Date();
DateFormat formatter = new SimpleDateFormat("dd MMM yyyy HH:mm:ss z");
formatter.setTimeZone(TimeZone.getTimeZone("CET"));
Date date1 = dateformat.parse(formatter.format(date));
// Set the formatter to use a different timezone
formatter.setTimeZone(TimeZone.getTimeZone("IST"));
Date date2 = dateformat.parse(formatter.format(date));
// Prints the date in the IST timezone
// System.out.println(formatter.format(date));
Now compare date1 with date2
First, I concur with Peter Lawrey's answer up there. It is usually good practice to store all time in the database for a single zone, and render it with offset for the user based upon the user's locale.
To find the difference, use the method getTime() to get the time in milliseconds from the epoch for each date. The calculation for the difference of 1 day is then 86400 * 1000 milliseconds. Or, perhaps, store the time in milliseconds from epoch in the database, and use a DB procedure/function at the time of retrieval.
Hope this helps.
I'm passing back a UTC date from the server, and I need some JS to find the difference in seconds between "now" and the date passed back from the server.
I'm currently trying moment, with something like
var lastUpdatedDate = moment(utcStringFromServer);
var currentDate = moment();
var diff = currentDate - lastUpdatedDate;
problem is, this gives a very invalid answer, because UTC is coming down from the server, and creating a new moment() makes it local. How can I do a calculation with respect to full UTC so it's agnostic of any local timing?
What you aren't quite understanding is that Dates are stored as the number of milliseconds since midnight, Jan 1, 1970 in UTC time. When determining the date/time in the local timezone of the browser, it first works out what the date/time would be in UTC time, then adds/subtracts the local timezone offset.
When you turn a Date back into a number, using +dateVar or dateVar.valueOf() or similar, it is back to the number of milliseconds since 01/01/1970T00:00:00Z.
The nice part about this is that whenever you serialise dates in UTC (either as a number, or as ISO String format), when it gets automatically converted to local time by Javascript's Date object, it is exactly the same point in time as the original value, just represented in local time.
So in your case, when you convert a local date to a number, you get a value of milliseconds in UTC that you are subtracting a value in milliseconds in UTC from. The result will be the number of milliseconds that has passed between the time from the server and the time the new Date() call is made.
Where it gets tricky is when you want a timestamp from the server to not be translated to local time, because you want to show the hours and minutes the same regardless of timezone. But that is not what you need in this case.
Try this way hope it may help:
var lastUpdatedDate = moment(utcStringFromServer);
var date = Date.UTC();
var currentDate = moment(date);
var diff = currentDate - lastUpdatedDate;
I'm assuming that utcStringFromServer is something like this:
Fri, 19 Aug 2016 04:27:27 GMT
If that's the case, you don't really need Moment.js at all. If you pass that string to Date.parse(), it'll return the number of milliseconds since Jan. 1, 1970. You can then use the .toISOString() method to get the same info about right now (converted to UTC) and parse it the same way to get milliseconds since 1970. Then, you can subtract the former from the latter and divide it by 1000 to convert back to seconds.
All in all, it would look something like this:
var lastUpdatedDate = Date.parse(utcStringFromServer);
var currentDate = Date.parse((new Date()).toISOString())
var diff = (currentDate - lastUpdatedDate) / 1000.0; // Convert from milliseconds
I have dates in my database set to Europe/London time. I am using Moment.js to show relative time e.g. "3 minutes ago". This works fine for me as I am in the same timezone, but for example, someone who is PST timezone would see "in 8 hours". How can I fix this?
My current code is like this:
$('time').text( moment( '2016-01-22 18:00:00' ).fromNow() );
To echo Jon's answer, moment's relative time functionality is strictly UTC based, so the behavior you describe won't actually happen, unless you are interpreting the original timestamp in local time.
It's hard to say if you're doing that or not, as you didn't give a sample value of the input string.
If your times are indeed UTC based, but that's not reflected in the input string, then use moment.utc instead of just moment.
And no, London is not the same as UTC.
I believe that the best approach is to store the date in UTC and then convert this to the local time zone for display. Note that this is not necessarily the same as London time because UTC does away with daylight savings time nonsense. You can do everything that you need with the date class provided the time stamp stored in the database does not have to deal with the vagaries of time zone and DST. The date class maintains its own epoch internally as milliseconds elapsed since midnight 1 January 1970 UTC. You can evaluate the difference between two Date objects as follows:
var agora = Date.now();
var stored = ... // the date that was stored in your database
var diff_msec = agora.getTime() - stored.getTime();
Knowing that the difference and that its units are milliseconds, you can convert the difference to whatever units are best for presentation.
The documentation seems to suggest getTimezoneOffset() always returns the offset of the current locale, irregardless of the date object. But I'm getting inconsistent results that I can't understand.
new Date().getTimezoneOffset() // -120
new Date("2015-03-10T15:48:05+01:00").getTimezoneOffset() // -60
new Date("2015-03-10T15:48:05-04:00").getTimezoneOffset() // -60
Also, is there a better way to get the timezone off a datetime string (maybe with moment.js)?
getTimezoneOffset returns the offset for the specific moment in time represented by the Date object it is called on, using the time zone setting of the computer that it executing the code.
Since many time zones change their offset for daylight saving time, it is perfectly normal for the value to differ for different dates and times. When you call it on new Date(), you're getting the current offset.
The value returned from getTimezoneOffset is in terms of minutes west of UTC, as compared to the more common offsets returned in [+/-]HH:mm format, which are east of UTC. Therefore, the time zone you gave alternates between UTC+1, and UTC+2. My guess is the computer that gave this output was in one of the zones that uses Central European Time - though it could be one of several others.
Also, when you pass in an offset as part of an ISO8601 formatted string, that offset is indeed taken into account - but only during parsing. The offset is applied, and the Date object holds on to the UTC timestamp internally. It then forgets anything about the offset you supplied. On output, some of the functions will explicitly use UTC, but most of them will convert to the local time zone before emitting their result.
You also asked about how to get the offset of a datetime string using moment.js. Yes, that is quite simple:
// Create a moment object.
// Use the parseZone function to retain the zone provided.
var m = moment.parseZone('2015-03-10T15:48:05-04:00');
// get the offset in minutes EAST of UTC (opposite of getTimezoneOffset)
var offset = m.utcOffset(); // -240
// alternatively, get it as a string in [+/-]HH:mm format
var offsetString = m.format("Z"); // "-04:00"
It's because of Daylight Savings Time. For your timezone, on June 11th it is in UTC+2 and on March 10th it's in UTC+1:
// when in DST (since it's June)
new Date("2015-06-11T00:00:00Z").getTimezoneOffset(); // -120
// when not in DST
new Date("2015-03-10T15:48:05+01:00").getTimezoneOffset(); // -60
For me, since I'm in the Eastern Time Zone, the following will happen:
// when in EST
new Date("2015-03-01T00:00:00Z").getTimezoneOffset(); // 300
// when in EDT
new Date("2015-06-01T00:00:00Z").getTimezoneOffset(); // 240
For sure, there is a lot of questions about Date objects and timezones but many of them are about converting the current time to another timezone, and others are not very clear about what they want to do.
I want to display the day, hour, minute etc. in an arbitrary timezone, in an arbitrary day. For example, I would like a function f(t, s) that:
given the timestamp 1357041600 (which is 2013/1/1 12:00:00 UTC) and the string "America/Los Angeles", would satisfy the comparison below:
f(1357041600, "America/Los Angeles") == "2013/01/01 04:00:00"
given the timestamp 1372680000 (2013/07/01 12:00:00 UTC), would satisfy the comparison below:
f(1357041600, "America/Los Angeles") == "2013/07/01 05:00:00"
will always behave this way even if the timezone in the browser is, let us say "Europe/London" or "America/São Paulo".
will always behave this way even if the time in the browser is, let us say 2014/02/05 19:32, or 2002/08/04 07:12; and
as a final restriction, will not request anything from the server side (because I'm almost doing it myself :) )
Is it even possible?
given the timestamp 1357041600 (which is 2013/1/1 12:00:00 UTC)
That appears to be seconds since the UNIX epoch (1970-01-01T00:00:00Z). Javascript uses the same epoch, but in milliseconds so to create a suitable date object:
var d = new Date(timestamp * 1000);
That will create a Date object with a suitable time value. You then need to determine the time zone offset using something like the IANA time zone database. That can be applied to the Date object using UTC methods. E.g. resolve the offset to minutes, then use:
d.setUTCMinutes(d.getUTCMinutes() + offset)
UTC methods can then be used to get the adjusted date and time values to create a string in whatever format you require:
var dateString = d.getUTCFullYear() + '/' + pad(d.getUTCMonth() + 1) + '/' ...
where pad is a function to add a leading zero to single digit values. Using UTC methods avoids any impact of local time zone offsets and daylight saving variances.
There are also libraries like timezone.js that can be used to determine the offset, however I have not used them so no endorsement is implied.
For JavaScript runtime environments that support the ECMAScript Internationalization API, and adhere to its recommendation of supporting the IANA time zone database, you can simply do this:
new Date(1357041600000).toLocaleString("en-US", {timeZone: "America/Los_Angeles"})
For other environments, a library is required. There are several listed here.