In C# I would like to get UTC (+00:00) time as milliseconds. So I can use this in Javascript with offset(like below). I have tried several things but I did not achieve this.
new Date(1528204115692 - (new Date().getTimezoneOffset() * 60000)).toString()
Below code gives me milliseconds according to my timezone.
((DateTimeOffset)DateTime.UtcNow).ToUnixTimeMilliseconds()
I want to keep UTC time millisecond in db so I can show datetime according to user browser regional zone.
For example : in +03:00 zone now time is 06.05.2018 16:12:20.568
I want keep UTC zone time in milliseconds. (epoch time 00:00)
Can you Help?
Thank you
Your C# code was correct.
From mozilla:
new Date(value);
value
Integer value representing the number of milliseconds since January 1, 1970, 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider that most Unix timestamp functions count in seconds).
So you only need:
var date = new Date(1528204115692);
Where 1528204115692 is the value you obtain from your C# code.
Javascript dates are internally in milliseconds (it is simply a number) and "start" at 01 jan 1970 00.00 (that is "time" 0).
So:
public static readonly DateTime Date01Jan1970 = new DateTime(1970, 1, 1);
public static long MillisecondsFrom01Jan1970(DateTime dt)
{
return (dt.Ticks - Date01Jan1970.Ticks) / TimeSpan.TicksPerMillisecond;
}
Use it like:
long ms = MillisecondsFrom01Jan1970(DateTime.UtcNow);
This will return the number of ms that passed between the DateTime.UtcNow (the "now" in Utc time) and the 01 jan 1970.
The code below behaves differently for different time zones on users' browsers.
var date = new Date(1528204115692);
You can test it with the same number (millisecond) by changing computers time zone. This code shows different datetimes when changing time zone.
Related
I'm passing back a UTC date from the server, and I need some JS to find the difference in seconds between "now" and the date passed back from the server.
I'm currently trying moment, with something like
var lastUpdatedDate = moment(utcStringFromServer);
var currentDate = moment();
var diff = currentDate - lastUpdatedDate;
problem is, this gives a very invalid answer, because UTC is coming down from the server, and creating a new moment() makes it local. How can I do a calculation with respect to full UTC so it's agnostic of any local timing?
What you aren't quite understanding is that Dates are stored as the number of milliseconds since midnight, Jan 1, 1970 in UTC time. When determining the date/time in the local timezone of the browser, it first works out what the date/time would be in UTC time, then adds/subtracts the local timezone offset.
When you turn a Date back into a number, using +dateVar or dateVar.valueOf() or similar, it is back to the number of milliseconds since 01/01/1970T00:00:00Z.
The nice part about this is that whenever you serialise dates in UTC (either as a number, or as ISO String format), when it gets automatically converted to local time by Javascript's Date object, it is exactly the same point in time as the original value, just represented in local time.
So in your case, when you convert a local date to a number, you get a value of milliseconds in UTC that you are subtracting a value in milliseconds in UTC from. The result will be the number of milliseconds that has passed between the time from the server and the time the new Date() call is made.
Where it gets tricky is when you want a timestamp from the server to not be translated to local time, because you want to show the hours and minutes the same regardless of timezone. But that is not what you need in this case.
Try this way hope it may help:
var lastUpdatedDate = moment(utcStringFromServer);
var date = Date.UTC();
var currentDate = moment(date);
var diff = currentDate - lastUpdatedDate;
I'm assuming that utcStringFromServer is something like this:
Fri, 19 Aug 2016 04:27:27 GMT
If that's the case, you don't really need Moment.js at all. If you pass that string to Date.parse(), it'll return the number of milliseconds since Jan. 1, 1970. You can then use the .toISOString() method to get the same info about right now (converted to UTC) and parse it the same way to get milliseconds since 1970. Then, you can subtract the former from the latter and divide it by 1000 to convert back to seconds.
All in all, it would look something like this:
var lastUpdatedDate = Date.parse(utcStringFromServer);
var currentDate = Date.parse((new Date()).toISOString())
var diff = (currentDate - lastUpdatedDate) / 1000.0; // Convert from milliseconds
As title, although I set with only value.
In Javascript:
var n = (new Date("2015 Oct 17")).getTime()/1000;
console.log(n);
// result: 1445014800
And PHP:
$unix = date('d-m-Y', 1445014800);
echo $unix;
// result: 16-10-2015
Please leave some explanations.
Thanks a lot!
In your JavaScript:
var n = (new Date("2015 Oct 17")).getTime()/1000;
console.log(n);
// result: 1445014800
The operation /1000 is coercing your value to a numeric type, so the answer is correct!
In javascript, parsing of a string like "2015 Oct 17" is entirely implementation dependant. If it works at all, it is likely to be converted to a Date object representing the date at 00:00:00 at the start of the day in the time zone of the host system.
For a system whose time zone offset is, say, UTC+1000, and that parses the string as a local time, the time value in seconds will be 1445004000.
However, such a system might decide that the string is a bit like an ISO 8601 string, and it might decide that since such strings without a time zone were treated as UTC in ES5, that it will treat it as a UTC time. In that case, the time value in seconds will be 1445040000 (i.e. equivalent to 2015-10-17T00:00:00Z).
To reliably transfer dates between systems, it is often considered best to transfer time values in either seconds or milliseconds since the UNIX (and ECMAScript) epoch of 1070-01-01T00:00:00Z.
To create such a time value for 2015-Oct-17 you can use:
var timeValue = Date.UTC(2015, 9, 17);
To convert the UNIX time value 1445014800 to an ECMAScript date, you can do (noting that UNIX time values are in seconds and ECMAScript in milliseconds):
console.log(new Date(1445014800*1000).toISOString()); // 2015-10-16T17:00:00.000Z
So I'll assume that the PHP host is in a timezone that is UTC-05:00 and that 2015-Oct-17 has been treated as UTC.
I have checked it, and this JS function returns
1445036400
which in 'human time' is
Fri, 16 Oct 2015 23:00:00 GMT
https://jsbin.com/wuvawupede/edit?js,console,output
http://www.epochconverter.com/
I'm having the hardest time trying to convert this date from an API to UTC milliseconds. As of right now I'm displaying the dates but it's showing 7 hours ahead and going on to the next day which I don't even have data for. Here is the example format:
8/31/2012 9:00:00 AM
I currently have this code
var formattedDate = new Date(data[i].Time);
formattedDate = formattedDate.getTime();
which seems like it's returning the correct value type but the date is wrong. I've also tried
getUTCMilliseconds() and returns 0.
EDIT: jsfiddle example : http://jsfiddle.net/b2NK6/
So you want the raw timestamp in UTC time, instead of local time?
Compare:
(new Date(Date.UTC(2012, 7, 31, 9, 0, 0, 0))).getTime(); /* month 7 is August */
with
(new Date(Date.parse("8/31/2012 9:00:00 AM"))).getTime();
When you parse the string (the second example) it applies your local timezone information when it creates the date object. If you are in timezone -0700, then the date that is created will actually correspond to 4:00pm UTC.
But if you create the date object by explicitly saying that you are specifying the UTC value, it will give you 9:00am UTC, which corresponds to 2:00am in timezone -0700.
Edited to give clearer and more correct code example.
var dateString = "8/31/2012 9:00:00 AM"; // assuming this is expressed in local time
var millisecondsSinceTheEpoch = (new Date(dateString)).valueOf(); // 1346418000000
var isoString = (new Date(millisecondsSinceTheEpoch)).toISOString(); // 2012-08-31T13:00:00.000Z
// Note: example return values from a computer on U.S. Eastern Daylight Time (-4:00).
From W3Schools:
The valueOf() method returns the primitive value of a Date object.
Note: The primitive value is returned as the number of millisecond[s] since midnight January 1, 1970 UTC.
Also see W3Schools for a comprehensive overview of the Date object.
HighStocks expects to get its dates aligned to UTC-midnight date boundary.
Assuming your chart only deals with dates (without the time component) here is a trick you can use:
Do originalDate.getTime() to get the number of milliseconds since midnight UTC 1/1/1970 , e.g. 1362286800000.
Divide the number of milliseconds by (1000*60*60*24) to get the number of days since midnight UTC 1/1/1970 e.g. 15767.208333333334.
Do Math.round() to round the number to the nearest UTC midnight, e.g. 15767.
Multiply the number by (1000*60*60*24) to get it back into the milliseconds scale e.g. 1362268800000.
Here is the final formula:
var utcMidnight=new Date(Math.round(anyZoneMidnight.getTime()/86400000)*86400000)
I've written a function in .net that returns a date. I need to get that date into a Date Object in JavaScript.
According to https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Date, I should be able to invoke new Date(x) where x is the number of milliseconds in my date.
Therefore, I've written the following in my ASP.net MVC 3 code:
ViewBag.x = new TimeSpan(someDate.Ticks).TotalMilliseconds;
Then, in JavaScript, I get the following code:
new Date( 63461023004794 )
The date being represented should be January 1st, 2012.
However, the date that JavaScript reads is December 31st, 3980.
What's going wrong here?
Your .NET code is giving you the number of milliseconds since Jan. 1st, 0001.
The JavaScript constructor takes the number of milliseconds since Jan. 1st, 1970.
The easiest thing would probably be to change your .NET code to:
ViewBag.x = (someDate - new DateTime(1970, 1, 1)).TotalMilliseconds;
someDate.Ticks is measured since January 1st, 0001.
Javascript dates take milliseconds since January 1st, 1970, UTC.
That's because the DateTime structure counts ticks since 0001-01-01, while the Date object counts milliseconds since 1970-01-01.
Take the difference from 1970-01-01 as milliseconds:
ViewBag.x = (someDate - new DateTime(1970, 1, 1)).TotalMilliseconds;
The Unix calender's epoch is 1970-01-01 00:00:00 UTC. Assuming your times are already UTC (not a given):
DateTime someDate = GetSomeDate() ;
DateTime UNIX_EPOCH = new DateTime(1970,1,1) ;
Timespan ts = someDate - UNIX_EPOCH ;
should do you. Then pass javascript, the TimeSpan's TotalMilliseconds property.
Rules:
C# Ticks measures since 0001-01-01.
Javascript dates take milliseconds since 1970-01-01, UTC.
Then you need subtract 2665800000 milliseconds from your C# DateTime variable (someDate):
2665800000 is a const: different between 1970-01-01 and 0001-01-01 as milliseconds
Use:
ViewBag.x = (someDate - new DateTime(2665800000)).TotalMilliseconds;
I have a problem converting a timestamp in javascript.
I have this timestamp:
2011-10-26T12:00:00-04:00
I have been trying to format it to be readable. So far, it converts this using the local time of my system instead of the GMT offset in the timestamp. I know the timezone that this was created in is EST. I'm in PST, so the times are being offset by 3 hours.
Instead of this showing as:
Wednesday October 26, 2011 12:00 pm
It shows as:
Wednesday October 26, 2011 9:00 am
I have tried a few different solutions, but the latest one is found here: http://blog.stevenlevithan.com/archives/date-time-format
I am less concerned with the formatting part as I am with figuring out how to handle the GMT offsets. Would appreciate any help and insight anyone can provide.
Date objects are created in the local zone. If the date string was created in a different time zone, then you need to adjust the date object to allow for the difference.
The abbreviations PST and EST are ambiguous on the web, there is no standard for time zone abbreviations and some represent two or zones. You should express your zone only in terms of +/- UTC or GMT (which are the same thing, more or less).
You can get the local time zone offset using Date.prototype.getTimezoneOffset, which returns the offset in minutes that must be added to a local time to get UTC. Calculate the offset for where the time string was created and apply it to the created date object (simply add or subtract the difference in minutes as appropriate).
If your time zone is -3hrs, getTimezoneOffset will return +180 for a date object created in that zone. If the string is from a zone -4hrs, its offset is +240. So you can do:
var localDate = new Date('2011-10-26T12:00:00') // date from string;
var originOffset = 240;
var localOffset = localDate.getTimezoneOffset();
localDate.setMinutes( localDate.getMinutes() + originOffset - localOffset );
Adding the origin offset sets it to UTC, subracting the local offset sets it to local time.
It would be much easier if the time string that was sent by the server was in UTC, that way you just apply the local offset.
Edit
IE will not parse a time string with an offset, and Chrome thinks that the above time string is UTC and adjusts for local offset. So don't let Date parse the string, do it manually.
It doesn't matter what time zone you are- the time stamp will result in a different local time for every different time-zone, but they all will be correct, and anyone checking the UTC time of the date will get the same time-stamp:
new Date('2011-10-26T12:00:00-04:00').toUTCString()
returns
Wed, 26 Oct 2011 16:00:00 GMT
and getTime() anywhere returns the same milliseconds universal timestamp:1319644800000