Moment.js - UTC date to millis since unix epoch - javascript

I am trying to get an UTC Date using the moment.js library (just to send it to my server), as follows:
const eighteenYearsAgoUTC = moment().utc().subtract(18, "years").toDate();
const eighteenYearsAgoUTCSinceUnixEpoch = eighteenYearsAgoUTC.valueOf()
console.log(eighteenYearsAgoUTCSinceUnixEpoch);
The millis since unix epoch are: 1051875596343
But... if I do the same without utc, I get the same result
const eighteenYearsAgoUTC = moment().subtract(18, "years").toDate();
const eighteenYearsAgoUTCSinceUnixEpoch = eighteenYearsAgoUTC.valueOf()
console.log(eighteenYearsAgoUTCSinceUnixEpoch);
1051875596343
Why am I getting the same milliseconds since Unix Epoch for an UTC date and a local Date?
My local date is: Fri May 02 2003 13:37:00 GMT+0200 (Central Europe)

The utc method just changes how moment parses and formats dates. The underlying information is still the same.
Milliseconds-since-The-Epoch values are always UTC. Both of your code snippets do the same thing:
Get "now"
Subtract 18 years
Get the result as the milliseconds-since-The-Epoch value
You'd notice a difference if you were formatting a date or parsing one, but you aren't doing either of those things.

Related

How can I convert GTM-0006 to ISO without Adding hours

I'm working in a from who has a date field and by default it shows the current date.
I set the date using this:
var date = new Date(); = Tue May 25 2021 17:06:01 GMT-0600 (Mountain Daylight Time) {}**
Everything works fine, but when I send the data to the controller, the JSON automatically converts it to ISO and the date received by the controller is 6 hours in advance.
I understand a little bit the context about GMT-0006 (my current timezone is 6 hours more than the 0 timezone), and the fact that my controllers received the date in ISO format because when I converted to ISO format is the same problem
date.toISOString() = "2021-05-25T23:06:01.861Z" (6 hours in advance)
so my question is, there is a way to create a date that allows me to use .toISOString() and keep the same?
or create a date with my current hour but -0000 so when I convert it to toISOString keeps the same?

How to work with a Javascript +Date() numeric time stamp in Python?

I've a JavaScript date string which I want to convert in understandable form. E.g.
new Date(1415988000000)
which gives the output
Fri Nov 14 2014 23:30:00 GMT+0530 (IST)
I want to do this in python. I don't want to use PyV8. Is there any other alternative ?
You don't need to write Javascript in Python to convert the timestamps.
The following works in pure python.
Note:
The JS Date format is the number of milliseconds since Jan 1,1970 UTC.
Oddly enough, the python format for time.time() is the number of seconds since Jan 1, 1970 UTC.
This standard is sometimes called Unix time
First, for the milliseconds to seconds time conversion, you need to divide 1415988000000 by 1000, obtaining 1415988000
Then you can use the datetime library like this:
import datetime
d = datetime.datetime.fromtimestamp(1415988000)
print d
Obtaining:
2014-11-14 13:00:00
This print seems to have converted d to my TZ which is UTC-5.
So UTC time would be 18:00.
This explains the later time, 23:30, you receive in JS for the same stamp in IST or UTC+5:30

Javascript getFullYear() Date method weird behavior

Can anyone explain why getFullYear does not return 2014?
console.log(new Date('2014-01-01').getFullYear()) //2013
console.log(new Date('2014-01-01').getUTCFullYear()) //2014
From MDN:
The dateString of "March 7, 2014" returns a different date than "2014-03-07" unless the local time-zone is UTC. When converting a dateString of "March 7, 2014" the local time-zone is assumed. When converting a dateString of "2014-03-07" the UTC time-zone is assumed. This results in two different Date values depending on the format of the string that is being converted.
So when you ask it to parse "2014-01-01", you're getting the time in UTC.
Then you call .getFullYear() on your object, which uses local time. If you live in the Eastern US like I do, then it basically subtracts 4 hours from the internal time and returns the year.
So here's what happens:
"2014-01-01" is converted to "1388534400000"
.getFullYear() is called and "1388534400000" is converted to local time
Local time is something like "1388534160000"
New years hasn't yet occurred at "1388534160000", so it's still 2013
All of this implies that if we do something like
console.log(new Date('January 1, 2014').getUTCFullYear()); // 2014
console.log(new Date('January 1, 2014').getFullYear()); // 2014
We'll get the same year, because we told the browser to use our timezone right on New Year's, but they're not equivalent:
console.log(new Date('January 1, 2014').getUTCHours()); // 5
console.log(new Date('January 1, 2014').getHours()); // 0
According to this:
"The difference is when you specify a string in the format YYYY-MM-DD, you get a date that is 12am in the GMT timezone and when you specify a date in the format DD-MM-YYYY, you get a date that is 12am in your current timezone."
So basically since you are specifying News Years Day 2014 when it gets converted from GMT to your local time it believes it is 12-31-13 not 01-01-14.

Retrieving the timezone offset that was already input

I create a date like this:
var date = new Date('Wed, 19 Mar 2014 18:17:00 +0200');
This resolves to:
Wed Mar 19 2014 17:17:00 GMT+0100 (Central European Standard Time)
Is there a way to retrieve the "+0200" portion from the date object once it is created? I am trying to get this without parsing the input string and without the use of external libraries.
EDIT:
When I use
date.getTimezoneOffset();
It returns "-60", which corresponds to the local timezone offset, which in my case is GMT+0100. The question I am asking is whether the "+0200" from the input is lost in the Date object upon creation, or is it stored somewhere?
You can retrieve the timezoneoffset with date.getTimezoneOffset(); and use that to calculate it
From the Mozilla MSDN
The getTimezoneOffset() method returns the time-zone offset from UTC, in minutes, for the current locale.
date.getTimezoneOffset()
The time-zone offset is the difference, in minutes, between UTC and local time. Note that this means that the offset is positive if the local timezone is behind UTC and negative if it is ahead. For example, if your time zone is UTC+10 (Australian Eastern Standard Time), -600 will be returned. Daylight saving time prevents this value from being a constant even for a given locale.
A date object is stored as the number of milliseconds since the Unix epoch. So your input string is parsed and stored as a primitive number. So no, you can't retrieve aspects of your original input after it has been converted to a date object.

What format is this timestamp, and how can I format it in its own time

I have a problem converting a timestamp in javascript.
I have this timestamp:
2011-10-26T12:00:00-04:00
I have been trying to format it to be readable. So far, it converts this using the local time of my system instead of the GMT offset in the timestamp. I know the timezone that this was created in is EST. I'm in PST, so the times are being offset by 3 hours.
Instead of this showing as:
Wednesday October 26, 2011 12:00 pm
It shows as:
Wednesday October 26, 2011 9:00 am
I have tried a few different solutions, but the latest one is found here: http://blog.stevenlevithan.com/archives/date-time-format
I am less concerned with the formatting part as I am with figuring out how to handle the GMT offsets. Would appreciate any help and insight anyone can provide.
Date objects are created in the local zone. If the date string was created in a different time zone, then you need to adjust the date object to allow for the difference.
The abbreviations PST and EST are ambiguous on the web, there is no standard for time zone abbreviations and some represent two or zones. You should express your zone only in terms of +/- UTC or GMT (which are the same thing, more or less).
You can get the local time zone offset using Date.prototype.getTimezoneOffset, which returns the offset in minutes that must be added to a local time to get UTC. Calculate the offset for where the time string was created and apply it to the created date object (simply add or subtract the difference in minutes as appropriate).
If your time zone is -3hrs, getTimezoneOffset will return +180 for a date object created in that zone. If the string is from a zone -4hrs, its offset is +240. So you can do:
var localDate = new Date('2011-10-26T12:00:00') // date from string;
var originOffset = 240;
var localOffset = localDate.getTimezoneOffset();
localDate.setMinutes( localDate.getMinutes() + originOffset - localOffset );
Adding the origin offset sets it to UTC, subracting the local offset sets it to local time.
It would be much easier if the time string that was sent by the server was in UTC, that way you just apply the local offset.
Edit
IE will not parse a time string with an offset, and Chrome thinks that the above time string is UTC and adjusts for local offset. So don't let Date parse the string, do it manually.
It doesn't matter what time zone you are- the time stamp will result in a different local time for every different time-zone, but they all will be correct, and anyone checking the UTC time of the date will get the same time-stamp:
new Date('2011-10-26T12:00:00-04:00').toUTCString()
returns
Wed, 26 Oct 2011 16:00:00 GMT
and getTime() anywhere returns the same milliseconds universal timestamp:1319644800000

Categories