1474148715 According to epochconverter it's: GMT: Sat, 17 Sep 2016 21:45:15 GMT
Trying momentJS, but no luck:
const startMomentized = moment(Date($state.params.start_epoch)).add(1, 'milliseconds').unix();
^ This gives me the date: GMT: Thu, 09 Feb 2017 19:39:43 GMT instead of the date in Sep 2016, I'm trying to have.
Hoping for something like the following:
var start = params.start_epoch;
var startUpdated = start+millisec;
The value represents time in milliseconds since 1970. You can add milliseconds directly to that number like: 1474148715 + 1
Why don't you add Milliseconds directly to the value you have
var newDateTime = new Date(1474148715*1000+1);
Don't forget to multiply the value with 1000 to convert in milliseconds.
Related
I'm confused about how this time conversion works. I have timestamp 1462060800000 which when I turn in to date correctly becomes:
Sun May 01 2016 02:00:00 GMT+0200 (Central European Summer Time)
but then when I want to get the month with const startMonth = start.getUTCMonth() I get 4 instead of 5. Why is this happening and what do I need to do to get the correct month?
const timestamp = 1462060800000
const start = new Date(timestamp)
console.log(start) // Sun May 01 2016 02:00:00 GMT+0200 (Central European Summer Time)
const startYear = start.getUTCFullYear()
const startMonth = start.getUTCMonth()
console.log(startMonth) // 4
getUTCMonth() returns zero-based months. 4 is correct. See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/getUTCMonth.
I get it it's actually a month index that starts with 0.
The getUTCMonth() returns the month of the specified date according to universal time, as a zero-based value (where zero indicates the first month of the year).
From the docs see Date.prototype.getUTCMonth()
The getUTCMonth() method, like getMonth(), has a zero (0) count. This means that the period will be like this - 0 and 11. To get the desired month, you need to add +1:
const startMonth = start.getUTCMonth() + 1;
Loot it.
const timestamp = 1462060800000;
const start = new Date(timestamp);
console.log(start); // Sun May 01 2016 02:00:00 GMT+0200 (Central European Summer Time)
const startYear = start.getUTCFullYear();
const startMonth = start.getUTCMonth() + 1;
console.log(startMonth);
I have the following code to increment the hours in a date:
let timerExpireDate = new Date(countdownStartDate);
console.log(`original date is ${timerExpireDate}`);
console.log(`add on ${countdownHours} hours`);
timerExpireDate.setHours(timerExpireDate.getHours() + countdownHours);
console.log(`New date is ${timerExpireDate}`);
However it also seems to be incrementing the days by 6, here is the console log:
original date is Sun Jul 19 2020 16:36:39 GMT+0800 (Taipei Standard Time)
add on 2 hours
New date is Sat Jul 25 2020 18:36:39 GMT+0800 (Taipei Standard Time)
What am I doing wrong here?
It is likely that countdownHours is of type string instead of number, so timerExpireDate.getHours() + countdownHours is '162' (6 days later) instead of 18.
The fix is to cast countdownHours to number first, like countdownHours = +countdownHours.
I want to get the date from the GMT time but it returns the date which is one day ahead. How can I get the date mentioned in the GMT string always?
new Date("Mon, 27 Aug 2018 22:00:00 GMT").getDate()
This command returns 28 as the output, but I want 27.
Is there anything I need to add?
Thanks in advance.
Try this One.I think your problem will be solved.
<script>
function myFunction() {
var d = new Date();
var n = d.getUTCDate();
document.getElementById("demo").innerHTML = n;
}
</script>
When you create a new Date() the browser returns date based on your device timezone. You can use Date.getTimezoneOffset() to get GMT offset time difference and then adjust the time by multiplying the value.
// Your date
var myDate = new Date("Mon, 27 Aug 2018 22:00:00 GMT")
// Convert your date to local using getTimezoneOffset() and multiply with 60000 to get time adjusted GMT 0
var myDateLocal =new Date( myDate.valueOf() + myDate.getTimezoneOffset() * 60000 ).getDate();
document.getElementById("myDate").innerHTML=myDateLocal;
<h1 id="myDate" ></h1>
This is how I calculate the duration:
var duration = new Date().getTime() - customerForgotPassword[0].createdTime.getTime();
This is how I compare:
var TEN_MINUTES = 10*60*1000;
if(duration > TEN_MINUTES){
//do smtg
}
new Date().getTime() returns 1528291351108 which is Wed Jun 06 2018 13:22:31 in UTC
customerForgotPassword[0].createdTime returns Wed Jun 06 2018 13:20:04 GMT+0800 (Malay Peninsula Standard Time) in my code.
customerForgotPassword[0].createdTime.getTime() returns 1528262404000 which is Wed Jun 06 2018 05:20:04 in UTC
In database, customerForgotPassword[0].createdTime is in UTC format but when I retrieve it, it shows the local timezone. How can I set this to UTC format too when I retrieve it in node.js using Sequelize ?
I had to add 8 hours to get time in UTC
var timeInUTC = customerForgotPassword[0].createdTime.getTime() - (customerForgotPassword[0].createdTime.getTimezoneOffset() * 60 * 1000)
I need to store an unix value from a Time input.
The problem is that :
// I create a Moment Date from my input :
var date = moment({hour: 10, minute: 00)
// gives this _d : Mon May 04 2015 10:00:00 GMT+0200 (CEST)
// I convert it to unix value
date = date.unix()
// -> 1430726400
moment( date ).format('HH:mm')
// -> "14:25" // Should give me 10:00
// Online conversion unix time gives me : Mon, 04 May 2015 08:00:00 GMT
So how can I keep my 10:00 in memory as unix value using those transformations ?
Per documentation:
moment.unix( date ).format('HH:mm')
I kind of sorted it this way :
// I create a Moment Date from my input :
var date = moment({hour: 10, minute: 00)
// gives this _d : Mon May 04 2015 10:00:00 GMT+0200 (CEST)
// Convert to unix
date = date.toDate();
date = date.getTime();
console.log(moment( date ).format('HH:mm')); // Gives me 10:00
I hope it will keep the good time ? In France the time changes twice a year ( go forward 1 hour, go backwards 1 hour)