I have the following js codes:
// set a date time say 2 Oct
var localTime = new Date(2016, 9, 2, 4, 0, 0);
// set time to 23:59:59
localTime.setHours(23, 59, 59, 0);
console.log(localTime); // // Sun Oct 02 2016 23:59:59 GMT+0800 (MYT), which is expected
// now minus 600 minutes, which should be 10 hours
localTime.setMinutes(-600);
console.log(localTime); // Sun Oct 02 2016 13:00:59 GMT+0800 (MYT)
When I minus 600 minutes from that time, I am expecting it to minus 10 hours which should be 13:59:59 but it's printing 13:00:59
What is that I am missing here?
Date.prototype.setMinutes does not add/remove minutes from the time you have. It sets the minutes value for your date. The argument you provide is:
minutesValue
An integer between 0 and 59, representing the minutes.
Related
I can not clearly understand how does Date.getHours works. I live in montenegro and for me GMT is +2. I checked it in google and in browser console
new Date() // Date Tue Aug 02 2022 15:26:32 GMT+0200 (Central European Summer Time)
const date1 = new Date('December 31, 1975, 16:00:00 GMT+04:00');
const date2 = new Date('December 31, 1975, 16:00:00 GMT-04:00');
const date3 = new Date('December 31, 1975, 16:00:00 GMT+00:00');
console.log(date1.getUTCHours()); // 12
console.log(date2.getUTCHours()); // 20
console.log(date3.getUTCHours()); // 16
console.log('-------------------------')
console.log(date1.getHours()); // 13
console.log(date2.getHours()); // 21
console.log(date3.getHours()); // 17
UTC hours is I understand clearly plus/minus 4 hours and 0 hours. But why I get odd numbers when get non UTC hours?
I expected to get 14 in first logs because the time was in GMT+4, my timezone is GMT+2, so difference is 2 not 3
With same logic I expected to see 22, because difference is -6
And 18 in third log because difference is -2
I have issue with a date.
I have two date (start and end) and I need to build an array of all date between these two.
My script is something like:
while(currentData < nodeLastDate){
currentData.setDate(currentData.getDate() + 1);
console.log(currentData)
}
But at Sat Mar 30 2019 there is an error and the data change also the time.
if you run this simple script you can see it.
let test = new Date(2019, 2, 30, 2)
console.log(test)
test = test.setDate(test.getDate() + 1)
console.log(new Date(test))
this is the result:
Sat Mar 30 2019 02:00:00 GMT+0100 (Ora standard dell’Europa centrale)
index.js?c69d:385 Sun Mar 31 2019 03:00:00 GMT+0200 (Ora legale dell’Europa central)
Is this normal?
Date.getDate() gets the day of the month, so you lose any other information. If you want to add a day to a date, simply use Date.getTime() and add the number of milliseconds in a day:
let test = new Date(2019, 2, 30, 2)
console.log(test)
test = test.setDate(test.getTime() + (1440 * 60000))
console.log(new Date(test))
Date.getTime returns the number of milliseconds since an arbitrary date known as the epoch date, so adding the number of milliseconds in a day will add exactly one day to your date. (1440 * 60000 is the number of milliseconds in a day because there are 1440 minutes in a day and 60000 milliseconds in a minute)
This month (March) has 31 days.
I want to get the last day of the month and instead of get Wed Mar 31 2021 23:59:59 I get Fri Apr 30 2021 23:59:59 look:
let d = new Date()
d.setMonth( d.getMonth() + 1) // April
d.setDate(0) // should bring the 31 of March
d.setHours(23, 59, 59, 999)
console.log(d) // Fri Apr 30 2021 23:59:59 GMT+0300 (IDT)
Why does it happen on date with 31 days?
When tried on different months every month it worked as well, for example:
let d = new Date("2021-02-25") // notice that we point to February
d.setMonth( d.getMonth() + 1)
d.setDate(0)
d.setHours(23, 59, 59, 999)
console.log(d) // Sun Feb 28 2021 23:59:59 GMT+0200 (IST)
Notice that in the second example - which is working good, we get the last day of Feb and GMT+2 (IST) and not GMT+3 (IDT)
Also notice that if I declare it like that: let d = new Date('2021-03-25') it also works good (with specific date, instead of just new Date())
It happens because April only has 30 days.
let d = new Date()
d.setMonth( d.getMonth() + 1) // Actually April 31st -> May 1st.
Try this way:
d.setMonth(d.getMonth(), 0);
second argument 0 will result in the last day of the previous month
Got it!
I set +1 for the month while the current date is 31 and what will happen is that it will jump to 31 of April which doesn't exist and the default date will be 1 in May.
So prev date of 1 in May is 30 of April.
I should set the date to 1 before doing the increment of the month, look:
let d = new Date()
d.setDate(1) // this is the change - important!
d.setMonth( d.getMonth() + 1)
d.setDate(0)
d.setHours(23, 59, 59, 999)
console.log(d) // Wed Mar 31 2021 23:59:59
That way, it will start from 1 of March, inc to 1 of April, and go prev date to last day of March.
Even that it also works, weird:
var date = new Date(), y = date.getFullYear(), m = date.getMonth();
var firstDay = new Date(y, m, 1, 0, 0, 0, 0);
var lastDay = new Date(y, m + 1, 0, 23, 59, 59, 999)
console.log(lastDay)
var today = new Date();
var endYear = new Date(1995, 11, 31, 23, 59, 59, 999); // Set day and month
endYear.setFullYear(today.getFullYear()); // Set year to this year
console.log("Version 1: end year full date is ", endYear);
var msPerDay = 24 * 60 * 60 * 1000; // Number of milliseconds per day
var daysLeft = (endYear.getTime() - today.getTime()) / msPerDay;
var daysLeft = Math.round(daysLeft); //returns days left in the year
console.log(daysLeft,endYear);
// when l write that code answer is 245.
var today = new Date();
var endYear = new Date(2021, 0, 0, 0, 0, 0, 0); // Set day and month
console.log("Version 2: end year full date is ", endYear);
var msPerDay = 24 * 60 * 60 * 1000; // Number of milliseconds per day
var daysLeft = (endYear.getTime() - today.getTime()) / msPerDay;
var daysLeft = Math.round(daysLeft); //returns days left in the year
console.log(daysLeft,endYear);
// but when l add only 1 ms then answer returns like 244. but how is it possible? where has 1 day gone?
That is the difference with the time you set.
To be clear,
first endYear will print Thu Dec 31 2020 23:59:59
second endYear will print Thu Dec 31 2020 00:00:00
That is the difference you see there.
I will post the complete out put I received on console here as well.
Thu Dec 31 2020 23:59:59 GMT+0530 (India Standard Time)
245.0131708912037
245
Thu Dec 31 2020 00:00:00 GMT+0530 (India Standard Time)
244.01317090277777
244
==================EDIT==================
new Date(2021, 0, 0, 0, 0, 0, 0) calculates this to Dec 31st because date is indexed from 1 and not zero. If that value is zero it computes it as the day before the 31st of December.
For example,
new Date(Date.UTC(2021, 1, 0, 0, 0, 0, 0)) will print out Sat Jan 31 2021 05:30:00 GMT+0530 (India Standard Time)
and
new Date(Date.UTC(2021, 1, -1, 0, 0, 0, 0)) will print out Sat Jan 30 2021 05:30:00 GMT+0530 (India Standard Time)
Just want a help for adding numbers bi-weekly.
Let's say,
Start date : Jan 15, 2012
End date : May 15, 2012
Value : 300.00
What I want to accomplish is that, every 15th and last day of the month 300 will be multiplied to how many 15th and last day before May 15, 2012
so
Jan 15, 2012 to Jan 31, 2012 the value must be 300.00
Feb 01, 2012 to Feb 15, 2012 the value must be 600.00
Feb 16, 2012 to Feb 28/29, 2012 the value must be 900.00
Mar 01, 2012 to Mar 15, 2012 the value must be 1200.00
Mar 16, 2012 to Mar 31, 2012 the value must be 1500.00
Apr 01, 2012 to Apr 15, 2012 the value must be 1800.00
Apr 16, 2012 to Apr 30, 2012 the value must be 2100.00
May 01, 2012 to May 15, 2012 the value must be 2400.00
hope you get what I mean.
Hoping for your helpful replies, Thanks.
You can loop as long as you have whole months, then check if there is a half month to add at the end:
var date = startDate;
var sum = 0;
while (date < endDate) {
sum += value * 2;
date.setMonth(date.getMonth() + 1);
}
if (date < endDate) sum += value;