How to addition seconds with seconds in JavaScript? - javascript

I have my current time in seconds and duration time in seconds.
I want to add both seconds to calculate the end time of the song (for example).
But i have a weird format in the result.
Here is my code :
// This is my song's duration
var duration = new Date("Sept 21, 2019 00:03:32");
var durationSeconds = duration.getSeconds();
// Current second
var date = new Date();
var seconds = date.getSeconds();
var differenceSecondsConverted = date.setSeconds(seconds + durationSeconds);
console.log(differenceSecondsConverted);
And the result is something like : 1569102592740
Thanks

Actually the code works as it should, probably, you just miss some concept.
The new Date() is a constructor that returns an instance of Date object.
This object have several properties. When you instantiate it, it returns something like Sun Sep 22 2019 01:21:14 GMT+0200 (CEST) which is string representation of current time.
However, this string representation is not how JS actually "thinks" about the time.
Internally, "time" for JS is number of milliseconds passed from January 1, 1970, 00:00:00.
It looks something like this: 1569108461979.
You may see it if you run Date.now();
Also, if you do any calculations (not directly, but using methods like .setDate) with new Date(), it will be internally calculated as milliseconds passed from 1, 1970, 00:00:00.
So, the main problem in your code is that your duration variable is not actually "duration".
It just contains an object that represents Sept 21, 2019 00:03:32.
It is just a moment in time (3 minutes, 32 seconds after midnight of 20 of September 2019).
To calculate when the song will end if it starts right now, you'd do something like:
let now = Date.now();
// Song duration in milliseconds
let songDuration = 201000;
let songEndTime = now + songDuration;
console.log( new Date(songEndTime) );

You can get a date object for the time in say 3:32 by adding 3 minutes and 32 seconds to the current date, e.g.
// time is minutes and seconds as mm:ss
function nowPlus(time) {
let [m, s] = time.split(':').map(Number);
let now = new Date();
now.setMinutes(now.getMinutes() + m, now.getSeconds() + s);
return now;
}
console.log('In 3:32 it will be: ' + nowPlus('3:32').toLocaleString(undefined, {hour12:false, hour: 'numeric', minute:'2-digit', second:'2-digit'}));

Related

Increment date and batch create Firestore documents

I have a map of data:
{
name: name,
time: <epoch in seconds>,
}
My goal is to create 6 documents with this data in Firestore, but with each document, the time field should be incremented by 1 week. So if my initial time is Sun Nov 17 2019 in epoch, there should be 6 documents with time (as Firestore timestamp) Sun Nov 17 2019, Sun Nov 24 2019, Sun Dec 1 2019, etc.
I'm using a https cloud function and tried doing this:
var name = 'test';
var time = 1573992000; // 7 AM EST on Sun, Nov 17 2019
var data = {
name: name,
time: time,
};
var start = new Date(time * 1000); // convert to milliseconds
var i;
for (i = 0; i < 6; i++) {
var ref = db.collection('shifts').doc();
start.setDate(new Date(start) + 7);
data['time'] = new admin.firestore.Timestamp(start.getTime() / 1000, 0);
batch.set(ref, data);
}
return batch.commit();
But all the documents appear in firestore with the same time (6 weeks after the initial). My guess is that it's something with the batch only setting the latest value. How can I avoid this without having to create 6 separate maps?
You usage of Date looks overall incorrect to me.
setDate() sets the date of the month to the exact value, and it will not increment the month if needed to accommadate values that go beyond what the month allows (that is to say, if you pass 32, no month would ever accept that).
Also this doesn't make sense to me: new Date(start) + 7. You can't simply add 7 to a date object. It looks like maybe you meant to add 7 to the date of the month represented in the date. But again, that won't work the way you expect if the value is too high for the given month.
All things considered, the Date object isn't going to help you out too much here. I suggest just doing date math on integer numbers. For example, if your start time is:
var time = 1573992000
You can add 7 days worth of seconds to it like this:
time + (7 * 24 * 60 * 60)
This should be easier to work with.

setHours is not a function when converting time from string

I am trying to compare a time in string format to the current time. I've tried setting up two Date objects and calling .Now() on both of them, then on one of them adjusting the time to the time that is in string format by splitting it and parsing both the hours and minutes to integers, but I get the following error:
setHours is not a function
The 'cutoff' value I'm using is '15:00' and when following in the debugger I can see this splits in to split[0] = 15 and split[1] = 00 (this is before they are parsed into integers.
var cutoff = data.CutOff;
var split = cutoff.split(":");
var today = Date.now();
var hours = parseInt(split[0]);
var min = parseInt(split[1]);
today.setHours(hours, min);
if (Date.now() < today) {
// Do Something
}
You want to do new Date() as opposed to Date.now()
new Date creates a Date instance which allows you to access the Date methods.
Date.now() method returns the number of milliseconds elapsed since 1 January 1970 00:00:00 UTC.

JavaScript setHours() and getHours() [duplicate]

I am trying to subtract hours from a given date time string using javascript.
My code is like:
var cbTime = new Date();
cbTime = selectedTime.setHours(-5.5);
Where selectedTime is the given time (time that i pass as parameter).
So suppose selectedTime is Tue Sep 16 19:15:16 UTC+0530 2014
Ans I get is : 1410875116995
I want answer in datetime format.
Am I doing something wrong here? Or there is some other solution?
The reason is that setHours(), setMinutes(), etc, take an Integer as a parameter. From the docs:
...
The setMinutes() method sets the minutes for a specified date
according to local time.
...
Parameters:
An integer between 0 and 59, representing the minutes.
So, you could do this:
var selectedTime = new Date(),
cbTime = new Date();
cbTime.setHours(selectedTime.getHours() - 5);
cbTime.setMinutes(selectedTime.getMinutes() - 30);
document.write('cbTime: ' + cbTime);
document.write('<br>');
document.write('selectedTime: ' + selectedTime);
Well first off setting the hours to -5.5 is nonsensical, the code will truncate to an integer (-5) and then take that as "five hours before midnight", which is 7PM yesterday.
Second, setHours (and other functions like it) modify the Date object (try console.log(cbTime)) and return the timestamp (number of milliseconds since the epoch).
You should not rely on the output format of the browser converting the Date object to a string for you, and should instead use get*() functions to format it yourself.
According to this:
http://www.w3schools.com/jsref/jsref_sethours.asp
You'll get "Milliseconds between the date object and midnight January 1 1970" as a return value of setHours.
Perhaps you're looking for this:
http://www.w3schools.com/jsref/tryit.asp?filename=tryjsref_sethours3
Edit:
If you want to subtract 5.5 hours, first you have to subtract 5 hours, then 30 minutes. Optionally you can convert 5.5 hours to 330 minutes and subtract them like this:
var d = new Date();
d.setMinutes(d.getMinutes() - 330);
document.getElementById("demo").innerHTML = d;
Use:
var cbTime = new Date();
cbTime.setHours(cbTime.getHours() - 5.5)
cbTime.toLocaleString();
try this:
var cbTime = new Date();
cbTime.setHours(cbTime.getHours() - 5.5)
cbTime.toLocaleString();

How to programmatically determine the prior time period based upon unix time stamps in Javascript?

Essentially I have two unix timestamps, representing the first and last days of a given month. Is it possible programmatically determine the timestamps for the first and last of the previous month?
For example, I have the following two timestamps:
1467331201 --> July 1, 2016
1469923201 --> July 31, 2016
Essentially, can I manipulate these two numbers in a consistent way in order to the unix time (or Date object) for June 1, 2016 and June 30, 2016, respectively? Problem that I'm running into is that you cannot simply subtract a given amount because the amount of days in a month is variable.
You could use this function:
function getPreviousMonthRange(unixTime) {
var dt = new Date(unixTime * 1000);
dt.setUTCDate(0); // flips to the last day of previous month
var unixLast = dt.getTime();
dt.setUTCDate(1); // back to the first day of that same month
var unixFirst = dt.getTime();
return [unixFirst / 1000, unixLast / 1000];
}
// given first and last date (only one is really needed)
var unixTimeFirst = 1467331201;
var unixTimeLast = 1469923201;
// get previous month's first & last date
var [first, last] = getPreviousMonthRange(unixTimeFirst);
// output
console.log('previous month first day: ', first, new Date(first*1000));
console.log('previous month last day: ', last, new Date(last*1000));
Take a look at the following example:
// Specify a timestamp
var timestamp = 1467331201;
// Create a date object for the time stamp, the object works with milliseconds so multiply by 1000
var date = new Date(timestamp * 1000);
// Set the date to the previous month, on the first day
date.setUTCMonth(date.getUTCMonth() - 1, 1);
// Explicitly set the time to 00:00:00
date.setUTCHours(0, 0, 0);
// Get the timestamp for the first day
var beginTimestamp = date.getTime() / 1000;
// Increase the month by one, and set the date to the last day of the previous month
date.setUTCMonth(date.getUTCMonth() + 1, 0);
// Explicitly set the time to 23:59:59
date.setUTCHours(23, 59, 59);
// Get the timestamp for the last day
var endTimestamp = date.getTime() / 1000;
// Print the results
console.log('Timestamps for previous month: ');
console.log('Begin timestamp: ' + beginTimestamp);
console.log('End timestamp: ' + endTimestamp);
A timestamp must be specified in the variable on the top, this might be one of the two timestamps you suggested in your question, anywhere in a month.
This code then calculates the begin and end timestamp for the previous month as you've requested, and prints the results to the console.
Please note, that in this example the begin timestamp uses 00:00:00 as time, and the end timestamp uses 23:59:59 as time (the last second of that day). This can be configured the way you'd prefer.
In this case, we're working with the ...UTC... Date functions, because a Unix timestamp is in UTC time, not in the timezone the user is in.
The statement date.setMonth(date.getMonth() + 1, 0); is used to select the last day in the month. The next month is selected first, but because the day is set to 0 (and not 1) one day is subtracted giving you the preferred result. This is described here.
You can consider using Moment.js. I'm sure this is not exactly how you'd end up handling it but see below for an example of some helpful methods.
var lastDayOfJuly = moment(1469923201);
var firstDayOfJuly = lastDayOfJuly.startOf('month');
var lastDayOfJune = firstDayOfJuly.subtract(1, 'day');
var firstDayOfJune = lastDayOfJune.startOf('month");
Moment.js

How to the get the beginning of day of a date in javascript -- factoring in timezone

I am struggling to find out the beginning of day factoring in timezones in javascript. Consider the following:
var raw_time = new Date(this.created_at);
var offset_time = new Date(raw_hour.getTime() + time_zone_offset_in_ms);
// This resets timezone to server timezone
var offset_day = new Date(offset_time.setHours(0,0,0,0))
// always returns 2011-12-08 05:00:00 UTC, no matter what the offset was!
// This has the same issue:
var another_approach_offset_day = new Date(offset_time.getFullYear(),offset_time.getMonth(),offset_time.getHours())
I expect when i pass a Pacific Timezone offset, to get: 2011-12-08 08:00:00 UTC and so on.
What is the correct way to achieve this?
I think that part of the issue is that setHours method sets the hour (from 0 to 23), according to local time.
Also note that I am using javascript embedded in mongo, so I am unable to use any additional libraries.
Thanks!
Jeez, so this was really hard for me, but here is the final solution that I came up with the following solution. The trick was I need to use setHours or SetUTCHours to get the beginning of a day -- the only choices I have are system time and UTC. So I get the beginning of a UTC day, then add back the offset!
// Goal is given a time and a timezone, find the beginning of day
function(timestamp,selected_timezone_offset) {
var raw_time = new Date(timestamp)
var offset_time = new Date(raw_time.getTime() + selected_timezone_offset);
offset_time.setUTCHours(0,0,0,0);
var beginning_of_day = new Date(offset_time.getTime() - selected_timezone_offset);
return beginning_of_day;
}
In JavaScript all dates are stored as UTC. That is, the serial number returned by date.valueOf() is the number of milliseconds since 1970-01-01 00:00:00 UTC. But, when you examine a date via .toString() or .getHours(), etc., you get the value in local time. That is, the local time of the system running the script. You can get the value in UTC with methods like .toUTCString() or .getUTCHours(), etc.
So, you can't get a date in an arbitrary timezone, it's all UTC (or local). But, of course, you can get a string representation of a date in whatever timezone you like if you know the UTC offset. The easiest way would be to subtract the UTC offset from the date and call .getUTCHours() or .toUTCString() or whatever you need:
var d = new Date();
d.setMinutes(d.getMinutes() - 480); // get pacific standard time
d.toUTCString(); // returns "Fri, 9 Dec 2011 12:56:53 UTC"
Of course, you'll need to ignore that "UTC" at the end if you use .toUTCString(). You could just go:
d.toUTCString().replace(/UTC$/, "PST");
Edit: Don't worry about when timezones overlap date boundaries. If you pass setHours() a negative number, it will subtract those hours from midnight yesterday. Eg:
var d = new Date(2011, 11, 10, 15); // d represents Dec 10, 2011 at 3pm local time
d.setHours(-1); // d represents Dec 9, 2011 at 11pm local time
d.setHours(-24); // d represents Dec 8, 2011 at 12am local time
d.setHours(52); // d represents Dec 10, 2011 at 4am local time
Where does the time_zone_offset_in_ms variable you use come from? Perhaps it is unreliable, and you should be using Date's getTimezoneOffset() method. There is an example at the following URL:
http://www.w3schools.com/jsref/jsref_getTimezoneOffset.asp
If you know the date from a different date string you can do the following:
var currentDate = new Date(this.$picker.data('date'));
var today = new Date();
today.setHours(0, -currentDate.getTimezoneOffset(), 0, 0);
(based on the codebase for a project I did)
var aDate = new Date();
var startOfTheDay = new Date(aDate.getTime() - aDate.getTime() % 86400000)
Will create the beginning of the day, of the day in question
You can make use of Intl.DateTimeFormat. This is also how luxon handles timezones.
The code below can convert any date with any timezone to its beginging/end of the time.
const beginingOfDay = (options = {}) => {
const { date = new Date(), timeZone } = options;
const parts = Intl.DateTimeFormat("en-US", {
timeZone,
hourCycle: "h23",
hour: "numeric",
minute: "numeric",
second: "numeric",
}).formatToParts(date);
const hour = parseInt(parts.find((i) => i.type === "hour").value);
const minute = parseInt(parts.find((i) => i.type === "minute").value);
const second = parseInt(parts.find((i) => i.type === "second").value);
return new Date(
1000 *
Math.floor(
(date - hour * 3600000 - minute * 60000 - second * 1000) / 1000
)
);
};
const endOfDay = (...args) =>
new Date(beginingOfDay(...args).getTime() + 86399999);
const beginingOfYear = () => {};
console.log(beginingOfDay({ timeZone: "GMT" }));
console.log(endOfDay({ timeZone: "GMT" }));
console.log(beginingOfDay({ timeZone: "Asia/Tokyo" }));
console.log(endOfDay({ timeZone: "Asia/Tokyo" }));

Categories