In Python, using calendar.timegm(), I get a 10 digit result for a unix timestamp. When I put this into Javscript's setTime() function, it comes up with a date in 1970. It evidently needs a unix timestamp that is 13 digits long. How can this happen? Are they both counting from the same date?
How can I use the same unix timestamp between these two languages?
In Python:
In [60]: parseddate.utctimetuple()
Out[60]: (2009, 7, 17, 1, 21, 0, 4, 198, 0)
In [61]: calendar.timegm(parseddate.utctimetuple())
Out[61]: 1247793660
In Firebug:
>>> var d = new Date(); d.setTime(1247793660); d.toUTCString()
"Thu, 15 Jan 1970 10:36:55 GMT"
timegm is based on Unix's gmtime() method, which return seconds since Jan 1, 1970.
Javascripts setTime() method is milliseconds since that date. You'll need to multiply your seconds times 1000 to convert to the format expected by Javascript.
Here are a couple of python methods I use to convert to and from javascript/datetime.
def to_datetime(js_timestamp):
return datetime.datetime.fromtimestamp(js_timestamp/1000)
def js_timestamp_from_datetime(dt):
return 1000 * time.mktime(dt.timetuple())
In javascript you would do:
var dt = new Date();
dt.setTime(js_timestamp);
Are you possibly mixing up seconds-since-1970 with milliseconds-since-1970?
JavaScript Date constructor works with milliseconds, you should multiply the Python unix time by 1000.
var unixTimestampSeg = 1247793660;
var date = new Date(unixTimestampSeg*1000);
Related
Given a JavaScript date, how can I convert this to the same format as Swift JSON encoding?
e.g. I would like to get a value of 620102769.132999 for the date 2020-08-26 02:46:09
The default Swift JSON encoding outputs a value which is the number of seconds that have passed since ReferenceDate. https://developer.apple.com/documentation/foundation/jsonencoder/2895363-dateencodingstrategy
It seems ReferenceDate is 00:00:00 UTC on 1 January 2001.
https://developer.apple.com/documentation/foundation/nsdate/1409769-init
function dateToSwiftInterval(date: Date): number {
const referenceDate = Date.UTC(2001,0,1);
const timeSpanMs = (date - referenceDate);
return timeSpanMs / 1000;
}
const myDate = new Date(1598366769000);
console.log(dateToSwiftValue(myDate)); // 620102769
As Elwyn says, Swift represents dates as time intervals that are seconds since 1 Jan 2001 UTC. Javascript Dates use milliseconds since 1 Jan 1970 UTC, so all you need to do is adjust by the reference date difference and divide by 1000, e.g.
// Javascript function to convert a Date to a Swift time interval
// date is a Javascript Date, defaults to current date and time
function toSwiftTI(date = new Date()) {
return (date - Date.UTC(2001,0,1)) / 1000;
}
console.log(toSwiftTI());
Since the time difference is a constant, 978307200000, you might just use that instead of calculating it every time, so:
return (date - 978307200000) / 1000;
Going the other way, just multiply by 1,000 and add the constant:
function swiftToDate(ti) {
return new Date(ti * 1000 + 978307200000);
}
console.log(swiftToDate(620102769.132999).toISOString());
I am trying to subtract hours from a given date time string using javascript.
My code is like:
var cbTime = new Date();
cbTime = selectedTime.setHours(-5.5);
Where selectedTime is the given time (time that i pass as parameter).
So suppose selectedTime is Tue Sep 16 19:15:16 UTC+0530 2014
Ans I get is : 1410875116995
I want answer in datetime format.
Am I doing something wrong here? Or there is some other solution?
The reason is that setHours(), setMinutes(), etc, take an Integer as a parameter. From the docs:
...
The setMinutes() method sets the minutes for a specified date
according to local time.
...
Parameters:
An integer between 0 and 59, representing the minutes.
So, you could do this:
var selectedTime = new Date(),
cbTime = new Date();
cbTime.setHours(selectedTime.getHours() - 5);
cbTime.setMinutes(selectedTime.getMinutes() - 30);
document.write('cbTime: ' + cbTime);
document.write('<br>');
document.write('selectedTime: ' + selectedTime);
Well first off setting the hours to -5.5 is nonsensical, the code will truncate to an integer (-5) and then take that as "five hours before midnight", which is 7PM yesterday.
Second, setHours (and other functions like it) modify the Date object (try console.log(cbTime)) and return the timestamp (number of milliseconds since the epoch).
You should not rely on the output format of the browser converting the Date object to a string for you, and should instead use get*() functions to format it yourself.
According to this:
http://www.w3schools.com/jsref/jsref_sethours.asp
You'll get "Milliseconds between the date object and midnight January 1 1970" as a return value of setHours.
Perhaps you're looking for this:
http://www.w3schools.com/jsref/tryit.asp?filename=tryjsref_sethours3
Edit:
If you want to subtract 5.5 hours, first you have to subtract 5 hours, then 30 minutes. Optionally you can convert 5.5 hours to 330 minutes and subtract them like this:
var d = new Date();
d.setMinutes(d.getMinutes() - 330);
document.getElementById("demo").innerHTML = d;
Use:
var cbTime = new Date();
cbTime.setHours(cbTime.getHours() - 5.5)
cbTime.toLocaleString();
try this:
var cbTime = new Date();
cbTime.setHours(cbTime.getHours() - 5.5)
cbTime.toLocaleString();
I am reading excel data using php and JavaScript. Storing results in variable and showing it on the page.
Simple code example:
var yearend = "< ? php echo ($connection->sheets[0]["cells"][2][5]); ? >";
This works for text and fields with number. But when I format cell as "Date" it returns the values, such as.
Excel field is: 31-Dec-2015 - JavaScript returns value: 40542
I know it is a MS DATEVALUE formatting.
But i need to convert it to date using JavaScript so it shows 31-Dec-2015 or 31 December 2015 only.
So in short:
From Excel 40542 to JavaScript 31 December 2015.
Also, I only need as above, without trailing time and locations, so removing:
00:00:00 00:00 GMT
Also is it possible modify the date to +1 day or -1 day?
//Convert Excel dates into JS date objects
//#param excelDate {Number}
//#return {Date}
function getJsDateFromExcel(excelDate) {
// JavaScript dates can be constructed by passing milliseconds
// since the Unix epoch (January 1, 1970) example: new Date(12312512312);
// 1. Subtract number of days between Jan 1, 1900 and Jan 1, 1970, plus 1 (Google "excel leap year bug")
// 2. Convert to milliseconds.
return new Date((excelDate - (25567 + 1))*86400*1000);
}
try this
toDate(serialDate, time = false) {
let locale = navigator.language;
let offset = new Date(0).getTimezoneOffset();
let date = new Date(0, 0, serialDate, 0, -offset, 0);
if (time) {
return serialDate.toLocaleTimeString(locale)
}
return serialDate.toLocaleDateString(locale)
}
Use the following php function to covert the datevalue into a php timestamp. You could then use standard date functions to format however you wish
function xl2timestamp($xl_date){
return ($xl_date - 25569) * 86400;
}
I have a UNIX timestamp 1411866803 which is equivalent to Sun, 28 Sep 2014 01:13:23 GMT
I want to get the month from the timestamp according to local time of the user. I use the following function:
<script>
var date = new Date(1411866803);
var month = date.getMonth();
alert(month);
</script>
This should return 9 instead it returns 0. Is there something wrong with the above syntax? Thanks :)
JavaScript timestamps are expressed in milliseconds. Multiply by 1000:
var date = new Date(1411866803000);
var month = date.getMonth();
alert(month); // => 8
Also, months are 0-based, so September is 8, not 9.
The Date object takes in milliseconds, whereas UNIX timestamps use seconds. You have to convert it over.
var date = new Date(1411866803*1000);
var month = date.getMonth();
alert(month);
The Date object takes as milisecond instead of seconds. So, just put three zeroes and you will be fine.
var date = new Date(1411866803000);
var month = date.getMonth();
alert(month);
If I test your script in a console, I get this date to be:
var date = new Date(1411866803)
Date {Sat Jan 17 1970 20:11:06 GMT+1200 (NZST)}
And according to that page : http://www.w3schools.com/jsref/jsref_getmonth.asp
The getMonth() method returns the month (from 0 to 11) for the specified date, according to local time.
Note: January is 0, February is 1, and so on.
I'm having difficulty working with dates in Python and Javascript.
>>> d = date(2004, 01, 01)
>>> d
datetime.date(2004, 1, 1)
>>> time.mktime(d.timetuple())
1072944000.0
Then, in Javascript (data sent over Ajax):
>>> new Date(1072944000.0)
Tue Jan 13 1970 02:02:24 GMT-0800 (PST) {}
I'm confused. Shouldn't the Javascript date be the same as the one that I entered in Python? What am I doing wrong?
Javascript's Date() takes milliseconds as an argument. Python's uses seconds. You have to multiply by 1,000.
Python:
import datetime, time
d = datetime.datetime.utcnow()
for_js = int(time.mktime(d.timetuple())) * 1000
Then in JS:
new Date({{ for_js }});
In Flask you can do:
#app.template_filter('date_to_millis')
def date_to_millis(d):
"""Converts a datetime object to the number of milliseconds since the unix epoch."""
return int(time.mktime(d.timetuple())) * 1000
And then do:
new Date({{ current_user.created|date_to_millis }});
Python is returning the time since the epoch in seconds. Javascript takes the time in milliseconds. Multiply the time by 1000 before passing it to Date() and you should get the expected result.
new Date(1072944000.0 * 1000)