This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Eli Collins
Recipients Eli Collins
Date 2017-01-22.17:47:33
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
I've found an odd behavior when passing very large values to ``datetime.datetime.utcfromtimestamp()`` and ``.fromtimestamp()`` under python 3.6.

Under python 3.5, ``utcfromtimestamp(1<<40)`` would throw a ValueError that the year was out of range.  Under python 3.6, this now returns a datetime in year 36812 (which seems reasonable given the input).  

The unexpected behavior occurs when increasing the bits passed:   ``utcfromtimestamp(1<<41)`` returns a datetime with a *smaller* year (6118).  This pattern proceeds as the bits are increased, with the years increasing & then wrapping around again, up to the point where it exceeds time_t (at that point, python 3.6 throws the same OSError as 3.5).

It looks to me like 3.6 dropped a bounds check somewhere, and is now truncating high bits off the resulting year?


Attached is the "" script that I was using to examine boundary behavior of utctimestamp() when I found this bug.

System was running Linux Mint 18.1 x86_64, using the python 3.6.0 build from (ubuntu's python 3.6.0 build also shows this behavior).
Date User Action Args
2017-01-22 17:47:34Eli Collinssetrecipients: + Eli Collins
2017-01-22 17:47:34Eli Collinssetmessageid: <>
2017-01-22 17:47:34Eli Collinslinkissue29346 messages
2017-01-22 17:47:34Eli Collinscreate