New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Testsuite fails on x86_64 #88891
Comments
The following tests fails on x86 32 bit: test_cmath test_math test_posix test_turtle - looks, like the expected precision is too high. Full log is attached. |
Extract from the log - from the configure output:
and from the test output
The first part indicates that your math library does provide expm1, so Python goes ahead and uses it. The errors from test_math show that the expm1 implementation provided by your math library has accuracy problems for large inputs. From the other failures, I suspect that the underlying issue is actually an issue with exp (but our tests for exp are not so thorough as those for expm1). In short, the tests represent issues with the underlying C math library. What OS is this, and who supplies the libm? |
The test_posix failure appears to be unrelated; I'll let others look into that one. The test_turtle failure looks again like a libm issue, perhaps combined with an overeager test: we're expecting a |
Update: test_turtle is fixed (44734); I think test_math and test_cmath should be resolved as either "third party" or "wont fix", since the issue is almost certainly the platform libm. That leaves test_posix, which we should probably open a fresh issue for. |
Here's the test_posix failure, extracted from the attachment: ====================================================================== Traceback (most recent call last):
File "/build/python/src/Python-3.9.6/Lib/test/test_posix.py", line 1347, in test_sched_rr_get_interval
interval = posix.sched_rr_get_interval(0)
PermissionError: [Errno 1] Operation not permitted |
One more thought on the libm issues - from the logs, it looks as though the libm implementation is coming from a fairly recent version of glibc (glibc 2.33, which was released in February 2021). There were updates to the 'exp' implementation in glibc in 2018[1], but I'm not seeing anything more recent than that. Is it possible that glibc was miscompiled somehow on this platform - e.g., that it's using code designed to work with the x87 FPU with its extended precision, but was compiled with flags that force use of SSE2 instead? If not, and if there's a genuine accuracy problem in glibc itself, we should expect to see more Python test_math and test_cmath failures in the future. If anyone is able to run test_math or test_cmath on a machine using glibc 2.33 or glibc 2.34 and report results, that would be useful. [1] https://sourceware.org/git/?p=glibc.git;a=commit;h=e70c17682518fab2fad164fecf73341443bc2ed3 |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: