New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PEP 3121, 384 Refactoring applied to decimal module #59927
Comments
Thank you for the patch and the big amount of work that you are doing on so I'm afraid though that the patch is not acceptable in its current state:
If 2) cannot be fixed due to the increased amount of state lookups, I'm |
Ouch. That's an interesting observation, because I guess other modules could be affected as well. Robin, can you please take care that other modules don't have significant performance regressions after your work? I'm not sure what kind of changes it involves, or if it's possible at all, but we don't want to slow down Python for what is mostly (IMO) a code cleanup. |
I have removed some redundant modulestate lookups and the testsuite now executes the decimal tests as fast as before the patch is applied. (at least for me). Regarding the failing test: static int
hackcheck(PyObject *self, setattrofunc func, char *what)
{
PyTypeObject *type = Py_TYPE(self);
while (type && type->tp_flags & Py_TPFLAGS_HEAPTYPE)
type = type->tp_base;
/* If type is NULL now, this is a really weird type.
In the spirit of backwards compatibility (?), just shut up. */
if (type && type->tp_setattro != func) {
PyErr_Format(PyExc_TypeError,
"can't apply this %s to %s object",
what,
type->tp_name);
return 0;
}
return 1;
} As the context-type is now a heaptype the while-loop will continue to jump to the basetype of the conext-type, which is the object-type. There it ultimately fails when it compares func (context_setattr) to the tp_setattr of object. |
The test suite is not a good benchmark: it also tests decimal.py. For cd Modules/_decimal/tests You can hit Ctrl-C after the first cdecimal result, since that's usually Running an unpatched _decimal.c three times gives (Linux, 64-bit, Core2 Duo): 0.162576s 0.165146s 0.163242s With your second patch: 0.204383s 0.204383s 0.206919s
Thanks for the analysis. Perhaps Martin can comment on that. |
hackcheck fixes the "Carlo Verry hack", which goes like this: py> object.__setattr__(str, 'lower', str.upper) It shouldn't be possible to monkey-patch a builtin type; I believe this is to prevent crashes when self has the incorrect layout. Other than that, I find that an arbitrary restriction, except that setattr/attribute assignment prevent an assignment from occurring, it shouldn't be possible to bypass this by calling __setattr__. So if the restriction could be lifted, it should be lifted in both cases. What specific decimal test is failing? |
# Attributes cannot be deleted test test_decimal failed -- Traceback (most recent call last):
File "/home/stefan/pydev/pep-3121-cpython/Lib/test/test_decimal.py", line 3683, in test_invalid_context
self.assertRaises(AttributeError, c.__delattr__, attr)
File "/home/stefan/pydev/pep-3121-cpython/Lib/unittest/case.py", line 571, in assertRaises
return context.handle('assertRaises', callableObj, args, kwargs)
File "/home/stefan/pydev/pep-3121-cpython/Lib/unittest/case.py", line 135, in handle
callable_obj(*args, **kwargs)
TypeError: can't apply this __delattr__ to object object 1 test failed: |
In order to avoid the significant slowdown: Could we create a new |
This sounds like a question for python-dev (or perhaps python-ideas). |
Yes, python-ideas is probably better. -- I just noticed that the |
Sorry Robin, I was wrong about the context -- it should be fine |
I would like to reject this until either the performance problems If you do, you get tons of warnings about libmpdec being reinitialized, |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: