New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimize dict_merge for copy #85603
Comments
Although there are dict.copy() and PyDict_Copy(), dict_merge can be used for copying dict.
|
Microbenchmark for commit cf4f61c. # timeit -s 'd=dict.fromkeys(range(8))' -- 'dict(d)' # timeit -s 'd=dict.fromkeys(range(1000))' -- 'dict(d)' # timeit -s 'd=dict.fromkeys(range(8))' -- '{}.update(d)' # timeit -s 'd=dict.fromkeys(range(1000))' -- '{}.update(d)' |
To reduce code size, I am considering to remove clone_combined_dict. I will check how PyDict_Copy() is performance critical. This is microbenchmark result of d.copy() and dict(d). $ ./python -m pyperf timeit --compare-to ./python-master -s 'd=dict.fromkeys(range(1000))' -- 'd.copy()'
python-master: ..................... 4.36 us +- 0.07 us
python: ..................... 5.96 us +- 0.10 us Mean +- std dev: [python-master] 4.36 us +- 0.07 us -> [python] 5.96 us +- 0.10 us: 1.37x slower (+37%) $ ./python -m pyperf timeit --compare-to ./python-master -s 'd=dict.fromkeys(range(1000))' -- 'dict(d)'
python-master: ..................... 21.6 us +- 0.2 us
python: ..................... 6.01 us +- 0.09 us Mean +- std dev: [python-master] 21.6 us +- 0.2 us -> [python] 6.01 us +- 0.09 us: 3.59x faster (-72%) |
PyDict_Copy() is not used in eval loop or calling functions. So removing clone_combined_dict() is a considerable option. Another option is to use clone_combined_dict() in dict_merge, instead of adding dict_copy2(). Pros: No performance regression. PyDict_Copy() is as fast as before. I suppose most dict used by |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: