classification
Title: Use PyMem_New instead of PyMem_Malloc
Type: enhancement Stage: resolved
Components: Extension Modules, Interpreter Core Versions: Python 3.4, Python 3.5, Python 2.7
process
Status: closed Resolution: fixed
Dependencies: Superseder:
Assigned To: Nosy List: Arfrever, benjamin.peterson, python-dev, serhiy.storchaka, skrah, vstinner
Priority: normal Keywords: patch

Created on 2015-02-11 17:30 by serhiy.storchaka, last changed 2015-03-02 18:59 by benjamin.peterson. This issue is now closed.

Files
File name Uploaded Description Edit
pymem_new.patch serhiy.storchaka, 2015-02-11 17:30 review
Messages (8)
msg235758 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2015-02-11 17:30
Proposed patch replaces PyMem_Malloc with PyMem_New if the former is used in the form PyMem_Malloc(len * sizeof(type)). This can fix possible overflow errors and makes the code cleaner.
msg235770 - (view) Author: Stefan Krah (skrah) * (Python committer) Date: 2015-02-11 21:25
In _testbuffer.c:  ndim <= 64, so the changes aren't really necessary.
Somehow this fact needs to get widely known, since it does not make
sense to check for overflow anytime ndim is used.

The reason is of course that even an array with only 2 elements per
dimension gets quite large with ndim=64. :)
msg235785 - (view) Author: Benjamin Peterson (benjamin.peterson) * (Python committer) Date: 2015-02-12 01:16
Very nice. I think you should also apply it to older versions, since (as we now) this sort of thing is very liable to cause security problems.
msg235807 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2015-02-12 11:18
> In _testbuffer.c:  ndim <= 64, so the changes aren't really necessary.

Indeed, I'll remove these changes.

> The reason is of course that even an array with only 2 elements per
dimension gets quite large with ndim=64. :)

But an array can be with 1 element per dimension. In any case it is good that there is strict limitation on ndim values.
msg235818 - (view) Author: Stefan Krah (skrah) * (Python committer) Date: 2015-02-12 12:26
Yes, but these (degenerate) arrays tend to arise only as a result of slicing.
Last time I looked NumPy had MAX_NDIM=32, so we should be fine.
msg236100 - (view) Author: Roundup Robot (python-dev) (Python triager) Date: 2015-02-16 11:35
New changeset d83884b3a427 by Serhiy Storchaka in branch '2.7':
Issue #23446: Use PyMem_New instead of PyMem_Malloc to avoid possible integer
https://hg.python.org/cpython/rev/d83884b3a427

New changeset 036a2aceae93 by Serhiy Storchaka in branch '3.4':
Issue #23446: Use PyMem_New instead of PyMem_Malloc to avoid possible integer
https://hg.python.org/cpython/rev/036a2aceae93

New changeset d12c7938c4b0 by Serhiy Storchaka in branch 'default':
Issue #23446: Use PyMem_New instead of PyMem_Malloc to avoid possible integer
https://hg.python.org/cpython/rev/d12c7938c4b0
msg237063 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2015-03-02 17:14
May be apply the fix to 3.3?
msg237076 - (view) Author: Benjamin Peterson (benjamin.peterson) * (Python committer) Date: 2015-03-02 18:59
That would be nice.
History
Date User Action Args
2015-03-02 18:59:12benjamin.petersonsetmessages: + msg237076
2015-03-02 17:14:41serhiy.storchakasetmessages: + msg237063
2015-02-28 15:27:35Arfreversetnosy: + Arfrever

versions: + Python 2.7, Python 3.4
2015-02-16 14:31:52serhiy.storchakasetstatus: open -> closed
resolution: fixed
stage: patch review -> resolved
2015-02-16 11:35:27python-devsetnosy: + python-dev
messages: + msg236100
2015-02-12 12:26:07skrahsetmessages: + msg235818
2015-02-12 11:18:58serhiy.storchakasetmessages: + msg235807
2015-02-12 01:16:50benjamin.petersonsetmessages: + msg235785
2015-02-11 21:25:20skrahsetnosy: + skrah
messages: + msg235770
2015-02-11 17:30:15serhiy.storchakacreate