This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author pitrou
Recipients lemburg, loewis, pitrou, stutzbach
Date 2010-05-21.13:19:02
SpamBayes Score 0.033069342
Marked as misclassified No
Message-id <1274447945.54.0.456108066249.issue8781@psf.upfronthosting.co.za>
In-reply-to
Content
The problem with a signed Py_UNICODE is implicit sign extension (rather than zero extension) in some conversions, for example from "char" or "unsigned char" to "Py_UNICODE". The effects could go anywhere from incorrect results to plain crashes. Not only in our code, but in C extensions relying on the unsignedness of Py_UNICODE.

Is there a way to enable those optimizations while keeping an unsigned Py_UNICODE type? It seems Py_UNICODE doesn't have to be typedef'ed to wchar_t, it can be defined to be an unsigned integer of the same width. Or would it break some part of the C standard?
History
Date User Action Args
2010-05-21 13:19:05pitrousetrecipients: + pitrou, lemburg, loewis, stutzbach
2010-05-21 13:19:05pitrousetmessageid: <1274447945.54.0.456108066249.issue8781@psf.upfronthosting.co.za>
2010-05-21 13:19:03pitroulinkissue8781 messages
2010-05-21 13:19:02pitroucreate