Message106236
The problem with a signed Py_UNICODE is implicit sign extension (rather than zero extension) in some conversions, for example from "char" or "unsigned char" to "Py_UNICODE". The effects could go anywhere from incorrect results to plain crashes. Not only in our code, but in C extensions relying on the unsignedness of Py_UNICODE.
Is there a way to enable those optimizations while keeping an unsigned Py_UNICODE type? It seems Py_UNICODE doesn't have to be typedef'ed to wchar_t, it can be defined to be an unsigned integer of the same width. Or would it break some part of the C standard? |
|
Date |
User |
Action |
Args |
2010-05-21 13:19:05 | pitrou | set | recipients:
+ pitrou, lemburg, loewis, stutzbach |
2010-05-21 13:19:05 | pitrou | set | messageid: <1274447945.54.0.456108066249.issue8781@psf.upfronthosting.co.za> |
2010-05-21 13:19:03 | pitrou | link | issue8781 messages |
2010-05-21 13:19:02 | pitrou | create | |
|