This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author ggenellina
Recipients ggenellina, mark.dickinson, ncoghlan
Date 2008-03-26.06:36:58
SpamBayes Score 0.047657266
Marked as misclassified No
Message-id <1206513421.92.0.703000109012.issue2483@psf.upfronthosting.co.za>
In-reply-to
Content
Are numbers so special to break 
the rules? why stopping here? 
what about other types that may 
want to accept ASCII bytes 
instead of characters? Isn't 
this like going back to the 2.x 
world?

The protocol with embedded ASCII 
numbers isn't a very convincing 
case for me. One can read a 
binary integer in C using a 
single function call. In Python 
2.X this can't be done in a 
single call, one has to use 
struct.unpack to decode the 
bytes read, and there was no 
complains that I know of. 
In 3.0 the same happens for 
ASCII numbers too, one will have 
to decode them first. The 
conversion may look like a 
stupid step, but it's as stupid 
as having to use struct.unpack 
to convert some bits to the 
*same* bits inside the integer 
object.

Writing int(str(value,'ascii')) 
doesn't look so terrible.

And one may argue that 
int(b'1234') should return 
0x34333231 instead of 1234; 
b'1234' is the binary 
representation of 0x34333231 in 
little-endian format.
History
Date User Action Args
2008-03-26 06:37:02ggenellinasetspambayes_score: 0.0476573 -> 0.047657266
recipients: + ggenellina, mark.dickinson, ncoghlan
2008-03-26 06:37:01ggenellinasetspambayes_score: 0.0476573 -> 0.0476573
messageid: <1206513421.92.0.703000109012.issue2483@psf.upfronthosting.co.za>
2008-03-26 06:37:00ggenellinalinkissue2483 messages
2008-03-26 06:36:59ggenellinacreate