This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author phr
Recipients
Date 2001-10-13.10:24:05
SpamBayes Score
Marked as misclassified
Message-id
In-reply-to
Content
Logged In: YES 
user_id=72053

1) if Python longs are currently implemented as vectors
of 15-bit digits (yikes--why on earth would anyone do that)
and marshalled like that, then I agree that THAT much
weirdness doesn't need to be propagated to future versions.
Wow!  I never looked at the long int code closely, but
the marshal code certainly didn't reflect that. It's still
possible to freeze the current marshal format and let
future versions define a new mechanism for loading .pyc's.
From my own self-interest (of wanting to distribute apps
that work across versions) that idea attracts me, but it's
probably not the right thing in the long run.  Better may
be to fix the long int format right away and THEN document/
freeze it.  (Use a new format byte so the 2.2 demarshaller
can still read 2.1 .pyc files).  By "fix" I mean use a
simple
packed binary format, no 15 bit digits, no BER, and the
length prefix should be a byte or bit count, not multibyte
"digits".  

2) Unfortunately it's not easy in portable C with 32 bit
longs to use digits wider than 16 bits--multiplication
becomes too complicated.  If the compiler supports wide
ints (long long int) then conditionalized code to use them
might or might not be deemed worthwhile.  Python's long int
arithmetic (unlike Perl's Math::BigInt class) is fast enough
to be useable for real applications and I don't expect it
to go to the extremes that gmpy does (highly tuned
algorithms
for everything, asm code for many cpu's, etc).  So currently
I use gmpy when it's available and fall back on longs if
gmpy won't import--this works pretty well so far.

3) I like the idea of a BER/DER library for Python but I
don't feel like being the guy who writes it.  I'd probably
use it if it was available, though maybe not for this 
purpose.  (I'd like to handle X509 certificates in Python).
BER really isn't the most efficient way to store long ints,
by the way, since it puts just 7 useful bits in a byte.

4) My suggestion of BER in 465045 was motivated slightly
differently, which was to add a feature from Perl's pack/
unpack function that's missing from Python's struct.pack/
unpack.  I understand a little better now what the struct
module is for, so binascii may be a better place for such
a thing.  However, I believe Python really needs a 
pack/unpack module that does all the stuff that Perl's
does.   Data conversion like that is an area where
Perl is still beating Python pretty badly.  (Again, I
don't feel like being the one who writes the module).

5) Sorry I didn't notice Guido's post of 20:24 earlier
(several arrived at once).  I guess I'm willing to submit
a patch for binascii to read and write longs in binary.
It's slightly humorous to put it in binascii since it's
a binary-binary conversion with no ascii involved, but
the function fits well there in any case.  I'd still rather
use marshal, since I want to write out more kinds of data
than longs, and with a long->binary conversion function
I'd still need to supply Python code to traverse
dictionaries and lists, and encode strings.  Btw, the
struct module doesn't have any way to encode strings
with a length, except the Pascal format which is limited
to 256 bytes and is therefore useless for many things.
History
Date User Action Args
2007-08-23 16:01:34adminlinkissue467384 messages
2007-08-23 16:01:34admincreate