Message54223
Logged In: YES
user_id=1039782
Yes, I run into this bug all the time. For example, I may
want to generate all strings of length n:
BINARY_CHARSET = ''.join([chr(i) for i in range(256)])
def all_strs(n, charset=BINARY_CHARSET):
m = len(charset)
for i in xrange(m**n):
yield ''.join([charset[i//m**j%m] for j in range(n)])
This is correct algorithmically, but fails due to the buggy
Python xrange() function. So I end up writing my own irange
() function.
Other cases where I've run into this problem: Sieving for
primes starting at a given large prime, checking for integer
solutions to an equation starting with a large integer, and
searching very large integer sets.
itertools.count and itertools.repeat are similarly annoying.
Often I want to search the list of all positive integers starting
with 1, and have to use an awkward while loop to do this, or
write my own replacement for itertools.count.
There should be little performance hit for apps which use
xrange(n), where n fits in the system's integer size. xrange
() can just return an iterator which returns system ints, and
the only performance penalty is an extra if statement in the
call to xrange (and there is no performance penalty in the
iterator's next() method).
Finally, this move appears consistent with Python's values, ie
simplicity, long ints supported with no extra effort, avoid
gotchas for newbies, no special cases, etc.
|
|
Date |
User |
Action |
Args |
2007-08-23 16:08:18 | admin | link | issue1003935 messages |
2007-08-23 16:08:18 | admin | create | |
|