Message340467
> The tokenize() generator requires one argument, readline, which must be a callable object which provides the same interface as the io.IOBase.readline() method of file objects. Each call to the function should return one line of input as bytes.
Add an example like this should be easier to understand:
# example.py
def foo:
pass
# tokenize_example.py
import tokenize
f = open('example.py', 'rb')
token_gen = tokenize.tokenize(f.readline)
for token in token_gen:
# Something like this
# TokenInfo(type=1 (NAME), string='class', start=(1, 0), end=(1, 5), line='class Foo:\n')
# TokenInfo(type=1 (NAME), string='Foo', start=(1, 6), end=(1, 9), line='class Foo:\n')
# TokenInfo(type=53 (OP), string=':', start=(1, 9), end=(1, 10), line='class Foo:\n')
print(token) |
|
Date |
User |
Action |
Args |
2019-04-18 03:43:56 | Windson Yang | set | recipients:
+ Windson Yang, docs@python |
2019-04-18 03:43:56 | Windson Yang | set | messageid: <1555559036.7.0.0522143852215.issue36654@roundup.psfhosted.org> |
2019-04-18 03:43:56 | Windson Yang | link | issue36654 messages |
2019-04-18 03:43:56 | Windson Yang | create | |
|