Message354816
I think anyone using the tokenize module to programmatically edit python source wants to use and probably does use the undocumented behavior, which should then be documented.
I ran into this issue because for me this manifested as a crash:
$ python3
>>> import tokenize
>>> tokenize.untokenize([(tokenize.STRING, "''", (1, 0), (1, 0), None)])
"''"
>>> tokenize.untokenize([(tokenize.STRING, "''", None, None, None)])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/<snip>/virtualenv/lib/python3.6/tokenize.py", line 338, in untokenize
out = ut.untokenize(iterable)
File "/<snip>/virtualenv/lib/python3.6/tokenize.py", line 272, in untokenize
self.add_whitespace(start)
File "/<snip>/virtualenv/lib/python3.6/tokenize.py", line 231, in add_whitespace
row, col = start
TypeError: 'NoneType' object is not iterable
The second call is giving untokenize() input that is documented to be valid, yet which causes a crash. |
|
Date |
User |
Action |
Args |
2019-10-16 20:26:13 | Zachary McCord | set | recipients:
+ Zachary McCord, csernazs, docs@python, utkarsh2102, donovick |
2019-10-16 20:26:13 | Zachary McCord | set | messageid: <1571257573.8.0.502517277902.issue35297@roundup.psfhosted.org> |
2019-10-16 20:26:13 | Zachary McCord | link | issue35297 messages |
2019-10-16 20:26:13 | Zachary McCord | create | |
|