This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author berker.peksag
Recipients Aivar.Annamaa, berker.peksag
Date 2017-09-08.09:40:34
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1504863634.52.0.764611409416.issue31394@psf.upfronthosting.co.za>
In-reply-to
Content
Thank you for the report, but this behavior is already documented at https://docs.python.org/3/library/tokenize.html

    To simplify token stream handling, all Operators and Delimiters
    tokens are returned using the generic token.OP token type. The
    exact type can be determined by checking the exact_type property
    on the named tuple returned from tokenize.tokenize().

If you replace the following line

    print(ellipsis)

with

    print(ellipsis.exact_type)

you will see that it prints '52'.
History
Date User Action Args
2017-09-08 09:40:34berker.peksagsetrecipients: + berker.peksag, Aivar.Annamaa
2017-09-08 09:40:34berker.peksagsetmessageid: <1504863634.52.0.764611409416.issue31394@psf.upfronthosting.co.za>
2017-09-08 09:40:34berker.peksaglinkissue31394 messages
2017-09-08 09:40:34berker.peksagcreate