This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author serhiy.storchaka
Recipients Devin Jeanpierre, eric.araujo, eric.snow, petri.lehtinen, serhiy.storchaka, terry.reedy, vstinner
Date 2012-10-15.20:48:43
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1350334124.33.0.546970507682.issue12486@psf.upfronthosting.co.za>
In-reply-to
Content
Patch to allow tokenize() accepts string is very simple, only 4 lines. But it requires a lot of documentation changes.

Then we can get rid of undocumented generate_tokens(). Note, stdlib an tools use only generate_tokens(), none uses tokenize(). Of course, it will be better if tokenize() will work with iterator protocol.

Here is a preliminary patch. I will be thankful for the help with the documentation and for the discussion.

Of course, it will be better if tokenize() will work with iterator protocol.
History
Date User Action Args
2012-10-15 20:48:44serhiy.storchakasetrecipients: + serhiy.storchaka, terry.reedy, vstinner, Devin Jeanpierre, eric.araujo, eric.snow, petri.lehtinen
2012-10-15 20:48:44serhiy.storchakasetmessageid: <1350334124.33.0.546970507682.issue12486@psf.upfronthosting.co.za>
2012-10-15 20:48:44serhiy.storchakalinkissue12486 messages
2012-10-15 20:48:44serhiy.storchakacreate