This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Joshua.Landau
Recipients Joshua.Landau, benjamin.peterson, flox
Date 2015-04-12.06:08:01
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1428818881.87.0.670478075687.issue9712@psf.upfronthosting.co.za>
In-reply-to
Content
This doesn't seem to be a complete fix; the regex used does not include Other_ID_Start or Other_ID_Continue from

https://docs.python.org/3.5/reference/lexical_analysis.html#identifiers

Hence tokenize does not accept '℘·'.

Credit to modchan from http://stackoverflow.com/a/29586366/1763356.
History
Date User Action Args
2015-04-12 06:08:01Joshua.Landausetrecipients: + Joshua.Landau, benjamin.peterson, flox
2015-04-12 06:08:01Joshua.Landausetmessageid: <1428818881.87.0.670478075687.issue9712@psf.upfronthosting.co.za>
2015-04-12 06:08:01Joshua.Landaulinkissue9712 messages
2015-04-12 06:08:01Joshua.Landaucreate