Rietveld Code Review Tool
Help | Bug tracker | Discussion group | Source code | Sign in
(3)

Unified Diff: Lib/token.py

Issue 3353: make built-in tokenizer available via Python C API
Patch Set: Created 4 years, 11 months ago
Use n/p to move between diff chunks; N/P to move between comments. Please Sign in to add in-line comments.
Jump to:
View side-by-side diff with in-line comments
Download patch
« no previous file with comments | « Include/tokenizer.h ('k') | Makefile.pre.in » ('j') | no next file with comments »
Expand Comments ('e') | Collapse Comments ('c') | Show Comments Hide Comments ('s')
--- a/Lib/token.py
+++ b/Lib/token.py
@@ -100,7 +100,7 @@ def _main():
with fp:
lines = fp.read().split("\n")
prog = re.compile(
- "#define[ \t][ \t]*([A-Z0-9][A-Z0-9_]*)[ \t][ \t]*([0-9][0-9]*)",
+ "#define[ \t][ \t]*PYTOK_([A-Z0-9][A-Z0-9_]*)[ \t][ \t]*([0-9][0-9]*)",
re.IGNORECASE)
tokens = {}
for line in lines:
« no previous file with comments | « Include/tokenizer.h ('k') | Makefile.pre.in » ('j') | no next file with comments »

RSS Feeds Recent Issues | This issue
This is Rietveld 894c83f36cb7+