This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author db3l
Recipients
Date 2002-01-10.04:01:11
SpamBayes Score
Marked as misclassified
Message-id
In-reply-to
Content
If you have a module that you wish to compile using 
the builtin compile() function (in 'exec' mode), it 
will fail with a SyntaxError if that module does not 
have a newline as its final token.

The same module can be executed directly by the 
interpreter, or imported by another module, and Python 
can properly compile and save a pyc for the module.

I believe the difference is rooted in the fact that 
the tokenizer (tokenizer.c, in tok_nextc()) 
will "fake" a newline at the end of a file if it 
doesn't find one, but it will not do so when 
tokenizing a string buffer.

What I'm not certain of is whether faking such a token 
for strings as well won't break something else (such 
as when parsing a string for an expression rather than 
a full module).  But without such a change, you have a 
state where a module that works (and compiles) in 
other circumstances cannot be read into memory and 
compiled with the compile() builtin.

This came up while tracking down a problem with 
failures using Gordan McMillan's Installer package 
which compiles modules using compile() before 
including them in the archive.

I believe this is true for all releases since at least 
1.5.2.

-- David
History
Date User Action Args
2007-08-23 13:58:33adminlinkissue501622 messages
2007-08-23 13:58:33admincreate