Message186034
Just came across this when running hadoop jobs
it takes your .py script folder, puts each file in its own cache folder, then recreates the script folder populating it with individual symlinks.
When run like this, the scripts can no longer import each other, because sys.path[0] is set to the "real" place of the file, rather than the place it was invoked from.
I just reproed this with python 2.7.3 on a new ubuntu system:
repro:
mkdir pydir
mkdir pydir/lnk
echo "import sys; print ">", sys.path[0]" >> pydir/lnk/test.py
lndir -s lnk/test.py pydir/test.py
python pydir/test.py
> /home/kristjan/pydir/lnk
You would have expected "/home/kristjan/pydir" since this is the apparent directory of the file. When "pydir" contains many .py files, each residing in their own unique real target directories, then they cannot import each other. |
|
Date |
User |
Action |
Args |
2013-04-04 14:34:15 | kristjan.jonsson | set | recipients:
+ kristjan.jonsson, jackjansen, kowaltowski |
2013-04-04 14:34:15 | kristjan.jonsson | set | messageid: <1365086055.58.0.588039246332.issue1387483@psf.upfronthosting.co.za> |
2013-04-04 14:34:15 | kristjan.jonsson | link | issue1387483 messages |
2013-04-04 14:34:15 | kristjan.jonsson | create | |
|