Message39150
Logged In: YES
user_id=89016
I'm currently writing a make in Python. This make should be
able to handle not only local files, but remote files
(http, ftp, etc.). One project might have several thousand
targets, and some of them are remote. I want to be able to
handle both types in a uniform way, i.e. via
urllib/urllib2. This means, that I call urllib2.urlopen()
to get the header information about the last modification
date, but I don't want to open the file right away. Only
when the data is required (because the source resource is
newer than the target) should the file be read.
And this might open the door to making streams that are
returned from urlopen() writable (simply by using open
(..., "wb") instead of open(..., "rb") when the first write
is called.
Another possibility might be using urllib.urlretrieve(), but
the API is horrible (one global cleanup function) and not
supported by urllib2.
|
|
Date |
User |
Action |
Args |
2007-08-23 15:11:26 | admin | link | issue525945 messages |
2007-08-23 15:11:26 | admin | create | |
|