This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author nixdash
Recipients gregory.p.smith, nixdash, pitrou
Date 2012-12-12.22:57:27
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1355353047.96.0.733895762072.issue16660@psf.upfronthosting.co.za>
In-reply-to
Content
Ok , It did not crash when I recompiled with --with-pydebug.
I observed crash when I was working with the urllib.request.
It  was crashing when I  imported urllib.request. I drill down the code of urllib.request and found that it was crashing because of hashlib import.
Now, it is not crashing when I import hashlib and urllib.request.
But urllib.request.urlopen fails to open a site. It does open few other sites.
Now this might be completely different issue.

I think we first need to find why it is not crashing after the recompilation.

Following is comparison of urlopen on the same machine using two different python versions.


ir[40] [~/Python-3.3.0/]$ python3
Python 3.3.0 (default, Dec 12 2012, 17:26:56) 
[GCC 4.4.6 20110731 (Red Hat 4.4.6-3)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import hashlib
[71516 refs]
>>> import urllib
[71550 refs]
>>> import urllib.request
[123410 refs]
>>> u = urllib.request.urlopen('http://en.wikipedia.org/wiki/Wikipedia')
]Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 160, in urlopen
    return opener.open(url, data, timeout)
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 479, in open
    response = meth(req, response)
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 591, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 517, in error
    return self._call_chain(*args)
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 451, in _call_chain
    result = func(*args)
  File "/usa/arao/Python-3.3.0/lib/python3.3/urllib/request.py", line 599, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
[126943 refs]
>>> u = urllib.request.urlopen('http://mit.edu/')
[127082 refs]
>>> len(u.read())
13857
[127068 refs]

It does open some other sites.


r[37] [~/Python-3.3.0/]$ python
Python 2.6.6 (r266:84292, Dec  7 2011, 20:48:22) 
[GCC 4.4.6 20110731 (Red Hat 4.4.6-3)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import urllib
>>> u = urllib.urlopen('http://en.wikipedia.org/wiki/Wikipedia')
>>> len(u.read())
3000
>>> import hashlib
>>> 


Thanks,
Ashwin
History
Date User Action Args
2012-12-12 22:57:28nixdashsetrecipients: + nixdash, gregory.p.smith, pitrou
2012-12-12 22:57:27nixdashsetmessageid: <1355353047.96.0.733895762072.issue16660@psf.upfronthosting.co.za>
2012-12-12 22:57:27nixdashlinkissue16660 messages
2012-12-12 22:57:27nixdashcreate