This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author joseph_myers
Recipients joseph_myers
Date 2019-02-06.20:33:03
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
RobotFileParser.crawl_delay and RobotFileParser.request_rate raise AttributeError for a robots.txt with no matching entry for the given user-agent, including no default entry, rather than returning None which would be correct according to the documentation.  E.g.:

>>> from urllib.robotparser import RobotFileParser
>>> parser = RobotFileParser()
>>> parser.parse([])
>>> parser.crawl_delay('example')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3.6/urllib/", line 182, in crawl_delay
    return self.default_entry.delay
AttributeError: 'NoneType' object has no attribute 'delay'
Date User Action Args
2019-02-06 20:33:07joseph_myerssetrecipients: + joseph_myers
2019-02-06 20:33:03joseph_myerssetmessageid: <>
2019-02-06 20:33:03joseph_myerslinkissue35922 messages
2019-02-06 20:33:03joseph_myerscreate