New submission from Joseph Myers <[email protected]>:
RobotFileParser.crawl_delay and RobotFileParser.request_rate raise
AttributeError for a robots.txt with no matching entry for the given
user-agent, including no default entry, rather than returning None which would
be correct according to the documentation. E.g.:
>>> from urllib.robotparser import RobotFileParser
>>> parser = RobotFileParser()
>>> parser.parse([])
>>> parser.crawl_delay('example')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.6/urllib/robotparser.py", line 182, in crawl_delay
return self.default_entry.delay
AttributeError: 'NoneType' object has no attribute 'delay'
----------
components: Library (Lib)
messages: 334982
nosy: joseph_myers
priority: normal
severity: normal
status: open
title: robotparser crawl_delay and request_rate do not work with no matching
entry
type: behavior
versions: Python 3.6, Python 3.7, Python 3.8
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue35922>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com