mirror of https://github.com/python/cpython
f0953b9dff
The cause seems to be that when a file URL doesn't exist, urllib.urlopen() raises OSError instead of IOError. Simply add this to the except clause. Not elegant, but effective. :-) |
||
---|---|---|
.. | ||
README | ||
tktools.py | ||
wcgui.py | ||
wcmac.py | ||
webchecker.py | ||
websucker.py | ||
wsgui.py |
README
Webchecker ---------- This is a simple web tree checker, useful to find bad links in a web tree. It currently checks links pointing within the same subweb for validity. The main program is "webchecker.py". See its doc string (or invoke it with the option "-?") for more defails. History: - Jan 1997. First release. The module robotparser.py was written by Skip Montanaro; the rest is original work by Guido van Rossum. - May 1999. Sam Bayer contributed a new version, wcnew.py, which supports checking internal links (#spam fragments in URLs) and some other options. - Nov 1999. Sam Bayer contributed patches to reintegrate wcnew.py into webchecker.py, and corresponding mods to wcgui.py and websucker.py.