Akismet thinks this bug is spam, so I cannot submit it to Trac.
A URLField will report that all links to en.wikipedia.org are invalid, because urllib2, along with wget and libwww-perl, are blocked by default. http://mail.wikipedia.org/pipermail/wikitech-l/2003-December/019849.html This is due to people writing poorly-designed bots, not due to it being a violation of Wikipedia policy to access the site automatically, so a good fix is to set the User-Agent header to indicate that Django is making the request. A patch is attached; it's against 0.95 but this also affects the trunk. It would be nice if the Django version could be included in the User-Agent, but I didn't see where it was accessible from the code. -- Shields. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django developers" group. To post to this group, send email to django-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-developers -~----------~----~----~----~------~----~------~--~---
--- django/core/validators.py.orig 2006-08-21 06:13:11.000000000 +0000 +++ django/core/validators.py 2006-08-27 00:43:37.000000000 +0000 @@ -203,8 +203,10 @@ def isExistingURL(field_data, all_data): import urllib2 + req = urllib2.Request(url=field_data) + req.add_header('User-Agent', 'Django/0.0') try: - u = urllib2.urlopen(field_data) + u = urllib2.urlopen(req) except ValueError: raise ValidationError, gettext("Invalid URL: %s") % field_data except urllib2.HTTPError, e: