On Mon, 29 Jun 2009 09:18:20 +0200, Andras.Horvath wrote:

>> For a urllib-style interface, there's not much point in performing
>> verification after the fact. Either the library performs verification or
>> it doesn't. If it doesn't, you've just sent the (potentially confidential)
>> request to an unknown server; discovering this after the fact doesn't
>> really help.
> 
> I was more thinking about supplying a/some CA certificate(s) and
> requiring that the site cert be valid (otherwise the connection should
> fail). This sounds very EAFP to me.

This is easier to do with urllib2 than urllib. For urllib, you would need
to either "fix" URLopener.open_https() or clone half of urllib (URLOpener
and FancyURLOpener). For urllib2, you can use urllib2.install_opener() to
replace the built-in HTTPSHandler with a subclass which performs
validation. Validation should just be a matter of passing
cert_reqs=CERT_REQUIRED and ca_certs= to ssl.wrap_socket(), then checking
that SSLSocket.getpeercert() returns a non-empty dictionary.

Note: the above is purely theoretical, based upon the (2.6) documentation
and source code. I suggest that you verify it by connecting to a site
with a bogus (e.g. self-signed) certificate and checking that it fails.

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to