aahz> Here's the stark simple recipe: when you use Unicode, you *MUST*
    aahz> switch to a Unicode-centric view of the universe.  Therefore you
    aahz> encode *FROM* Unicode and you decode *TO* Unicode.  Period.  It's
    aahz> similar to the way floating point contaminates ints.

That's what I do in my code.  Why do Unicode objects have a decode method
then?

Skip

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to