Hi, I am puzzled. I am using the latest RH stable mozilla (1.0.1-26), and noticed that dynamic (php) pages that are encoded in utf-8 (including the correct meta tag) are recognized as such by konqueror, but not by mozilla.
However, if I save the page, and then view it with mozilla, the thing is displayed properly. Which leads me to believe that either php or apache (version 2) sends out an xml header that specifies the character set as iso-8859-1, and that such a header is only sent out for dynamic documents (hence my suspicion that php is guilty), and that the difference between konqueror and mozilla is what header they each give greater priority to. Anyway, I tried to find the myserious header as you can see below, but see nothing unusual: [afolger@localhost html]$ php index.php X-Powered-By: PHP/4.2.2 Set-Cookie: lang=english; expires=Wed, 26-Nov-03 15:08:52 GMT Content-type: text/html <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <title>Pokeach Ivrim </title> <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=UTF-8"> ===== OUTPUT TRUNCATED ===== I also tested with iso-8859-8-i static documents, and not surprisingly everything works. I have not yet tested mozilla with dynamic iso-8859-8-i documents, so I am not sure about that. I also used a simple cgi script in perl that echoes back the content of a form, and in that case, utf-8 was set properly, which kind of eliminates the apache theory. (script is attached) So is it Apache that sends out nasty stuff? php? mozilla is allergic to php? Any help? Arie -- It is absurd to seek to give an account of the matter to a man who cannot himself give an account of anything; for insofar as he is already like this, such a man is no better than a vegetable. -- Book IV of Aristotle's Metaphysics
foo.cgi
Description: Perl program