On Sep 19, 2005, at 5:53 PM, Trejkaz wrote: > As you can probably tell, I've been playing around with macros a lot. > > Something I just discovered... if my macro outputs characters in UTF-8 > encoding, they don't get picked up properly. I did some digging > and it > seems that even if the HTML has UTF-8 specified both in a <meta> > element > and in the <?xml?> declaration, if the HTTP specifies the encoding > to be > something else, it's all for nought. > > Presently, it seems that Typo sends almost all pages as encoding > ISO-8859-1. Would it make sense to change this globally to UTF-8? > Is it > possible perhaps to do this using an Apache directive, so that Typo > doesn't have to add the feature?
I haven't noticed this, but it's possible that Safari is defaulting to UTF-8 for me. We *absolutely* should be using UTF-8 everywhere; ISO-8859-1 is wrong. Changing Apache's default would only help with Apache; webrick and lighttpd would still be broken. So we'll need to add a UTF-8 content-type everywhere. That shouldn't be all that hard. > Interestingly, when you create a page which has multi-byte > characters in > it, those characters encode using XML entities. This is probably a > problem in itself, as XML-encoding the characters results in around 8 > bytes per character, whereas UTF-8 results in an average of 2-3. This is almost certainly an editor or web browser issue; I've created posts with UTF-8 characters using Ecto on OS X. > Are either of these issues worth a ticket? :-) The first one is. I'm not sure about the second one, can you provide more details? Scott