On Sun, Feb 04, 2007 at 03:46:41PM -0800, Allen wrote: > An idle question. English has a relatively low entropy as a > language. Don't recall the exact figure, but if you look at words > that start with "q" it is very low indeed.
I seem to recall Shannon did some experiments which showed that with a human as your probability oracle, it's roughly 1-2 bits per letter. Many of his papers are online last time I looked, but some of his experimental results are harder to locate online. > What about other languages? Does anyone know the relative entropy > of other alphabetic languages? What about the entropy of > ideographic languages? Pictographic? Hieroglyphic? IIRC, it turned out that Egyptian heiroglyphs were actually syllabic, like Mesopotamian, so no fun there. Mayan, on the other hand, remains an enigma. I read not long ago that they also had a way of recording stories on bundles of knotted string, like the end of a mop. -- The driving force behind innovation is sublimation. -><- <URL:http://www.subspacefield.org/~travis/> For a good time on my UBE blacklist, email [EMAIL PROTECTED]
pgpyE3iyc6JFI.pgp
Description: PGP signature