I'm contemplating limiting bayes tokens to 128 chars, in the tokenize
method.  Anyone see a problem with that?

My current bayes DB has 17 tokens > 128 and they all look like they
are either poison tokens or from uuencoded/base64 msgs, in other words
garbage.

Michael

Reply via email to