So basically the current best compression uses:
- Many predictors combined
- Arithmetic coding
- Huffman coding too
- Translate grouping ex. sister/brother

The predicting of the next bit is a totally separate thingy from the 
Huffmaning, if I'm correct here. Is it so that the arithmetic coding is also a 
separate additive here, like the Hufmaning? If not, how does arithmetic coding 
work with predictions or Huffman??

As for the sister/brother clustering, so, you predict he next bit, and make 
both bits more likely? Or is it compressing a dictionary for the Huffman and/or 
arithmetic?
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T2d0576044f01b0b1-Mdd9adf8e22dbe59d34dfa607
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to