James Cooley wrote:

...I'm sure there's a way to come up with a compression scheme that's tailored to the sort of sample data we see...

A topic for a dissertation if ever there was one :-)

In the general case, you're going to have to work hard to beat Ziv-Lempel. You might improve the performance somewhat by tweaking the parameters, but probably not a lot.

The problem is that, as we know, in the general case, compressibility is basically the reciprocal of entropy. Ziv-Lempel depends on conditional entropy being lower, and thus compressibility being higher, through markovity.

For most human-generated signals, the influence of the past (the markovity) decays exponentially towards the past. For general signals, the decay is very rapid, so there's going to be a lot of slop at the boundaries of the underlying markov model. There's also a theorem in there somewhere (due to Rissanen, I think) that says there's a limit on how much slop you can trim by tuning the adaptation.

Where you might win is by picking different compression algorithms depending on the signals. For example, for voice-bandwidth channels, you might gain a lot from first converting to mu-law and then gzipping. And so on. For wide, densely-packed signals, you can probably forget it. The fact that they're densely packed means they're already high-entropy.

Regards
Frank



_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to