Re: MD: ATRAC loseless compression techniques...

2001-01-12 Thread Stainless Steel Rat
* Anthony Lalande [EMAIL PROTECTED] on Thu, 11 Jan 2001 | Does any loseless compression algorithm require the entire set of data for | read access before it begins compression? No. In fact none do. Conventional compression algorithms operate on fixed-size blocks of data. Real-time

Re: MD: ATRAC loseless compression techniques...

2001-01-12 Thread Anthony Lalande
No. In fact none do. Conventional compression algorithms operate on fixed-size blocks of data. Real-time compression of an audio stream is easilly possible with a bit of buffering. The issue is not that but compressing fast enough so that the buffer is not overrun. Well, in effect, the

Re: MD: ATRAC loseless compression techniques...

2001-01-12 Thread Stainless Steel Rat
* Anthony Lalande [EMAIL PROTECTED] on Fri, 12 Jan 2001 | I'm wondering if you would get better compression by treating the whole | stream as 1 block, and then compressing that, or compression in many smaller | blocks. I guess it all depends on the compression used. All data compression

Re: MD: ATRAC loseless compression techniques...

2001-01-12 Thread Anthony Lalande
32-64K blocks is the norm for high-level compression these days. That is what bzip2 uses, and boy is it slow even on a fast Pentium-III. One minute of linear PCM is ~8.75MB. You would need a supercomputer the size of a refrigerator to utilize a block size that large. Well, I can go to

Re: MD: ATRAC loseless compression techniques...

2001-01-11 Thread Dave Kimmel
Does any loseless compression algorithm require the entire set of data for read access before it begins compression? If you wanted to encode audio with a loseless compression, could you do it in real-time or would you need to wait until the entire recording is complete, and then compress