Douglas A. Tutty wrote:
I tried making a very sparse file (100 MB data, 1000 GB sparseness) and
gave up trying to compress it.  gzip has to process the whole thing,
sparseness and all.  Sure it would probably end up with a very small
file, but the whole thing has to be processed.

Yes it does and I am not sure anyone said it would be less work. I sure didn't and yes it needs to be process and I demonstrated it with the time it takes to rsync with a sparse file and without. In my test, 45+ minutes oppose to 17 seconds.

I imagine that its no less time than that which rsync takes to process.
Rsync takes lots of time and computation but saves on bandwidth.

Yes it is a lots of processing to do it and lots of time wasted and lots of CPU power wasted and if you don't use the -S in case of rsync, you can't even sync it if the space on the destination is not the size of the sparse file, not the real data part.

The short of it is that sparse file are a good thing when you don't have to copy them across file system on different servers in witch case, it's a way different ball game.

It's been interesting learning and testing anyway.

Hopefully it was useful to others, if not, it was to me anyway.

Best,

Daniel

Reply via email to