[EMAIL PROTECTED] wrote:

> > > How come that the time granularity of the backup processing chain
> > > does not get finer as the systems get faster ?
> > 
> > What do you understand by time granularity?
>
> I see a fifo as a method to smoothen out peaks and gaps in a
> input function and to bring the output nearer to the input
> average.
> The intensity of peaks and gaps resp. the deviation from the
> average can be characterized by the time span in which one
> may expect that those irregularities compensate each other.
> The product of this time span and the average speed determines
> the size which is needed for an effective buffer.
> This time span is what i mean with "time granularity".

Time span looks correct, I still cannot see how you come to
"time granularity".

A FIFO allows you to survive a period with low input data rates.
If everything goes faster, you need to increase the size of the FIFO
proportional to the size improvements.


> Other views which lead me to the same result:
>
> My considerations about the benefits and effectivity of a fifo
> always dealt with relative speeds of input and output. Never
> with absolute speed.
> Thus one would expect that if both input and output speed
> increase in the same proportion, then the effectivity should
> stay the same. But it obviously doesn't.
>
> A simple thought experiment:
> Imagine a movie of an old backup run which is shown at double
> speed. The report messages about buffer size and buffer fill would
> stay exactly the same.

It's the absolute speed that counts.

Jörg

-- 
 EMail:[EMAIL PROTECTED] (home) Jörg Schilling D-13353 Berlin
       [EMAIL PROTECTED]                (uni)  
       [EMAIL PROTECTED]     (work) Blog: http://schily.blogspot.com/
 URL:  http://cdrecord.berlios.de/old/private/ ftp://ftp.berlios.de/pub/schily


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to