Suppose we measure a process and determine that the average is
100 +/- 8% with 99% confidence; that is, the average is between
92 and 108.

We improve the process, and when we measure again, the average
is 120 +/- 5% with 99% confidence; that is, the average is
beteen 114 and 126.  Assume that a larger number is an improvement.

What is the percentage of the improvement?  And what is the
confidence interval, if any, of that percentage improvement?

Do we simply state the improvement of the average (20%)?

That does not seem right to me.  The improvement might be as low
as 5.6% and as high as 37%.

Or do we state the improvement as an average of the deltas of the
9 pairs of lows, highs, and averages of the two ranges (20.5%)?

(The 9 pairs are given in the table below.)

If so, what is the confidence interval?  Is it based on the
distribution of the 9 deltas, for example +/- 8.5 percentage
points with 99% confidence?

Or do we state the improvement simply as the extremes of the
deltas, namely "5.6% to 37%"?

If so, would we add:  "with an average improvement of 20%",
based on the delta of the averages?

-- Ken Mintz


Table of Improvements
---------------------

        114    120    126
      -----  -----  -----
 92:  23.9%  30.4%  37.0%
100:  14.0%  20.0%  26.0%
108:   5.6%  11.1%  16.7%
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to