I am not involved in academics or statistics nowaday, so I don't get
here much any more, but Hi to anyone who remembers me.

I am not sure who cares, but I was thinking about the concept of average
today. I have been wrestling with fractions, and came to the conclusion
that they have three different meanings. One is a process -- 2/3 means
to divide 2 by 3. The second is a number -- 2/3 is one of the rational
numbers. Finally, and completely different from the first two, is as a
proportion. So we might say that 2/3 of the patients died, without
saying how many died or how many patients there were, or not even
caring.

To illustate 2/3, a teacher might draw a circle, divide it into 3 equal
parts, and then shade two parts to represent 2/3. This illustrates 2/3
as a proportion. I see little or no connection to the process of
division (I guess we are illustrating 1/3 * 2, but that's not the same
concept as 2 /3). Similarly, the connection to the number 2/3 is tenuous
at best -- it requires assuming that the area of the circle is one,
which is very difficult to draw or depict and irrelevant to whatever is
to be done with your circle (such as to show that 2/3 = 4/6).

Similarly, I have read that a percentage is just a number, so that for
example 50% = .5. This is true, but it obscures the actual role of
percentages. We don't say that Mary had 200% apples, and John gave her
300% apples, how many apples does she have? We say that 70% of the
patients died, without much caring about how many patients died or how
many patients there were. And if you had to visualize 70%, you just
might choose a shaded circle.

My thought for the day is that average is a part of this class. It is a
number designed to talk about the features of a group, without caring
about how many things are in the group.

The impetus is the occasional student who can pass a statistics class
without getting the concept of average. I find them from my problem on
averages -- if you go to the mall and sample 20 people at random, what
do you think the average age will be? And, if you go to the mall and
sample 40 people at random, what do you think the average age will be?

So then I started asking what the concept of average was? The
information "the expected average is the same no matter how many people
you randomly sample" did not seem to be a part of the concept of
average. (I would guess that people can have the concept of average
without having the concept of expected average or random sampling." The
concept of average isn't the formula. I can argue this better for the
concept of standard deviation, which has three different formulas. Or,
to be more precise, average is a formula, but there is a more primitive
concept and the formula for calculating the average is just one of many
formulas we could have used to put a number on this primitive concept
(albeit probably the best formula for this primitive concept).

Sorry if all this was obvious, and thanks for listening, it helped me to
think about it. Looking back, this view of average doesn't seem very
controversial,  but perhaps the view of fractions and percentages is
different. I am now attaching averages to this very fundamental and
hopefully primitive notion of describing a set independent of the number
of items in the set.

Bob

.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to