Hello Clarence:
On Tue, 08 May 2001 00:08:45 -0400, Clarence Verge wrote:
>> This is an interesting point for you to make, but who are you as an
>> individual living in a temperate climate zone to arbitrarily say that zero
>> degrees is the point at which it becomes bloody cold and a hundred degrees is
>> the exact mark when it is too damn hot for the human body to well withstand
>> without considerable discomfort? An Eskimo or an African tribesman might
>> have a quite different perception as to when it is bloody cold or when it is
>> too damn hot. Their sensations are just as valid to them as your sensations
>> are to you. One's chances of surviving the heat or the cold are dirctly
>> related to the particular climate to which his body has become accustomed.
> Well Sam, I agree with your comments re perception of discomfort, but
> the survivability is independant of adaptation at those temperatures
> assuming reasonable attire and no wind chill.
You will find that most military historians would disagree with you if you
would examine their explanations of the reasons why the armies of Napolean
and the armies of Hitler were so soundly defeated when they made the very
stupid mistake of invading Russia in the dead of winter. The Russian
soldiers were not at all as well attired and as well equipped as their
enemies. Yet the Russians could fight and carry on and survive the elements
of the weather much more effectively than their enemies because the Russians
were quite accustomed to the severe cold. They could adapt to the conditions
while their enemies just huddled themselves together in a futile attempt to
keep themselves from getting frozen to death. Because of such observations
as these and other similar notes from history, most war strategists from all
countries are now emphasizing very highly the importance of first
aclimatizing their troops to weather conditions similar to whatever may be
encountered in the area where they are planning to conduct military combat
operations.
>>> I don't mind the metric system in all other respects, but the Centigrade/
>>> Celsius temperature scale is just plain stupid for relating to the human
>>> sensations.
>>> With F you know that 0 is bloody cold and 100 is too damn hot - the degrees
>>> of discomfort are about the same, and you will survive.
>>> With C you find 0 rather pleasant and at 100 you've been dead a long time.
>> With C, 0 is when water freezes and 100 is when water boils (assuming
>> a sea level mean atmospheric pressure of 30 milibars). This is a most
>> clearly defined scientific standard that all of us can universally relate
>> to. There are absolutely no subjective sensations involved in our
>> judgement here.
> That's the point. Regardless of your adaptation, the comfort range in C
> or K is very small compared to our ability to discriminate.
This so called ability to discriminate among ranges of numbers is just
simply attributable to our acculturation and the particular system of
mathematics we are taught. We are told that since we use the decimal
system we should think of zero to ten and ten to one hundred as orders
of magnitude. To Aztecs and Mayans and other peoples who use a vegicimal
system, zero to twenty and and twenty to 400 are orders of magnitude.
Their concept of orders of magnitude are just as valid to them as our
concept is to us. It makes no more sense to base our mathematical system
on the number of fingers we have than to base it on the total of the number
of our fingers and toes. To assembly language programmers, zero to sixteen
and sixteen to two hundred fifty six are orders of magnitude. Our so
called ability to discriminate among ranges of numbers is simply relative
to the numerical system we are using.
>> <snip>
>> So, Kelvin is based on the point at which nothing happens. Celsius is
>> based on the points at which water changes states from liquid to solid or
>> to gaseous form. Both the Kelvin scales and the Celsius scales are
>> scientifically defined. They both make perfect sense to me.
> As the size of a degree is the same in both systems, and neither relate
> adequately to human sensations, the C scale is redundant/pointless IMHO.
My doctor once asked me how much it hurts on a scale of one to ten. I told
him seven. He could just as well have asked me how much it hurts on a
scale of one to a hundred and I would have told him seventy. Either way I
could have self-assessed my pain just as well and my answer would have made
just as much sense to my doctor. BTW, my doctor gave me a shot of morphine
and then I was happy <g>.
>> Does anyone know when and how and by whom the Farenheit scale was developed?
>> Is the Farenheit scale based on any scientific observations or theory, or is
>> it just an arbitrary scale based on somebody's subjective sensations as to
>> what's hot and what's not?
> The Fahrenheit scale was arrived at by an interesting series of mistakes.
> Fahrenheit, German by birth, thought he was using a method described by
> a contempory Dane, Roemer to calibrate his thermometers. He got the zero
> point right - it was the coldest temperature they could regularly achieve
> in the early 1700s - obtained by mixing ice and salt.
Very interesting! I never did know how low a temperatue one could expect
to regularly acieve by this method. I've done that to make ice-cream, but
I didn't have any thermometer.
> The first mistake was the high point which he misunderstood to be "BLOOD"
> heat.
Also very interesting! BTW, was "BLOOD" heat ever scientifically defined?
How is "BLOOD" heat measured? As a medic in the Army I was trained to
measure oral, rectal, and axial temperatures and I was taught how to apply
the appropriate adjustment to conform to the "oral" standard when plotting
vital signs graphs. Whatever method I used for taking a patient's
temperature, I always had to do it according to the prescribed and defined
standards. Sometimes I used Celsius thermometers and I had to know how to
perform the conversions. This is something I had of course already learned
in high school. (In my high school days we used to criticize our teachers
for teaching us things that we all thought we would never need to know about
in "real" life. Every one of us found out about how wrong we were as soon
as we got into the "real" world.)
> The next happened when the scale was modified (rationalized) to put the
> high point an even multiple of the freezing point of water which was 32.
> This now made blood heat 96. (wrong u say?)
I can't say whether this is right or wrong, because I don't know what is
the prescribed standard for measuring "BLOOD" heat.
> Finally, an inaccurate measurment he made of the boiling point of water
> came out to 212 on his scale. This number later became accepted as the
> high point for calibration which, when measured properly, finally resulted
> in blood heat being 98.6F.
What do you suppose could have been the cause for his initally inaccurate
measurement for the boiling point of water? What temperature was he
expecting to read on his scale? Do you suppose that the scientific community
in those days was unaware that the atmospheric pressure is a variable that
affects the boiling point of water? Surely someone living before the early
1700's must have published some observations about some problems with
cooking things in boiling water at high altitudes.
> It WAS the 1700s and you have to admit there appears to be some human
> element in the calibration - the LACK of which is what I complained
> about re: "C".
>> Maybe we don't really need a Farenheit scale any more than we need a
>> Clarence scale.
> It could be that I just don't care for any kind of "C".
> Centigrade, Celsius, "C" code or the Clarence scale.
> BTW, I knew about the salt ice and the blood, but I just now got the error
> detail via GOOGLE. <G>
What do you mean by the error detail?
Regards,
Sam Heywood
-- See our Big Gizmotimetemp at
-- http://banners.wunderground.com/banner/gizmotimetempbig/US/VA/Mt_Jackson.gif