The following message is a courtesy copy of an article
that has been posted to bit.listserv.ibm-main,alt.folklore.computers as well.


peter.nutt...@euroclear.com (Peter Nuttall) writes:
> I remember working as an operator on shift at a site in London (mid
> 80's).  Think we had a 3081/3084.  The water cooling was done by a
> bank of radiators out in the car park. The site was set on a
> roundabout and during the summer months the radiators would get so
> clogged up with the heat/congestion and general rubbish flying about
> that they employed a guy to just hose down the radiators during the
> peak times of the day ... Water cooling the water cooling system
> ... :-).

re:
http://www.garlic.com/~lynn/2010d.html#43 What was old is new again (water 
chilled)

from this thread
http://www.garlic.com/~lynn/2010c.html#78 SLIGHTLY OT - Home Computer of the 
Future (not IBM)

discussing amount of heat and cooling required for old computers.

much lower density of circuits ... so that air flow could be used to
pull heat away from the circuits ... but the aggregate heat was such
that room air cooling units required lots of power and water.

also mentioned in the above, the science center's 360/67 machine room
cooling had large city water pipe coming in, going thru the cooling unit
and then dumping directly into city sewer system.

in the mid-70s, the city started making noises about conserving water.
the problem was that it would have required a water tower on the roof to
recycle the water ... and the building hadn't been constructed to handle
such loading (some comments from the period was that few of the large
multi-story office buildings had been built to handle such loading, also
the science center's 360/67 datacenter wasn't the only one in the
building). misc. past posts mentioning science center at 545 tech sq
(offices were on 4th flr, machine room was on the 2nd flr)
http://www.garlic.com/~lynn/subtopic.html#545tech

the was an interesting problem with the (new) almaden research bldg.
computing in the machine room was dropping ... as well as the mainframe
computers getting smaller so the machine room was starting to have empty
spaces. however, pc/rt were starting to be placed in most of the
offices. the pc/rt used quite a bit of power and produced a lot of heat.

a conservation notice started reminding people to turn the machines off
at the end of the day. the problem was that the building air
conditioning system hadn't been designed for the enormous fluctuation in
heat (with the all PC/RTs being turned on in the morning and then turned
off at the end of day) and actually having stable/comfortable
temperature through-out the building. The solution to avoid the wide
swings in temperature ... was to start leaving the machines on.

The building had lots of high-tech wiring for each office ...  a large
part was CAT5 for 16mbit T/R. However, some early tests found that
star-wired (10mbit) ethernet running over CAT5 ... actually had both
higher aggregate data transfer thruput as well as lower latency than
16mbit T/R (over the same CAT5).

We started to make use of the datapoint ... as well as several others.
We had come up with 3tier network architecture (with lots of routers and
ethernet) and was out pitching it to customer executives ... bringing
down the wrath of the communication group which was pushing T/R,
terminal emulation and trying to suppress 2tier/client-server.
http://www.garlic.com/~lynn/subnetwork.html#3tier

-- 
42yrs virtualization experience (since Jan68), online at home since Mar1970

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to