Actually, If I remember correctly, he read the Japanese book of change...
called Cha-Ching, which sold millions of copies, and was one of the biggest
money makers for the publishing company...

Which is also why we refer to making alot of money as Cha Ching....(as in
the cash register door opening..).
This is where he discovered that One, meaning full, and zero, of course
being very empty!!

And they bridged the gap for algebra?????

Damn im out of the loop!!!!

----- Original Message -----
From: Priscilla Oppenheimer 
To: 
Sent: Thursday, August 16, 2001 12:13 PM
Subject: RE: Friday Funnie #2, Couldn't let this one go by!! [7:14809]


> In ancient India, binary numbers were used in music to classify meters.
>
> African bush tribes sent messages via a combination of high and low
pitches.
>
> Australian aborigines and New Guinea Tribesman counted by two's.
>
> In 1666, Gottfried Wilhelm Leibniz wrote the essay "De Arte Combinatoria"
> which laid a method for expressing all things in the law of thought with
> precision mathematics, including binary numbers. After reading the Chinese
> "Book of Changes," or "I Ching," he refined his work and came to believe
> that binary numbers represented Creation, the number one portraying God,
> and zero depicting the Void.
>
> In the 19th Century, British mathematician George Boole invented the
system
> of symbolic logic call Boolean algebra.
>
> In 1867, Charles Sanders Peirce introduced Boolean algebra to the United
> States.
>
> In 1936, Claude Shannon, may he RIP, bridged the gap between algebraic
> theory and practical application.
>
>
> At least that's what I read on the Internet, so it must be true!? ;-)
>
> Priscilla
>
> P.S. I don't think the UNIVAC I was core either.
>
>
> At 08:34 AM 8/16/01, Howard C. Berkowitz wrote:
> > >That's what I meant Howard. I think I left out a few words as I do that
> most
> > >of the time. I think much quicker than I type.
> > >
> > >My understanding of this:
> > >
> > >All computer machines were decimal[base10] until the 40's. Atanasoff
was
> the
> > >original one who suggested binary to be used instead of base10 to
correct
> > >the computational probems that existed in measuring current/voltage. In
> > >those days with base10, one was a little current, two was a little
more,
> > >three a little more than that and so on and so on. It was not a very
good
> > >way to be accurate and was met with many failures. With the induction
of
> > >binary for current measureage, it became easy and computers were on
their
> > >way to being a successful marketing venture.  One was on, zero was off.
> Very
> > >simple. But the original idea of the binary counting concept started
with
> > >Ada.  Not in the computer sense, but in a general sense of numbers.
> > >
> > >Or at least that what I have read.
> > >
> > >Jenn
> >
> >
> >It could have been that Ada, Lady Lovelace, did invent binary as a
> >means of representation.   There's no question that Boolean algebra,
> >and logical binary operations, come from George Boole.
> >
> >I honestly don't know who made the suggestion of binary computer
> >electronics.  It had to have taken place before the invention of
> >magnetic core memory, which is binary or, at best, ternary. Before
> >core, there were essentially analog storage devices like specialized
> >CRTs (storage as light) or mercury delay lines (storage as
> >vibrations).
> >
> >Now I'm trying to remember what was the first fully core-based
> >machine.  I want to say the ATLAS* in the UK, but I'm not sure.
> >UNIVAC I was commercial, but I don't think it was core based. The
> >first commercial core machine might have been a later UNIVAC or
> >possibly the IBM 701.  The IBM 650 -- and I actually worked in the
> >same computer room as one still chugging away before it was
> >successfully emulated -- used a magnetic head-per-track disk (called
> >a drum) as main memory.  (It was the first computer that produced the
> >Consumer Price Index, one of those applications that HAD to work).
> >
> >
> >*our UK list members expecially should learn something about the
> >history of the ATLAS, which was done at an English university and
> >pioneered a great number of computer innovations, such as interrupts.
> >It never gets the historical credit it should.
> >
> > >
> > >
> > >-----Original Message-----
> > >From: Howard C. Berkowitz [mailto:[EMAIL PROTECTED]]
> > >Sent: Sunday, August 05, 2001 4:23 AM
> > >To: Jennifer Cribbs; [EMAIL PROTECTED]
> > >Subject: RE: Friday Funnie #2, Couldn't let this one go by!! [7:14809]
> > >
> > >
> > >Not serious, but the intellectual credit here goes to George  Boole--as
in
> > >"boolean arithmetic."  Babbage/Lovelace machines were decimal.
> > >
> > >
> > >
> > >At 02:01 PM 8/3/2001 -0400, Jennifer Cribbs wrote:
> > >>Is this serious?
> > >>
> > >>I was under the impression that Ada Lovelace invented the binary
counting
> > >>system.  I was also under the impression that John Atanasoff came up
with
> > >>the brilliant coding system that expressed everything in terms of two
> > >>numbers for the methodology of measuring the current or lack of
current
> in
> > >>regards to computers way back in the 40's.
> > >>
> > >>Before that everyone kept trying to incorporate the base10 system in
> > >>computers, which was a major headache and unsuccessfull, but that was
in
> > >the
> > >>vacuum tube days.
> > >>
> > >>hmmm.  Surely Microsoft doesn't think they can do this..Maybe this is
a
> > >joke
> > >  >however and I am just too d*** serious.
> > >  >
> > >  >Jenn
> ________________________
>
> Priscilla Oppenheimer
> http://www.priscilla.com




Message Posted at:
http://www.groupstudy.com/form/read.php?f=7&i=16315&t=14809
--------------------------------------------------
FAQ, list archives, and subscription info: http://www.groupstudy.com/list/cisco.html
Report misconduct and Nondisclosure violations to [EMAIL PROTECTED]

Reply via email to