Re: [Simh] Why 36-bit computing?
As I recall from way back, wasn't the 36 bit potentially split into 32-bit data and 4 bit offset to allow fast jump to the next "card" in the deck on a branch? Not that (m)any implemented this, but I seem to recall this from my early days back at ADP 30+ years ago. With the advent of fast memory and disk drives this effectively became unnecessary but mainframe architecture took a while to adjust. Dave -Original Message- From: simh-boun...@trailing-edge.com [mailto:simh-boun...@trailing-edge.com] On Behalf Of Ian King Sent: 19 March 2013 19:05 To: simh@trailing-edge.com Subject: Re: [Simh] Why 36-bit computing? > -Original Message- > From: simh-boun...@trailing-edge.com [mailto:simh-bounces@trailing- > edge.com] On Behalf Of Michael Mondy > Sent: Tuesday, March 19, 2013 7:37 AM > To: simh@trailing-edge.com > Subject: [Simh] Why 36-bit computing? > > On Tue, Mar 19, 2013 at 02:43:10PM +0100, Johnny Billquist wrote: > > [ ... ] > > > > It wasn't just DEC. Back in the day, most everyone used various word > > lengths that wasn't a power of two. I can't really make many > > comments on why other word lengths were more popular. I've seen > > mentioned that floating point formats was pretty nice to do with > > something like 60 or > > 72 bits. Reason being that you had large enough exponents for useful > > things, and enough precision for most calculations. > > So a word length that related to this made sense. > > > > Number of bits being a power of two started with IBM in the 60s, and > > became common with the PDP-11 in the 70s. (Or so I'd like to think.) > > > > Johnny > > Wikipedia has an article on 36-bit computing: > http://en.wikipedia.org/wiki/36-bit > > Snipped from the wikipedia article: > > [ ... ] > > Many early computers aimed at the scientific market had a 36-bit word > length. This word length was just long enough to represent positive > and negative integers to an accuracy of ten decimal digits (35 bits > would have been the minimum). It also allowed the storage of six > alphanumeric characters encoded in a six-bit character encoding. Prior > to the introduction of computers, the state of the art in precision > scientific and engineering calculation was the ten-digit, electrically > powered, mechanical calculator, such as those manufactured by Friden, > Marchant and Monroe. These calculators had a column of keys for each > digit and operators were trained to use all their fingers when > entering numbers, so while some specialized calculators had more > columns, ten was a practical limit. Computers, as the new competitor, > had to match that accuracy. Decimal computers sold in that era, such > as the IBM 650 and the IBM 7070, had a word length of ten digits, as did ENIAC, one of the earliest com puters. > > [ ... ] > > By the time IBM introduced System/360, scientific calculations had > shifted to floating point and mechanical calculators were no longer a > competitor. [...] [ At which point the advantages of using powers of > two became more important than feature parity with mechanical > calculators. ] > The following is from a biography of Fred Brooks, the project manager for the IBM 360, on UNC-Chapel Hill's Computer Science department website (http://www.cs.unc.edu/cms/our-people/faculty/frederick-p.-brooks-jr): "In 1957, Dr. Brooks and Dura Sweeney invented a Stretch interrupt system that introduced most features of today's interrupt systems. Dr. Brooks coined the term computer architecture. His system/360 team first achieved strict compatibility, upward and downward, in a computer family. His early concern for word processing led to his selection of the 8-bit byte and the lowercase alphabet for the System/360, engineering of many new 8-bit input/output devices, and providing a character-string datatype in PL/I." Keep in mind that the S/360 was not only targeted for scientific computation. It was intended to consolidate IBM's customer bases. -- Ian ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh
Re: [Simh] Why 36-bit computing?
> -Original Message- > From: simh-boun...@trailing-edge.com [mailto:simh-bounces@trailing- > edge.com] On Behalf Of Michael Mondy > Sent: Tuesday, March 19, 2013 7:37 AM > To: simh@trailing-edge.com > Subject: [Simh] Why 36-bit computing? > > On Tue, Mar 19, 2013 at 02:43:10PM +0100, Johnny Billquist wrote: > > [ ... ] > > > > It wasn't just DEC. Back in the day, most everyone used various word > > lengths that wasn't a power of two. I can't really make many comments > > on why other word lengths were more popular. I've seen mentioned that > > floating point formats was pretty nice to do with something like 60 or > > 72 bits. Reason being that you had large enough exponents for useful > > things, and enough precision for most calculations. > > So a word length that related to this made sense. > > > > Number of bits being a power of two started with IBM in the 60s, and > > became common with the PDP-11 in the 70s. (Or so I'd like to think.) > > > > Johnny > > Wikipedia has an article on 36-bit computing: > http://en.wikipedia.org/wiki/36-bit > > Snipped from the wikipedia article: > > [ ... ] > > Many early computers aimed at the scientific market had a 36-bit word > length. This word length was just long enough to represent positive and > negative integers to an accuracy of ten decimal digits (35 bits would have > been the minimum). It also allowed the storage of six alphanumeric > characters encoded in a six-bit character encoding. Prior to the introduction > of computers, the state of the art in precision scientific and engineering > calculation was the ten-digit, electrically powered, mechanical calculator, > such as those manufactured by Friden, Marchant and Monroe. These > calculators had a column of keys for each digit and operators were trained to > use all their fingers when entering numbers, so while some specialized > calculators had more columns, ten was a practical limit. Computers, as the > new competitor, had to match that accuracy. Decimal computers sold in that > era, such as the IBM 650 and the IBM 7070, had a word length of ten digits, as > did ENIAC, one of the earliest com puters. > > [ ... ] > > By the time IBM introduced System/360, scientific calculations had shifted to > floating point and mechanical calculators were no longer a competitor. [...] > [ > At which point the advantages of using powers of two became more > important than feature parity with mechanical calculators. ] > The following is from a biography of Fred Brooks, the project manager for the IBM 360, on UNC-Chapel Hill's Computer Science department website (http://www.cs.unc.edu/cms/our-people/faculty/frederick-p.-brooks-jr): "In 1957, Dr. Brooks and Dura Sweeney invented a Stretch interrupt system that introduced most features of today's interrupt systems. Dr. Brooks coined the term computer architecture. His system/360 team first achieved strict compatibility, upward and downward, in a computer family. His early concern for word processing led to his selection of the 8-bit byte and the lowercase alphabet for the System/360, engineering of many new 8-bit input/output devices, and providing a character-string datatype in PL/I." Keep in mind that the S/360 was not only targeted for scientific computation. It was intended to consolidate IBM's customer bases. -- Ian ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh
Re: [Simh] Why 36-bit computing?
My very first computer as a kid was a 37-bit computer we had at school, which I reckon must be considered superior to 36-bit computers. From: Michael Mondy To: simh@trailing-edge.com Sent: Tue, March 19, 2013 7:39:07 AM Subject: [Simh] Why 36-bit computing? On Tue, Mar 19, 2013 at 02:43:10PM +0100, Johnny Billquist wrote: > [ ... ] > > It wasn't just DEC. Back in the day, most everyone used various word > lengths that wasn't a power of two. I can't really make many > comments on why other word lengths were more popular. I've seen > mentioned that floating point formats was pretty nice to do with > something like 60 or 72 bits. Reason being that you had large enough > exponents for useful things, and enough precision for most > calculations. > So a word length that related to this made sense. > > Number of bits being a power of two started with IBM in the 60s, and > became common with the PDP-11 in the 70s. (Or so I'd like to think.) > > Johnny Wikipedia has an article on 36-bit computing: http://en.wikipedia.org/wiki/36-bit Snipped from the wikipedia article: [ ... ] Many early computers aimed at the scientific market had a 36-bit word length. This word length was just long enough to represent positive and negative integers to an accuracy of ten decimal digits (35 bits would have been the minimum). It also allowed the storage of six alphanumeric characters encoded in a six-bit character encoding. Prior to the introduction of computers, the state of the art in precision scientific and engineering calculation was the ten-digit, electrically powered, mechanical calculator, such as those manufactured by Friden, Marchant and Monroe. These calculators had a column of keys for each digit and operators were trained to use all their fingers when entering numbers, so while some specialized calculators had more columns, ten was a practical limit. Computers, as the new competitor, had to match that accuracy. Decimal computers sold in that era, such as the IBM 650 and the IBM 7070, had a word length of ten digits, as did ENIAC, one of the earliest computers. [ ... ] By the time IBM introduced System/360, scientific calculations had shifted to floating point and mechanical calculators were no longer a competitor. [...] [ At which point the advantages of using powers of two became more important than feature parity with mechanical calculators. ] -- Mike ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh
Re: [Simh] Why 36-bit computing?
>By the time IBM introduced System/360, scientific calculations had shifted >to floating point and mechanical calculators were no longer a competitor. ... but the PDP-6 and the S/360 are roughly contemporary - both were introduced around 1964. I guess IBM was more forward thinking than DEC was at the time :-) Of course DEC had a prior history with 18 bit computers, so I'm sure 36 seemed like a more natural choice. Bob Armstrong ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh
[Simh] Why 36-bit computing?
On Tue, Mar 19, 2013 at 02:43:10PM +0100, Johnny Billquist wrote: > [ ... ] > > It wasn't just DEC. Back in the day, most everyone used various word > lengths that wasn't a power of two. I can't really make many > comments on why other word lengths were more popular. I've seen > mentioned that floating point formats was pretty nice to do with > something like 60 or 72 bits. Reason being that you had large enough > exponents for useful things, and enough precision for most > calculations. > So a word length that related to this made sense. > > Number of bits being a power of two started with IBM in the 60s, and > became common with the PDP-11 in the 70s. (Or so I'd like to think.) > > Johnny Wikipedia has an article on 36-bit computing: http://en.wikipedia.org/wiki/36-bit Snipped from the wikipedia article: [ ... ] Many early computers aimed at the scientific market had a 36-bit word length. This word length was just long enough to represent positive and negative integers to an accuracy of ten decimal digits (35 bits would have been the minimum). It also allowed the storage of six alphanumeric characters encoded in a six-bit character encoding. Prior to the introduction of computers, the state of the art in precision scientific and engineering calculation was the ten-digit, electrically powered, mechanical calculator, such as those manufactured by Friden, Marchant and Monroe. These calculators had a column of keys for each digit and operators were trained to use all their fingers when entering numbers, so while some specialized calculators had more columns, ten was a practical limit. Computers, as the new competitor, had to match that accuracy. Decimal computers sold in that era, such as the IBM 650 and the IBM 7070, had a word length of ten digits, as did ENIAC, one of the earliest computers. [ ... ] By the time IBM introduced System/360, scientific calculations had shifted to floating point and mechanical calculators were no longer a competitor. [...] [ At which point the advantages of using powers of two became more important than feature parity with mechanical calculators. ] -- Mike ___ Simh mailing list Simh@trailing-edge.com http://mailman.trailing-edge.com/mailman/listinfo/simh