Re: Parallel computation

2017-01-24 Thread Paul Koning

> On Jan 23, 2017, at 8:33 PM, Paul Koning <paulkon...@comcast.net> wrote:
> 
>> 
>> On Jan 23, 2017, at 5:09 PM, Toby Thain <t...@telegraphics.com.au> wrote:
>> 
>> On 2017-01-23 6:55 PM, Paul Koning wrote:
>>> 
>>>> On Jan 23, 2017, at 3:52 PM, Chuck Guzis <ccl...@sydex.com> wrote:
>>>> 
>>>> ...
>>>> It's just that I bridle a bit when hearing the young 'uns refer to any
>>>> physically large machine as a "supercomputer".
>>>> 
>>>> It's the same feeling that I get when I see press releases today that
>>>> relate that David Gelernter single-handedly developed the parallel
>>>> computation.  He's not old enough; at 61, he was still in high school
>>>> during the ILLIAC IV era.
>>> 
>>> Even earlier...
>>> 
>>> From what I've read, ENIAC supported parallel computing, but in practice it 
>>> wasn't used because it was too hard to get the code right.  At least, 
>>> that's what a computer design course from 1948 states.
>>> 
>> 
>> Has this been scanned anywhere?
> 
> Yes, it's on the CWI website in Amsterdam.  The trouble for most readers is 
> that it's in Dutch.  I'm working on translating it.  Report CR3, Principles 
> of electronic computers, course Feb 1948, by A. van Wijngaarden.
> 
> His comments on ENIAC should be able to be confirmed (or refuted) from ENIAC 
> documentation.

Here is a translation of the relevant text from that report (page 6):

General strategy.

A very important question is how many operations are performed in
parallel in the machine.  Here we must distinguish between large scale
parallelism and small scale parallelism.

Large scale parallelism means the possibility of performing multiple
arithmetic operations on distinct operands at the same time.  This of
course requires multiple arithmetic elements, for example 4 adders or
2 multipliers.  The immediate benefit is faster calculation.  In
addition, such multiple arithmetic elements have a significant
additional storage capacity.  In the Eniac essentially the entire fast
storage capacity is implemented by its adder elements.  However, the
distribution of the calculation program among these elements is not a
trivial matter if we really want the whole system to be working
efficiently.  For this reason, the most effective approach to make
Eniac a more manageable machine is that of von Neumann, in which 19 of
the 20 adders are simply demoted to memories.  In some machines (for
example the relay machine of Stibits) great ingenuity has been applied
to allow the machine to be split arbitrarily into a number of
independent machines, or to acts as a single unit.



So the implication is that Eniac is a 20 unit MIMD computer.  Amazing.

paul




Re: Parallel computation

2017-01-23 Thread Paul Koning

> On Jan 23, 2017, at 5:09 PM, Toby Thain <t...@telegraphics.com.au> wrote:
> 
> On 2017-01-23 6:55 PM, Paul Koning wrote:
>> 
>>> On Jan 23, 2017, at 3:52 PM, Chuck Guzis <ccl...@sydex.com> wrote:
>>> 
>>> ...
>>> It's just that I bridle a bit when hearing the young 'uns refer to any
>>> physically large machine as a "supercomputer".
>>> 
>>> It's the same feeling that I get when I see press releases today that
>>> relate that David Gelernter single-handedly developed the parallel
>>> computation.  He's not old enough; at 61, he was still in high school
>>> during the ILLIAC IV era.
>> 
>> Even earlier...
>> 
>> From what I've read, ENIAC supported parallel computing, but in practice it 
>> wasn't used because it was too hard to get the code right.  At least, that's 
>> what a computer design course from 1948 states.
>> 
> 
> Has this been scanned anywhere?

Yes, it's on the CWI website in Amsterdam.  The trouble for most readers is 
that it's in Dutch.  I'm working on translating it.  Report CR3, Principles of 
electronic computers, course Feb 1948, by A. van Wijngaarden.

His comments on ENIAC should be able to be confirmed (or refuted) from ENIAC 
documentation.

paul




Re: Parallel computation

2017-01-23 Thread Toby Thain

On 2017-01-23 6:55 PM, Paul Koning wrote:



On Jan 23, 2017, at 3:52 PM, Chuck Guzis <ccl...@sydex.com> wrote:

...
It's just that I bridle a bit when hearing the young 'uns refer to any
physically large machine as a "supercomputer".

It's the same feeling that I get when I see press releases today that
relate that David Gelernter single-handedly developed the parallel
computation.  He's not old enough; at 61, he was still in high school
during the ILLIAC IV era.


Even earlier...

From what I've read, ENIAC supported parallel computing, but in practice it 
wasn't used because it was too hard to get the code right.  At least, that's 
what a computer design course from 1948 states.



Has this been scanned anywhere?

--Toby



paul







Parallel computation (was: IBM 7074 and then some)

2017-01-23 Thread Paul Koning

> On Jan 23, 2017, at 3:52 PM, Chuck Guzis <ccl...@sydex.com> wrote:
> 
> ...
> It's just that I bridle a bit when hearing the young 'uns refer to any
> physically large machine as a "supercomputer".
> 
> It's the same feeling that I get when I see press releases today that
> relate that David Gelernter single-handedly developed the parallel
> computation.  He's not old enough; at 61, he was still in high school
> during the ILLIAC IV era.

Even earlier...

From what I've read, ENIAC supported parallel computing, but in practice it 
wasn't used because it was too hard to get the code right.  At least, that's 
what a computer design course from 1948 states.

paul