On Oct 3, 2012, at 7:53 AM, Paul Homer <paul_ho...@yahoo.ca> wrote:
> If instead, programmers just built little pieces, and it was the computer 
> itself that was responsible for assembling it all together into mega-systems, 
> then we could reach scales that are unimaginable today. To do this of course, 
> the pieces would have to be tightly organized.


The missing element is an algorithmic method for decomposing the representation 
of large, distributed systems that is general enough that it does not imply 
significant restrictions on operator implementation at the level of individual 
piece such that the programmer has to be aware of the whole system 
implementation. The details of the local implementation will matter much less 
but it will also be a very different type of interface than programmers are 
used to designing toward. 

Programming environments improperly conflate selecting data models and 
operators with selecting data structures and algorithms.  Reimplementation is 
common because I need sorting, maps, vectors, etc in the abstract to build the 
software but the algorithms someone else may use to optimally implement them 
for their data model is pathological for my data model. The algorithms used to 
operate on a data model are not separable from the data model but most 
programming environments treat them as though they are. Large systems are not 
nicely decomposable because they impose global implementation details beyond 
the interfaces of the pieces.

A canonical example is the decomposition of an ad hoc relational join 
operation. Common data model representations impose the use of algorithms that 
are pathological for this purpose. You can't get there, or even close, with the 
data structures commonly used to represent data models. At the root, it is a 
data structure problem.

--
J. Andrew Rogers


_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to