I see something deeper in what Zed is saying. 

My first really strong experiences with programming came from the 
data-structures world in the late 80s at the University of Waterloo. There was 
an implicit view that one could decompose all problems into data-structures 
(and a few algorithms and a little bit of glue). My sense at the time was that 
the newly emerging concepts of OO were a way of entrenching this philosophy 
directly into the programming languages.

When applied to tasks like building window systems, OO is an incredibly 
powerful approach. If one matches what they are seeing on the screen with the 
objects they are building in the back, there is a strong one-to-one mapping 
that allows the programmer to rapidly diagnose problems at a speed that just 
wasn't possible before.

But for many of the things that I've built in the back-end I find that OO 
causes me to jump through what I think are artificial hoops. Over the years 
I've spent a lot of time pondering why. My underlying sense is that there are 
some fundamental dualities in computational machines. Static vs. dynamic. Data 
vs. code. Nouns vs. verbs. Location vs. time. It is possible, of course, to 
'cast' one onto the other, there are plenty of examples of 'jumping' 
particularly in languages wrt. nouns and verbs. But I think that decompositions 
become 'easier' for us to understand when we partition them along the 'natural' 
lines of what they are underneath.

My thinking some time ago as it applies to OO is that the fundamental 
primitive, an object, essentially mixes its metaphors (sort of). That is, it 
contains both code and data. I think it's this relatively simple point that 
underlies the problems that people have in grokking OO. What I've also found is 
that that wasn't there in that earlier philosophy at Waterloo. Sure there were 
atomic primitives attached to each data-structure, but the way we build 
heavy-duty mechanics was more often to push the 'actions' to something like an 
intermediary data-structure and then do a clean simple traversal to actuate it 
(like lisp), so fundamentally the static/dynamic duality was daintily skipped 
over.

It is far more than obvious that OO opened the door to allow massive systems. 
Theoretically they were possible before, but it gave us a way to manage the 
complexity of these beasts. Still, like all technologies, it comes with a 
built-in 'threshold' that imposes a limit on what we can build. If we are too 
exceed that, then I think we are in the hunt for the next philosophy and as Zed 
points out the ramification of finding it will cause yet another technological 
wave to overtake the last one.

Just my thoughts.


Paul.  
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to