Marcus wrote: > Phil Henshaw wrote: > > Growth taken to it's absolute limit always leads to an absolutely > > impenetrable wall of complexity, at which point turbulence > > or it's equivalents interrupt the > > whole process. I don't think we want to do that. > > > To control a system a regulator must be able to absorb and respond to at > least as much information as the regulated entity can produce. ..well, unless you rely on how a system 'out of control' will take care of itself, like Tom Sawyer, or cities or markets designed around infrastructure that defines a fair playing field which lets people do what they like, out of control. That's what free markets are all about after all, fair exchange where the participants are completely free to do what they like with what they take away from it. > To reduce forms of company-internal entropy, large companies tend to > spin-off successful and unsuccessful business units. To reduce forms
> of external entropy, we also see big companies buy smaller companies > simply to nip potential competition in the bud. The need for control is > built-in and forces companies away from overly-complicated decision > making. The need for control by government is also present, and one > form it takes are antitrust laws. How businesses actually operate certainly does provide great evidence of how complex systems work. One of the things I find interesting is the role of an organization's central structure in providing a path of least resistance for things that are essentially out of control. You have great people? The whole job is getting out of their way and letting them work. When government does that for society at large sometimes it means fostering basic scientific research, and sometimes it means removing barriers to the powerful for doing whatever they want with the rest of us. Some good some bad. > Given these forces, there is a push away from the absolute limit. And > provided there is room for the participants and the raw materials that > makes them go, it's not clear to me why this kind of system couldn't > expand, and indefinitely. It is a `small' matter of technology. By > genetically engineering more energy efficient food or people, spreading > to other planets, etc. the problems of sustainability could be addressed. Well, sure, things veer away from absolute limits or they don't survive, and the world seems mostly made up of things that have in some way or another passed that test. An analogy I've been thinking about is that walking is understood as 'organized falling', where you take a step because it catches you from falling flat on your face, but it's just shy of catching you entirely, so you shortly need to take another step to keep you from falling flat on your face again, etc. If you decide to pick up the speed to a jog or a sprint it's a simple matter of having your steps be just a little less successful at keeping you upright, and as your steps lag in the task of preventing you from falling you accelerate, moving faster and faster. At some point before you're exhausted it's good to take steps that are *more* than sufficient to stabilize your fall, either to pace yourself by easing back to a jog, or just to not run smack into things as you arrive at your door. This is steering, and a plan to fall forward ever faster forever, and have your legs magically keep up with it somehow, is dreaming not steering. I think there is good reason to suspect that the plans for me and my descendants to continually multiply the complexity of our tasks forever is faulty. I notice that you use as your assurance, as is rather common, the phrase "it's not clear to me why this kind of system couldn't expand, and indefinitely". Isn't it odd the way we establish long range plans with a test showing an absence of clarity? I find it hard to know when to use it, though it pays off SO well sometimes, but I hope to have the sense to just say "I don't know.." when that's the case. It seems like a much stronger and more versatile general observation. > ============================================================ > FRIAM Applied Complexity Group listserv > Meets Fridays 9a-11:30 at cafe at St. John's College > lectures, archives, unsubscribe, maps at http://www.friam.org > > ============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org