Marcus G. Daniels wrote at 07/26/2013 10:42 AM:
A set of people ought to be able to falsify a proposition faster than one 
person, who may be prone to deluding themselves, among other things.   This is 
the function of peer review, and arguing on mailing lists.  Identification of 
truth is something that should move slowly. I think `negotiated truth' occurs 
largely because people in organizations have different amounts of power, and 
the powerful ones may insist on something false or sub-optimal.   The weak, 
junior, and the followers are just fearful of getting swatted.

Fantastic point.  So, the (false or true) beliefs of the more powerful people 
are given more weight than the (true or false) beliefs of the less powerful.  
That would imply that the mechanism we need is a way to tie power to 
calibration, i.e. the more power you have, the smaller your error must be.

If an objective ground is impossible, we still have parallax ... a kind of 
continually updating centroid, like that pursued by decision markets.  But a 
tight coupling between the most powerful and a consensual centroid would 
stultify an organization.  It would destroy the ability to find truth in 
outliers, disruptive innovation.  I suppose that can be handled by a healthy 
diversity of organizations (scale free network). But we see companies like 
Intel or Microsoft actively opposed to that... they seem to think such 
behemoths can be innovative.  So, it's not clear to me we can _design_ an 
artificial system where calibration (tight or loose) happens against a parallax 
ground for truth (including peer review or mailing lists).

It still seems we need an objective ground in order to measure belief error.  The only 
way around it is to rely on natural selection, wherein "problems" with 
organizatinos may well turn out to be the particular keys to their survival/success.  So, 
that would fail to address the objective of this conversation, which I presume is how to 
reorg. orgs either before they die off naturally (because they cause so much harm) or 
without letting them die off at all.  (Few sane people want, say, GM to die, or our 
government to shut down ... oh wait, many of our congressional reps _do_ want our govt to 
shut down.)

--
⇒⇐ glen e. p. ropella
Some of our guests are ... how shall I say? Hyperbolic V.I.P.
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

Reply via email to