I never understand your posts.  But I can say that I think this one has
a fatally wrong assumption underneath: that "we" can be distinguished
from "technology".  I'm pretty sure we've covered this ground as well.
I can sum it up with the aphorism:

  "The problem with communication is the illusion that it exists."

I believe that tools are part of our extended phenotype.  That implies
(I think) that a tool created by one person only translates to another
person if the other person is similar to the former ... similar
according to the measures defined by the tool's aspect.

A corollary is that if you use a dissimilar person's tool, you are
abusing that tool, or at least using it in a way the first person did
not intend it to be used.

The implications are that any attempts to "repress" technology or
distinguish benign from destructive information is an attempt to repress
ourselves or classify the population into two types: good vs. evil
people (which, in a way, is its own type of repression).

And I'll end with as clear a statement as I can make:  There is no
conflict between technology and any other human trait.  All our traits
are a part of the same phenotype.  They are multiple aspects of the same
thing.

Steve Smith wrote at 01/15/2013 01:19 PM:
> I think we've made the rounds here on gun control already, but the
> energy of the topic in this current moment might help to drive a larger,
> more interesting (to me) discussion about the intrinsic paradoxes of
> technology, information, freedom and responsibility.
> 
> This is not the first example of where these things collide, it is
> merely a recent and newsy one.
> 
> <screed>
> Technology provides a lever (in some cases literally, but generally in a
> metaphoric sense), amplifying force by translating a small force over a
> large distance into a larger force over a smaller distance.  Technology
> can also translate across domains, allowing energy in one domain (e.g.
> chemical) to be translated into another domain (e.g. mechanical, i.e.
> Internal Combustion Engine, Explosives, Firearms, etc.)...
> 
> Greek mythology gives us the tale of Promethius, who was condemned by
> Zeus to eternal Torment for having brought the gift of fire to humans. 
> Fire is also a powerful metaphorical stand-in for technology in
> general.  If the Greek Gods did not trust us with something as
> (relatively) mundane as fire, why in the world should we be trusted with
> everything else?
> 
> A close friend of Freedom is Individualism. Technology (whether it be a
> lever, fire, or firearm) gives the individual "leverage" with nature and
> with his fellow man.  As the old saying goes "God made men, but Sam Colt
> made them equal". Individuals, when wielding proper leverage, feel more
> "free" from the threats of man and nature.  At least for a few moments,
> until the feedback loop of paranoia or the very real facts of an arms
> race ratchets up another notch.  On a happier note, the Sufi saying:
> "most able, least threatened; least threatened, most able"  suggests
> that the cynical needn't be the only perspective on this condition.
> 
> Levers (literal and metaphorical) can also support various forms of
> megalomaniacal thinking. Archimedes said: "Give me a long enough lever
> and a place to stand and I will move the Earth".  Think of the Nazi
> Whermacht, a "defensive might" built on the tools and the metaphors of
> the industrial age it was born amidst.  The Whermacht translated the
> german will to power and the german intellectual prowess into the
> domination of others (the Ubermenchen over the Untermenchen) and the
> acquisition of their resources.   At one level it was a simple lever,
> but of course, at another it was a hugely complex "machine", and in
> fact, also a Complex Adaptive System, as was the Allied "machines" that
> ultimately dismantled it.
> 
> I'm not arguing that this tragic view of technology is the only way to
> live or to perceive the universe, and our place in it.  Rather this is
> roughly the logic that has brought us to this place where our
> technologists (present company included) are often racing blindly into
> the kinds of futures often foretold in the dystopias of Orwell, Dick,
> Sterling, Gibson, et al... and our politicians and other social
> engineers are grasping to regulate first the products of technology
> (e.g. guns, cryptology, nuclear technology, biotech), and then out of
> desperation (or righteous ignorance) the knowledge that it is built upon.
> 
> While we've encountered this problem before, the current era may be a
> unique example which may yield a qualitative change in the pattern.  
> Governments (or other ruling groups) have tried to repress knowledge to
> varying degrees of success throughout history.    Today it might be a 3D
> printer and high capacity clips, or even small caliber handguns,
> tomorrow it may be a desktop nucleic-acid printer and the sequence for
> the 1918 influenza, weaponized anthrax, ebola or worse.   The day after
> tomorrow it may be a universal molecular replicator and the threat of
> "grey goo".
> 
> Kotler/Diamandis' "Abundance" tells a happier story.  I want to believe it.
> 
> I'm convinced that we cannot repress the advance of technology.  I'm
> convinced that we cannot distinguish much less repress benign vs
> devastatingly destructive information.   We've shown that few of us have
> the self-knowledge to self-regulate around this kind of power.   Perhaps
> the examples of Nelson Mandela or Mohandas Ghandi indicate a possibility
> that we could.
> 
> Guatama Buddha, Jesus Christ, and Muh.ammad ibn `Abd Alla-h  (and
> countless others who did not make it "above the fold", each brought us
> messages of peace and love, but for the most part, I'd say their
> teachings didn't take.  I'm not waiting for another prophet to bring us
> the answer.
> 
> I for one, keep waking from my pop-culture soaked, consumerist-driven
> dreams of comfort and entertainment to find my lead foot pressing
> heavily on the accelerator pedal, increasing my personal velocity even
> as my headlights (or is that my vision) get dimmer.
> 
> I cannot restrict my questioning of technology to weapons.
> I cannot restrict the making of rules and their enforcement to my own
> worst fears.
> I cannot restrict some knowledge without risking restricting all.
> 
> I know there to be people here who have grappled with this both on a
> personal level and within the scholastic or intellectual sphere.  If
> this is not a supremely hard problem it is probably a supremely subtle one.
> 
> Scissors, Paper, Stone.
> 
> </screed>


-- 
glen

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

Reply via email to