I agree that it's applicable to any imaginable goal, this is the usual
prisoner's dilemma. What I'm not sure is precisely how we could all agree
on some collective goal.

The pre-democracy solution was to enforce allegiance to some set of
religious beliefs. You accept these goals if you don't want to spend
eternity in torment, case closed. Democracy at face value asks the majority
what they want. If you're in the minority, though shit. At least there is a
rational explanation for your suffering. Democracy in reality does no such
thing, but that's another story.

I can't take you seriously until you tell me how to agree on the goal.

On Tue, Nov 25, 2014 at 10:37 PM, Alberto G. Corona <agocor...@gmail.com>
wrote:

> I don´t though it very much. But it is applicable to any imaginable goal.
> Since certain self interests will go against any collective interest that
> we can imagine. And the best way to advance self interest is indeed to use
> ideology to hide deleterious self interests behind any true or false good
> or bad collective interest already existent or promoted as such.
>
>
>
>
>
> 2014-11-24 13:01 GMT+01:00 Telmo Menezes <te...@telmomenezes.com>:
>
>> Hi Alberto,
>>
>> You talk of "advancement of society", so this implies some collective
>> goal. What is the goal, in your view?
>>
>> Cheers
>> Telmo.
>>
>>
>> On Mon, Nov 24, 2014 at 9:14 AM, Alberto G. Corona <agocor...@gmail.com>
>> wrote:
>>
>>>  I laugh at the anthropological optimists that are confident that humans
>>> will be like gods for the same reason that I laugh loudly at cibernetical
>>> optimists.
>>>
>>> most of the effort of inteligent people is devoted to lie themselves and
>>> other in order to gain power and "enslave" people, at least, to seduce
>>> other in his sophisticated lies.   The more inteligence, the more chance
>>> for creation  and destruction. On the average, intelligence alone
>>> contribute zero to the advancement of society and thus contributes nothing
>>> to the advancement of anything. It is often the case that dumb people are
>>> wiser than intelligent people from Harvard of Yale staturated by ideology
>>> (self-profitable ideology, I could say).
>>>
>>> To have intelligent machines either autonomous or not don´t change that
>>> they could be used for good or for evil contributing nothing, not even to
>>> the progress of machines.
>>>
>>> 2014-11-24 7:45 GMT+01:00 John Clark <johnkcl...@gmail.com>:
>>>
>>>>
>>>> > A.I. is no closer than  it was 20 or 30 or 40 years ago.
>>>>>
>>>>
>>>> Of one thing I am certain, someday computers will become more
>>>> intelligent than any human who ever lived using any measure of intelligence
>>>> you care to name. And I am even more certain that we are 20 years closer to
>>>> that day than we were 20 years ago.
>>>>
>>>> > But what is new and big is Big Data. But Big Data does not involve
>>>>> theories of A.I. nor efforts. it's about taking very large sets of paired
>>>>> data and converging by some basic rule to a single thing. This is how
>>>>> translation services work.
>>>>>
>>>>
>>>> Well... Big Data computers are artificial and good translation requires
>>>> intelligence,  so why in the world isn't that AI.
>>>>
>>>> > Big Data does not involve theories of A.I
>>>>>
>>>>
>>>> I think it very unlikely that the secret to intelligence is some grand
>>>> equation you could put on a teashirt, it's probably 1001 little hacks and
>>>> kludges that all add up to something big.
>>>>
>>>> >  It's very large sets of translations of sentences, and sentence
>>>>> components, simply rehashed for  best fit
>>>>>
>>>>
>>>> Simply? Is convoluted better than simple?  Are you saying that if we
>>>> can explain how it works then it can't be intelligent?
>>>>
>>>> > It actually works fairly adequately for most translation needs. Which
>>>>> would be great, except this: The Big Data system is not independent at any
>>>>> point. Every day there needs to be a huge scrape of the translations
>>>>> performed by human translators.
>>>>>
>>>>
>>>> And human beings move from being mediocre translators to being very
>>>> good translators by observing how great translators do it.
>>>>
>>>> > Human translation professions are in a state of freefall. There used
>>>>> to be a career structure with rising income and security and status. Now
>>>>> there isn't.
>>>>>
>>>>
>>>> Translation certainly won't be the last profession where machines
>>>> become better at there job than any human; and I predict  that the next
>>>> time it happens somebody will try to find a excuse for it just like you did
>>>> and say "Yes a machine is a better poet or surgeon or joke writer or
>>>> physicists than I am but it doesn't really count because (insert lame
>>>> excuse here)".
>>>>
>>>>   John K Clark
>>>>
>>>>
>>>>
>>>>
>>>>  --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Everything List" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to everything-list+unsubscr...@googlegroups.com.
>>>> To post to this group, send email to everything-list@googlegroups.com.
>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
>>>
>>> --
>>> Alberto.
>>>
>>> --
>>> You received this message because you are subscribed to the Google
>>> Groups "Everything List" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to everything-list+unsubscr...@googlegroups.com.
>>> To post to this group, send email to everything-list@googlegroups.com.
>>> Visit this group at http://groups.google.com/group/everything-list.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
>
> --
> Alberto.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to