Cool, but ... I maintain that none of this is about consciousness. Knowledge
representation, abstraction and compression via symbolism, communication of
structure, common protocols and standards for describing physical phenomena ...
these are all intelligence tasks. Specifically, the
John
I know you know this, so perhaps you're clickbaiting me. :-)
Digital devices do not pass the qualia test. Therefore, the example is invalid.
However, it has relevance for this debate.
As a thought experiment, perhaps try an example of the interaction between
yourself, your PC's
On Wednesday, August 28, 2019, at 6:49 PM, WriterOfMinds wrote:
> Great, seems like we've reached agreement on something.
> When we communicate with words like "red," we're really communicating about
> the frequency of light. I would argue that we are not communicating our
> qualia to each
On Wednesday, August 28, 2019, at 5:09 PM, WriterOfMinds wrote:
> People can only communicate their conscious experiences by analogy. When you
> say "I'm in pain," you're not actually describing your experience; you're
> encouraging me to remember how I felt the last time *I* was in pain, and to
All my work, images, notes, algorithms, a big movie, and my future research. I
am planning to soon release it to all of the AI community. I am doing this
because there is no actual monetary gain from owning or patenting AGI or any
other piece of knowledge because of how fast evolution will move
People can only communicate their conscious experiences by analogy. When you
say "I'm in pain," you're not actually describing your experience; you're
encouraging me to remember how I felt the last time *I* was in pain, and to
assume you feel the same way. We have no way of really knowing
On Wednesday, August 28, 2019, at 4:07 PM, WriterOfMinds wrote:
> Are you sure you wouldn't be better served by calling your ideas some other
> names than "consciousness" and "qualia," then? We're all getting "hung-up
> on" the concepts that those terms actually refer to.
Good question.
On Wednesday, August 28, 2019, at 3:35 PM, Secretary of Trades wrote:
> https://philpapers.org/archive/CHATMO-32.pdf#page=50
Blah blah blah.
>From AGI perspective we are interested in the multi-agent computational
>advantages in distributed systems that consciousness (or by other names)
https://philpapers.org/archive/CHATMO-32.pdf#page=50
On 28.08.2019 22:19, Secretary of Trades wrote:
clrscr();
On 28.08.2019 16:03, johnr...@polyplexic.com wrote:
On Wednesday, August 28, 2019, at 8:44 AM, WriterOfMinds wrote:
That is not what qualia are. Qualia are incommunicable and
clrscr();
On 28.08.2019 16:03, johnr...@polyplexic.com wrote:
On Wednesday, August 28, 2019, at 8:44 AM, WriterOfMinds wrote:
That is not what qualia are. Qualia are incommunicable and private.
As Matt would say:
printf("Ouch!\n");
John
*Artificial General Intelligence List
Does a quale have to always pass the "qualitative character of sensation" test?
Any generalized system relying on the random, subjective input value of qualia
would give rise to the systems constraint of ambiguity. Therefore, as a policy,
all subjectively-derived data would introduce a
On Monday, August 26, 2019, at 5:25 PM, WriterOfMinds wrote:
> "What it feels like to think" or "the sum of all a being's qualia" can be
> called phenomenal consciousness. I don't think this type of consciousness is
> either necessary or sufficient for AGI. If you have an explicit goal of
>
This attention system is also done in cells, on DNA translation
encoding/decoding and for repairs of errors and signaling growth specialization
of new types of cell differentiation. It works on all levels, cities to cells,
even wound healing, because it is information repair and Generative
If you shake a bucket of rocks, they will fall and take up less volume, you let
the matter do the calculations on their own. This is indeed similar to the
Self-Attention system in the Transformer architecture, let me get a picture:
--
Artificial General
14 matches
Mail list logo