Quan,
Lots of words. None of which mean anything to me...
OK "soft-systems ontology" turns up something:
https://en.wikipedia.org/wiki/Soft_systems_methodology
This British guy Checkland wrote some books on management techniques.
Some kind of "seven step" process:
1) Enter situation in which
No, I don't believe I am talking about PCA. But anyway, you are unable
to demonstrate how you implement PCA or anything else, because your
algorithm is "far from complete".
You are unable to apply your conception of the problem to my simple
example of re-ordering a set.
How about PCA itself? If
Note to my own opening post: Humans can't dream and hold a game in their head,
we rely on computers to store all their thousands of items where they were left
or start and how they work perfectly.
--
Artificial General Intelligence List: AGI
Permalink:
> What I mean by contradiction is different orderings of an entire set
of data, not points of contrast within a set of data
That's not what people usually mean by contradiction, definitely not in a
general sense.
You are talking about reframing dataset (subset) of multivariate items along
the
Rob. I'm referring to contextualization as general context management
within complex systems management. As an ontology. The application of which
has relevance for knowledge graphs, LLMs, and other knowledge-based
representations. Your quotation: "Contextualization ... in LLM systemic
On Sun, Jun 23, 2024 at 11:05 PM Boris Kazachenko wrote:
>
> There can be variance on any level of abstraction, be that between pixels or
> between philosopical categories. And it could be in terms of any property /
> attribute of compared elements / clusters / concepts: all these are derived
On Thursday, June 20, 2024, at 10:36 PM, immortal.discoveries wrote:
> Consciousness can be seen as goal creation/ learning/ changing. Or what you
> might be asking is to have them do long horizon tasks, and solve very tricky
> puzzles. I think all that will happen and needs to happen.
This
On Sun, Jun 23, 2024 at 1:20 AM wrote:
>
> @Matt what year do you expect AGI? (which I classify as something that
works on AI on its own but much faster due to having many copied clones and
being a computer)
Every time you start a ChatGPT session, it creates a new copy so your data
doesn't leak
There can be variance on any level of abstraction, be that between pixels or
between philosopical categories. And it could be in terms of any property /
attribute of compared elements / clusters / concepts: all these are derived by
lower-order comparisons. None of that falls from the sky, other
There were 5 or 6 totally mis-interpretations of my words in there,
Boris. Mis-interpretations of my words was almost the whole content of
your argument. I'll limit myself to the most important
mis-interpretation below.
On Sun, Jun 23, 2024 at 7:10 PM Boris Kazachenko wrote:
> ...
> Starting
Rob, a lot of your disagreements stem from your language-first mindset.
Which is perverse, you must agree that the language is a product of basic
cognitive ability, possed by all mammals.
Starting from your "contradiction": that's simply a linguistic equivalent of my
variance.
I have no idea
11 matches
Mail list logo