On Sunday, June 16, 2024, at 6:49 PM, Matt Mahoney wrote:
> Any LLM that passes the Turing test is conscious as far as you can tell, as
> long as you assume that humans are conscious too. But this proves that there
> is nothing more to consciousness than text prediction. Good prediction
> requir
It is an interesting paper. But even though it references Tononi's
integrated information theory, I don't think it says anything about
consciousness. It is just the name they gave to part of their model. They
refer to a "consciousness vector" as the concatenation of vectors
representing perceptions
> For those of us pursuing consciousness-based AGI this is an interesting paper
> that gets more practical... LLM agent based but still v. interesting:
>
> https://arxiv.org/abs/2403.20097
I meant to say that this is an exceptionally well-written paper just teeming
with insightful research on t