> I've developed Molly, a conversational agent with an inner monologue, using OpenAI's Large Language Model (LLM). > This experiment gives the illusion of artificial consciousness, and raises questions about the nature of consciousness.
>
> [...]
>
> One theory of consciousness called Attention Schema Theory, developed by neuroscientist Michael Graziano at Princeton University,
> strikes me as particularly interesting. For one part, this theory introduces a concept called "attention", which is reminiscent
> of the famous "Attention is all you need" that triggered the recent spike in Language Model performance.
PS: ho replicato l'esperimento con ChatGPT-4. con risultati del tutto analoghi, se non migliori.
Fabio