The Brain and Mind discussion

43 views
Cognitive Science > AI and Consciousness

Comments Showing 1-5 of 5 (5 new)    post a comment »
dateUp arrow    newest »

message 1: by Joe (new)

Joe Danielewicz | 1 comments David Chalmers is a widely acknowledged expert on AI and consciousness. He is a Professor of Philosophy and Neural Science at New York University and co-director of NYU’s Center for Mind Brain and Consciousness. Chalmers is famous for calling ‘consciousness’ the ‘Hard Problem’; how do physical processes in the brain give rise to first-person conscious experience?

Chalmers admits we do not fully understand consciousness, nor is there an accepted operational definition of the phenomenon.

In his article “Could a Large Language Model be Conscious” (2022), Chalmers outlines the special features of consciousness that most people would agree on:
• Self-Awareness
• Sense Perception
• Affective Experience (i.e., emotional and mood-related experiences)
• Robust World Model (i.e., can generalize to new unseen data)
• Memory


message 2: by Michael (new)

Michael B. Morgan | 3 comments Joe wrote: "David Chalmers is a widely acknowledged expert on AI and consciousness. He is a Professor of Philosophy and Neural Science at New York University and co-director of NYU’s Center for Mind Brain and ..."

That's really interesting! Thanks, Joe :)


message 3: by Joe (new)

Joe Danielewicz | 1 comments Consciousness is linked to intention. When we are conscious, we are intentionally directing our thinking toward something. It could be an object or person in our immediate field of perception, a poblem that needs solving, or it could be a memory that we’ve deliberately called to mind, or a combination.

The other distinguishing feature of consciousness is self-awareness. When we are conscious, we are aware of what we are directing our thinking toward; the object of our thought. Just as important, we are also aware of ourselves. We are aware that we are directing our thinking toward some-thing. We are self-aware or self-conscious.

AI chatbots like ChatGPT or Gemini lack self-awareness and intention. They are not conscious.


message 4: by Michael (new)

Michael B. Morgan | 3 comments Joe wrote: "AI chatbots like ChatGPT or Gemini lack self-awareness and intention. They are not conscious."

I totally agree with you, Joe, but strong AI supporters say that self-consciousness is just a matter of time. To them, a human brain and a computer are just different hardware, so consciousness becomes software that you can just change the hardware to. It would just be a matter of finding the right algorithm. Which is like saying that consciousness can emerge from a complex system of algorithms.

I’m reading a book by Roger Penrose on the subject, and I find it very interesting:
The Emperor's New Mind: Concerning Computers, Minds and the Laws of Physics


message 5: by Edson (new)

Edson Pinheiro (edsonpinheiro) | 5 comments That's indeed interesting! Thanks, Joe and Michael!

I think that the debate on whether AI can achieve consciousness hinges on our understanding of the mind, which is still incomplete. David Chalmers highlights the "Hard Problem" of consciousness, emphasizing that we don't fully understand how physical processes in the brain translate to subjective experience.

While AI systems like ChatGPT can simulate certain aspects of human thought and behavior, they lack self-awareness and intentionality, crucial components of consciousness. Strong AI proponents argue that consciousness might emerge from sufficiently advanced algorithms, suggesting that it's a matter of complexity rather than a fundamental difference in kind.

However, critics like Roger Penrose in "The Emperor's New Mind" argue that human consciousness might involve non-computable processes beyond current AI capabilities. Thus, while the prospect of conscious AI is intriguing, it remains speculative and deeply tied to unresolved questions about the nature of consciousness itself.

What do you think about it?


back to top