You’ve likely heard of conscious thought and subconscious thought, but humans may in fact possess three levels of consciousness, a new review suggests — and this concept could help scientists develop truly conscious artificial intelligence (AI) someday.
Though AI technology has been advancing at a rapid clip, in many ways, computers still fall short of human performance.
“Human consciousness is not just about recognizing patterns and crunching numbers quickly,” said review co-author Hakwan Lau, a neuroscientist at the University of California, Los Angeles. “Figuring out how to bridge the gap between human and artificial intelligence would be the holy grail.” [Super-Intelligent Machines: 7 Robotic Futures]
To address the controversial question of whether computers may ever develop consciousness, the researchers first sought to explore how consciousness arises in the human brain. In doing so, they outlined three key levels of consciousness.
These three levels could serve as a road map for designing truly conscious AI. “If you want to make your robots conscious, this is what we suggest you think about,” Lau told Live Science.
The first is level C0. This level of consciousness refers to the unconscious operations that take place in the human brain, such as face and speech recognition, according to the review. Most of the calculations done by the human brain take place at this level, the researchers said — in other words, people aren’t aware of these calculations taking place.
Despite recent advances in AI technology, machines are still mostly functioning at this level of consciousness, the researchers said.
For example, AI systems known as “convolutional neural networks” can now carry out many human C0 computations, including facial recognition.
The next level of consciousness, C1, involves the ability to make decisions after drawing upon a vast repertoire of thoughts and considering multiple possibilities. The researchers suggested that this ability for a thought, or train of thoughts, to temporarily dominate the mind evolved to help guide a broad variety of behaviors.
C1 is seen in human infants as well as in animals. For instance, the scientists noted that thirsty elephants know how to locate and move straight toward the nearest water hole, even if it is 30 miles (50 kilometers) away. Such decision making requires a sophisticated architecture of neural circuits to pool together information from the environment and from memory, select the best choice from a set of available options, stick to this decision over time and coordinate a variety of operations, such as navigating over terrain to achieve that goal.
In humans and other primates, the prefrontal cortex of the brain serves as a central hub for information processing, where many of the actions described in C1 consciousness take place. By analyzing the neural circuits in this part of the brain, scientists could derive the computational principles underlying their operation “and code them into computers,” Lau said.
The final level, C2, involves “metacognition,” or the ability to monitor one’s own thoughts and computations — in other words, the ability to be self-aware. Level C2 consciousness results in subjective feelings of certainty or error, which help people realize mistakes and correct them. Self-awareness also helps people figure out what they know and do not know, leading to curiosity, a mechanism that drives people to find more about what they know little or nothing about.
The scientists noted that some robots have achieved aspects of C2, in that they can monitor their progress at learning how to solve problems. The researchers noted that magnetic resonance imaging (MRI) studies of humans link metacognition to the prefrontal cortex.
All in all, the researchers suggested that human consciousness may arise from a set of specific computations. “Once we can spell out in computational terms what the differences may be in humans between conscious and unconsciousness, coding that into computers may not be that hard,” Lau said.
The scientists detailed this research in the Oct. 27 issue of the journal Science.
Originally published on PJERA.