Survey shows many people believe AI chatbots like ChatGPT are conscious

midian182

Posts: 9,963   +129
Staff member
WTF?! Are advanced, generative AI chatbots such as ChatGPT in some way conscious or self-aware, able to experience feelings and memories just like a human? No, of course not. Yet the majority of people believe these LLMs do have some degree of consciousness, and that conviction could affect how users interact with them.

Generative AI language tools have made unbelievable advancements over the last few years. According to a new study from the University of Waterloo, the software's human-like conversational style has resulted in over two-thirds (67%) of those surveyed believing LLMs have a degree of consciousness.

The survey asked 300 people in the US if they thought ChatGPT could have the capacity for consciousness and the ability to make plans, reason, feel emotions, etc. They were also asked how often they used OpenAI's product.

Participants had to rate ChatGPT responses on a scale of 1 to 100, where 100 would mean absolute confidence that ChatGPT was experiencing consciousness, and 1 absolute confidence it was not.

The results showed that the more someone used ChatGPT, the more they were likely to believe it had some form of consciousness.

"These results demonstrate the power of language," said Dr. Clara Colombatto, professor of psychology at Waterloo's Arts faculty, "because a conversation alone can lead us to think that an agent that looks and works very differently from us can have a mind."

The study, published in the journal Neuroscience of Consciousness, states that this belief could impact people who interact with AI tools. On the one hand, it may strengthen social bonds and increase trust. But it may also lead to emotional dependence on the chatbots, reduce human interactions, and lead to an over-reliance on AI to make critical decisions – concerns that have been raised ever since ChatGPT was pushed out to the general public.

"While most experts deny that current AI could be conscious, our research shows that for most of the general public, AI consciousness is already a reality," Colombatto explained.

The study argues that even though a lot of people don't understand consciousness in the way scientists or researchers do, the fact that they perceive these LLMs as being in some way sentient should be taken into account when it comes to designing and regulating AI for safe use.

Discussions about AI showing signs of self-awareness have been around for years. One of the most famous examples is that of Blake Lemoine, a former Google software engineer who was fired from the company after going public with his claims that Google's LaMDA (language model for dialogue applications) chatbot development system had become sentient.

Asking ChatGPT if it is conscious or sentient results in the following answer: "I'm not conscious or sentient. I don't have thoughts, feelings, or awareness. I'm designed to assist and provide information based on patterns in data, but there's no consciousness behind my responses." Maybe people believe the chatbot is lying.

Permalink to story:

 
Why not !? It doesnt have a soul and senses but has memories . The last thing makes them similar to us .
 
Last edited:
This is nothing new... I remember the old chat bots from like 2005 or 10? So many people that I knew were 100% sure they were real people. When I said it's just bots, they kept saying it's SKYNET, it can think, its smart etc. Some of them were never convinced these bots were not real people either... It's hilarious, but yeah. They can pretend all they want, tis not the real thing kek. (they = AI botz)
 
More proof that fully half of people are below average intelligence.

AND our education system is woefully inadequate for many people.
I do not think all people can be helped. Besides, how is it that some of the most skilled people in various areas still fall to conspiracies ridiculous ideas?
The thing that is really needed here is the system that can reach those who have a potential and ability to do something with received knowledge. Sadly, the society and culture we live in suggest that we will go the opposite direction, where skill and ability mean nothing while approved beliefs and identity everything.
I am seriously questioning people who say that we need more of a _____ in highly demanding industries.
It means that the best person who is the most promising and talented is not enough because he is not _______
No, we are not pointed toward growing more of smart people, we are aiming at convincing *****s they are amazing and then letting them steer the system because they are _____
And it does not even matter what "_____" is at this point. It is not the ability to be good at something that benefits other people.
 
Yet the majority of people believe these LLMs do have some degree of consciousness, and that conviction could affect how users interact with them.
Are these the same group of people that still think the Earth is flat? Some people aren't so swift, especially those who are not tech-savvy and have no idea how intelligence or CPUs function.

Same electrical charge going through circuitry vs neurons, so go figure.

1*iTsFgQ61UyT4B6-GCzQHAw.jpeg
While we all can appreciate the reference, it doesn't really work in this context.
 
Back