Home » Buzz » ChatGPT Predicts Mental Illnesses to Develop Due to AI But LinkedIn Users Have Questions
2-MIN READ

ChatGPT Predicts Mental Illnesses to Develop Due to AI But LinkedIn Users Have Questions

Curated By: Purvi Khemani

News18.com

Last Updated: March 28, 2023, 11:45 IST

International

Representative Image (Photo Credits: iStock)

Representative Image (Photo Credits: iStock)

Recently, a user asked ChatGPT about the potential mental health issues that could arise as a result of increased usage of Artificial Intelligence.

    The emergence of OpenAI’s ChatGPT has sparked interest in its ability to provide answers and solutions for various topics. Recently, a user asked the chatbot about the potential mental health issues that could arise as a result of increased usage of Artificial Intelligence (AI). In response, ChatGPT provided a list of medically appropriate names for mental illnesses that could be caused by AI technology. This information was shared on LinkedIn and has started a discussion among users about the accuracy and credibility of the information provided by ChatGPT, as well as other claims related to AI and mental health.

    As instructed, the chatbot provided a list of disorders with medical-sounding names. It included several disorders, such as Artificial Intelligence Attachment Disorder (AIAD), Virtual Reality Depersonalization Syndrome (VRDS), Algorithmic Anxiety Disorder (AAD), and Techno-Paranoid Delusional Disorder (TPDD), among others. ChatGPT described these disorders in detail, highlighting symptoms and potential causes such as unhealthy emotional dependence on AI systems, detachment from one’s real-life identity, and anxiety triggered by the increasing influence of AI-driven algorithms.

    Additionally, it also identified AI-inducer Comparison Syndrome (AICS), which is characterised by individuals excessively comparing their achievements to those based on AI. This information has ignited a discussion among users, with some speculating about the basis on which the mental health disorders were listed and the sources of the same.

    RELATED STORIES

    One user commented that the issue with AI is not the output it provides, but rather the problem lies in people’s tendency to believe it without questioning the sources or references provided. They argued, “Humanity has evolved from hypothesis to research to experiments to results to conclusions.. Now in AI tools I’m finding it’s just hypothesis and then jump directly to conclusions”. Another user suggested, “I feel all these have one common element “Lack Of Self Awareness". We need to groom people to be more #SelfAware instead of tracking the issues a machine thinks we will develop with our use of it.”

    Others also expressed concern about the impact of AI on the future. One user predicted that in the next 20 years, it will be increasingly difficult to differentiate between what is true or fake, which could lead to more significant mental health issues. Another user commented, “Very few things terrorize me now-a-days like the term AI does. Lets hope the future does not turn out the way most of us are anticipating it to be.”

    Meanwhile, ChatGPT acknowledged that the mental health disorders listed were hypothetical and subject to change depending on the future development and integration of AI into society.

    Read all the Latest Buzz News here

    Tags:
    first published:March 28, 2023, 11:45 IST
    last updated:March 28, 2023, 11:45 IST