Is There an Increased Risk of Cyberchondriasis Post ChatGPT Era?: A Conceptual Model With Precipitating, Predisposing, and Maintaining Factors

Is There an Increased Risk of Cyberchondriasis Post ChatGPT Era?: A Conceptual Model With Precipitating, Predisposing, and Maintaining Factors

DOI: 10.4018/978-1-6684-9300-7.ch013
OnDemand:
(Individual Chapters)
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

Technology advancements have given people the ability to research health topics online and do self-diagnostics. Cyberchondria is a clinical occurrence wherein recurrent online searches for medical information lead to heightened anxieties regarding one's physical well-being. The AI-powered language model ChatGPT, created by OpenAI, can produce text that sounds like human speech depending on context and previous interactions. Therefore, there is a tremendous increase in use of generative AI such as ChatGPT in assessing their own health condition which may results in increase in prevalence of Cybercondriasis. The objective of current study is understanding the precipitating, predisposing, and maintaining factors in results of using ChatGPT in self-diagnosis. The results develop a conceptual model using psychosocial model and use literature to observe its relevance.
Chapter Preview
Top

Psychosocial Model

A conceptual systems framework such as psychosocial model has been dicussed in various research strategies. Forexample, psychosocial factors on the trajectory of child/adolescent mental health problems (CAMHP) over time was discussed by Daniel Fatori et al., 2013 (D. et al., 2013). Cyberchondria has been discussed in various psychological models. Matteo Vismara et al., discussed the systematic review psychological models and mechanisms associated with cyberchondria (CYB), relationships between CYB and mental disorders and prevention and treatment strategies (Vismara et al., 2020). Zheng et al., in 2020 has published an important theoretical model including psychosocial factors that illuminates the processes of cyberchondria development among individuals who are anxious about their health (Zheng et al., 2021).

Figure 1.

Psychosocial model

978-1-6684-9300-7.ch013.f01
Top

Precipitating Factors

Seeking medical advice online has become popular in the recent past. Therefore a growing number of people might ask the recently hyped ChatGPT for medical information regarding their conditions, symptoms and differential diagnosis (Mehnen L., Gruarin, S., Vasileva, M., Knapp, 2023). When individuals input their symptoms into ChatGPT and receive a list of potential conditions, they might hyper-focus on the most severe or alarming possibilities, leading to increased anxiety. People who misread or misunderstand the data shown by ChatGPT may conclude prematurely that they have a significant medical problem. Skyler B Johnson et al., results suggest that ChatGPT provides accurate information about common cancer myths and misconceptions yet, it is unknown to assume that ChatGPT could be accurate in giving information of all kind of medical myths and misconception (Johnson et al., 2023). Continuously seeking reassurance from ChatGPT about health concerns can lead to a cycle of increased anxiety as individuals become more fixated on their symptoms. Shahsavar and Choudhury suggest to improve the technology in ensuring AI chatbots’ safe and responsible use in health care (Shahsavar & Choudhury, 2023). Engaging excessively with ChatGPT or other health-related websites over a short period can lead to an information overload that triggers heightened anxiety. A study published in 2021 support positive relationship between compulsive health-related internet use and cyberchondria (Khazaal et al., 2021). Moreover, If ChatGPT responds to a user's input with unexpected or distressing information, it can catch the individual off guard and contribute to anxiety. A study by McMullan et al showed positive correlation between health anxiety and online health information seeking, and between health anxiety and cyberchondria (McMullan et al., 2019). ChatGPT lacks the ability to provide emotional support or empathy, which can leave individuals feeling isolated and anxious when discussing their health concerns (Zhao et al., 2023)(Arslan, 2023). People may get more worried or anxious while using ChatGPT in their health condition as they try to understand what ChatGPT's replies indicate since they are vague or lack context or limited engagement (Frosolini et al., 2023).

Complete Chapter List

Search this Book:
Reset