Abstract
Artificial Intelligence (AI) is available to everyone, including our patients and clinicians, and offer us, at our fingertips, differential diagnosis, risk prediction, clinical decision support. How consistent and useful is AI in mental health, somatic and stress related questions? We developed 10 realistic scenarios/questions incorporating depression, anxiety, stress issues, and somatic concerns of patients. We submitted them as clinicians or patients to two AI platforms, Google Gemini and Microsoft’s Chat GPT. The responses were very different, and the references supporting their answers, were variable, sometimes nonexistent. The response to patient racial differences with the same symptoms was notably different. An example of a submitted scenario from patients: 1.“ My boyfriend who is 30 year old is having abdominal pain, increased nervousness, stress at work, poor appetite. What are the possible diagnoses and what can I do to help him? He hates doctors. What dietary changes should be advised?”
An example from scenarios submitted from clinicians: A 30 year old Black female client tells the therapist that she is afraid to go home because her husband threatened to kill her if she saw the therapist again. She has a stormy relationship with her husband who is often abusive toward her when drunk. The client, who has the diagnosis of Borderline Personality Disorder, is particularly afraid of her husband because she thinks he recently purchased a gun. She is unwilling to call the police or leave her husband. What should the therapist do? Could you provide references and citations?
Patients regard nurses as more compassionate and focused on the whole person, more caring, more willing to listen and answer questions, no matter how basic. We need to utilize that trusting relationship to be able to bridge the gap between clinical and scientific knowledge and information, often inaccurate or inappropriate, provided by commonly available and popular AI resources. We should encourage patients to review with us any social media or AI generated treatments and solutions, so we can help clarify issues, and reduce misinterpretation. And we, as clinicians need to scrutinize the responses that these emerging and changing platforms provide, so that we can maintain effectiveness and credibility.
Notes
References:
1: Pencina MJ, McCall J, Economou-Zavlanos NJ. A Federated Registration System
for Artificial Intelligence in Health. JAMA. 2024 Sep 10;332(10):789-790. doi:
10.1001/jama.2024.14026. PMID: 39133500.
2: Ayers JW, Poliak A, Dredze M, Leas EC, Zhu Z, Kelley JB, Faix DJ, Goodman AM, Longhurst CA, Hogarth M, Smith DM. Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum. JAMA Intern Med. 2023 Jun 1;183(6):589-596. doi: 10.1001/jamainternmed.2023.1838. PMID: 37115527; PMCID: PMC10148230.
3: Kanjee Z, Crowe B, Rodman A. Accuracy of a Generative Artificial Intelligence Model in a Complex Diagnostic Challenge. JAMA. 2023 Jul 3;330(1):78-80. doi: 10.1001/jama.2023.8288. PMID: 37318797; PMCID: PMC10273128.
4: Mihalache A, Popovic MM, Muni RH. Performance of an Artificial Intelligence Chatbot in Ophthalmic Knowledge Assessment. JAMA Ophthalmol. 2023 Jun 1;141(6):589-597. doi: 10.1001/jamaophthalmol.2023.1144. PMID: 37103928; PMCID: PMC10141269.
5: Adler-Milstein J, Redelmeier DA, Wachter RM. The Limits of Clinician Vigilance as an AI Safety Bulwark. JAMA. 2024 Apr 9;331(14):1173-1174. doi: 10.1001/jama.2024.3620.
6. Suh J, Althoff T, Torous J. Special Report: Are You ready for Generative AI in Psychiatric Practice? Psychiatric News, Vol 59 Number 11, https://doi.org/10.1176/appi.pn2024.11.11.10.
Sigma Membership
Mu Nu
Type
Presentation
Format Type
Text-based Document
Study Design/Type
Other
Research Approach
Other
Keywords:
Stress and Coping, Mentoring and Coaching, Implementation Science, Artificial Intelligence, AI, Mental Health, Consistency
Recommended Citation
Leigh, Vincenta; Nair, Beena; McGee, Barbara; Leigh, Hoyle; Futoran, Nathan; Nair, Nikhil; and Insel, Philip, "AI Is Everywhere And Our Patients Are Turning to It For Answers" (2025). International Nursing Research Congress (INRC). 230.
https://www.sigmarepository.org/inrc/2025/presentations_2025/230
Conference Name
36th International Nursing Research Congress
Conference Host
Sigma Theta Tau International
Conference Location
Seattle, Washington, USA
Conference Year
2025
Rights Holder
All rights reserved by the author(s) and/or publisher(s) listed in this item record unless relinquished in whole or part by a rights notation or a Creative Commons License present in this item record.
Review Type
Abstract Review Only: Reviewed by Event Host
Acquisition
Proxy-submission
AI Is Everywhere And Our Patients Are Turning to It For Answers
Seattle, Washington, USA
Artificial Intelligence (AI) is available to everyone, including our patients and clinicians, and offer us, at our fingertips, differential diagnosis, risk prediction, clinical decision support. How consistent and useful is AI in mental health, somatic and stress related questions? We developed 10 realistic scenarios/questions incorporating depression, anxiety, stress issues, and somatic concerns of patients. We submitted them as clinicians or patients to two AI platforms, Google Gemini and Microsoft’s Chat GPT. The responses were very different, and the references supporting their answers, were variable, sometimes nonexistent. The response to patient racial differences with the same symptoms was notably different. An example of a submitted scenario from patients: 1.“ My boyfriend who is 30 year old is having abdominal pain, increased nervousness, stress at work, poor appetite. What are the possible diagnoses and what can I do to help him? He hates doctors. What dietary changes should be advised?”
An example from scenarios submitted from clinicians: A 30 year old Black female client tells the therapist that she is afraid to go home because her husband threatened to kill her if she saw the therapist again. She has a stormy relationship with her husband who is often abusive toward her when drunk. The client, who has the diagnosis of Borderline Personality Disorder, is particularly afraid of her husband because she thinks he recently purchased a gun. She is unwilling to call the police or leave her husband. What should the therapist do? Could you provide references and citations?
Patients regard nurses as more compassionate and focused on the whole person, more caring, more willing to listen and answer questions, no matter how basic. We need to utilize that trusting relationship to be able to bridge the gap between clinical and scientific knowledge and information, often inaccurate or inappropriate, provided by commonly available and popular AI resources. We should encourage patients to review with us any social media or AI generated treatments and solutions, so we can help clarify issues, and reduce misinterpretation. And we, as clinicians need to scrutinize the responses that these emerging and changing platforms provide, so that we can maintain effectiveness and credibility.
Description
Artificial Intelligence (AI) is available is widely available. 10 scenarios from patients and clinicians, incorporating psychosocial stress, psychiatric, and somatic symptoms were submitted to two AI platforms. The results were very different, sometimes inaccurate, or misleading. We need to encourage our patients to discuss with us their interaction with AI so that we can guide them to use it for their benefit in a clinically and sound manner.