To the Editor: Smartphone applications (apps) have garnered public interest in the field of psychiatry. These interventions help to reach underserved populations in need of mental health services. Preliminary apps assisted users to manage aversive emotional states, such as depression and anxiety.1 Subsequently, the technology progressed to address a range of mental health problems including social robots for autism and sex robots such as Roxxxy.2 These apps employ cognitive-behavioral therapy (CBT) and can be used solo or with professional assistance.
A recent article3 in The Primary Care Companion for CNS Disorders compared traditional CBT with application-guided therapy. Six trials were analyzed that investigated a variety of patient populations including individuals with anxiety, depression, acrophobia, and panic disorder. In particular, young patients showed greater benefit, as smartphones play a more central role in their lives than older individuals. However, in all age groups, adherence was higher with some degree of clinician involvement.3
In the “real world,” self-guided treatments may not involve clinicians at all. While some apps are portrayed as educational, others are designed to develop therapeutic relationships. These relational apps have endearing names such as Woebot and Wysa and have engaging and empathetic properties. These “chatbot clinicians” are available 24 hours a day and are designed to form an alliance with users.2 They are ready to chat about the “troubles of the day,” provide advice, and teach CBT strategies. These apps have a captivating screen presence and interact with the patient to aid them in the recognition of their symptoms and self-management skills. Concurrently, more research is needed to ensure efficacy and safety of these virtual platforms.
Although technological advancements in this field are promising, the associated risks should not be disregarded. Long term use of artificial intelligence (AI) interventions could result in some patients developing clinically meaningful attachments to these apps. People may humanize chatbots, which can precipitate an elevated level of trust that could be potentially misused. Just like in human therapeutic relationships, there is the risk of transference of emotions, thoughts, and feelings to the robot. Can the robot adequately manage transferential aspects of the relationship?
Ethical principles must be considered when attempting to replace face-to-face mental health services with virtual apps. Although apps are accessible, issues such as safety, confidentiality, and privacy must be addressed. Applications such as Woebot are available through social media platforms and therefore are connected to patients’ real names. When a third-party site is used in conjunction with CBT apps, there is less protection of sensitive information.4 Conversely, Wysa allows users to remain anonymous without the use of a third party. Clear guidelines are needed on handling confidential data collected by assistive robots. International standards for clinical trials of AI systems have been developed in Europe to promote transparent protocols.5 Unregulated growth raises concerns regarding the effects of these applications in vulnerable persons. In terms of beneficence, the primary advantage of apps is the potential to reach populations where mental health services are scarce. After all, it is hard to say no to therapy when it is at your fingertips.
Article Information
Published Online: August 20, 2024. https://doi.org/10.4088/PCC.24lr03754
© 2024 Physicians Postgraduate Press, Inc.
Prim Care Companion CNS Disord 2024;26(4):24lr03754
To Cite: Modesto-Lowe V, Adams S, Rossi A. Smartphone applications: therapy at your fingertips. Prim Care Companion CNS Disord. 2024;26(4):24lr03754.
Author Affiliations: Hartford Behavioral Health, Hartford, Connecticut (Modesto-Lowe); Department of Psychiatry, University of Connecticut, Farmington, Connecticut (Modesto-Lowe); School of Health Sciences, Quinnipiac University, Hamden, Connecticut (Modesto-Lowe, Adams, Rossi).
Corresponding Author: Vania Modesto-Lowe, MD, MPH, Hartford Behavioral Health, 2550 Main St, Hartford, CT 06120 ([email protected]).
Relevant Financial Relationships: None.
Funding/Support: None.
References (5)
- Sinha C, Meheli S, Kadaba M. Understanding digital mental health needs and usage with an artificial intelligence-led mental health app (Wysa) during the COVID-19 pandemic: retrospective analysis. JMIR Form Res. 2023;7:e41913. PubMed CrossRef
- Fiske A, Henningsen P, Buyx A. Your robot therapist will see you now: ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. J Med Internet Res. 2019;21(5):e13216. PubMed CrossRef
- Pelucio L, Quagliato LA, Nardi AE. Therapist-guided versus self-guided cognitive-behavioral therapy: a systematic review. Prim Care Companion CNS Disord. 2024;26(2):23r03566. PubMed CrossRef
- Kretzschmar K, Tyroll H, Pavarini G, et al. NeurOx Young People’s Advisory Group. Can your phone be your therapist? Young people’s ethical perspectives on the use of fully automated conversational agents (chatbots) in mental health support. Biomed Inform Insights. 2019;11:1178222619829083. PubMed CrossRef
- Ibrahim H, Liu X, Rivera SC, et al. Reporting guidelines for clinical trials of artificial intelligence interventions: the SPIRIT-AI and CONSORT-AI guidelines. Trials. 2021;22(1):11. PubMed
Enjoy free PDF downloads as part of your membership!
Save
Cite
Advertisement
GAM ID: sidebar-top