Commentary June 22, 2016

To Use or Not? Evaluating ASPECTS of Smartphone Apps and Mobile Technology for Clinical Care in Psychiatry

John B. Torous, MD; Steven R. Chan, MD, MBA; Peter M. Yellowlees, MD, MBBS; Robert Boland, MD

J Clin Psychiatry 2016;77(6):e734-e738

Article Abstract

In this commentary, we discuss smartphone apps for psychiatry and the lack of resources to assist clinicians in evaluating the utility, safety, and efficacy of apps. Evaluating an app requires new considerations that are beyond those employed in evaluating a medication or typical clinical intervention. Based on our software engineering, informatics, and clinical knowledge and experiences, we propose an evaluation framework, "ASPECTS," to spark discussion about apps and aid clinicians in determining whether an app is Actionable, Secure, Professional, Evidence-based, Customizable, and TranSparent. Clinicians who use the ASPECTS guide will be more informed and able to make more thorough evaluations of apps.

To Use or Not? Evaluating ASPECTS of Smartphone Apps and Mobile Technology for Clinical Care in Psychiatry

Vertical divider

ABSTRACT

In this commentary, we discuss smartphone apps for psychiatry and the lack of resources to assist clinicians in evaluating the utility, safety, and efficacy of apps. Evaluating an app requires new considerations that are beyond those employed in evaluating a medication or typical clinical intervention. Based on our software engineering, informatics, and clinical knowledge and experiences, we propose an evaluation framework, "ASPECTS," to spark discussion about apps and aid clinicians in determining whether an app is Actionable, Secure, Professional, Evidence-based, Customizable, and TranSparent. Clinicians who use the ASPECTS guide will be more informed and able to make more thorough evaluations of apps.

J Clin Psychiatry 2016;77(6):e734-e738

dx.doi.org/10.4088/JCP.15com10619

aDepartment of Psychiatry, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts

bDepartment of Psychiatry, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts

cDepartment of Psychiatry and Behavioral Sciences, University of California, Davis, School of Medicine, Sacramento

*Corresponding author: John B. Torous, MD, 221 Longwood Ave, Boston, MA 02115 ([email protected]).

The expanding use of mobile health (mHealth) technologies is unprecedented in the history of medicine. Every month, companies and researchers release new smartphone apps, smartwatches, and sensor technologies for the health care market. Psychiatry has been no exception to this trend. There has also been growing patient, clinical, government, and payor interest in the potential of mHealth technologies for psychiatric clinical care. Psychiatrists, clinical psychologists, psychotherapists, and other mental health clinicians are increasingly faced with questions regarding the efficacy and risks of these technologies, typically presented as mobile apps, underscoring the importance of adopting a methodology to evaluate them.

Mobile phones and apps, with their exponential growth, represent the most rapidly adopted technology in human history.1 Recent estimates report that over 165,000 health care apps are now directly available to patients.2 While there is no accurate count of the number of psychiatry-related apps, the same study2 suggests that mental health and behavioral disorders are the largest group of apps for a specific disease state, larger than cardiology, cancer, endocrine, or musculoskeletal.2 Research has also shown that mental health outpatients increasingly own smartphones that can run these apps and are interested in using apps in their clinical care.3,4 Today’s psychiatric patient is well connected in this digital world. For example, those with schizophrenia not only own phones5 but also have no trouble using smartphone apps related to their condition.6

Yet, while patients have access to an exponentially increasing number of apps, the research literature has not kept pace. A recent literature review of smartphone apps for bipolar disorder and major depressive disorder found only 14 articles, with the majority being pilot feasibility studies.7 A systematic review of the literature on smartphone apps for schizophrenia found only 7 published studies.6 These studies primarily reported pilot and feasibility data, with little efficacy, safety, or clinical outcome data in the published literature, across all psychiatric diagnostic categories and disease states. This lack of data has not held back the high level of industry and consumer interest.

Without research to guide clinical practice in mobile health technologies like smartphone apps, psychiatrists may look to professional organizations, regulatory bodies, research literature, and other third parties offering app reviews. Yet, currently, professional bodies—such as the American Psychiatric Association (APA)—offer little guidance, although an APA workgroup on smartphone app evaluation hopes to offer such guidance in the near future. The US Food and Drug Administration (FDA) has announced that it does not intend to regulate apps that appear to be of low risk and do not transform a smartphone into a medical device or perform patient-specific analyses, diagnoses, or treatments.8 It is important to note that the concept of patient risk in psychiatry is often dynamic, and consequently it is hard to predict the risk, or the unintended consequences, that an app could pose to a patient. Regardless, clinicians will not find actionable information for evaluating apps from the FDA. Organizations like the UK National Health Service (NHS), the Anxiety and Depression Association of America, and other third parties have recently attempted to professionally evaluate and rate health care apps, although the utility and validity of such recommendations remain uncertain.9 The recent closure of the British NHS app rating website after a study revealed serious flaws in the security and privacy of many vetted apps10 is further evidence of the difficulty in curating apps.

The landscape is further complicated by the fact that there are several broad categories of health care apps that are very different in scope, purpose, and use. Consider how a single app can be categorized in numerous ways: patient- or clinician-facing; self-help or "prescribed" by a clinician; used for diagnosis and monitoring versus treatment focused; and with local data, shared data connected to a consumer-driven social network, or shared data connected to an electronic medical record (EMR) system. The combinations of these categories—along with other features—make it a challenge to categorize apps.

EVALUATING APPS

Currently, psychiatrists have 2 choices regarding the use of smartphone apps and other consumer devices for clinical care. The first choice is not to use them because of the weak evidence base and unclear professional and regulatory standards. This is certainly a reasonable choice. The alternative is to use them and acknowledge that apps are here to stay, that they will be increasingly used as clinical tools, and that some smartphone apps may have efficacy—proven or as yet unproven—that could be of benefit to certain patients. In some cases, specific apps have undergone clinical studies suggesting efficacy.11,12 In addition, patients are already bringing apps, sleep-tracking devices, and activity-monitoring devices to psychiatrists to ask for a professional opinion on their use, in the same way that many patients bring Internet resources and Google searches to physicians for second opinions.13

With the second choice, psychiatrists must decide whether the potential benefits of apps outweigh the potential risks. While the general principles that are used for evaluating the use of a medication apply to evaluating an app, there are also new aspects to consider that clinicians may not be familiar with. The use of a framework for clinicians to approach smartphone app evaluation can help facilitate an informed discussion and ensure all facets, or aspects, of apps are considered. The points proposed below—based on industry reports, telemedicine guidelines, and usability principles—are by no means comprehensive and are based on the authors’ informatics knowledge and clinical experience with apps in health care as opposed to rigorous clinical trials. Given the nascency of apps for health care, though, the authors believe this to be the beginning of an important tool to help clinicians to consider all "ASPECTS" of an app: whether the app is Actionable, Secure, Professional, Evidence-based, Customizable, and TranSparent. This checklist builds on the authors’ prior work on app evaluation,14 with the goal of presenting a clinically focused framework to spark discussion and guide a comprehensive discourse when evaluating apps. Given the numerous variations in apps, not all elements of this checklist will apply for each app. For example, a self-help app that offers psychoeducation via access to a video library may not require the same security standards as an app transmitting patients’ medication adherence data to a medical record system. Still, it is important to consider if each element applies in order to ensure none are overlooked.

Actionable

An app is useful only if the data it collects, and the results it produces, are actionable and meaningful. With powerful, ubiquitous smartphone sensors and data processing abilities, it is increasingly possible to capture tremendous amounts of self-reported, behavioral, and physiologic symptom data. For example, it may be possible for an app to collect data on patterns of patient movement throughout the day. Although interesting, the clinical relevance of such data is currently unknown. A practical app should instead produce data that the patient and clinician can use to make informed decisions about the course of clinical care. The simple mantra "just because we can measure and collect it does not inherently make it valuable or clinically useful" is good to keep in mind. Thus, in considering whether to use an app in clinical care, a psychiatrist should consider how those app data will be incorporated into clinical decision-making and how the data will inform care. For example, if an app is able to detect patterns of medication nonadherence, the clinician and patient should have a plan for how to use such information to improve adherence and not assume that the collection of data itself is a treatment plan.15,16

To ensure that app data are actionable in the health care setting, it will be increasingly valuable for some categories of apps to be able to seamlessly integrate with electronic health records and complement clinical practices. Any app that has an aim of providing information to clinicians should not increase the workload or burden on that clinician by making data hard to access. New data standards—such as the Fast Healthcare Interoperability Resources (FHIR) standard—are creating a common language to enable smartphone app data to communicate and integrate directly with EMRs. (http://www.govhealthit.com/news/fhir-and-future-interoperability). This standardization increases the usefulness of the EMR with actionable results, sparing both patient and physician from spending precious clinic time fumbling with multiple devices, apps, and patient portals.

Secure

While prescribing a medication may not warrant significant concerns about data security, this issue becomes critical when evaluating an app for clinical use. While regulations such as the Health Insurance Portability and Accountability Act (HIPAA) mandate certain security requirements for apps, there are core security features that psychiatrists should seek to ensure are present in any app involving patient information. First, apps should be protected by passphrases, biometric authentication, or other security features such as 2-factor authentication; this should be easy to verify with simple testing. Second, an app should encrypt patient data on the device itself to ensure that others cannot easily read it if the device is stolen or hacked. While it may be difficult for the typical psychiatrist to verify an app’s encryption protocol, assuring this is listed as a feature of the app is important, as it was recently discovered that none of the 79 apps studied from the British NHS app library encrypted data on the device.9 If an app does not clearly state that it encrypts data on the device, it may be best to look elsewhere. Third, an app should also clearly state that it encrypts patient data during transmission of the data and then stores the data in an encrypted and secure manner. Checking for this is important, as the same NHS report found that of the 35 studied apps that transmitted information, two-thirds did not encrypt data, and 4 sent personal health information with no encryption. If considering an app for clinical use, it is best to not simply rely on claims made about an app, and it is best to discuss security concerns with the information technology department of a local hospital or institution.

In addition, psychiatrists can educate and counsel patients to take protective measures, such as not granting apps access to information by learning about Android App Ops and iOS’s permission dialog boxes, coaching them to not enter personal information to apps, and using vetted third-party security applications and built-in OS features, such as biometric and 2-factor authentication.

Professional

When considering an app for clinical use, a psychiatrist should ensure that such use is in line with professional standards, including legal and ethical considerations. While there are clear professional, legal, and ethical standards for prescribing medications, less clarity exists for recommending apps. One new factor that should be considered is compliance with numerous state and certain federal regulations, such as HIPAA, when using digital technologies. While laws governing electronic health data will vary from state to state, HIPAA is federal and in part requires strict protection and confidential handling of protected health information as well as severe penalties for violations. A full description of HIPAA is beyond the scope of this article, but clinicians should carefully assess if using a particular app in clinical practice falls under the scope of HIPAA, as many commercial apps do not meet HIPAA’s privacy, security, and encryption standards. A new website was recently released by the federal government that offers an accessible resource to learn more about HIPAA and apps (https://www.healthit.gov/providers-professionals/your-mobile-device-and-health-information-privacy-and-security). State and local laws are also important to consider, and it is advisable to refer to the American Telemedicine Association state resources (http://www.americantelemed.org/policy/state-policy-resource-center) on legal compliance and the Federation of State Medical Boards (FSMB) telemedicine policies (https://www.fsmb.org/Media/Default/PDF/FSMB/Advocacy/FSMB_Telemedicine_Policy.pdf), seek advice from local professional organizations such as APA chapters, review the soon-to-be-released APA technology resource kit, or consult with a lawyer familiar with local health care law to ensure the planned app use is legal. For example, an app that allows a psychiatrist to video chat with a patient in a different state may actually constitute malpractice17 in the form of practicing medicine without a license (if the psychiatrist is not also licensed in the state where the patient is using the app).

New ethical considerations beyond those frequently thought of when evaluating medications also need to be considered. There are now well-established ethical standards employed in the pharmaceutical industry that affect how drug companies interact with psychiatrists, as well as ethical standards between biotechnology device manufacturers and surgeons. Similar ethical standards between app developers and physicians are still nascent and underdeveloped. Psychiatrists need to be mindful of the ethical considerations of app use and employ their professional judgment to maintain a strong patient-clinician relationship that remains independent from the consumer-user relationship that may be present between an app developer and app user, where conflicts of interest can easily occur.

Evidence-Based

One of the most important aspects of evaluating a medication, therapy, or app is the need to seek out those with the most clinical evidence and efficacy data, while balancing against potential unintended consequences, risks, and harm. While risks of app use may initially appear minimal, there are already documented cases in which apps designed for reduction in alcohol intake led to increased alcohol use18 and several cases in which apps designed to deliver specific types of therapy did not yield the desired results when tested.19 Another concern is that app time is also screen time, which might be replacing other useful activities such as exercise or socialization.20 Another risk is that some apps could lead patients or clinicians to think the patient is treatment refractory when in reality the app is ineffective. Thus, it is important for clinicians to look for apps with clinical evidence and a strong research base to understand the potential risk as well as to assess the quality of the data. Apps with no or limited data may be risky to use, and the use of such apps should include a clinician-led discussion with the patient in which the current lack of evidence and potential risks are explained. While the evidence base for apps is still nascent, research efforts are accelerating, and there are a small but growing number of apps with some clinical data to support use. That said, this lack of evidence might be one of the greatest barriers to app use at the present time, and it is something that psychiatrists should discuss with patients.

Customizable

One size does not fit all for psychiatric treatments, and the same is true for psychiatric apps. When considering an app for clinical use, psychiatrists should look for those that offer more customizable and flexible features. Both patients and clinicians are more likely to be invested in and adherent to something they created together as a team and designed to fit the problem at hand. For example, a mood-tracking app that allows the patient and clinician to pick from a menu of evidence-based scales, menu of sensor data streams, and menu of adjunctive digital therapies ensures that the app will best meet the needs of the clinician and patient, while ideally collecting the minimum amount of data necessary. An app that is customizable does not mean that it is not validated, but rather that it is made from many validated elements that can be turned off or on, depending on the clinical situation. Such an app empowers both patients and clinicians to be active and invested in the use of an app for clinical care, becoming a key element of personalized. Along these same lines, a psychiatrist can suggest different combinations of apps for each patient as well, customizing the use of several apps for the patient’s condition.

Transparent

Transparency takes on a dual role when evaluating apps, first for understanding how apps work and second for buttressing privacy. Ensuring that an app openly reports how data are collected, stored, analyzed, used, and shared is critical in selecting an app for clinical care. If there is uncertainty about how an app is using a patient’s health care data, then there is inherent uncertainty in any conclusions or recommendation that app may offer. While many app companies may "black box," or obscure, the data analysis methods in order to protect intellectual property, it is difficult for clinicians to trust results that are produced in secret and not supported with strong clinical evidence. For example, imagine an app that uses the GPS signal on a patient’s phone, as well as data from the call logs, with the claim of predicting relapse of depression. If the way in which the GPS and call log data are being used and the methods by which the company is predicting depression risk are not transparent to the clinician and patient,then using such an app is risky. Furthermore, patients need to also understand how their data are being used, as trust is always necessary for accurate reporting of symptoms. "Black box" and cryptic data analysis methods produce apps that are of questionable clinical value and likely not deserving of clinician or patient trust, whereas transparent apps and methods are a step in the right direction.

Transparency is equally important outside of the app for supporting privacy. Security features such as passphrases and encryption are necessary, but not sufficient, for privacy. Security alone does not guarantee privacy, as an app can have the world’s best security features but also sell patient information to data brokers and marketers. It is important for clinicians and patients to select apps that not only guarantee data security but also have easy-to-understand privacy policies. The best privacy policy is likely one in which the patient controls the use of his or her own health care data—and whether, with whom, and when it may be shared. Privacy policies that are vague or difficult to understand should act as a warning sign. Some apps may have business models that involve selling patient data to third parties or advertising companies. Unless there is transparency in the privacy policy of an app, it may be best to avoid that app in clinical care.

CONCLUSION

Mobile technologies such as smartphones and apps are currently not core clinical tools in psychiatric care, but their prevalence and potential mean that psychiatrists should be aware of their presence and potential applications. The limited evidence base and unknown risks of app use in clinical care are justifiable reasons to delay their implementation. However, patients will increasingly bring apps into the clinical visit with them. Understanding the complexity of evaluating apps—with important differences from other more standard practices and interventions—is important for leading an informed discussion with patients regarding app use.

No single app will be right for every patient, no single app rating will be customized to each patient’s unique needs, and even the best app rating will become obsolete with new and updated versions of an app. Likewise, not all features of our ASPECTS checklist will apply to every app, but at least considering each feature will help ensure that none are overlooked. When working directly with each patient to consider all of the checklist items outlined here, psychiatrists may not be able to know for sure if an app is effective or safe, but they will be asking the right questions and starting the right discourse with patients to ensure that more personalized, informed, and educated choices are made. Whether that choice is to use an app is for the psychiatrist and patient to decide at this time, but it seems clear that apps are here to stay, so all physicians will need to learn how to work with them and use them as clinical tools in future.

Submitted: December 29, 2015; accepted March 1, 2016.

Online first: April 26, 2016.

Potential conflicts of interest: None reported.

Funding/support: None reported.

REFERENCES

1. Rainie L, Wellman B. Networked, The New Social Operating System. Cambridge, MA: MIT Press; 2012.

2. Patient adoption of mHealth. IMS Health Web site. http://www.imshealth.com/en/thought-leadership/ims-institute/reports/patient-adoption-of-mhealth. Accessed October 30, 2015.

3. Torous J, Friedman R, Keshavan M. Smartphone ownership and interest in mobile applications to monitor symptoms of mental health conditions. JMIR Mhealth Uhealth. 2014;2(1):e2. PubMed doi:10.2196/mhealth.2994

4. Torous J, Chan SR, Yee-Marie Tan S, et al. Patient smartphone ownership and interest in mobile apps to monitor symptoms of mental health conditions: a survey in four geographically distinct psychiatric clinics. JMIR Ment Health. 2014;1(1):e5. PubMed doi:10.2196/mental.4004

5. Firth J, Cotter J, Torous J, et al. Mobile phone ownership and endorsement of "mHealth" among people with psychosis: a meta-analysis of cross-sectional studies. Schizophr Bull. 2016;42(2):448-455. PubMed doi:10.1093/schbul/sbv132

6. Firth J, Torous J. Smartphone apps for schizophrenia: a systematic review. JMIR Mhealth Uhealth. 2015;3(4):e102. PubMed doi:10.2196/mhealth.4930

7. Torous J, Powell AC. Current research and trends in the use of smartphone applications for mood disorders. Internet Interventions. 2015;2(2):169-173. doi:10.1016/j.invent.2015.03.002

8. US Department of Health and Human Services. Mobile medical applications: examples of MMAs that are NOT medical devices. FDA.gov Web site. http://www.fda.gov/MedicalDevices/DigitalHealth/MobileMedicalApplications/ucm388746.htm. Updated April 4, 2016. Accessed October 30, 2015.

9. Wicks P, Chiauzzi E. "Trust but verify"—five approaches to ensure safe medical apps. BMC Med. 2015;13(1):205. doi:10.1186/s12916-015-0451-z PubMed

10. Huckvale K, Prieto JT, Tilney M, et al. Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment. BMC Med. 2015;13(1):214. PubMed doi:10.1186/s12916-015-0444-y

11. Ben-Zeev D, Brenner CJ, Begale M, et al. Feasibility, acceptability, and preliminary efficacy of a smartphone intervention for schizophrenia. Schizophr Bull. 2014;40(6):1244-1253. doi:10.1093/schbul/sbu033 PubMed

12. Gustafson DH, McTavish FM, Chih MY, et al. A smartphone application to support recovery from alcoholism: a randomized clinical trial. JAMA Psychiatry. 2014;71(5):566-572. PubMed doi:10.1001/jamapsychiatry.2013.4642

13. Murray E, Lo B, Pollack L, et al. The impact of health information on the internet on the physician-patient relationship: patient perceptions. Arch Intern Med. 2003;163(14):1727-1734. PubMed doi:10.1001/archinte.163.14.1727

14. Chan S, Torous J, Hinton L, et al. Towards a framework for evaluating mobile mental health apps. Telemed J E Health. 2015;21(12):1038-1041. doi:10.1089/tmj.2015.0002 PubMed

15. Dayer L, Heldenbrand S, Anderson P, et al. Smartphone medication adherence apps: potential benefits to patients and providers. J Am Pharm Assoc (2003). 2013;53(2):172-181. PubMed doi:10.1331/JAPhA.2013.12202

16. Heron KE, Smyth JM. Ecological momentary interventions: incorporating mobile technology into psychosocial and health behaviour treatments. Br J Health Psychol. 2010;15(pt 1):1-39. PubMed doi:10.1348/135910709X466063

17. Shore JH. Telepsychiatry: videoconferencing in the delivery of psychiatric care. Am J Psychiatry. 2013;170(3):256-262. PubMed doi:10.1176/appi.ajp.2012.12081064

18. Gajecki M, Berman AH, Sinadinovic K, et al. Mobile phone brief intervention applications for risky alcohol use among university students: a randomized controlled study. Addict Sci Clin Pract. 2014;9(1):11. PubMed doi:10.1186/1940-0640-9-11

19. Heffner JL, Vilardaga R, Mercer LD, et al. Feature-level analysis of a novel smartphone application for smoking cessation. Am J Drug Alcohol Abuse. 2015;41(1):68-73. PubMed doi:10.3109/00952990.2014.977486

20. Przybylski AK. Electronic gaming and psychosocial adjustment. Pediatrics. 2014;134(3):e716-e722. PubMed doi:10.1542/peds.2013-4021