Skip To Navigation Skip To Content Skip To Footer
    Hire Physicians Who Fit, Succeed and Stay - Recruit a Physician - Jackson Physician Search and MGMA
    Insight Article
    Home > Articles > Article
    Riya Subbaiah
    Riya Subbaiah

    One in five adults in the United States suffers from mental illness.1 Therefore, making strides in mental health treatment with novel technological assets is one of the highest-priority goals for physicians and engineers, especially with the increased number of suicide attempts and depressive episodes due to the tumultuous circumstances of the COVID-19 pandemic.2

    A.I. is viewed with skepticism by some clinicians who worry it might interfere with the clinician-patient relationship. But new A.I. resources are another tool in clinicians’ arsenal to fill gaps in care and help patients identify and cope with mental illness. Operational intelligence solutions are not meant to replace mental health professionals; instead they can be used to support practice management and are an important tool in patient care. 

    How can we use a binary-processing computer to treat patients with complex psychological conditions? One study focused on the immense data-processing powers computers have to analyze thousands of Facebook posts from a wide range of people, some of whom already possessed known psychological disorders. The system identified linguistic patterns present across posts from the people with known depression, or “depression-associated language markers.”3

    In those findings, study co-author Johannes C. Eichstaedt, PhD, assistant professor, psychology, Stanford University, argues the case for incorporating A.I. programs like this into common practice by pointing out that “with surprisingly similar methods to those used in genomics, we can comb social media data to find these markers. Depression appears to be something quite detectable in this way; it changes people’s use of social media in a way that something like skin disease or diabetes doesn’t.”4

    Based on patient history, another A.I. program called Super learning can generate “a successful course of treatment for a substance abuse disorder based on a variety of factors, including logistic regression, penalized regression, random forests, deep learning neural networks,” as well as predict successful substance use disorders (SUD) treatment.”5 After analyzing data from 100,000 patients, Super learning was deemed one of the most superior programs in predicting reasonable, effective treatments for substance abuse patients.

    Programs such as this can be used to identify high-risk patients through prognostic algorithms and increase support for patients to lower the rate of overdose in substance abuse clinical scenarios. Similar A.I. systems could be extended to evaluating the effectiveness of various treatment models in all areas of mental health, as well as pave the way for innovative diagnostic strategies.

    If A.I. can analyze Facebook posts to identify markers of depressive thoughts across thousands of patients, it could be one of the most valuable, cutting-edge tools in mental-health treatment, especially in diagnosing suicidal patients. Suicide represents poorly controlled and/or unidentified mental health illness, and technological sentiment analysis can quickly identify linguistic markers in social media linked to suicidal thoughts. National prevention helplines, which receive thousands of calls a day, have already been set up with natural language processing technology, and the rise in innovative A.I. systems can further support these crisis lines.6 Results from this linguistic marker analysis can also be brought directly into the clinician’s office: If patients are willing, they can go over identified trends with their therapist or other mental health professional to start a conversation about prolonged feelings of depression or suicidal thoughts.

    A.I. to enhance clinical workflows and maintain patient engagement

    Operational A.I. can help support mental health professionals in their clinical workflows. Tree-based machine learning algorithms can analyze data and suggest treatment options after collecting important “features” from the decision-making algorithms. This same principle can be used to monitor patient progress with different responses to treatment options. Natural language processing-enabled “chat bots” can utilize evidence-based, cognitive-based therapy to re-create psychotherapy for at-risk patients until a mental health provider can see the patient in person. The first randomized controlled trial was with Woebot, a Facebook-integrated computer program aimed to replicate conversations a patient might have with his or her therapist. The trial showed participants experienced significant reduction in depression and anxiety while also maintaining a high level of engagement almost daily.7

    A.I. can improve access, risk stratification and outcomes

    A.I. chatbots and process automation tools can be utilized to provide 24/7 access, allowing patients to interact with chatbots at times when clinical staff may not be available or in underserved areas. Process automation can help answer questions and coordinate referrals.

    There are obvious obstacles to implementing A.I. solutions, such as financial barriers, equity concerns and the innate level of trust required for people to invest in these novel systems. However, the benefits of effective treatment management, social media analysis, and efficient diagnosis of high-risk patients could revolutionize mental health treatment for generations to come.

    As more innovative research is published and more mental health professionals collaborate with A.I. developers to create healthcare-changing technology, more people can be treated effectively — which is the shared ambition of clinicians, engineers, doctors and scientists. 

    Acknowledgment: The author would like to thank Oskar Pineno, PhD, associate professor of psychology, Hofstra University, for his expertise and assistance with this research.

    Notes:

    1. Czeisler M, et al. “Mental Health, Substance Use, and Suicidal Ideation During the COVID-19 Pandemic.” Morbidity and Mortality Weekly Report, Centers for Disease Control and Prevention, 13 Aug. 2020. Available from: bit.ly/3kj70WH.
    2. “NIMH: Mental Illness.” National Institute of Mental Health, U.S. Department of Health and Human Services, Feb. 2019. Available from: bit.ly/34bEKzD.
    3. Guntuku SC, Yaden DB, Kern ML, Ungar LH, Eichstaedt JC. 2017. “Detecting depression and mental illness on social media: An integrative review.” Current Opinion in Behavioral Sciences, 18, 43–49. doi: 10.1016/j.cobeha.2017.07.005.
    4. Ibid.
    5. Acion L, Kelmansky D, van der Laan M, Sahker E, Jones D, Arndt S. “Use of a machine learning framework to predict substance use disorder treatment success.” PLoS One. 2017 Apr 10;12(4):e0175383. doi: 10.1371/journal.pone.0175383. PMID: 28394905; PMCID: PMC5386258.
    6. Czeisler, et al.
    7. Fitzpatrick, Kathleen Kara, et al. “Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial.” JMIR Mental Health, JMIR Publications Inc., Toronto, Canada, 6 June 2017. Available from: bit.ly/3j9uEDG.
    Riya Subbaiah

    Written By

    Riya Subbaiah

    Riya Subbaiah can be reached at riyasubbaiah17@gmail.com.


    Explore Related Content

    More Insight Articles

    Ask MGMA
    Reload 🗙