Lentis/Ellie, the Microsoft Kinect, and Psychotherapy

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Ellie is the name of the virtual psychotherapist used in the SimSensei system, an artificial intelligence (AI) platform which can interview patients and collect information to assess their mental health. Ellie is being developed by researchers at multiple universities and funded by the Defense Advanced Research Projects Agency (DARPA) to investigate the potential of automated technology in psychological screening. This system is currently being tested in National Guard members in Afghanistan, and researchers believe it may be especially helpful for mental health care in the military by increasing accessibility and countering the stigma that is sometimes associated with mental disorders. Few people have expressed strong opinions about Ellie because it has only been tested on a few small groups. However, studies of the emergence of other medical diagnostic tools and automated technologies indicate that some could oppose Ellie's use due to concerns about the accuracy of the system or the job security of psychologists.

History of Diagnosis in Psychotherapy[edit | edit source]

Clinical diagnosis of mental disorders has evolved significantly over the last century. In the early 1900s, professionals in the nascent fields of psychology and psychiatry focused primarily on describing observable symptoms of mental illnesses. Sigmund Freud, in the 1920s and 30s, shifted the field by investigating causes of mental disorders; however, psychotherapists from this era tended to attribute mental illness to intentional misbehavior or personal weakness. It was only in 1980, when the third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSMIII) was published, that mental disorders began to be considered legitimate medical conditions which could be diagnosed with an evidence-based scientific system.[1]

Ellie and the SimSensei Project[edit | edit source]

Ellie was developed by computer scientist Louis-Philippe Morency from Carnegie Mellon University and psychologist Albert “Skip” Rizzo and with other staff at the University of Southern California’s Institute of Creative Technologies (ICT). Funded by DARPA, the SimSensei project aims to study a virtual human-based psychotherapy system. This could eventually create a platform to help diagnose post-traumatic stress disorder (PTSD), anxiety, and depression in military personnel and their families.[2] The U.S. Department of Veterans Affairs estimates that PTSD and depression affect a significant portion of returning military members, including a fifth of Iraq and Afghanistan veterans. Current and retired military members fear stigma and resulting career and personal repercussions associated with receiving professional help for these disorders.[3] Developers currently classify Ellie as a clinical decision support tool that could supplement questionnaires or serve as a screening platform.

Quantitative Observation[edit | edit source]

Using a Microsoft Kinect platform connected to an artificial intelligence virtual human (Ellie), the system can interact with the patient in real time and carry out a fifteen to twenty minute long interview. Ellie collects and processes a large amount of information.[4] Tracking 60 different movements, at 30 measurements per second using an integrative platform, Ellie analyzes voice patterns, posture, and facial expressions. Distress, depression, and other psychological disorders have been linked to certain behavioral and speech patterns. Measurements are synthesized to provide indicators of attention and fidgeting, gaze aversion, speaking fraction, smile level, and upper body activity, which inform the SimSensei reaction. In an initial study, scientists were able to distinguish between depressed and non-depressed patients using only the information gathered by Ellie.[5]

Patient Interaction[edit | edit source]

Ellie uses machine learning and natural language processing algorithms to actively listen and incorporate mimicry and body language cues. The current version of Ellie is programmed to respond after a conversational pause with pre-recorded audio clips, which limits the number of possible responses to about 100 words or phrases and 20 non-verbal movements. These are classified by the function they serve in the system: interview questions, neutral backchannels (“uh huh”), positive empathy, negative empathy, surprise responses, and continuation prompts.[4] Ellie is programmed to build rapport and then ask psychological questions during a conversation. The developers believe Ellie could improve consistency in how questions are asked (and interpreted), prevent patients from feeling judged, and avoid biases introduced by human evaluators.[5]

Ellie’s Effect on Willingness to Disclose[edit | edit source]

Patients may not reveal personal information that is illegal, unethical, or stigmatized. However, therapists need this information to effectively treat patients. Ellie has been found to increase patients' willingness to disclose information, when compared to a human-controlled system [6]. Research suggests two factors can increase a patient's willingness to disclose: rapport and anonymity.

Rapport[edit | edit source]

Rapport is a relationship in which parties understand each other and communicate well. When rapport is built, patients reveal more personal details, such as fears and anxieties, to therapists [7]. While communicating, verbal and non-verbal behaviors can influence rapport. For example, interviewers who ask follow-up questions get more disclosure than those who do not [8]. Body language such as facial expressions, gestures, gaze, and posture also impact rapport [9]. This could explain why rapport suffers and people feel less connected when interviewed by nonhuman agents unable to express verbal cues and body language [10].

Ellie's capabilities are still being assessed. Users ranked Ellie's listening abilities worse than a human-controlled interface, most likely because of the state of the artificial intelligence technology. However, users gave Ellie and face-to-face interviewers similar rapport scores [5].

Uncanny Valley[edit | edit source]

Uncanny Valley

In 1970, Masahiro Mori proposed that as robots become more human-like, there is a point right before they look perfectly human that frightens people [11]. He named this effect the uncanny valley. Attempts to make Ellie life-like could cause her to fall in the uncanny valley, make it harder to build rapport with users, and reduce their willingness to disclose.

Anonymity[edit | edit source]

Anonymity occurs if one feels one's identity is protected. When interviewed through a computer, patients felt more anonymous and disclosed more than when interviewed face-to-face [12]. Ellie could help patients feel more anonymous because she allows them to respond in the absence of a human psychiatrist. However, patients may feel less anonymity if the data Ellie collects is shared with psychiatrists. Researchers are investigating whether knowledge that their specific responses will be viewed in the future changes patients' willingness to disclose. Initial results suggest even if participants know their responses will be viewed later, they disclose more information than in face-to-face interviews[13].

Lessons from Current Computer-Aided Psychotherapy[edit | edit source]

Ellie is not currently in use clinically and is not intended to be a primary method of diagnosis or treatment, so Skype-based psychotherapy and telemedicine counseling might indicate the responses to and possible benefits of future computer-aided diagnosis methods. Online systems provide a broader selection of caregivers to choose from, allowing patients can receive care targeted at very specific patient populations and needs.[14] As could be the case with Ellie, patients can interact with the therapist on their own schedule in a location that is comfortable, enhancing convenience. In areas where little care is available, telehealth significantly reduces a barrier to access. Because practitioners are required to hold a license in the patient’s state, many instead advertise their services as psychoanalysis or life coaching - similarly, it could be difficult to classify Ellie. [15] If the patient is in immediate danger, the remote counselor needs to have a way to get help for the patient locally.[14] As with Ellie, rapport and comfort with disclosing personal information may be affected by the system's technological capabilities and the lack of face-to-face interaction.

Debate Over Accuracy of Diagnosis[edit | edit source]

Though a few studies have quantified the consistency of psychological diagnosis, many studies have suggested that diagnoses remain insufficiently reliable. Researchers believe numerous factors lead to inconsistent diagnoses, including inadequacy of psychiatric nomenclature, clinician biases, and insufficient time to complete a comprehensive assessment [16]. Robert Spitzer, a key author of DSM-III, believes the field of psychology still faces a reliability problem[17].

Support for Computer-Aided Diagnosis[edit | edit source]

Some researchers believe that computers can help improve the reliability of psychiatric and medical diagnoses. In a review of 163 studies, Grove et al. found mechanical prediction techniques (which use statistics and data-processing algorithms) were on average 10% more accurate than clinical judgment (which use data from informal and subjective methods such as personal interviews) [18]. Others believe computerized systems like Ellie will make clinicians less susceptible to anchoring bias, a phenomenon which makes them place too much weight on the symptoms they initially observe when making a diagnosis[19][20]. Developers of other computer-aided diagnostic tools believe they will increase diagnostic accuracy because computers can hold much more information than a single human clinician could ever know. Scientists at IBM who are adapting the AI platform Watson for medical diagnosis assert that doctors would need to spend up to 160 hours per week to stay current with medical literature[21]. As an objective system containing a larger volume of up-to-date medical information, a computer algorithm could improve the diagnostic accuracy by providing clinicians with differential diagnoses to consider.

Opposition[edit | edit source]

Because Ellie has only been tested in a few locations, few psychologists are familiar enough with the system to express opposition. However, the arguments made against the use of x-ray imaging after it was invented in 1895 provide insight on possible complaints. While most believed x-ray would improve surgery, a Yale professor of medicine worried that doctors would become too reliant on x-ray images and discount the importance of human judgment and face-to-face interaction with patients. “We must get back to training students to look at the patient rather than simply the data base,” he stated [22]. As described in the chapter on bedside manner, Dr. Abraham Verghese voices similar opinions. He criticizes the overreliance on data and believes that human touch and personal interaction are crucial in preventing doctors from overlooking simple diagnoses, and in building trust in the patient-doctor relationship. Finally, the computer scientist Joseph Weizenbaum believed computer-aided psychotherapy was immoral and dehumanizing because it is impossible for a computer to provide genuine warmth and empathy [23]. Weizenbaum made these comments after creating ELIZA, one of the first crude computer therapist simulators. Ellie and the SimSensei system might provide more comprehensive information and be more human-like than earlier technologies, but criticism such as the arguments by the Yale professor, Verghese, and Weizenbaum may still exist.

Impact on the White Collar Job Market[edit | edit source]

Jobs in manufacturing and other blue-collar sectors have made up a decreasing proportion of total U.S. employment in recent decades, a trend which some analysts attribute to the growth of automation [24]. The fact that manufacturing output grew while manufacturing jobs decreased since the 1970s supports the idea that automated technologies have replaced humans in certain blue-collar jobs by offering higher productivity. Now, white-collar jobs might also be threatened by technological advances like SimSensei. Although the current version of Ellie still requires humans to analyze the data collected and make a diagnosis, Ellie practically eliminates the time they would need to spend conducting in-person interviews, which reduces the number of psychologists needed to care for the same number of patients. Computerized systems are also beginning to achieve the capabilities of journalists, lawyers, and other white-collar workers. For instance, Narrative Science has developed a system which can generate sports articles when given discrete pieces of data and facts [25], and computer algorithms are being used to screen thousands of legal documents and select only the ones most relevant for lawyers to review for a case [26]. In a survey of experts in AI, the Internet, information technology, and other fields, the Pew Research Center found that the “vast majority” of the 1,896 respondents believed that robotics and AI will “permeate wide segments of daily life” by 2025. However, about half of the experts believe that the rise of robotics and AI would decrease the total size of the white-collar sector by replacing human labor, while the other half believe that robotics and AI will lead to a net increase in white-collar jobs by creating even more completely new types of jobs [27].

Conclusion[edit | edit source]

Ellie is not designed to completely replace the human component of psychological diagnosis, though other medical technologies and human-computer interfaces may demonstrate Ellie's potential impact on psychological care. Current research is still exploring how computer-based methods affect patient disclosure, and how Ellie may reduce stigma associated with seeking mental health care. More research is needed to understand the risks of moving towards increasingly automated diagnostic technologies, as well as how Ellie will affect a patient's sense of anonymity and willingness to disclose if an individual's results are revealed to psychiatrists later. Ellie raises a key question: how can technology affect healthcare for patients and clinicians?

References[edit | edit source]

  1. Mayes, R., & Horwitz, A.V. (2005). DSM‐III and the revolution in the classification of mental illness. Journal of the History of the Behavioral Sciences, 41(3), 249-267. https://facultystaff.richmond.edu/~bmayes/pdf/dsmiii.pdf
  2. University of Southern California Institute for Creative Technologies. (2014). SimSensei. http://ict.usc.edu/prototypes/simsensei/
  3. RAND. (2008). One in five Iraq and Afghanistan veterans suffer from PTSD or major depression. http://www.rand.org/news/press/2008/04/17.html
  4. a b Morbini, F., DeVault, D., Georgila, K., Artstein, R., Traum, D., & Morency, L.-P. (2014). A demonstration of dialogue processing in SimSensei Kiosk. In 15th Annual Meeting of the Special Interest Group on Discourse and Dialogue (p. 254). http://www.aclweb.org/anthology/W/W14/W14-43.pdf#page=274
  5. a b c DeVault, D., Artstein, R., Benn, G., Dey, T., Fast, E., Gainer, A., … (2014). SimSensei kiosk: a virtual human interviewer for healthcare decision support. In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems (pp. 1061–1068). International Foundation for Autonomous Agents and Multiagent Systems. http://dl.acm.org/citation.cfm?id=2617415
  6. Lucas, G. M., Gratch, J., King, A., & Morency, L. P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94-100.
  7. Dijkstra, W. (1987). Interviewing Style and Respondent Behavior An Experimental Study of the Survey-Interview. Sociological Methods & Research, 16(2), 309–334. doi:10.1177/0049124187016002006
  8. Miller, L. C., Berg, J. H., & Archer, R. L. (1983). Openers: Individuals who elicit intimate self-disclosure. Journal of Personality and Social Psychology, 44(6), 1234–1244. doi:10.1037/0022-3514.44.6.1234
  9. Hall, J. A., Harrigan, J. A., & Rosenthal, R. (1995). Nonverbal behavior in clinician—patient interaction. Applied and Preventive Psychology, 4(1), 21–37. doi:10.1016/S0962-1849(05)80049-6
  10. Gratch, J., Kang, S.-H., & Wang, N. (2013). Using Social Agents to Explore Theories of Rapport and Emotional Resonance. In J. Gratch & S. Marsella (Eds.), Social Emotions in Nature and Artifact (pp. 181–197). Oxford University Press. http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780195387643.001.0001/acprof-9780195387643-chapter-12
  11. The Uncanny Valley - IEEE Spectrum. (n.d.). http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley
  12. Weisband, S., & Kiesler, S. (1996). Self Disclosure on Computer Forms: Meta-analysis and Implications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3–10). New York, NY, USA: ACM. doi:10.1145/238386.238387
  13. G. Lucas, personal communication (email), December 4, 2014.
  14. a b Novotney, A. (2011). A new emphasis on telehealth. http://www.apa.org/monitor/2011/06/telehealth.aspx
  15. Burgo, J. (2014). The skype psychologist. http://www.theatlantic.com/health/archive/2014/12/the-skype-psychologist/382910/
  16. Aboraya, A., Rankin, E., France, C., El-Missiry, A., & John, C. (2006). The reliability of psychiatric diagnosis revisited: The clinician's guide to improve the reliability of psychiatric diagnosis. Psychiatry (Edgmont), 3(1), 41.
  17. Spiegel, Alix (2005, January 3). The dictionary of disorder. The New Yorker.
  18. Grove, W. M., Zald, D. H., Lebow, B. S., Snitz, B. E., & Nelson, C. (2000). Clinical versus mechanical prediction: a meta-analysis. Psychological assessment, 12(1), 19.
  19. Cohn, Jonathan. (2013, February 20). The robot will see you now. The Atlantic.
  20. Ofri, Danielle. (2012, July 19). Falling into the diagnostic trap. The New York Times. Retrieved from http://well.blogs.nytimes.com/
  21. Kohn, Martin. (2013, April 22). Innovator chat: How Watson can transform healthcare. The Atlantic.
  22. Blume, S. S. (1992). Insight and industry: on the dynamics of technological change in medicine. MIT Press.
  23. Ford, B. D. (1994). Ethical and professional issues in computer-assisted therapy. Computers in human behavior, 9(4), 387-400.
  24. Sherk, J. (2010). Technology explains drop in manufacturing jobs. The Heritage Foundation. Retrieved from www.heritage.org.
  25. Fassler, Joe. (2012, April 12). Can the computers at Narrative Science replace paid writers? The Atlantic.
  26. Palazzolo, Joe. (2012, June 18). Why hire a lawyer? Computers are cheaper. The Wall Street Journal.
  27. (2014, August 6). Digital life in 2025: AI, robotics, and the future of jobs. Pew Research Center. Retrieved from www.pewinternet.org.