Based in Hertfordshire, England Calmermind is a blog by Michael Trup. His posts explore the nexus of various social sciences, most notably politics, psychology, economics and business. He does this through a perspective of 40+ years of hugely varied practical and academic experience. His style is generally to be succinct, controversial and mixed with a dash of humour.


Talk to me - I'm ELIZA

shutterstock_405458026.jpg

When I first discuss the idea of coaching or therapy being delivered by a chatbot, the most common objection from professionals in the field is that the ‘human connection’ is actually the most important part of the process. However, even going back to times before using chat became the norm for person to person interaction in so many aspects of popular culture, people were showing they were open to talking to a computer, even if they knew the respondent was a software program and not a human.

ELIZA was an early natural language processing computer program created from 1964 to 1966 at the MIT Artificial Intelligence Laboratory by Joseph Weizenbaum. Its most famous script, DOCTOR, simulated a was based on the therapeutic concepts of the humanistic psychotherapist Carl Roger, who responded to patients by paraphrasing what they had just said and used rules, dictated in the script, to respond with open-ended, non-directional questions to user inputs. 

Sherry Tuckle, relates this process which she personally witnessed in the 1970’s, in her book Alone Together:


So, a user typed in a thought, and ELIZA reflected it back in language that offered support or asked for clarification.1 To “My mother is making me angry,” the program might respond, “Tell me more about your mother,” or perhaps, “Why do you feel so negatively about your mother?” ELIZA had no model of what a mother might be or any way to represent the feeling of anger….Weizenbaum’s students knew that the program did not know or understand; nevertheless they wanted to chat with it. More than this, they wanted to be alone with it. They wanted to tell it their secrets. Faced with a program that makes the smallest gesture suggesting it can empathize, people want to say something true. I have watched hundreds of people type a first sentence into the primitive ELIZA program. Most commonly they begin with “How are you today?” or “Hello.” But four or five interchanges later, many are on to “My girlfriend left me,” “I am worried that I might fail organic chemistry,” or “My sister died.”

Therapy or coaching is not necessarily about finding answers from the other person but it can be about ordering your thoughts and thereby achieving clarity. Just the act of seeing or hearing yourself express your thoughts or emotions can have a therapeutic effect. Is this effect reduced by not heard by a human but instead by a robot? As Sherry Tuckle goes on to ask, ‘What if a robot is not a “form of life” but a kind of performance art? What if “relating” to robots makes us feel “good” or “better” simply because we feel more in control?’. We know from Self-Determination Theory (more to come in a subsequent blog) that autonomy is a crucial constituent of well-being, so maybe it is natural to want to feel in control when you are sharing your innermost thoughts?

Did you take time to wash today?