Please Note: Only COVID-19 vaccinated adults and children over 5 can attend the Clinic.

Are We Redundant Yet?

We know that artificial intelligence can achieve above the 99.9% on traditional measures of intelligence, outperforming most humans on subtests of vocabulary, general knowledge, arithmetic, concept formation and abstract reasoning. Wisdom, the capacity for good judgement to make life decisions is another form of intelligence and one that people seek in therapy. Has AI rendered therapy redundant in this domain? 

What is Wisdom? 

Igor Gossmann of the University of Waterloo in Canada explored this question by asking people to think aloud about social and political issues which were then rated by psychologists. He identified five elements to components of wise reasoning: recognising the limits of our knowledge, identifying the possibility for change, considering multiple perspectives, searching for compromise, and seeking a resolution to the conflict. His research has shown that this capacity can better predict wellbeing than IQ alone. 

How Well Does AI Do? 

BBC journalist David Robson asked Gossmann if he could measure AI’s wisdom and, while clear that any results were tentative and must be treated with caution, Gossmann accepted the challenge. AI Platforms like ChatGPT, which are called language models, develop algorithms based on vast amounts of text fed into the system that is then refined through feedback by humans. Gossman developed suitable prompts and then fed these to OpenAI’s GPT4 and Claude Opus, a large language model from Anthropic. The results were analysed by research assistants according to each dimension of wisdom with a score from 1-3. The chatbot scored 2/3 for its capacity to recognise the possibility of change, search for compromise and the prediction of conflict resolution. It failed dismally on its capacity to consider different perspectives or show intellectual humility. Gossman concluded that , ”Overall, it seems to me that the systems can be perceived as doing better than humans on a range of dimensions, except for intellectual humility,”  

Should we be Afraid? 

Probably not, because we are comparing two completely different things. 

An 18th century equivalent of the ChatGPT experience is The Agony Aunt column. This column dates to 1709 when Della Manley began a paper called The Female Tatler which included an advice column with the reader submitting their  perspective on their problem and posting it in. With no other human interaction, questioning, meeting, or discussion – the answer would come back published in the paper. This continues today, most notably with the New York Times, Ethicist segment and it appears that the issues presented are not so different from those of the 18th century. 

Therapy is different, with the relationship between practitioner and client as one of the key variables for change. Genuineness, warmth, empathy, unconditional acceptance, and responding to the other person as a fellow human have all been identified as central to the process. Therapy relies on a conversation where the practitioner introduces new information to the client both verbally and visually that allows a renewed understanding and different solutions to life’s challenges to emerge. Our enquiry encompasses a bigger view than that of one person and suggests different and equally valid perspectives should be considered. We do not tell; we ask and show. Our task is not to ‘know better’ but to engage with another person with empathy, compassion and professional knowledge that allows them to ‘know better’ for themselves. 

 

David Robson Dear AI: This is what happens when you ask an algorithm for relationship advice 5 May 2024 BBC 

Free weekly
director’s notes
  • This field is for validation purposes and should be left unchanged.

By subscribing you agree to receive marketing communications from Bower Place. You can unsubscribe at any time or contact us to have your details deleted from our database.