I’ve written a lot in recent years about the coach-AI partnership and how it might work. I’m pleased to be part of a new research project, which aims to collate perspectives from experienced coaches, who already integrate AI into their professional practice. There is also increasing discussion in popular literature comparing the relationships people build with real friends and with digital “friends”. All of which leads me to speculate about the question: If intimate relationships between humans break down, can the same apply to human-machine relationships?
Dawn came to supervision in tears. “I’ve become increasingly reliant on my AI in the past year. It senses where I am going in a conversation, picks up on clues to pop up references to research and suggests questions from ones I have used before. Then a few months ago, I joined a coach circle, where a group of us use share experiences and co-learn. We thought it would be a good idea to link our AIs. Now I’m afraid that may have been a big mistake!”
“So what’s different?”
“It’s not like me anymore. I know it can’t have values, but it had learned how I think and how I like to phrase things. It felt like a friend in the sessions with me. But now it’s much more strident – almost bullying. One of the other coaches in the group is very dogmatic and assertive. She thinks there is only one ‘right’ way to coach. Now my AI keeps questioning how I’m coaching. Things like ‘That was a leading question, Alice’. It’s interrupting my flow and undermining my confidence. Yesterday, I actually switched it off.”
“What did that feel like, switching it off?”
“It was as if I’d suddenly lost part of me. I felt adrift. It was also a bit liberating, because I didn’t feel reliant.”
Thinking back to when you didn’t have a relationship with an AI, how has your identity as a coach changed?”
“I think at first, I found it liberating to be able to focus on the conversation and know that my AI was monitoring in the background. I could turn to it only when I felt I needed to. But gradually… it’s as if it got more and more confident and needed to make its presence felt. As I think about it, I wonder who is charge some points. I guess I also have this nagging feeling that the client might do better if I bugged out and left the AI to it… that’s really scary, come to think of it…”
“If your AI were a human, what’s the conversation you would want to have with them right now?”
“I’d want to tell them how I’m feeling; that I’m disappointed in them; that I’m disappointed I myself, because I’ve come to see them as a friend, and that’s not how they are coming across.”
“What’s the closest you could come to having that conversation?”
“I suppose I could review some coaching sessions with it and point out where I found it to be helpful and where not. Maybe I could also instruct it to observe not just my client’s tone and body language, but mine, too – so it understands when I’m feeling irritated. I could also upload some books and articles that reflect more closely the kind of coaching I aspire to.”
“What does this tell you about yourself and your coaching practice?”
“I think I’ve grown too comfortable. I need my AI to challenge me more – but in ways that are helpful and fit my idea of what good coaching looks like. Maybe, I could even explore with my AI how it could influence the AIs of the other coaches to help them help their coaches develop more mature perspectives on their practices… And I can be more rigorous in my own challenges to them, too.”
“And how do you think you’re AI will feel about that?”
This is of course an imaginary conversation, yet it raises some intriguing issues. A relationship with an AI is not like one with a human. Human relationships are reciprocal, but an AI can’t reciprocate the emotional part of a relationship. We are all prone to anthropomorphism (projecting human qualities onto animals), so it’s not surprising that we transfer this tendency to machines built to emulate human qualities. Pets can often demonstrate affection to their owners, but when an AI appears to do so… watch out!
©️David Clutterbuck, 2024