The executive knows she has a tough week ahead, with some difficult decisions to make. Once upon a time, she might have had a PA, but now she has a dedicated AI. On the way to work, she talks over with her AI assistant what has to be accomplished today. When she gets to the office, the papers she needs for each meeting are already printed, or ready to upload as needed, complete with highlights. Ten minutes before the meeting, the AI pulls up a summary of:
- Who will be there
- What their objectives will be
- What she wants to get out of the meeting
After the meeting, the AI prompts her to review what happened and creates a list of actions expected of her and of other participants. The AI reminds her she has forgotten one person, who needs to be informed.
As the day goes on, the AI alerts her when her energy starts to flag, as it often does at that time of day. She takes the hint, goes for a walk and restores her blood sugar levels. The AI also monitors her anxiety levels, prompting her to prioritise the most worrying issues. Instead of dealing with those straight away, the AI asks if she would first like to have a short mindfulness break. It leads her through some breathing and relaxation exercises.
When she is in a good frame of mind to tackle the difficult issues, the AI offers her a menu of decision-making algorithms, from which she selects. As she thinks out aloud, the AI captures and edits her words, occasionally offering a question from a library it has learned from previous conversations with her. Most of these she ignores, but two she highlights and comes back to at a later stage in her reflections. The AI helps her summarise what has changed in her thinking and what actions she now intends to take, along with when to remind her to follow through.
Some of the actions require her to write a memo or email. She asks: “What previous text do I have?” The AI searches for similar situations and offers several examples, one of which can be adapted relatively easily for the current purpose – saving her half an hour or more in creating a memo from scratch.
One of her meetings the next morning is with her coach. The AI brings up the notes she made after the previous coaching session. Does she want to explore the same theme in more detail? Or bring a new issue? She consults her “Frustration-Elation Log”, a weekly review of the things that have gone very well and those that haven’t. The AI has helpfully clustered these by key word. She doesn’t find the selection helpful and selects a different key word — immediately a pattern emerges. She now has a clear issue to bring to coaching, along with specific examples to draw upon.
She drafts an email to her coach, outlining the issue and starts a checklist of preparation for the meeting. The AI suggests some simple diagnostic questionnaires that might be helpful. She completes one of these and sends the results to her coach.
So, what’s left for the human coach to do? He or she adds value in several ways. Firstly, they can draw upon their intuition and imagination (qualities the AI can’t emulate) to step into the issue with then client. This allows the coach to ask questions drawing upon a much wider reservoir of experience and knowledge than an AI can access. For example, the coach can help the client contextualise and make sense of the diagnostic. How reliable an indicator is it? Are here responses a reflection of particular circumstances or of a deeper pattern of responses and behaviours?
The coach also helps the client step outside the immediate issue and see the complexity of the systems in play. What are the motivations of all the other people involved and who is pulling their strings? When it comes to drawing a systems map, the coach and the client can again engage with the AI, which will capture a visual image of connections, as they talk. This visual image will be more flexible than anything drawn on paper – it will effectively be three dimensional, rotating as instructed so they can see the system from different perspectives.
Another key attribute the coach brings is a combination of friendship and belief in the client. People’s ability to tackle challenges is greatly enhanced when someone else believes in them. Those same qualities allow the coach to be a critical friend, giving tough feedback when needed.
Above all, the coach brings themselves to the conversation. The Gestalt of their own mental and physical reactions is a key part of the co-discovery process. Our self-awareness is integral to the client’s experience of coaching. We cannot yet envisage how an AI could recognise projection and counter-projection, for example.
Current coach education assumes that we have to start coaches off with a model or process that they apply. Intelligent, self-aware coaches rapidly outgrow this and learn that being is more important than doing. If AI can do the doing as well as or better than a human, do we need to train beginner coaches in these models at all?
© David Clutterbuck, 2023