Skip to main content
AI in Coaching

AI is a hot topic. Everyone’s talking about it.  Some love it, some hate it. A lot have mixed feelings. What’s going on? In the world of coaching there is a big push on how you can use AI to support you in your business. The challenge is that some are bringing AI into their coaching sessions. Additionally, AI has been trained to coach individuals. There’s so many pros and cons involved here.

The biggest concern with AI being in a coaching session or having AI doing a coaching session is confidentiality. Specifically, AI uses what happens to continue learning. Training AI to coach involves giving it recorded sessions. Sometimes coaches and clients realize the session is used by AI and sometimes they don’t; even when they realize it, the problem remains: the information is on the cloud and in multiple systems accessed by many.

If AI is coaching, what are the pros? What are the cons? On the one hand, some people say they like the idea of the perceived anonymity of having AI as a coach. That brings up another question. Where’s the information going? If AI is doing the coaching, or helping the coach, what happens with that content? In some organizations where they’ve offered an AI coach to employees, the employees have said, “No thanks. We don’t know where that’s going or what’s happening with it or anything else.” There are concerns about confidentiality.

The comfort level somebody has with an AI coach is absolutely across the board. Is it great? Is it horrible or somewhere in between? And of course, each person is making their own decision and choice. What’s important here is that if somebody is giving or given the choice of choosing AI as a coach how are they informed in terms of the confidentiality? What are they told about what happens with the content of their conversations? What’s their comfort level with that? If somebody is a trained coach from a good school, they’ve learned the Code of Ethics, and they’ve learned about protecting client confidentiality. They understand transparency. AI isn’t adequately trained in coaching ethics nor on being transparent.

An interesting problem is that people are using AI to create their recorded session for assessment to earn a credential – meaning AI is coaching and not the coach. That’s absolutely astounding. The idea of the assessment, the recorded session, is that it is the coach who is being assessed. If AI is doing the coaching or coaching the coach, it isn’t the coach who is being assessed. It’s fraudulent. Here people are pursuing a credential in a profession that is very much about ethics. How can it even be considered to have AI do the coaching for you and submit that for an assessment? Absolutely mind-boggling!

Based on conversations and information I have had access to, AI is coaching currently at an ACC level. That in and of itself can be scary. Alternatively, I know people who have tested the AI coach. In one case, AI supported planning something illegal. Others had an AI coach that was making suggestions to them and giving them advice and telling them what to do. Which isn’t coaching at all. That’s a consultant, advisor, or mentor. A coach is not going to tell you what to do or how to do it. The role of a coach is to partner for developing the individual at an advanced level. That means they discover their own answers and create their own strategies which is how they achieve advanced results. The client is maximizing their own personal or professional potential.

I had the privilege recently of presenting with a group of people at the Global Leaders Forum for ICF. We shared a case study:

A global company integrates coaching and AI via trained ICF-member HR leaders who drive an internal coaching program and an AI-powered platform that tracks engagement and performance, suggests development plans, and summarizes confidential coaching conversations. The client, Emma, has a coach, Alex. Alex is an HR business partner from another division. Emma discussed her uncertainty about relocating and sought support rather than replacing the idea or rejecting the idea of mobility. In other words, she wanted to think it through and talk it through so that she could decide. Separately, the company’s AI-driven succession tool shortlisted candidates for an overseas role factoring in the coaching summaries which were stripped of personal details. Emma wasn’t selected. She suspected that the AI misinterpreted her discussions as reluctance for relocation. She filed the complaint through the ethics complaint review process alleging a confidentiality breach. Her coaching conversations influenced the career decisions or opportunities, an AI-driven bias. Alex’s coaching summaries fed into the system shaped the AI’s decision. AI assumptions misjudged her willingness based on coaching discussions, and blurred role boundaries. Alex is both coach and a human resource business partner. Alex did not clarify how the data was used.

The group discussed the situation with an eye on two standards. Standard 2.5, “Fulfill my ethical and legal obligations to my coaching client(s), sponsor(s), colleagues, and to the public at large directly and through any technology systems I may utilize (i.e. technology-assisted coaching tools, databases, platforms, software, and artificial intelligence).” Standard 3.2 was discussed, “Manage conflicts of interest and potential conflicts of interest with coaching client(s) and sponsor(s) through self-reflection, coaching agreement(s), and ongoing dialogue. This includes addressing organizational roles, responsibilities, relationships, records, confidentiality, and other reporting requirements.” The participants were invited to share whether they thought there was a breach of this standard. While most recognized the ethical breach, some did not.

There are additional standards to consider: 3.7. “Understand that ICF professionals often serve in multiple professional roles based on prior training and/or experience (i.e. mentor, therapist, HR specialist, assessor), and it is my responsibility to disclose to the client when I am acting in a capacity other than the role of an ICF professional.” Another standard that may apply is 4.2. “Recognize my personal limitations or circumstances that may impair my coaching performance or professional commitments. I will seek support if necessary, including relevant professional guidance. This may require suspending or terminating my coaching relationship(s).”

In this case study, the Code of Ethics was breached. When you’re coaching somebody, confidentiality is paramount. Because if you are not protecting confidentiality, it is not a safe place for that individual to have a coaching conversation. It is known that the number one indicator of success in a coaching relationship is rapport between the coach and the client. Rapport is based on the client’s trust in their coach, knowing what that that what they discuss is protected. It is confidential. That means the client truly can open up and be vulnerable and share, which increases the likelihood that they have breakthrough moments. It means they’re truly supported as an individual. The minute you bring AI into a coaching conversation, there are going to be problems in terms of confidentiality. In addition to that, you’re taking away the human element.

Consider for a moment the difference between doing coaching and being a coach. Doing coaching is the technical skills. It involves following a general process (with flexibility). For a human there are also parameters with the Code of Ethics. If you’re well trained, you’ve learned about language and ethics for the coaching conversation. You’ve learned about the coaching mindset and holding the client as their own best expert. As soon as you bring AI in, you’re dealing with a machine not a human being. The level of respect is not going to be there. A machine cannot be fully present to a client. The machine can only process based on what it’s programmed to do. It means that if you really want coaching, it is important to have a well-trained human being doing the coaching. That in turn means that well-trained human being is aware of their responsibilities in terms of protecting confidentiality.

Be aware of the implications of any AI use in terms of coaching. Reflect for yourself. If you were offered an AI coach, how do you feel about it? Do you like the idea of perceived anonymity because it’s a machine? Are you afraid of the confidentiality risk? Do you feel a machine can truly serve you at the same level that a human being can? People are making this choice on a daily basis. There are pros to it. There are a lot of concerns about it. Each person is making their own choice. What is your choice? What are you doing to effectively support others?

Leave a Reply