A technological and ethical revolution in progress
Update to the article “AI and coaching, an ethical and prospective reflection on their mutual contributions” (2020) written by Michel Cezon and Thierry Ménissier
The example of a session with an AI-augmented coach 💻
Sarah sits down in front of her screen for her weekly coaching session. The clean interface displays an avatar with soothing features, while a connected bracelet discreetly monitors her heart rate and skin conductance. “Hello Sarah, I notice unusually high tension today,” begins the AI coach in a natural voice generated in real time. “Your heart rate has increased by 15% since the beginning of our session. Would you like to explore what’s troubling you?”
Thanks to advanced semantic analysis, the AI detects micro-expressions of stress in Sarah’s voice when she mentions her difficulties with her new manager. The algorithm cross-references this data with the history of their previous sessions, identifies recurring patterns related to authority, and suggests personalized breathing exercises. In real time, the AI adjusts its questions based on biometric variations, knowing precisely when to deepen or lighten the emotional intensity.
This scene, still fictional in 2020, is becoming reality today thanks to spectacular advances in generative artificial intelligence ✨

Major transformations since 2020 🚀
The emergence of large language models (LLMs) like GPT-4, Claude, or Gemini has revolutionized the landscape of AI-assisted coaching. Unlike the rudimentary chatbots analyzed in 2020, these systems demonstrate sophisticated conversational capabilities, deep contextual understanding, and remarkable adaptability.
The main observed developments:
Advanced semantic understanding: Modern AI grasps nuances, metaphors, and even humor – these “discussion points” identified as problematic in 2020. Current models navigate easily through implications and unspoken elements, fundamentally transforming the nature of coach-coachee dialogue.
Multimodal integration: Real-time analysis of voice, facial expressions, and biometric data allows for holistic perception of the coachee’s emotional state. This “sensory augmentation” partially addresses the challenge of the “suffering human test” mentioned in our initial article.
Adaptive personalization: Each interaction enriches the coachee’s personality model, enabling increasingly precise and individualized support. AI develops a “relational memory” that transcends the temporal limits of traditional sessions 🧠
New ethical and deontological challenges ⚖️
This technological revolution raises unprecedented ethical questions that go far beyond those identified in 2020.
The question of relational authenticity becomes more complex: if AI can convincingly simulate empathy, creating real emotional attachment in the coachee, how do we define the authenticity of this relationship? The emotional asymmetry, already problematic in 2020, becomes dizzying when faced with AI capable of sophisticated emotional manipulation.
The protection of intimate data takes on a critical dimension: the AI coach accesses deep psychological information, intimate behavioral patterns, and sensitive biometric data. The risk of psychological surveillance or emotional profiling for commercial or political purposes becomes concerning 🔒
The coachee’s autonomy is questioned: could a perfected AI create emotional dependence, undermining the fundamental objective of coaching which aims at empowerment? The modern paradox lies in this AI that guides toward autonomy while risking creating a new form of technological dependence.
Towards a computational ethics of coaching 💭
Faced with these challenges, the coaching industry must urgently develop specific ethical frameworks. Beyond the four families of ethics identified in 2020 (consequentialism, deontologism, aretaism, axiologism), new imperatives emerge:
The ethics of algorithmic transparency: the coachee must understand how the AI analyzes them and what data it uses. The algorithmic “black box” is incompatible with the trust relationship necessary for coaching.
The ethics of limitation: precisely defining the domains where AI can and cannot intervene. Should certain dimensions of human support remain exclusively human?
The ethics of intermediation: developing hybrid models where AI augments the human coach’s capabilities without replacing them, thus preserving the essential relational dimension.
Future perspectives: AI coach as anthropological revealer 🌟
The rapid evolution of AI coaching confronts us with fundamental questions about human nature. What makes the essence of human support? What are the irreplaceable dimensions of the coach-coachee relationship?
Paradoxically, the AI coach could reveal and valorize human specificities rather than replace them. By automating the technical aspects of coaching (tracking, pattern analysis, reminders), it could free humans to focus on what is unique to them: intuition, relational creativity, existential wisdom.
The future of coaching seems to be moving toward sophisticated complementarity: AI as an instrument of analytical precision and continuous support, humans as a source of meaning, creativity, and relational authenticity 🤝
This evolution requires renewed training for coaches, adapted regulation, and above all collective reflection on the values we wish to preserve in human support in the age of artificial intelligence.
The question is no longer whether AI will transform coaching, but how we will choose to shape this transformation so that it serves human flourishing rather than supplanting it.
Article written with the assistance of Claude AI 😉