More

    The Rise of AI in Business & Leadership

    2023 was the breakout year of AI and since then it’s been going from strength to strength. It’s seen as the solution to everything from productivity and a skills gap to fixing the NHS, with Tony Blair calling for AI doctors and chatbots to ‘save’ the health service.

    There has been an explosive growth of generative AI with most companies using it in at least one of their business functions. It’s gone from being a subject discussed by tech teams to something regularly mentioned in the boardroom.

    According to a survey of C-suite executives, 40 percent say their companies will increase their investment in AI tools overall.

    Over-reliance on AI in business

    Over-reliance on AI in business risks removing that vital input from key decision making and team building and runs the risk of making some areas of leadership redundant – at significant cost.

    The costs might not immediately show up on a PNL spreadsheet, but they will be felt in other areas where it might not be so easy to quantify.

    AI is a useful tool, but it’s just that – a tool and it cannot replace that ‘gut feeling’ or connection that a great leader of any team has. That’s the magic which turns a group of disparate people into a team – the sum of which is greater than its parts.

    My concern is that we need to strike the right balance between our use of technology and building and developing our own skills. If all we do is put resources into AI at the expense of other training and development, or replace people with machines, we will be the worse for it.

    In short, I worry that people will become reliant on AI and could lose that innate human element that we bring to leadership and team building.

    Leaders have understanding and insights

    Leaders have understanding and insights built over many years which cannot be replaced by data or machines. If AI cannot find the answers to a question, it can make things up. Look at the infamous court case in America which made headlines in May 2023 when a law firm, Levidow, Levidow & Oberman got caught citing fake cases generated by ChatGPT.

    The ‘soft skills’ which good leaders at all levels have make the difference between a good leader and a great leader can’t be replaced by software. These personal qualities enable people to interact effectively and successfully with others: communication, teamwork, problem-solving and adaptability – not to mention empathy.

    These should be highly valued in the workplace, and especially by leaders, as they contribute to improved teamwork, productivity and overall job satisfaction.

    My experience of working in elite sports has shown me that humans cannot just be replaced by technology, no matter how smart.

    AI has a role to play

    AI has a role to play in any team, particularly with more basic tasks of information gathering or data analysis, but it cannot replace the power of a human connection.

    An emotionally intelligent leader goes beyond the numbers and knows their team. AI can’t pick up that real emotional response, and it can’t read people like humans do.

    It’s ‘machine learning’ for a reason – it uses whatever data you pump into it but there are just some cues and traits which cannot be quantified and that’s what cannot be replaced.

    We’ve used technology and data in sport for a long time; elite sports is incredibly data driven. We’ve got all the information that says the players should train at this point because they are due to pull a hamstring or develop another injury or over train.

    And whilst it’s useful to have that information, it should be used as a guide.

    I’ve seen many top-level coaches when the sports scientists are running over, saying ‘we need to stop training now.’

    But the coach looks at the players and can see that the players can continue for another ten minutes, and he’s pushed the players through.

    The data could also get it wrong and put players at risk by pushing them too much whereas a coach will see if his players look physically fatigued and need to stop training early.

    Costs of bad investment

    The costs of misreading that at elite sports level go beyond the investment in a machine: we’re talking millions of pounds worth of players or the opportunity to achieve a lifetime goal of a medal on the world stage.

    And even when Olympic golds aren’t in the picture, the impact of a toxic working environment is felt not only in the lower productivity or staff turnover but in more personal impacts on someone’s health and sense of value.

    This knowledge and consequent decision making comes from the human understanding of people. In my 25 years’ of working in both a team and in a leadership role in sports I have yet to see a coach who has got that wrong.

    Because we’ve used data in sports for so many years I think it provides a lesson for other areas on how we can use AI: as a way of gathering data, of making predictions or analysing patterns but only as a tool to help inform our own human-driven intelligence.

    To rely on AI wholly would be a mistake: nothing can replace that human intelligence.

    We’ve compiled a list of the best data visualization tools and the best business intelligence platforms

    What examples best illustrate AI’s limitations in understanding human emotions

    Several real-world examples and studies highlight the limitations of AI in understanding human emotions:

    1. Biased Emotional Analysis

      • Ethnic Bias in Emotion Recognition: Studies show that AI systems designed to interpret human emotions can assign more negative emotions to people of certain ethnicities than to others, revealing inherent biases in the technology. This can have serious consequences, especially in hiring or customer service applications, where misreading emotions can lead to unfair treatment or inappropriate responses.

      • Inconsistent Emotional Detection: Research comparing different AI models (like ChatGPT, Grok, DeepSeek, and Yiaho) found significant inconsistencies in how they detect and label emotions in written texts. For example, sadness was much less frequently identified by ChatGPT compared to other models, and the category of “other emotions” often masked nuances that AI failed to capture.

    2. Lack of Empathy and Contextual Understanding

      • Pre-programmed Responses: AI-powered chatbots and virtual assistants can simulate empathy by responding to keywords or phrases, but their responses are based on pre-set algorithms and lack genuine emotional depth or understanding of complex human experiences.

      • Inability to Grasp Context: AI often misinterprets emotional context. For instance, if a person is angry due to an injustice, AI might simply try to calm them down, failing to recognize the underlying cause or provide a truly empathetic response.

    3. Inability to Interpret Non-Verbal Cues

      • Limited to Data Inputs: AI can analyze text, voice, or video for emotional cues, but it struggles to interpret subtle non-verbal signals such as body language or micro expressions that humans naturally pick up on.

      • Privacy and Consent Issues: Emotion AI that analyzes video or voice can raise privacy concerns and may not always be accurate, especially if users are unaware they are being analyzed.

    4. No Genuine Emotional Experience

      • Lack of Consciousness: AI lacks self-awareness and consciousness, making it incapable of truly experiencing or understanding emotions as humans do. It can only simulate empathy or analyze patterns, not feel or relate to emotions on a personal level.

      • Superficial Empathy Simulation: While AI can mimic empathetic language, it cannot provide genuinely empathic attention or care, as empathy is rooted in biological and lived human experience.

    Navigating the AI Landscape: Cost, Careers, and Everyday Living

    Recent Articles

    Related Stories