The emotional cost of working with AI

The emotional cost of working with AI

AI discussions have predominantly focused on technical outcomes like productivity, task performance, and error reduction. However, a deeper exploration of the emotional and psychological impact is essential.

Not long ago, AI in the workplace seemed like science fiction. Today, it's prevalent across industries, helping doctors read scans, recommending resumes to HR teams, or suggesting product names to marketers. While much has been said about AI's efficiency, a critical aspect is often overlooked: the emotional experience of working alongside intelligent machines.

Behind the headlines and hype, workers are navigating a complex mix of emotions like curiosity, anxiety, awe, and sometimes resentment. These emotions significantly shape how we engage with our work, perceive ourselves, and relate to one another.

A recent multilevel review by Bankins and colleagues (2024) in the Journal of Organizational Behavior unpacks what researchers have learned so far about AI in organisational life. What they found is clear: introducing AI into the workplace is not just a technical or strategic change, it's an emotional one, too.

A question of identity and emotional response

Consider the scenario of an experienced journalist whose creativity is now complemented, or perhaps overshadowed, by an AI system generating headlines at lightning speed. Or think about a recruiter whose instincts are second-guessed by an algorithm. These situations are becoming increasingly common.

AI often challenges workers' professional identities, forcing them to confront a fundamental question: What unique value do I bring to this job that a machine cannot replicate?

A tired employee The emotional impact of working with AI is often overlooked. (Photo: Pexels)

Dr Tony Nguyen, Interim Associate Program Manager for the MBA program at RMIT University Vietnam, noted that this tension manifests in subtle yet profound ways.

“People may feel devalued and reduced to mere data points, or like ‘just a human’ compared to an error-proof machine. For some, AI serves as a catalyst for reinvention but for others, it breeds quiet unease,” he said.

Research indicates that emotional responses to AI vary widely. Some workers feel excitement and optimism, particularly when AI is framed as a supportive tool that handles repetitive tasks, allowing them to focus on more creative or strategic endeavours.

Others experience fear, frustration, or helplessness, especially when AI is imposed from the top down without adequate training, communication, or agency. Many may feel monitored, replaced, or left behind.

Dr Nguyen said the emotional ambiguity surrounding AI is telling. “It transcends the characteristics of a simple tool. It makes decisions, learns, and can outperform humans in specific tasks. This blurring of boundaries between tool and teammate can be deeply unsettling.”

Trust issues and change fatigues

According to RMIT Senior Lecturer in Management Dr Giang Hoang, trust is a recurring theme in discussions about AI.

“Employees often question whether they can trust the decisions made by AI systems and whether they feel safe in questioning or overriding those decisions. Concerns arise when these systems are used to evaluate performance,” he said.

Research shows that emotional trust is influenced not only by the accuracy of the technology but also by how AI is introduced, who controls it, and whether employees feel like active participants in the process. Low trust can lead to resentment and resistance, while high trust fosters collaboration and creativity.

Recent studies have also identified what could be termed AI-induced change fatigue. In workplaces already inundated with transformations – new software, shifting roles, constant reskilling – AI can feel like yet another disruptive force. Workers may silently wonder: When will it end? When will I feel competent again?

“Emotional fatigue manifests as disengagement, burnout, or cynicism. It’s not merely about whether AI functions effectively. It’s about how much change individuals can absorb before emotionally shutting down,” Dr Giang said.

Dr Tony Nguyen and Dr Giang Hoang photos Dr Tony Nguyen (left) and Dr Giang Hoang (Photo: RMIT)

Moving forward: Emotional intelligence meets artificial intelligence

Many human-centred questions remain unaddressed in AI implementation discussions: What implications does it have for motivation when AI receives credit for key insights? How do team dynamics shift when some members rely on AI while others resist it? How can we support individuals whose sense of competence has been quietly undermined?

These questions highlight the need for leadership and organisational psychology to focus on these dimensions.

So, what can organisations do? Dr Tony Nguyen and Dr Giang Hoang proposed three key ways forward:

  • Acknowledge the emotional reality of AI integration: This involves not only offering technical training but also creating spaces for reflection, open dialogue, and even venting.
  • Involve employees in the AI journey: When individuals have a say in how AI is utilised and how their roles evolve alongside it, they are more likely to engage positively.
  • Support leaders in developing emotional intelligence: Leaders should be attuned to early signs of withdrawal or tension and frame AI as a partner that relies on human judgment, ethics, and creativity rather than as a replacement.

As AI becomes integral to our decision-making, collaboration, and creativity, it is also growing as an emotional presence in our daily work lives. “We don’t need to fear AI, but we must navigate our emotional responses to it. Understanding how AI affects our inner world may be the missing piece in making it work effectively in the outer one,” Dr Nguyen concluded.

A version of this article first appeared on Psychology Today.

Masthead image: tippapatt - stock.adobe.com

Related news