When AI is Everywhere and Nowhere at Once

No time to read? Click play to listen

I’ve been sitting with the emotional tone of conversations I’m having with leaders lately and there’s a very particular feeling in the room.

It’s not excitement, but it’s not resistance either.

It’s something closer to uncertainty mixed with pressure and a quiet sense of “we should be further along by now”.

A recent article by Emma Partis (HR Leader , 2026) resonated with me perfectly. Drawing on KPMG Australia’s latest Keeping Us Up at Night survey, Emma reported that AI and technological disruption are now the number one concern for executives heading into 2026. And in fact, for the foreseeable few years at least…

Not inflation. Not supply chains. Not even geopolitics. Their concern is AI execution.

Nearly two thirds of executives surveyed said new technologies such as AI are their biggest worry, particularly around use cases, ethics and how to actually implement it responsibly.

That tells us something important.

This isn’t just a technology issue. It’s an emotional one.

Whenever something arrives with this much hype, promise and pressure, it triggers a response. Collective or shared emotions create atmosphere, and these feelings drive behaviours and create actions.

And like any atmosphere, it shapes behaviour long before it shapes outcomes.

What I’m noticing is a collective emotional response that looks like even more uncertainty:

Teams feel watched, evaluated or quietly replaced, even when nobody has said that out loud.
Boards want confidence and direction, while executives are still trying to make sense of what is real and what is noise.

That emotional fog matters, because it affects decision making, trust and performance.

And here’s the part that doesn’t get talked about enough.

Large scale surveys from firms like Gartner, Deloitte and Stanford’s AI Index show that most organisations are still struggling to translate AI adoption into meaningful commercial outcomes. Many pilots stall. Many tools are bolted on rather than embedded. Many leaders privately admit they cannot yet point to a clear return on investment.

So we end up in this strange place.

AI is everywhere in conversation. But nowhere when it comes to clear, measurable value.

That gap between expectation and reality is where anxiety lives.

This is where HOLD THE SPACE becomes essential.

We need to hold space for:

The uncertainty leaders are feeling but not saying out loud.
The fear teams experience when change feels constant and unclear.
The pressure to perform without a clear definition of success.

If we skip this step, we see predictable patterns.

Shiny AI initiatives launched to look progressive. Overpromising and underdelivering.

Teams disengaging quietly because the emotional atmosphere feels unsafe or incoherent.

What successful leaders are doing differently right now

The leaders who are navigating this well are not the ones chasing every new AI capability.

They are the ones who are:

Being honest about what AI can and cannot do today

Connecting AI initiatives to real problems rather than abstract possibility

Creating psychological safety so people can question, challenge and contribute

Resisting the urge to perform certainty when certainty does not exist

They are holding the space long enough for clarity to emerge.

The invitation as we head into 2026

AI is not a passing concern. Executives are thinking about this not just for next year, but for the next three to five years.

That means the real leadership task is not about keeping up with technology.

It’s about regulating the emotional environment in which technology is introduced.

Because performance does not come from tools alone.
It comes from people who feel safe enough to think clearly, speak honestly and adapt together.

Previous
Previous

New Rules, Same Truth: Psychosocial Hazards Aren’t Just Legal – They’re Human

Next
Next

How to Spot the Alpha