I’ve been sitting with the emotional tone of conversations I’m having with leaders lately and there’s a very particular feeling in the room.
It’s not excitement, but it’s not resistance either.
It’s something closer to uncertainty mixed with pressure and a quiet sense of “we should be further along by now”.
A recent article by Emma Partis (HR Leader , 2026) resonated with me perfectly. Drawing on KPMG Australia’s latest Keeping Us Up at Night survey, Emma reported that AI and technological disruption are now the number one concern for executives heading into 2026. And in fact, for the foreseeable few years at least…
Not inflation. Not supply chains. Not even geopolitics. Their concern is AI execution.
Nearly two thirds of executives surveyed said new technologies such as AI are their biggest worry, particularly around use cases, ethics and how to actually implement it responsibly.
That tells us something important.
This isn’t just a technology issue. It’s an emotional one.
Whenever something arrives with this much hype, promise and pressure, it triggers a response. Collective or shared emotions create atmosphere, and these feelings drive behaviours and create actions.
And like any atmosphere, it shapes behaviour long before it shapes outcomes.
What I’m noticing is a collective emotional response that looks like even more uncertainty:
Teams feel watched, evaluated or quietly replaced, even when nobody has said that out loud.
Boards want confidence and direction, while executives are still trying to make sense of what is real and what is noise.
That emotional fog matters, because it affects decision making, trust and performance.
And here’s the part that doesn’t get talked about enough.
Large scale surveys from firms like Gartner, Deloitte and Stanford’s AI Index show that most organisations are still struggling to translate AI adoption into meaningful commercial outcomes. Many pilots stall. Many tools are bolted on rather than embedded. Many leaders privately admit they cannot yet point to a clear return on investment.
So we end up in this strange place.
AI is everywhere in conversation. But nowhere when it comes to clear, measurable value.
That gap between expectation and reality is where anxiety lives.
This is where HOLD THE SPACE becomes essential.