The Number That Explains Nothing
Every year, somewhere between January and April, a significant portion of the world’s large organisations ask their employees some version of the same question: are you happy here?
The results come back as a number. Sixty-eight percent engaged. Up four points from last year. Leadership reviews the report. HR prepares a presentation. The executive team discusses what to do about the 28%.
Then the organisation makes exactly the same decisions it would have made without the survey.
This is not cynicism. It is, with some exceptions, an accurate description of how organisational surveys function in practice. Not because the platforms are unsophisticated — the better ones are rigorously built, backed by decades of psychometric research. Not because HR teams aren’t trying. The problem is simpler and harder to fix: the question being answered is not the question that would change anything.
The employee engagement survey measures engagement — a specific psychological construct about an employee’s connection to their work and organisation. Gallup’s Q12, probably the most widely used instrument in the field, genuinely predicts certain outcomes. Teams with high engagement show lower turnover, fewer safety incidents, better customer satisfaction scores. The measurement is valid.
But it measures the wrong thing for the problem most organisations are actually trying to solve.
Consider an organisation six months into a digital transformation programme. The engagement score is 71% — healthy, stable, up two points. What it does not tell you: that middle managers in three business units have quietly concluded the programme is designed for a company that doesn’t exist, and have been managing around it since February. That the technical team executing the migration has identified a dependency issue that will push the go-live date by at least eight months, and has mentioned it in three status updates that were summarised out of existence before reaching the steering committee. That the people most critical to the programme’s success are the ones least likely to say any of this on an engagement survey, because they’re invested enough to know it matters and experienced enough to know what happens to messengers.
None of this is hypothetical. It’s what’s happening, right now, inside most large transformation programmes. The intelligence exists. The engagement score doesn’t contain it.
The deeper problem — and this is the one that’s genuinely hard to talk about — is that the annual survey doesn’t just measure the wrong thing. It measures it at the wrong time, in the wrong direction, and from the wrong vantage point.
Wrong time: organisational reality during transformation moves in months, not in the annual intervals at which it’s typically measured. The moment when a programme loses credibility with the people executing it is not an annual event. It’s a specific week, often traceable to a specific decision that contradicted something the organisation said it believed. A survey taken six months later will find the engagement score slightly lower and offer no explanation.
Wrong direction: most surveys ask up. Employees tell HR what they think. The information travels to leadership in a form that has been processed, averaged, and made presentable. The signal that would be most useful — what do the people closest to the work actually understand about what’s going wrong — arrives, if at all, in the form of a footnote in the qualitative comments section.
Wrong vantage point: the intelligence that matters most is not contained within any single group. It lives in the gaps between them. What makes a multi-stakeholder divergence analysis different from an engagement survey isn’t the questions or the platform — it’s the analytical frame. Ask an executive team and a frontline team the same questions about transformation readiness, map where their answers systematically diverge, and you have something an engagement score will never produce: a picture of what the organisation thinks it believes versus what it actually operates as.
An executive team that believes their culture supports risk-taking, while the people below them have watched three innovation initiatives die in the annual budget cycle — that divergence is an organisational intelligence finding. The 71% engagement score from the same organisation is noise.
The engagement survey industry is large, well-funded, and genuinely useful for what it measures. The platforms are sophisticated. The research is real.
None of this changes the fundamental problem that the question being answered is not the question that would change decisions.
Are your people engaged? Probably worth knowing.
What do your people know that you don’t? That’s the question.
The gap between those two questions is where most organisational intelligence currently disappears.
Actual Intelligence builds longitudinal, multi-stakeholder assessment infrastructure for consulting firms delivering transformation work. The question we’re designed to answer is the second one.