The industry has been asking this question for twenty years
For almost as long as I can remember, people in the corporate and brand film industry have been trying to prove that what we do actually works.
I was on a panel in London discussing it fifteen years ago. We've tried various methods since. Some got in the way of the communication itself.
Some didn't matter because the client just wanted to look good in front of their boss — and they did, so everyone moved on. Too busy to go back and check what actually happened.
A friend of mine who ran a face-tracking platform told me once that half the time, agencies swept their own effectiveness studies under the rug because the results weren't what they wanted.
And nobody had a good answer
Traditional advertising has had measurement frameworks for decades. With corporate communications, it's harder.
We're not usually driving something as cleanly quantifiable as sales. We're trying to shift how people think, feel, and act — which are fairly intangible. That makes them hard to measure, and so we stopped really trying.
Advertising at least had sales data to anchor to. Corporate comms never had that luxury, so the habit of measuring never really took hold.
The content ecosystem being absolutely flooded has changed the calculus.
There is now 25,000 times more content uploaded to YouTube alone than every broadcaster, streamer, and TV network globally produced last year combined. Corporate audiences hold everything they watch to the same standard as Netflix or their TikTok feed.
To get noticed, to drive action, you have to be genuinely right for that person in that moment.
And yet too many businesses still don't grasp the opportunity cost of getting this wrong. In a world of hyper-accessible production, the inability to demonstrate the additional benefit of doing things properly is an existential issue for our industry.
Our failure. We sold the video, not the outcome. The drill, not the hole.
Until now
But – we know that the right video, for the right person at the right time, can be a hugely powerful driver for behaviour change. We know that if we can deliver on those 'right things' efficiently and consistently, that change will come. So, then we just need to track and optimise for the audience, and iterate as necessary to get it right.
That's how we can start to offer a guarantee for effectiveness. We already guarantee the quality of the films we make. Now we can guarantee that they'll actually work.
How do we do that though?
Director, Narrative Strategy
This is where our newly promoted Director of Narrative Strategy, Oliver Atkinson, comes in.
Olly has been with Casual for thirteen years. Before that, sixteen years in television - during which time he worked on a BAFTA-winning series. He has run our EMEA business since 2018. Last week, we had an in-depth chat to understand how he's thinking about what comes next.
The metrics we've been using are the wrong ones
The industry's answer to "did it work?" has, for years, been views, likes, and comments. These are Gen 2 advertising metrics - data points that exist because they're measurable, not because they're necessarily as meaningful as they could be.
"They understand clicks. But they don't really understand whether you connected with somebody."
There's a structural reason for this. YouTube, Meta, and TikTok haven't become the most valuable companies in the world by being indifferent to your money.
This is a fundamental misalignment in incentives. They're not optimising for your message to reach your audience. They're optimising for your spend with them. Those are not the same thing.
"A lot of the time we've been part of that conceit - making clients as happy as they can be, getting that approval from the organisation. But what's the film actually doing? Is it changing behaviour, or is it just getting approval internally? Because that's not good enough."
It isn't. And we've known it isn't. What we haven't had, until now, is anything better to offer.
The most loved ad isn't necessarily the most effective
Ninety-five percent of decisions are made subconsciously. The part of the brain that drives behaviour change is not the part that fills in a survey afterwards.
"What we say we like, is a scrambled self-report of us post-rationalising behaviours we've already taken."
The 2018 Super Bowl clearly illustrates this. Amazon's ad - full of celebrities, universally liked, number one in awareness. Diet Coke's - widely mocked, consciously hated by almost everyone.
And yet Diet Coke was the most immersive piece of content in the room. The one most likely to actually change behaviour. By the metrics that matter - the successful launch of the flavoured Diet Coke line - it worked.
The things we put on our award shelves are not necessarily the things that work.
So how does StoryPulse™ actually work?
I asked Olly to walk me through how StoryPulse™ actually addresses this. He thinks about it in layers.
The first is understanding your audience's cognitive profile - not demographics, not personas - but psychological makeup.
The five major traits (openness, conscientiousness, extroversion, agreeableness, neuroticism). Actual predictive science, not just a slightly more in-depth Myers-Briggs.
One example Olly uses is that of a whiskey brand that spent years convinced their audience was male, extrovert, bar-going.
But… when they profiled by cognitive traits, they found they were mostly targeting introverted women. They made a film - a cat, a book, a fire, a whiskey. No media spend. Fifty-six percent revenue increase in three months, after five years of flat, no growth.
"If you're speaking the language of the right audience, that's when you actually start to connect."
The second layer is measuring immersion during the edit - tracking neurological engagement in real time, finding exactly where you've lost people, and fixing it.
Spotify used this methodology to predict a hit song from twenty seconds of unreleased music with 97% accuracy, three months ahead of release. The test group saying which songs they 'liked' – no correlation at all.
Apply that to film, and a genuine effectiveness guarantee stops feeling like a fantasy.
"When you're producing from gut feel — and let's be honest, you're guessing — it's very hard to put a guarantee on the end."
The third layer is what happens after: tracking how the content performs with real audiences, iterating based on what the data shows, and tightening the feedback loop over time. That's what turns a one-off guarantee into something systematic.
Optimising for genuine connection
StoryPulse™ doesn't replace craft. Olly has two decades of storytelling experience, which makes him the perfect person to lead on this. When you deliver, you're not hoping. You can point at the data: This is connecting. Here's why it will drive the behaviour change you need.
Not every brief needs this. Brand films, event content, internal comms that just need to look good are generally fine as they are. But transformations, culture change, communications, where something meaningful actually has to shift?
"That's where this starts to become really quite good."
Oliver Atkinson is MD EMEA and Director of Narrative Strategy at Casual.

