Failing Forward: Why we should focus on what doesn’t work
This article was originally published in AMEC’s e-book 2019: EVOLVING COMMUNICATION MEASUREMENT
The majority of data programs will fail. That’s a good thing.
This was a bold point made by Allyson Hugley of Prudential at the start of the most recent AMEC Summit. It was also emphasized by Facebook’s Daniel Stauber: “Success is a lousy teacher.”
More than any other discipline, learning to accept – and even celebrate – failure feels challenging for public relations. This is perhaps even more true in the measurement space, where conversations often start from a point of ‘proving your worth’.
But as the marketing and communications world becomes increasingly digital-first, we’ve seen a shift toward a test-and learn approach from other disciplines where accountability is paramount. Surely if these data-led disciplines can learn not to fear failure, so can we?
So, what can PR professionals do to start to change the culture? First and foremost: we have to be willing to try and accept that not all outcomes will be what we expect.
We can look to some of the world’s largest companies to understand how experimentation (and failing forward) is crucial to their aggressive growth. Amazon CEO Jeff Bezos has alluded to the importance of failure within their growth and innovation strategy.
“If you’re going to take bold bets, they’re going to be experiments,” he explained in HBR shortly after Amazon bought Whole Foods. “And if they’re experiments, you don’t know ahead of time if they’re going to work. Experiments are by their very nature prone to failure. But a few big successes compensate for dozens and dozens of things that didn’t work.”
In the PR context – you might not want to start with a largescale acquisition.
But there are lots of small ways you can begin to embrace failing forward to improve your content and campaigns. Here are three to consider trying in your organization.
Embrace a test-and learn model on your next campaign
Always-on reporting is becoming commonplace for brands during big launch moments. Why not take that data and turn it on its head?
Focus on what’s not working, and what you can learn from it – rather than only telling a victory story with the big numbers. Make sure you can segment your data easily and then start to pull out the individual tactics that aren’t landing. Are you struggling to get coverage in lifestyle titles? Is message 2 never included? Are gifs working but short-form video struggling?
Focusing on these areas can make sure you’re using budget (and time) effectively and reveal truths about your audience you’d never know otherwise.
Where you don’t have the information – trial it in a real-world scenario
There are multiple ways this might work. Within a social context, you might consider creating some dark posts targeted at a segment of your audience with a new piece of collateral and test how it performs against your existing benchmarks. For a relatively small investment, you’ll know whether it is worth pursuing further… or if you should go back and try again. If you don’t want to trial it with a live audience, you can also use primary research methods to understand how your audience is likely to respond.
Within H+K’s behavioral science unit, we’ve worked with traditional survey vendors to create real-world experiences to test whether our hypotheses (for example, about messaging or design) bear out when put to use in a real-world scenario. Do they always work? No. Do we always come out with a better final product for having tested multiple, bolder iterations?
Absolutely.
Be bold in your next big measurement or research project
This really comes down to study design. For most researchers, there are tried and true ways to get to the answer of a question. Maybe you even follow templates when it comes to certain requests.
If you’re looking to fail forward, though, you have to be willing to take risks in your study design. Look at new metrics, or novel combinations of data. Ask your clients and partners for other datasets you can include. Know that not every piece of research will work out. Sometimes you will hit the data-equivalent of a dead end. But some will, and those with a willingness to fail and experiment are going to have a much higher rate of innovation in their use of data longer term.
Shifting our mindsets – individually or collectively – around our approach to failure is no small task. Measurement that really looks out for our blind spots and digs into what isn’t working well won’t show the rosy picture.
When you embrace failing forward, however, you can better improve content and reputations over time by listening to what your target audience is telling you. And really, what could be a better use of data and measurement than that?