Numbers only tell part of the story. A deeper analysis is required to glean actionable intelligence.
Predictions are getting a bad rap. From the improbable Chicago Cubs World Series to the Atlanta Falcons somehow snatching defeat from the jaws of victory, and most notably, last year’s stunning presidential upset, some of the smartest data-crunchers and metrics mavens have made rock-solid predictions, only to be proven disastrously (and embarrassingly) wrong.
It’s enough to make a fortune teller break her crystal ball into 538 pieces.
Truth is, it isn’t the data that’s the problem, but the analysis, and it’s an important distinction. How we view numbers – and, for marketers, how we create plans based around those numbers – is an art form that needs to be improved. Carpenters are taught to measure twice so they only have to cut once, but predictions based on data seem to be so wrong because of an instinct to jump to a thousand cuts despite only a cursory use of a yardstick.
In the aftermath of the presidential election, data geeks like Nate Silver, caught with egg on their faces, have issued all manner of mea culpas, blaming their faulty predications on a “small, systematic polling error” or undecided voters breaking for Trump in mass. Silver even gave himself a big pat on the back for the fact that his FiveThirtyEight only gave Hillary Clinton a 70 percent chance of winning the election—much lower odds than The New York Times’ Upshot (85 percent) and Huffington Post (98 percent).
But not everyone was so assured of a Clinton victory.
Wired, working with its partner, Networked Insights, posted its final map on November 7. The prediction: an Electoral College tie—269 to 269, still wrong but far closer to the actual result (Trump 306, Clinton, 232). What did Wired do differently from Silver and the other data geeks?
They understood that numbers only tell a part of the story. Voting in a presidential election is an emotional act. We vote for an intangible quality in a leader, not simply a set of policy positions. If you take out the emotion, you miss an essential part of the story. To solve for this, the company built a data model that also captured emotion as expressed through social media posts. While some national polls missed swings based on big events—like the Comey announcement—this metric picked up on it.
The takeaway from the 2016 election: Data does not live in a vacuum.
Whether you’re trying to predict an upcoming presidential race or evaluate the effectiveness of a digital-marketing campaign—raw data is a good place to start but it’s not the end of the story.
Use your CRM to determine whether you’re meeting those goals. Sometimes, you’ll need to talk to the people on the front lines: Your sales people. Find out if there’s been an increase in sales volume or closed business. This type of qualitative evidence will help you determine if your campaign is meeting your business goals, not just producing a bunch of vanity metrics.
Dig deep into the numbers to unearth larger truths. Who’s most engaged with your campaign? Get granular. Is there a particular persona or demographic that is especially drawn to your message? This will help you focus your efforts moving forward, targeting distinct groups, or letting you know that you should tweak your message to appeal to a wider audience.
Always go back and evaluate to make sure your metrics correspond with your goals. If you’re not measuring the right things, you’ll never get the answers you’re looking for.
They say “you can’t manage what you don’t measure.” But you need to make sure your measurements correspond with your goals. Otherwise, it’s like having a crystal ball but without the fortune teller to ask the questions.
Let’s Connect
Ready to build, grow, manage and protect your brand? Complete the form below to discuss how we can help.