skip to Main Content
Starry Night
Starry Night

When people are presented with disagreeable facts they tend to double down on their own truth. Regardless of who you voted for in the last election, for example, you were probably complicit in editing data that didn’t suit your preferences—and almost certainly never acknowledged to yourself that you did. But editing the data need not always be a bad thing as long it is conscious. And as long as we understand that data isn’t truth. It’s signal.

Published in The Huffington Post, December 14, 2016
by Kevin McDermott,  Jeremy Rabson and Daryl Twitchell

Imagine a truth machine. To any question you asked it would respond with perfect accuracy. It would account for the human tendency to reject truths we don’t like by dispassionately explaining our biases to us and addressing them. We’d probably still have a problem with it.

Not that we’d necessarily be lying to ourselves. More likely we’d want a truth different from the one we say we want. Because a truth machine is dumb. It answers the question we ask and nothing more.

When people are presented with disagreeable facts they tend to double down on their own truth. It may be that we are neurologically wired to make decisions now and defend them later. Regardless of who you voted for in the last election, for example, you were probably complicit in editing data that didn’t suit your preferences—and almost certainly never acknowledged to yourself that you did. Editing the data need not always be a bad thing as long it is conscious.

And as long as we understand that data isn’t truth. It’s signal.

Knowing what you believe in

Once we had a client for whom we did a marketing survey. The survey told us that the ad campaign the client was running was not yielding anything like its hoped-for return. “I know what the data says,” the client told us, “but I’m trying to build a brand.”

This was a woman who knew what she believed. She didn’t deny the data but she understood the limits of its usefulness for her. It was OK to privilege belief over data as long as she could say out loud that that was what she was doing.

We don’t argue against a vision that steers an organization to a place the market doesn’t yet know it wishes to go. Entrepreneurs do that all the time, and we all admire their daring. They have a bias toward action, we say. When they take a risk in the face of discouraging data we admire their guts. They are willing to make choices in the absence of certainty.

Placing a considered bet on where the world is going is evidence of leadership. It is a strategic choice about the future. The risk lies in getting all emotional about it.

Detaching

The old joke about consultants is that we’ll borrow your watch and tell you what time it is. The main reason companies hire consultants is not because they know more than people working at the client (usually they don’t) but because they have a process that is outside the client. Presumably the process is detached—a kind of truth machine.

The usefulness of a detached perspective—whether it comes from a consultant or from our super egos—is that it helps us observe the process by which we make choices.

Consider, for example, a large retail chain that knew it needed to acquire an electronics brand but didn’t know which one. Actually, it did know which one. Except that it was the wrong one. We did a study of acquisition options and recommended a choice different from what the senior team already had in mind. The client commissioned a second study. Same result. The client could always ask for one more piece of missing information to justify non-decision because it couldn’t distinguish between what it wanted and what it needed.

As outsiders we could see what the client couldn’t, which was that it was hoping for data to back up a decision the senior team had already made. This lacked common sense. It’s a moral hazard common to all of us. How many of us, for example, have been in romances where we fight the evidence and believe we can make everything right? People have an amazing capacity for ignoring cognitive dissonance.

Most of us don’t like saying we’re wrong—as we discovered with another client that continued pouring good money after bad with a hyperlocal strategy that didn’t work economically.

“It’s gotta work,” the CEO said. He hunted among the research for glimmers of encouragement. He was largely steering by his emotions, not by the data. Once again, want had become need.

Truth metrics

If we were defining a person with common sense we would probably describe someone who makes plausible choices knowing they won’t ever have all the facts. They acknowledge their biases and navigate them. Their definition of success would boil down to whether or not they are making good decisions most of the time.

Organizations don’t always want a process for forcing common sense on themselves; they resist telling the emperor he has no clothes. That’s especially so when the emperor has had some success; no one wants to challenge what works. It only gets harder once money and careers are invested.

That’s the reason for establishing truth metrics before emotions are heightened. Dashboards and key performance indicators, for instance, are designed to be truth metrics. Their intended usefulness is to enable detachment and decisions based on data. That way at least we’re all agreed on what we’re arguing about.

Bear in mind that exactly the effort to generate hard data can carry the risk of confirmation bias in the things we attempt to track, in the questions we ask our truth machine. WellsFargo, for example, tracked accounts opened by customers. Gaming of the system started immediately.

Metrics should show more than mere progress. Just as your doctor does with your cholesterol level, decide what a healthy level of success looks like and hold yourself to it. If the thing you’re measuring isn’t in the range be honest with yourself and think about ending it.

To test any choice it helps to include perspectives different from your own. As we described in a prior piece there is value in having people who challenge your beliefs. That’s what boards of directors do, in theory.

You might choose to ignore the metrics and declare out loud that you’re doing so. But differentiate belief from data. Remember, belief’s not always bad (who knows, maybe you’re a visionary). But confusing belief for data ignores the value data plays in signaling where you might find the truth.

Photo credit.

Back To Top