One of the great buzzwords of the effective philanthropy movement is the idea of “proven effective” programs. Since so many nonprofit programs are never tested and are based on ideas that have little research behind them, it makes sense to encourage the funding and deploying of programs that have proven to be effective. While sensible, I think this concept can be dangerous unless funders and nonprofits understand that “proof” is a process, not an event.
In 2010, the New Yorker published an article titled The Truth Wears Off, that looked at the existence of the “Decline Effect”, the seemingly inevitable way that when scientific studies are repeated over and over, they tend to follow a path of diminishingly positive results.
In the article, Jonah Lehrer wrote:
“The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.”
Lehrer’s article incited a flood of angry letters and emails claiming that he was undermining scientific research and drumming up a controversy that doesn’t exist. However, my reading of the article (and Lehrer’s responses to his critics) suggests a much more modest claim is at the heart of his article.
Human knowledge is an evolving concept.
For all the perceived precision of a large study “proving” that something is true, the fact remains that over time our understanding of facts and truths change.
Lehrer explains a number of reasons behind what is know as the “Decline Effect”. Taken together, much of the issue has to do with human cognitive biases and behavioral issues in the way we process information. For instance, Lehrer points to the way that scientific journals seem to greatly prefer to publish studies that prove something to be true, so scientists have a significant incentive for their studies to find these results.
But even if you peel away all of the messiness of the human practice of scientific study, you are still left with the idea that seeking truth is a process not an event.
Lehrer writes:
“The decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
[The Decline Effect is so troubling] Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.) The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true.”
Now the silly way to react to the decline effect is to turn our backs on science and decide that if it cannot present us with the unquestionable truth, then it doesn’t work (this is the message that some of Lehrer’s critics through he was pushing). The more useful way to react is simply to understand that the concepts of “truth” and “fact” are far less rigid and concise than we tend to treat them. The search for truth, for “proven programs” will not end some day when we finally, finally, finally discover the real truth.
As Ralph Waldo Emerson said, “Life is a journey, not a destination.”
What this means for nonprofits and funders who want to direct their resources towards programs that actually work is that doing so will always be a continuous process. There will never be a a final, definitive study that tells us the “truth” of the best way to eradicate poverty, to end obesity, to give every individual the opportunities they deserve.
But that doesn’t mean we should lower our ambitions nor reject the scientific process. Instead, I think that Ted Cadsby, writing in the Harvard Business Review had it right when he argued in favor of adopting a mindset of “provisional truth”:
“Provisional truth requires that we think of our explanations as hypotheses — always subject to replacement based on new information or alternative ways of structuring existing information. Provisional truth means challenging our interpretations with disconfirming evidence and alternative perspectives. Provisional truth does not preclude drawing conclusions or taking action; but it demands that we be skeptical about our first reasonable explanations in the realm of complex problems. It keeps us humble and mentally flexible, constantly asking ourselves if we’ve really got everything figured out and responding, "Probably not."
But of course the scientists among you will recognize that the skepticism embedded in the idea of “provisional truth” is in fact a core aspect of the scientific process. The Decline Effect doesn’t discredit the process of scientific inquiry. Instead it simply lays waste to the fetishism of the scientific process that deludes people into thinking that we can at last completely understand and control our world once we discover “the truth”.