During the Social Innovation Fund design process, I argued that the goal of the Fund should be to support the “next Nurse-Family Partnership”. I was arguing that given the extremely limited number of nonprofits with rigorous evidence showing their programs work, funders should focus on supporting organization which were most likely to build strong evidence bases in the future.
The phrase “next Nurse-Family Partnership” was a reference to the nonprofit Nurse-Family Partnership, which is widely viewed as being one of the best examples of a high impact nonprofit with strong evidence.
So today I’m happy to feature a guest post from Peggy Hill, Nurse-Family Partnership’s Chief Strategic Relations Officer, in which she recommends how we can give rise to more nonprofits with strong evidence of impact. Who better to discuss how to give rise to the next Nurse-Family Partnership than the NFP itself?
By Peggy Hill
The Nurse-Family Partnership® stands as a widely recognized example of careful program development, scientific testing, and extensive, disciplined implementation in community settings. We are now confident that transformational changes of public health importance are possible when this program of nurse home visits to first-time mothers living in poverty is conducted properly. That is why we often hear the philanthropic call to invest in the “next NFP.”
Nurse-Family Partnership has a leadership position in evidence-based practice, and is eager to share lessons learned. We also continue to face new challenges. In fact, some of these new challenges are even more difficult as we enter into the arenas of policy-making, finance, inter-governmental relations, and complex public-private partnerships. We continue to need philanthropic support to go to significant scale with effectiveness.
What can philanthropy and government do to increase the odds of success in moving evidence-based programs into broader practice? We have learned three things that are powerful and could be focal points.
1. Invest in preparing a program’s host agencies and personnel to implement research-based programs properly. Different evidence-based programs have different requirements, and their practitioners need specific skills to be competent. New pilots practice with flight simulators before they are made responsible for multi-million dollar equipment and hundreds of lives. Let’s assume it’s no easier when expecting people to fix the most intransigent ills in our society.
New host agencies need to acquire accurate knowledge about how to implement a program successfully; they need sufficient funding well-suited to the program’s design; and they need guidance to recruit capable staff and help them become expert. And all programs need a data system to track performance so staff can figure it out quickly when things go wrong. (And things will go wrong.) Invest in these critical efforts to lay strong foundations for program operation.
2. Set policy in a way that fosters practice excellence and good outcomes. Rules and money shape behavior. Rules can be good. They lend consistency and can set standards that define success. They can also force application to standards that are over-broad or irrelevant to what particular program models need to do to produce desired outcomes. What makes the most sense is for policy language to specify the outcomes that funders want to achieve, and require grantees to specify the nature of the intervention they want to use, the rationale and evidence for that intervention’s effectiveness, and precisely how they will hold themselves accountable for implementing that intervention well and achieving intended results.
3. Design evaluation of evidence-based programs based on specific knowledge of what each program requires to accomplish its outcomes. Be realistic about the limitations of any evaluation. Evaluation design is important because the results are used to inform policy, practice and future funding. And what gets measured tends to be what gets done. Attend to indicators of good program implementation before you expect good outcomes. Measure what is most important for each program’s success, not what is easiest, or a least-common-denominator across very diverse programs, because the resulting data may not be relevant or sufficient to inform decisions. Be realistic about what a small evaluation budget can buy in terms of methods and rock-solid conclusions.
The road ahead…
H.L. Mencken was quoted as saying, “For every complex problem there is a solution that is simple, neat, and wrong.” Dr. Del Elliott and his team at the Center for the Study and Prevention of Violence at the University of Colorado reviewed more than 800 studies of violence prevention programs. They found 11 programs that worked, and 19 more that looked promising. The painful reality is that many social programs simply don’t make a big difference, for a whole host of reasons. Those that do make a difference tend to have clear, powerful core principles made operational through well-articulated intervention strategies; and equally well-crafted implementation supports in practice, policy and financing. When elegantly integrated at scale in the hands of diverse, competent practitioners, they can produce the impacts we all want to see.