The Center for Effective Philanthropy has become one of the major thought leaders in philanthropy. The research they produce is taken very, very seriously by major foundations and has had a real effect on foundation behavior. CEP’s Grantee Perception Reports are probably the best case in point and many foundations now publish these reports on their websites.
The recent CEP conference that I spoke at last week showed just how influential they have become simply by looking at the audience. In a year when many foundations have cut back on the number of conferences they attend, the CEP conference was packed full of sector leaders. Of the 285 people in attendance, fully 30% were foundation CEOs and roughly 70% were CEOs, trustees or other senior management.
So what’s the fuss? At the core of CEP’s message is their belief (backed by data) that foundation effectiveness has three core essentials:
- Clear goals;
- Coherent, well-implemented strategies;
- Relevant performance indicators.
They call this the What, How and How Will We Know of philanthropic effectiveness.
The following quotes are excerpted from the opening plenary presentation given by CEP executive director Phil Buchanan and vice president of research Lisa Jackson.
One way to be clear about a goal is to be specific: specific about who your work is for or about; specific about where your work will take place; specific about how long you think it should take to achieve your goals; specific about the issue you are trying to address.
Forty percent of respondents to our survey provided goals that were not at all specific. They were goals like this:
· To catalyze the development of a community ethic in support of children
Let’s contrast this with a specific goal provided by another respondent to our survey:
· To increase the number of low-income youth in this state who complete high school on time and attain a postsecondary credential by the age of 25
Whatever your core values; whatever goals matter most to you; whatever you choose to try to do; the questions to ask are these. “Are your goals clear and specific enough to help you make choices about what strategy to use to achieve them?” “Are they clear and specific enough so that you can assess whether they are being achieved?”
Coherent, well-implemented strategies
We define strategy as a framework for decision-making that is 1) focused on the external context in which a foundation works and 2) includes a logical causal connection between use of funder resources and goal achievement.
Our newest research on foundation strategy shows that those who are more strategic act in some specific ways:
· They make decisions about how to achieve their goals based on information external to the foundation – they look to what other funders are doing, consider models, best practices, and relevant research to make their decisions. They don’t rely on only what they know, what the board thinks they should do, or what the foundation has done in the past.
· They have thought through logically what is required to achieve their goals beyond selecting grantees and making grants
· They get feedback from the community, stakeholders, grantees, and others to inform their strategy
· They tend to be more proactive in their grantmaking
· They have a strategic plan that they reference regularly
They have performance indicators to assess whether or not their strategies are helping them to achieve their goals.
Relevant performance indicators
One of the most difficult things for you to know is whether you are making progress toward achieving your goals and whether you are making an impact with your work. The challenges are familiar, and we have researched and written about them over the years: determining causality; aggregating results across disparate programs; the frequent long time lag before impact can be seen.
Still, we’ve seen a dramatic increase since 2002 in the number of foundations using robust performance indicators to assess effectiveness.
For example the James Irvine Foundation now publishes an Annual Performance Report, and the Robert Wood Johnson Foundation has made public the scorecard its board uses to assess its effectiveness. These are just two examples among a growing number of funders that routinely report to their board on a set of performance indicators – ranging from results on grantee, stakeholder, and staff surveys to program evaluation data that is closely connected to their outcome goals.
One of the things I was most impressed about at the conference was that while everyone I spoke with really admired CEP, the opening plenary presentation was followed by a long series of questions from the audience asking CEP tough questions about their approach. What do you think about their framework?