Today’s interview is with James Canales. Jim is president and CEO of the James Irvine Foundation, a 1.7 billion dollar private foundation dedicated to expanding opportunity for the people of California.
You can read a complete background report on Jim here (feel free to add your own background notes via the comments). Briefly: Jim was one of the panelists at the Demonstrating Impact session at the Council on Foundations conference in Seattle this year (you can read my write-ups here and here). The Irvine Foundation recently released a report called Midcourse Corrections (I wrote about it briefly here). After committing to a $60 million initiative, the biggest in their history, the Irvine Foundation realized that all was not going as planned. They released the Midcourse Corrections report to help other foundations avoid the mistakes they made. This is exactly the sort of knowledge sharing that I’ve been advocating.
Make sure to enter the Comments section at the bottom of this post to follow along with and participate in a follow up conversation with Jim.
Expand this post using the link below to read the transcript.
Sean: Hello, and welcome to the Tactical Philanthropy podcast. I’m Sean Stannard-Stockton, author of the Tactical Philanthropy Blog, and Principal and Director of Tactical Philanthropy at Ensemble Capital. My guest today is James Canales. Jim is president and CEO of the James Irvine Foundation, a 1.7 billion dollar private foundation dedicated to expanding opportunity for the people of California. Hi, Jim. It’s nice to have you on the show.
Jim: Oh, thanks very much, Sean. Nice to be here.
Sean: So, Jim, I’d like to just jump right in and start talking about Midcourse Corrections. I’ve blogged about that some already. You released this report in May of this year and detailed a number of problems you were seeing with your largest initiative in the foundation’s history. What led to the release of the report, and how did Midcourse Corrections further the mission of the Irvine Foundation?
Jim: Well, in our case, the Midcourse Correction Report, which we commissioned Gary Walker, who is the President Emeritus of Public/Private Ventures, to write, was really an effort on our part to take a step back from an activity that we undertook in 2003, which was to take a very close and hard look at an initiative that at that point was several years in its course, and a decision we made in 2003 to make a number of fundamental changes based on a close look at the initiative in that point in time.
And now, as we near the conclusion of this eight-year effort on our foundation’s part, we thought it might be valuable to share some of the lessons that we learned from making that Midcourse Correction, that we thought might be of value not just to us, as we think of other complicated initiatives in the future work of the Irvine Foundation, but perhaps to other philanthropies, given that these kinds of multi-site, big, complicated long-term initiatives are something that you tend to see across other foundations.
So it was really our desire and effort to share what we had learned from our experience in an effort to inform the field and, hopefully, in an effort to prevent others from perhaps making some of the same mistakes that we made a number of years ago.
Sean: You know, I really like this concept, that foundations, or any philanthropists, can view themselves as a member of a field, and therefore, doing work to help other philanthropists is in your own self-interest as opposed to a truly competitive market situation where your own success is all that matters. But releasing a report like Midcourse Corrections is relatively rare. I mean, there are some other instances of it, but do you think that that’s because the Irvine Foundation has a different mindset in viewing these issues, or is this something we’re just going to start seeing a lot of across the field?
Jim: Well, it’s my hope that we are going to see more of this across the field. And I was pleased that, there was no collusion here, if you will, that the Hewlett Foundation also recently released a report around a major initiative, a neighborhood initiative, that they had undertaken for many years. Paul Brest, the president of that foundation, commissioned a similar report to look at some of the lessons learned from that effort. These two reports came out at roughly the same time.
Actually, in my mind, I hope that it’s a refreshing signal that we will see other foundations taking a step back from time to time and finding ways to share lessons from the experiences that they learned.
One of the things I noted in the foreword to our report is that one of the many great privileges that we have as a private foundation is access to knowledge. We have so much [unintelligble] in our institutions because of the work that we are able to do with other partners. The question is how can we try to capture that knowledge and share it in ways that might be useful to others? That’s really what we’re trying to do through this report.
Sean: I’ve been writing a lot about transparency on the Tactical Philanthropy Blog. I believe that transparency is most important not as a public accountability issue, but as a highly effective way for foundations to increase their own effectiveness and to help all foundations complete their mission.
However, I recently had a foundation employee leave the following comment on a blog that I’ll quote from: “I worked for a foundation that had great success and big failures. We published some of these failures. All transparency did was to allow our so-called failure to eclipse the many successes in discussion.” How would you respond to this view of transparency?
Jim: Well, I think it gets back to what you’re saying, which is I think transparency is a mindset, it’s a value, and it’s something that ought to permeate across the foundations. I think that in the case of the institution that maybe every once in a while releases a report that talks about a mistake that it might have made or an effort that perhaps didn’t go as they had planned, and that basically becomes their sole effort at transparency — I can understand that that may lead some to feel that all attention had been diverted to that effort.
But if you’re an institution that has transparency as a core value and, therefore, it’s manifested in a whole number of ways, not just through the release of reports that talk about mistakes, but also through ways to share some of the work that your grant partners are doing, through dynamic websites that help people to access stories of the work that you are supporting, as well as just basically to access your grants, as well as making an effort to be out in the field at conferences, to be a visible partner to others in the fields in which you work. All of that, to me, bespeaks a commitment to transparency.
So therefore, transparency, for me and for our foundation, is really about being open about the work that we are doing: hopefully, the successes that we are able to support as well as ways that perhaps things didn’t go as we had hoped they’d have gone — not just to say we made a mistake, but also to say, “Here’s what we learned from it, and here’s how it might inform our own work going forward, and potentially might inform the work of others.”
Sean: Jim, I attended and wrote about the Demonstrating Impact session at the Council on Foundations Conference this year in Seattle. In fact, my post on the session has become the most widely read post in the blog’s history. During that session, your fellow panelist James Knickman said, “We need to frame our release of failures as an attempt to learn. No one tells scientists they’re failures when their experiments don’t work.”
This view of philanthropy as a field that should be characterized by experimentation, risk-taking, and thinking really far outside of that box fits with my own view of where philanthropy needs to go. Do you agree with this prescription for philanthropy, and if so, what do you think prevents the field from making this shift?
Jim: Well, I absolutely agree with Jim Knickman. I think that it is true that philanthropy, at least our value here at Irvine, and I think it’s the same for Jim as well, is that our role is to foster innovation, to take risks, to make bets, and to test hypotheses in the work that we do — and obviously this is all work that we do through others, not just on our own.
And as a result, because we are doing that kind of work, I think it is important to take a step back from time to time and figure out, “What are we learning from the risks that we’re taking? What are we learning from the bets that we’re making? What are we learning from the hypotheses that we’re testing that may well lead us to find ways to improve upon the work that we do going forward.”
To me, foundation work is very much a give-and-take process. You test the strategy, you learn from the test of the strategy, and then hopefully you refine that strategy as you go along. So, for me it’s very much at the core of the way we approach our work here, I think something that is a good movement in philanthropy. In terms of why this may not be a value that’s been part of the field over time? I’m not sure that I know that much about that or the motives of others, but I can simply say that in our case, I just think having this agenda of finding ways to share our knowledge broadly is an important value, and it’s something that we’re committed to.
Sean: If you think about that analogy that Jim made, comparing scientists and philanthropists, scientists have that culture of sharing information, and they don’t view it as admitting their failures. It’s simply sharing what we know. That’s a cultural value that we have as scientists. It’d be wonderful to see that as a widespread cultural trait in philanthropy.
Jim: I would agree with that and I think one of the challenges, one of the obstacles — and you have probably a number of obstacles that get in the way from our ability to have that mind set — is: If you think about it, in philanthropy in private foundations at least, the market force is really to make grants.
You have budgets to meet, you have payouts that you have to meet and the way that our institutions are structured is we are structured as grant makers. We make grants.
The question for me is, how much time are we able to then devote to monitoring those grants in an effective way and to thereby accumulating and gaining knowledge from the work that we are doing from the grants that we made a year ago, two years ago, and then finding ways to roll that knowledge up in a way that is accessible to others.
I think that’s one of the challenges. Because I think if you are trying to keep your staff fairly lean and I think many of us in foundations make an effort to keep our staffs lean and an effort to really invest largest possible resources in the community and you have pressure to make grants, and you have board dockets that you need to attend to.
I think it’s often hard to have the discipline to find the time to take a step back and say: “What are we learning from this work?” and not what just are we learning in terms of our internal conversation.
Obviously I think all foundations have those conversations on an ongoing bases but how do we find the time and carve out the space to then share that learning whether it’s through an article on a website or whether it’s through publishing these kinds of reports or whether it’s writing up [unintelligble], or serving on panels we have an opportunity to take a step back and reflect.
I guess it’s my hope that we will see in philanthropy a higher premium on that kind of activity. I think it ultimately enriches the field as a whole.
Sean: That would be great. Absolutely. Yeah, Jim Knickman made another comment in the session that was very well put in responding to a question on measuring impact. Knickman said, “A good soup is made up of lots of parts, it is hard to identify if the garlic or the carrots is what’s making it good. But you can identify if the soup is good. If you’re making a bad soup, do something different.”
How do you think that the impact of philanthropy can be best measured in service of providing us today with what we need to improve as a field?
Jim: Well, I think you need to begin by unpacking the term impact. And by saying, what do we mean by having impact? The way we thought about it in our foundation is that we have taken a fairly holistic view of that.
I mean, clearly I think as foundations and given where our resources go, priority needs to be on finding ways to measure and to assess whether we are having impacts through the grants that we are making to the partners that we are engaging with and obviously toward the ends that we are agreeing to when we support work.
Obviously there is work that we can do, we can have grantees self report on their progress toward goals and objectives. From time to time we can hire evaluators to assess the effectiveness in particular initiatives or cluster of grants that we’re undertaking and I think that’s all well and good and foundations are doing that and it is a good and worthy and important activity to undertake.
At the same time it’s been our view here at Irvine that we try to take a broader approach in terms of thinking about our performance. And we’ve created something we call a Performance Assessment Framework which we’ve described on our website.
This was something which we engaged with our board a couple of years ago and when the outcome of our process was the development of a framework that we wanted to test for a few years primarily to report back to our board on how we assess our performance as an institution.
We then decided not just to obviously work with our board and to share the results with our board but also to share that publicly, again to make hopefully a contribution and to develop our own thinking about this by sharing it broadly with the field.
One of the things we found when we undertook this process is there weren’t a lot of examples out there of foundations that have thought holistically about the way they assess their performances in institution.
There were some examples. The Lumina Foundation does this, the Rockefeller Brothers Fund had some examples, to Robert Wood Johnson Foundation where James Knickman worked, had some good examples. So those were good examples we drew from to create our framework.
And I won’t go into details. Anyone who’s interested can go to www.irvine.org and find the Performance Assessment Framework on there. But it effectively looks at three areas related to programmatic impact and then three years related to organizational impact.
The whole effort is, again, to create a holistic picture that says, “How do we as an institution think about our performance and assess our performance and report to our board on an annual basis on our performance?” Which, again, takes a more holistic view, obviously program impact is a big part of it but we think that there are other dimensions that an institution like ours ought to be looking at.
Sean: So we have time for one last question. I want to swing back to transparency a little bit. A lot of the focus seems to be on really large instances of transparency, like the Midcourse Corrections Report, which took a lot of time and money and should hopefully have a very large impact.
A lot of the new social media tools, which is referred to as Web 2.0, make it extremely inexpensive for people to update the world on every little action that they do. Have you seen anything in philanthropy, maybe anything that you’re doing or ways that people in philanthropy think about having transparency around smaller issues not around giant programs but around the every day things that they do, without getting bogged down in just revealing meaningless stuff that they do on a daily basis?
Jim: Well you know it is interesting; one of the other panelists at the Council of Foundation Session was Joel Fleishman. Of course who recently wrote a book called “The Foundation” and the subtitle of the book is “A Great American Secret”, and in many respects I’ve been very pleased to see this movement toward transparency in the philanthropic community because I do think that there is a lack of wide spread understanding of what our institutions do, the role that they play and hopefully an understanding of the contributions that we make.
And, again, I want to stress in talking about contributions that foundations make, that these contributions are all made in partnership with those we are privileged to support. It’s not just us, it could never be just us and we are an in abler of a wonderful constellation of organizations that make up the non-profit sector.
So in that context, I do think that the movement toward transparency is an important one. I find that I’m learning every day from others in the field about ways that we can do this more effectively and perhaps even in smaller ways.
Now, your blog is an example of I think something that might be characterized as Web 2.0. We are seeing, obviously, it seems to me a proliferation of blogs on philanthropy on the non-profit sector. I have been pleased to see some of my colleagues in foundations take this on, on their own.
There is a woman whose name I don’t recall, but who runs a community foundation I believe somewhere in the Midwest whose started her own blog. I know that Albert Ruesga of the Meyers Foundation has his own blog.
I think that these are great tools to just shed light into the work of our organization like into the world of organizations that many people don’t know much about. I think that the more people understand the work of foundations the better that our field is and I hope that we at Irvine are making some very small steps toward contributing to that broader aspiration.
Sean: Well, Jim, thank you for the steps you have taken toward transparency already. Thank you for joining us today.
Jim: Sean, thanks again for having me.
Sean: This has been Tactical Philanthropy Podcast. You can visit us at tacticalphilanthropy.com. For more information about the James Canales and the James Irvine Foundation visit irvine.org. Thanks so much.
25 Comments
It is refreshing to see foundations such as Irvine speaking openly about their work–even thoughtfully planned initiatives that veer off-course and need correcting. Special credit is also due Jim Canales for acknowledging his responsibility in the report for launching the initiative. I hope other foundation leaders see that there is nothing to fear from being so open, candid and willing to let others take a look at the inner workings and thinking of foundations–something that is still a mysterious process for many.
Thanks for participating in this interview. Your “Midcourse corrections” report is largely clear about what you did and what you learned, a rare quality (and one I don’t think the Hewlett report has at all). It’s great that you’re making it possible for people to ask questions about your work, and that’s what I’d like to do:
1. Your report makes it very clear why you decided to target after-school programs; I’m with you up till that point. Then, you lose me as you describe how the content of the programs was mapped out. What “key players” did you consult and how did you choose them? What was the case for the content you did choose?
2. Your report refers (page 12) to a body of research implying things like “It is very difficult to change educational performance through after-school programming.” What is this research? Do you have a writeup summarizing what you’ve looked at and what conclusions it’s led you to? Can I/we see this writeup?
3. When you describe what went wrong, it looks to me like you had always agreed that a strong educational component was necessary, but it wasn’t happening, so you agreed to let P/PV design programs, and cut funds from less education-focused programs. Is that right? If so, how did you come to the original conclusion that educational content was so important, and what was P/PV’s case for focusing specifically on literacy?
4. Are you concerned at all that you hired P/PV to do an evaluation and they concluded that what you needed was more involvement from (and pay for) P/PV? How was the potential untrustworthiness of that conclusion addressed?
I generally find your “lessons learned” to mesh with common sense, especially regarding having a theory of change and testing it throughout the program. Some of these lessons (“Think critically about the facts on which the initiative is based”) are so common sense as to seem vapid to me. But lessons 1, 4, and 5 are lessons I can get behind.
One more unrelated question: what is your time frame for giving out all your money? Do you agree that the “return on good” from helping people to become productive citizens (and thus help others) is higher than the return on financial investment? If so, does your time frame reflect this?
A note to Sean: I think this is your worst interviewing job, by far. A foundation publishes a report that actually speaks honestly about what it did – to me, the appropriate response is to read it critically and take the conversation further. Or, just push more on foundation transparency – I mean, this report is a step, but we’ve all discussed thousands of other steps we’d like to see. Instead, you threw a bunch of softball questions along the lines of “Do you agree that transparency is good?”
Holden–
I could use some clarification about your comment re: P/PV? Are you questioning the organization’s credibility because of their conclusions? For what it is worth, foundations and others commission P/PV because of its sterling reputation, the knowledge that its work is unquestionably of the highest possible quality, and thaqt undoubtedly what they have to say is worth paying attention to. Gary Walker, in particular, is one of the best in the business. It certainly is fair to ask your question about P/PV’s bona fides, such as why they were chosen among others, what made them best suited, the scope of the assignment, etc. But I hope (dare I say it) you don’t really see ulterior motives behind their conclusions. I’m sure Gary and company, as well as Irvine can answer your question to your satisfaction. But until then, I hate to see that comment sitting there unanswered.
Sean: Thanks again for doing these podcasts and for helping to take the conversation about philanthropy out to the streets (however virtual those streets might be).
Jim: Thanks so much for doing this interview. A big ibid. to Bruce’s comment. The public can see part of your commitment to transparency in that section of your website that discusses Irvine’s grantmaking philosophy and approach, as well as its guiding principles. I know that often a foundation’s very specific funding priorities and the detailed rationales for them are, for many good reasons, not made public. Do your published guidelines give a complete picture of your funding priorities? Has Irvine ever considered publishing program officers’ write-ups, for example? I know that many folks in the field who are committed to transparency struggle with these issues. Can you give us a sense of how Irvine resolved them?
“I know that often a foundation’s very specific funding priorities and the detailed rationales for them are, for many good reasons, not made public.”
Albert, what are these reasons? I’ve been looking for them (the good ones, anyway) in vain.
Holden, foundations that desire to be more transparent about their rationales, want also to be responsive to members of the public when they challenge these rationales. Publishing rationales and the like on a website takes little time; but foundations can eat up considerable staff resources defending their thinking. It’s not every grantmaker that can afford an ombudsman.
Then there’s the nature of these rationales: they’re not simple deductive arguments. A decision to fund or not fund in a specific area is often a complex function of a foundation’s history and ethos; the knowledge and experience of its program staff; and other factors. Many foundations will commission studies, surveys, scans, and the like, but between these tools and the setting of funding priorities there’s often a gulf—smaller or wider as the case may be. This gulf can easily become the locus of a lot of fruitless criticism.
Thanks to those who have taken the time to post their comments in response to both the Irvine Midcourse Corrections report and the interview.
I would divide Holden’s various questions into substantive ones related to the content of the CORAL grantmaking program and its redesign (questions #1, #2, and #3 in his post) and process questions (the other comments and questions in his post), which was the primary focus of the report. In this response this Thursday morning, I will address the latter, at the same time as I commit to posting a later reply that will seek to address his substantive, programmatic questions.
Let me start with the comment about P/PV and echo Bruce’s observations about that firm. Bruce’s comments about P/PV’s reputation are very much on the mark, and their work with us over the years has borne that out. Let me also clarify P/PV’s role here. P/PV was selected both as the intermediary organization for the initiative as well as the evaluator in parallel RFP processes. In other words, we carried out a rigorous RFP process to identify an intermediary organization and an evaluator to work with us on this next phase of CORAL in the fall of 2003, and in the process we interviewed other firms prior to selecting P/PV to carry out that work in each domain. In their role as intermediary, they then identified some of the gaps based on the evaluation data that had been produced by the previous firm (not P/PV), and that led to the focus of the midcourse correction. As far as the implication that they their conclusions were intended to generate additional business for them, that is problematic on three fronts: first, it’s not factually accurate; second, it would not be consistent with the integrity of their work; and finally, the conclusion that we needed to retool the effort was neither controversial nor difficult to reach.
That said in regard to P/PV, I think Holden is raising an interesting question as it relates to firms that work with foundations and which have various competencies. We have found ourselves in other circumstances using the same firm to carry out both evaluation activities for an initiative at the same time as they provide technical assistance to the grantees. As the cost and burden on grantees of dealing with multiple organizations for evaluation and for technical assistance can be high, in certain circumstances, we have found that these roles can best be coordinated within a single entity. This has obviously raised questions for us, for our board, and likely for our partners, about the objectivity of the evaluation work if the firm is also invested in its implementation. It also raises questions for the grantees about how much they share with the firm, knowing that they are not only technical assistance providers, but also evaluators. We have managed this at Irvine in three ways: first, only hiring the same firm for both roles when we are persuaded in our professional judgment that the firm is the best equipped to carry out each role; second, by ensuring that a “firewall” is created between these two roles, typically involving different professionals in each of the two roles; and finally, by communicating clearly to the grantee partners about this dual role and its implications. At the same time, I’m sure there are better ways to do this and other perspectives on this question that would be worth exploring if it’s of interest to other readers. So fire away with your thoughts. I’d be interested in learning more.
Holden’s final question about time frame, if I read it correctly, invites a discussion about perpetuity vs. spend down, a subject that quite frankly I have not considered as thoroughly as many others in the field and where there is rich literature available on each side of the debate. Irvine has operated under the perpetuity model, though the board can elect to change that, and I have accepted that approach as the one that continues while I am in this role.
Albert’s post asks about funding priorities and how “transparent” those really are. We, like many others, make an effort to be as clear as we can about our priorities so that prospective grantseekers have as much information as possible to guide their inquiries to us. But it is true that there are more programs and organizations that, on paper, are a fit with our guidelines, which then leads us more to the art than the science of grantmaking in areas such as assessing organizational health, leadership, governance, finances, planning/evaluation capacities, etc. In other words, there are indeed various factors that contribute to our internal decision-making, and there are conclusions we reach that others may disagree with. We have not discussed internally whether to post publicly our grant write-ups, and that’s an interesting idea. If there are models out there we could learn from, we’d value that.
Finally, while the note to Sean from Holden about the interview was not intended for me, I do want to probe one aspect of it. Holden notes that “the report is a step, but we’ve all discussed thousands of other steps we’d like to see.” I would be interested in learning more about some of these steps that have been most resonant for the “we” in that statement. This blog strikes me as a great opportunity not only for me to address other questions, but also for me (and us at Irvine) to learn from all of you, so I hope there will be some postings that address ways we could advance transparency as a field.
With thanks again,
Jim Canales
Haven’t read Jim’s comment yet … Albert, your reasons basically come down to “Foundations might draw a lot of criticism they don’t have time to answer.” This presumes:
A. That the cost of time taken to answer questions outweighs the benefit of good criticism. That, to me, is an incredibly arrogant attitude. Given how small and like-minded the communities from which foundations currently receive criticism, this just doesn’t seem plausible.
B. That foundations not only don’t have the staff to spend this time, but they can’t hire it. But then – foundations have tons of extra money! At least, according to the only plausible argument I’ve seen for why they give out only ~5% a year, that there simply aren’t enough worthwhile projects to fund. They’ve got cash sitting around begging for a place to spend it – why not hire a Director of Feedback?
C. That publishing this stuff without responding to comments would be worse than simply keeping it locked in the vault. Even if my analysis of A and B is off, this is ridiculous. If a foundation publishes its reasoning and is unable to respond to comments, maybe the foundation looks worse, but I can’t imagine how you’d argue that the world isn’t better off net. And the foundation’s mission is to improve the world, not its image.
Jim, first a couple quick notes/questions:
1. I don’t think the decision to use P/PV as you did is ipso facto bad, but it needs to be better explained. The report makes it sound like P/PV’s recommendation was the main reason the action was taken, and that just isn’t acceptable, no matter how good their reputation. (Ulterior motives can be subtle, even unconscious – that’s why conflict of interest policies are followed even by the most upright citizens.) It would be better to outline the content of their argument to continue employing them, i.e., what evidence they presented that their plan was the right way to go.
2. That’s fine if you don’t want to debate perpetuity, but would you mind pointing me to a starting place for the literature you mention?
Jim, in response to your question about promoting transparency, I’m just going to list some of the things that we at The Clear Fund are doing. Our project has the explicit aim of being the world’s most transparent grantmaker, in order to foster better dialogue and help inform individual donors, who collectively dwarf foundation giving. It’s a startup, so some of what I describe doesn’t have examples yet.
Is this extra work? Yes, although I don’t think it just pays off in transparency. I know that every single thing I do has to be written down and documented. That forces me into certain habits, and it’s good for the interaction between me and our Board of Directors, even if we had no interest in more general outreach. I believe that forcing as much as possible (about your values, priorities, research, decisions) into writing will improve the quality of those values, priorities, research, and decisions … and once it’s in writing, may as well stick it on the web.
Thoughts?
Jim, thanks for participating in all of this. I have a question for you and then my thoughts on how (and why) foundations can be more transparent.
In my first question during the podcast, I asked you “What led to the release of the report, and how did Midcourse Corrections further the mission of the Irvine Foundation?”. I’d like to know more about how releasing this report to the public furthers your mission. I believe “transparency” is NOT a public accountability issue, it is a technique for being a more effective foundation (by improving the field of philanthropy and being able to leverage the grant making of other foundations that learn from you). What benefits do you see of releasing the report and why should other foundations follow your lead?
Regarding what I think can be done by foundations; I think it all boils down to engaging in public conversations about how philanthropic dollars can best be spent. My professional background is in the investment management industry. Although Wall Street is about as cutthroat competitive as any industry, there is a culture of sharing and discussing ideas. We have a 24 hours news station, every investment professional is constantly sharing their ideas with colleagues (even those at other firms), there are blogs, articles, books, etc that share ideas about how to invest (and much of it is communicated publicly).
The huge hurdle of course is that investors want to move first and capitalize on their information. Philanthropy doesn’t have this issue! Getting everybody to direct their money in the best way is GOOD and does not hurt each foundation’s ability to accomplish their mission (it only enhances it).
So my advice would be for philanthropic entities (and large foundations in particular) to actively engage the public at large in a grand conversation about how philanthropy can best utilize its considerable power to make the world a better place.
Luckily, information sharing tools are plentiful and dirt cheap. Blogs, podcasts and regularly updated websites are not expensive. But the trick is to not just pump out information, but to engage in a conversation. To actively seek out other people’s opinions and to comment on their ideas.
As Holden has regularly written, the only downside to transparency is that you might look bad sometimes.
Who cares. Information is power and the more valuable information you can gather the better.
So how do you get started on all of this, when most foundation employees won’t even leave a comment on a blog let alone write their own? I have a very specific recommendation. Hire Beth Kanter to teach you about social media tools. She has been helping nonprofits harness their power for years and has worked with Robert Wood Johnson.
Thanks for the steps you have taken so far.
Great conversation. This is what nonprofit fundraising blogs should be all about.
I think I agree with Sean that this discussion about “transparency” is NOT about public accountability issue, it is a technique for being a more effective foundation (by improving the field of philanthropy and being able to leverage the grant making of other foundations that learn from you). The report’s forward states the purpose “to glean what specific lessons might be applicable to others.”
There are many other topics we could be discussing related to transparency of foundations… but that wasn’t the point of the Midcourse Correction – and that isn’t the point of this debate.
With that distinction behind us, I would move on to my core disagreement with Stephanie Strom’s article in the NYT (those who read my Don’t Tell the Donor blog know how much I love Strom).
I think it may not always be beneficial or helpful for foundations to make these kinds of reports publicly available. As I read the Irving Foundation’s report, it is clear that it is NOT “intended to question the commitment, competence, or sincerity of foundation or grantee personnel.” Rather they focus on strategic direction and approach.
…but what if the report had come to the conclusion that the grantee was incompetent and the logistical implementation was flawed? What if the foundation had picked a local group to be the intermediary organization – and that group failed and therefore the foundation pulled the plug.
Should foundations be making grantee failures public in order to warn other foundations?
This is going to get us back into the Some Nonprofits Just Suck debate… But it seems there are two options, 1) Don’t tell anyone that the grantee failed and watch other people make the same mistake and waste their money, or 2) Let people know what happen and help people learn from your experience.
If there was a restaurant in your neighborhood that you went to and had a bad experience, would you keep it to yourself and watch friend after friend go and find out for themselves? Or would you share your experience with people?
Does anyone think that the fact that Amazon.com features user reviews is a bad idea? Is sharing factual data and your personal opinions a way to help everyone?
I’ll tell you this. If a client of mine had a bad experience with my firm, I may not want them to say bad things about us to their circle of influence, but I would certainly expect it! Of course the reverse is true too. In fact, most excellent organization rely on the fact that people share their experiences with other people. Only poor performing organizations should want their “customers” to not share their experiences.
Regarding “public accountability” vs. “technique for being a more effective foundation” … Not sure who introduced the idea that we have to choose one. Great taste, less filling.
Sean,
You asked a direct question about how producing and releasing this Midcourse Correction report advances Irvine’s mission, beyond illustrating a core institutional commitment to transparency. In response, let me begin with Irvine’s mission which is to “expand opportunity for the people of California to participate in a vibrant, successful and inclusive society.” The primary means to achieve this mission is to provide financial support to organizations aligned with the specific programmatic goals of the Foundation, which is our core business. However, I’d argue that’s not enough.
Private foundations (and I specify “private foundations” because they are the ones I know best, not because other types of foundations can’t do these things) are uniquely situated to play other roles as well—for example, we can convene parties with shared interests, build new relationships that will advance mutual goals, complement grants with other offers of assistance, say in communications, evaluation, or organizational development, or distill knowledge gained from the work it does (with others) and to share that openly with targeted audiences. Each foundation has the right to decide how or whether it wishes to take these various approaches. Our view at Irvine historically has been that we enhance our contribution as a foundation by employing these various tools at various times, hopefully in a thoughtful and strategic manner.
So, as a result, targeted opportunities to learn from what we do, to capture that knowledge in ways accessible to others, and to share those understandings broadly are not only consistent with our mission and an extension of it, but frankly one of the ways we demonstrate our recognition of the privilege of philanthropic work. Yes, it’s more work, and at least for now, no one is demanding us to do this, but as foundations are more broadly on the radar screen (which is a good thing), we should be certain we are taking full advantage of all of the resources at our disposal to create social benefit. That’s how I see it, but again, I’m eager to hear what others think.
Jim Canales
There is a foundation in my home town that lists one of it’s main goals as “fostering an appreciation of the arts.”
To fulfill this part of its mission, it’s often giving $50,000 grants to local museums, galleries, and historical societies for specific exhibits.
What I tried to convey in my earlier post was that if they want to publish a review which says, “we’ve found that our goals are not being achieved and instead we will redirect our strategy to funding elementary and high school arts programs” – I think that is a good idea.
However, I’m not sure it always makes sense to use a midcourse correction report if the objective would have been to point fingers and blame specific museums that failed to deliver on it’s attendance goals.
To me, that’s where the differences between the goals of “improving the field of philanthropy” and “public accountability” begin to appear.
Let me begin by emphasizing support for an increase in accountability and transparency in philanthropy, and commending Jim and the Irvine Foundation for their report (while acknowledging that I have not read it).
However, as Albert noted, I also think we have to acknowledge some of the good reasons for privacy.
1) There is a already a great deal of conformity in philanthropy. While expanding the number of people critiquing foundation efforts could lead to new perspectives, it is just as likely to lead to more conformity and an unwillingness to take on unpopular issues or to try new approaches.
2) While the open source movement has proven that high quality innovation can come from transparency, it is equally true that sometimes high quality innovation requires more of a “skunk works” approach that allows for more risk-taking. Additionally, while there aren’t financial incentives for foundations to be innovative, there are reputational incentives and we eliminate those at our peril. While it’s true, as Holden notes, that foundations shouldn’t have a goal of protecting their image, this presumes we’re talking about large institutional foundations with funding in perpetuity. If we remove the “image” incentive that does quell a large reason why many foundations are started in the first place. While we may cast stones all we like about mixed motives, the reality is that money given with mixed motives does a lot of good in the world.
3) Straying into the “public accountability” issue, there are good reasons why we generally do not allow the elected government to decide what constitutes “speech” and is protected by the First Amendment. In the same way, I think we always need to be concerned about the slippery slope that ends in an official definition of what is “good”.
None of these arguments are meant to suggest that transparency is not useful or should not increase from its present state. We do need to acknowledge that there are costs as well as benefits to transparency.
Ultimately I believe that foundations should be transparent about the goals they are pursuing, the ways in which they pursue those goals and the results they are achieving. There should also be room for keeping some of this information “classified” for a limited amount of time (say 3 to 5 years).
Tim, all of your arguments are about what will happen if we “impose” transparency. But I don’t think anyone is advocating that. I’m certainly not.
I’m just saying to foundations: if you have the strength of conviction and character to share what you find and listen to criticism open-mindedly (rather than shutting yourself off to it, or letting yourself get pushed around), do it. If you don’t, please give your money to someone who does. I still haven’t heard good reasons this request is off base, though I’ve heard reasons it shouldn’t be imposed by force.
Three separate replies here to the posts thus far:
1. In my earlier post I committed to addressing Holden’s programmatic questions, which I will do here. His questions are essentially about the design of CORAL and the research that informed that design. He rightly notes that we didn’t cover these issues in the Midcourse Corrections report, and indeed it’s important to remind everyone that the Midcourse Corrections report is NOT an evaluation report of the initiative; it is a report that details changes we made midway to an initiative that was off track, a very important distinction. In any event, with respect to evaluation results, we have published separate reports that specifically address these technical issues. In December 2005, we published an interim report based on data collected in the first year of the evaluation of the CORAL program called Launching Literacy in After-school Programs: Early Lessons from the CORAL Initiative. That report also describes the early thinking behind the initiative and subsequent research findings. You can read or download the report here: http://irvine.org/publications/by_topic/evaluation.shtml#ev4. Over the next six months, we will be publishing a final report on CORAL as well as two research briefs and a tool kit for practitioners in the field. Together, these resources will address the questions Holden posed in terms of the context for CORAL, including key findings from existing research on large-scale, after-school program initiatives and the major barriers to realizing impact on student academic development and achievement.
2. Holden also asks about the literature on perpetuity vs. spend-down, and I’d invite others to weigh in on what is out there. I believe that the National Center on Family Philanthropy probably has resources available as might the Council on Foundations. There are certainly high profile examples of articulate and passionate spokespeople on the spend-down side of the argument from Julius Rosenwald in early part of the 20th Century to Richard Goldman today.
3. Finally, on the more recent debate in the postings yesterday about how far to take transparency, I’d like to build on a concept introduced in Tim Ogden’s post where he writes: “We do need to acknowledge that there are costs as well as benefits to transparency.” Taking a cost-benefit approach to the question of how transparent to be as a foundation strikes me as a useful one. So, for example, one might cite as “costs”: (a) harming grantees by publicizing their failures/shortcomings; (b) damaging the foundation’s reputation by revealing how resources were not well-used; (c) discouraging others from engaging in important fields of work that might be viewed as risky or with little chance of success; (d) providing fodder to those who question the contributions of philanthropy. On the “benefits” side: (a) sharing valuable knowledge that might inform the work of others and improve their prospects for success; (b) ensuring that mistakes are not repeated; (c) underscoring the importance of key principles for good grantmaking, such as clarity about outcomes, etc; (d) improving your own work as a funder through careful self-reflection and a commitment to learning and applying that learning (I’m sure most funders ascribe to that, but when you commit to sharing lessons publicly, it certainly guarantees follow-through on the commitment!) This is an incomplete analysis, to be sure, but I do think we should carefully weigh the costs and benefits. In our case at Irvine, as one of the earlier postings rightly observes, this report is not about grantees who failed; it is about flaws in program design and execution on our part, and that’s probably an easier report to put out there.
A parting thought: there is yet another thread on here about our obligations to share knowledge so that other funders don’t “waste their money”. Unlike the examples offered by Sean about the neighborhood restaurant or book reviews on Amazon, the relationship between funder/grantee is much more complex and nuanced. Indeed, I just don’t buy the parallel: as a funder, I don’t see us “buying a service” that lends itself to simple assessment of whether we were satisfied as a “customer”. I see us as investing in organizations who share our priorities about how to improve California. Indeed, while I fully acknowledge the power imbalance inherent in the funder/grantee relationship, I hope we work hard to build the kinds of relationships with our partners that minimize that imbalance. And I’m not sure that viewing our grantees in ways we view consumer products helps to lessen the power imbalance in any way. In fact, I’d argue it increases it.
Jim Canales
Jim, fair enough. The funder/nonprofit relationship is not the same as a seller/buyer relationship. I sometimes use that analogy because everyone has experience with the seller/buyer relationship.
However, the funder/nonprofit relationship is also not the same as an investor/company relationship because at the end of the day, you have no control over the investee. But let’s look at for-profit investing (my professional background) for lessons about transparency and power imbalances.
In the US today, the top performing, successful companies enjoy a power imbalance over investors. Start up companies clearly are on the weak end of the relationship and have to depend on investors for their success. Historically there has not been a lot of transparency in for-profit companies. But that is changing, investors are demanding that companies act in their best interest (they own the company after all) and are demanding transparency from the company so they can judge for themselves. There is a massive amount of factual and opinionated information about which companies are good investments.
I think that most observers think transparency, even (or especially) when it reveals negative info, is a huge positive for investors, the economy and even the companies themselves.
That being said, investors in startups (where the investors needs the company to get other investors in the future) might very well find that the cost of letting other people know about problems with the start up is not worth the benefit of transparency.
I think what it comes down to is that under conditions of perfect information, the benefit of transparency hugely outweighs the costs. However, when limited information exists, the release of negative data can hurt a “good” nonprofit because information about their positive points is not freely available. (this is reflected in the foundation employee’s comment that I reference in the interview.
So in my view, transparency is very good, but getting there is tough. This is a classic network effect, the most transparency there is, the better it is. But if only some foundations/nonprofits are transparent, they might well find that the cost outweighs the benefit.
I believe we can make it over the hump.
Just to clarify, I don’t believe that foundations have an “obligation” to be transparent. They don’t have an “obligation” to other funders. I just think they should, not in a moral sense, but in a self interested sense. I think that we will find that transparency is good for foundations, good for nonprofits and good for the public. But at the end of the day, I think that the transparency decision is completely up to each individual foundation.
This is why I make the distinction between “public accountability” transparency and “philanthropic effectiveness” transparency. I’m interested in the second type.
I think it’s both in your interest and a moral (though not legal) obligation. All taxpayers subsidize you; don’t forget it. And here’s what I think of your “costs”:
(a) harming grantees by publicizing their failures/shortcomings
This is a good thing. Nonprofits that suck at what they do should go out of business. There’s currently no mechanism for that to happen. Let’s create one. “But they might not suck at what they do, they might just have made one mistake, and people will get the wrong impression,” says the Straw Man. I’ll get to this below.
(b) damaging the foundation’s reputation by revealing how resources were not well-used
WHO CARES
(c) discouraging others from engaging in important fields of work that might be viewed as risky or with little chance of success
(d) providing fodder to those who question the contributions of philanthropy.
The basic line of reasoning here (and in the Straw Man response above) is “If we’re honest that mistakes were made, people will mistakenly interpret that as evidence that we’re totally incompetent.” This presumes that people are currently the least bit fooled into associating the lack of disclosure with a lack of weakness. It also presumes that they don’t recognize the basic truth that nobody’s perfect.
Both of these presumptions are wrong – particularly of any person who would bother to read your disclosures of weakness. I outlined this argument more fully here.
In the end, sharing the good and the bad is the best approach with any set of people whose goals you share. To do otherwise is to assume that you know so much more about what’s right than they do that you’re not even going to give them the ability to make their own judgments, that it’s better to “dupe them into the truth.” Well, that’s wrong. This argument has been had and basically settled regarding government transparency, business transparency (as Sean points out), within-organization relations (teammates, coaches, coworkers), etc. Now let’s settle it for charity.
Kudos to you, Sean, for advancing this dialogue which I blogged about in the first of three parts today on PhilanthroMedia.org. Love the back and forth of this commentary. My three-part posting on the Irvine and Hewlett reports, which was picked up by OnPhilanthropy and will be included in the next quarter of Grantmakers for Education, can be found here: http://www.philanthromedia.org/archives/2007/06/truth_when_kindly_fibs_would_f.html
Over the past six years, I’ve worked with Community Foundations of America and leading community foundations on two major ideas for advancing accountability. Both were driven by the desire to make impact data available to donors (a motivation that private foundations don’t share, and which makes this hard work even harder.)
Impact Data — About six years ago, CFA began an effort to capture performance data about grants that could be made available to the high-net worth donors they serve. We created a truncated version of the United Way’s Logic Model. It was built on the idea that inputs are a good starting point, and that you can’t have accountability if you don’t count. We taught this process to program officers and grantees at ten community foundations around the country with varying degrees of success. We also identified two sets of metrics (beginner and advanced, if you will) that community foundations which want to demonstrate their accountability should consider gathering and make available on their grantees. About three years ago, data elements driving this process were refined by a group of community foundation leader. White papers describing this process and the resulting metrics can be found here: http://www.givingnet.net/page7207.cfm
This work also led to the effort, undertaken by the Urban Institute and the Center for What Works, to develop a taxonomy of performance metrics that can be both a resource to nonprofits. (Info here: ttp://www.urban.org/center/cnp/projects/outcomeindicators.cfm)
Impact Database – In tandem with the data effort, we worked with a now-defunct technology company, called B2P Commerce Corporation, to build a web-based system for both capturing and making this data available. I worked with 12 community foundations who implemented the system, called ImpactMgr, to varying degrees of success. The Kansas City Community Foundation was part of the beta phase for ImpactMgr but then spun off to create DonorEdge. I won’t go into the post-mortem now about why ImpactMgr didn’t fly but will attribute much of it to first mover disadvantage. We were at least five years too early. The technology was Web 1.0, meaning too unwieldy and too expensive. The process was also unwieldy for both nonprofits and foundations. And because this data was a ‘nice to have’ for donors, neither foundation program staff nor grantees could afford to put the requisite time into building these metrics. I do believe the imperative for accountability continues to grow and that efforts like this will increasingly gain traction.
The summer weekend beckons so I’ll be back next week with a summary of ideas I’ve blogged about but haven’t tried that foundations could use to advance accountability.
Gentlemen:
(And it is all gentlemen, isn’t it? If I’m not the only girl person involved in these discussions, I’m one of the few; and it might be interesting some time to try to figure out why that is–though not right now.)
The discussion of transparency seems to me fundamentally beside the point. The only secret I really want to learn about any given foundation is its rationale for existence in perpetuity, or to put it another way its excuse for not spending more money. The primary purpose of foundations, lest we forget, is to give grants to agencies that serve people, and the philanthropies’ tax-favored status is based on the notion that giving money to a foundation is simply a deferred way of giving it to charity. But that wasn’t supposed to be deferred as in “What happens to a dream deferred?”–i.e., delayed indefinitely.
Actually, I lied: the other thing I’d like foundations to be transparent about is their reason(s) for having Boards of Directors less diverse not only than the communities they serve but than their corporate counterparts. The very minute a foundation tried trotting out that old chestnut “But we just can’t find any qualified minorities or women . . .,” the public ridicule that ensued would do more than a thousand diversity initiatives to cure the problem.
I’m all for the transfer of lessons learned, but it’s not clear that this transfer takes place best through the medium of foundations. Wouldn’t it be better for philanthropies to concentrate on setting up systems of communication between operating nonprofits so they could find out, e.g., whether bednets are more cost-effective than malaria pills, or transitional housing more effective than counseling in curing domestic violence or homelessness? That’s a “convening role” worth performing.