The “dirty little secrets” of program evaluation

Posted by & filed under CGR Staff.

Kirstin PryorDirty little secret #1—When you say “We do program evaluation,” typical reactions include polite but confused nods and rolling or glazed-over eyes. Unfortunately, the real value of evaluation is often crowded out by rhetoric about investing only in “evidence-based” programs on the one hand and pressures of grant compliance on the other.

CGR’s clients operate in the real world, where program evaluation isn’t quite as pristine as it is in academia. Some of CGR’s evaluations have formal designs with control groups, sophisticated statistical analysis, and measurable “hard” outcomes. But the vast majority of evaluations are not as “pure”—and this is where the fun begins. Real programs serve real people who are affected by many things besides the program in question. They have real limited budgets, real funders funding different outcomes, real challenges getting data, and operate in real dynamic contexts. Evaluation often gets a bum rap—either it’s watered down to the point of PR or it’s done just for compliance. So what’s the point?

If you understand evaluation as a way of thinking and a set of tools that when applied well will always improve the functioning and impact of programs, there’s a good deal of value in evaluation. Recently, we’ve had the chance to work for several clients who grasp the learning spirit of evaluation:

  • Providence Center in Philadelphia, PA is a small faith-based nonprofit seeking to build staff capacity to think about their work using evaluation as a lens. The neighborhood center ran after school, teen mentoring and adult English programs, and had never used evaluation in the past. CGR worked with the entire staff to articulate program goals, align their daily activities, and identify ways to assess their progress. We helped them design surveys, internal protocols and tracking tools, and supported them through one cycle of using the tools, analyzing and reflecting.
  • Providence Center took brave, substantive action as a result of this new capacity. Staff and board members concluded that they should stop offering the school based after school program or modify the format. Sounds like a loss, another reason to be scared of evaluation, right?  But it’s actually good news: The evaluation process led them to see their weakness—they struggled to provide and assess an effective remedial reading program in a nearby elementary school. It also revealed what school teachers, parents and students saw as their strength—offering a trusted, safe, nurturing place for children in an unsafe neighborhood to do their homework and cultivate positive attitudes toward learning. So instead of perpetually writing grants to maintain a costly program and pulling their hair about how to hire qualified instructors, they closed the program at the school site. Providence Center then changed course to engage service learning students from a local university and neighborhood teens to develop a Homework Club at the center that focuses exclusively on homework completion in English. It’s a win for kids, the community, and Providence Center. And it was possible because they employed the true spirit of evaluation, which at its best should act like a mirror, enabling us to learn from our reflection.

  • Rochester City School District engaged CGR to conduct an implementation evaluation of the first year at each of the five new high schools it opened in 2010. There is no clear bar for what a school should achieve in Year One, but the district knows it must monitor these schools closely to ensure success. We analyzed student profiles and outcomes, spent a day observing classes and conducted surveys of all staff and students in each school. Our school-level reports reflect back our findings to each principal; an aggregate report shares the big picture with the district and public. Next week, there will be a public presentation—the plan is to have a productive dialogue with principals, District leaders and Board members about where the schools are, and how they can use evaluation findings to help refine their work in Year Two.
  • Greater Rochester Health Foundation employs a balanced approach to evaluation; CGR has been pleased to play a small role. GRHF takes seriously the idea that evaluation should empower local organizations and “real people” to assess and drive change at the neighborhood level, rather than simply be used in the board room. They know that reporting “just the numbers” doesn’t completely assess impact, so our work has been to develop evaluations that capture the stories and experiences of grantees who are increasing physical activity among children all over Rochester neighborhoods. GRHF used what we learned to improve their internal operations as well as fine-tune the information they collect and disseminate to grantees. They also build capacity to use evaluation, engaging us to provide technical assistance to a few grantees by working with staff to design evaluation measures and protocols that are both rigorous and feasible.

Dirty little secret #2—sometimes the work we do at CGR sounds removed from the day-to-day world of people who do direct service. And I’m a teacher at heart, so what I love best is the opportunity to help people harness their passion into plans, look with fresh eyes at what they do, ask questions about effectiveness, consider new approaches, and participate in an ongoing cycle of assessment and adjustment. As it turns out, that’s what good program evaluation is about. And why I’ve actually come to see program evaluation as important, even fascinating…which maybe should be the third embarrassing secret.