Need practical how-to info that aims to help you build your evaluation capacity? This collection includes suggested readings from our friends at BetterEvaluation, the Center for Evaluation Innovation, the Center for Effective Philanthropy, and Grantmakers for Effective Organizations as well as hand-picked content by Candid. Thousands of actual evaluations are  available for download.

Know of content that should be considered for this collection? Please suggest a report!

"GenY Unfocus Group - KP Digital Health 47613" by Ted Eytan licensed under CC BY-SA 2.0

Search this collection

Clear all

6 results found

reorder grid_view

A Funder's Guide to Using Evidence of Program Effectiveness in Scale-Up Decisions

May 1, 2014

This Guide provides funders with practical advice on how to think about and use evidence of effectiveness when considering investments in scale-up opportunities. The Guide does not seek to turn private funders into evaluation experts or to delve into the methodological details of particular research approaches. Rather, the focus is on the right questions that funders should ask and the pitfalls they should avoid, including how to recognize the limitations of certain kinds of evidence. The Guide is divided into three sections:Section I, Eight Key Questions to Ask Throughout the Scale-Up Process, presents what funders should look for to determine whether programs are effective. These questions provide the building blocks for the discussion in the following section.Section II, Application of the Eight Questions to Scale-Up Decisions, shows how the questions apply to the different stages of a program's evidence-building and scale-up.Section III, Next Steps for the Field, highlights some remaining challenges for the field to consider in using evidence of effectiveness to guide scale-up decisions.

Evaluation in Foundations; Guidelines and Best Practices

Four Essentials for Evaluation

May 30, 2012

GEO created this guide to help grantmakers get to the next level in their evaluation efforts. The target audience is champions and supporters of evaluation who want to embed these practices more deeply in the work of their organizations.The term "evaluation" can refer to a lot of different activities, including data collection, information gathering and research about grantmaker-supported activities. GEO's emphasis, however, is on "evaluation for learning."Evaluation is about more than ensuring that grantees are doing what they promise, or that a specific program area at a foundation is meeting its goals. Rather, it's about advancing knowledge and understanding among grantmakers, their grantees and their partners about what's working, what's not and how to improve their performance over time.Using evaluation in this way requires grantmakers to transform themselves into learning organizations. Beyond getting smarter about specific evaluation methods and approaches, this means adopting a continuous process, a culture and a commitment to support the capacity of people to see patterns and insights that can lead to ever-improving results.

Guidelines and Best Practices; Must-Reads; Tools and Frameworks

A Handbook of Data Collection Tools: Companion to "A Guide to Measuring Advocacy and Policy"

May 21, 2012

This handbook of data collection tools is intended to serve as a companion to A Guide to Measuring Advocacy and Policy. Organizational Research Services (ORS) developed this guide on behalf of the Annie E. Casey Foundation to support efforts to develop and implement an evaluation of advocacy and policy work. The companion handbook is dedicated to providing examples of practical tools and processes for collecting useful information from policy and advocacy efforts. Included within this handbook are a legislative process tracking log, a meeting observation checklist, a policy brief stakeholder survey, a policy tracking analysis tool, and a policy tracking form.This best practice provides an approach to measure advocacy and policy change efforts, starting with a theory of change, identifying outcome categories, and selecting practical approaches to measurement.

Guidelines and Best Practices; Tools and Frameworks

Born Learning Washington Monthly Media Tracking Form

December 5, 2007

Page 24This is a tool that measures strengthened base of media support. It tracks the amount of campaign-generated media visibility that has been developed - noting dimensions such as type of media (TV, radio, print, other) and type of placement (PSA, news story, op-ed, programming).

Campaign Champions Data Collection Tool

December 5, 2007

This is a tool that measures strengthened base of public support. This form tracks and measures the number of "champions" (people who take actions to advance the public will) engaged and the actions these champions undertake as a part of the Born Learning campaign.

A Guide to Measuring Advocacy and Policy

February 20, 2007

The overall purpose of this guide is twofold. To help grantmakers think about and talk about measurement of advocacy and policy, this guide puts forth a framework for naming outcomes associated with advocacy and policy work as well as directions for evaluation design. The framework is intended to provide a common way to identify and talk about outcomes, providing philanthropic and non-profit audiences an opportunity to react to, refine and adopt the outcome categories presented. In addition, grantmakers can consider some key directions for evaluation design that include a broad range of methodologies, intensities, audiences, timeframes and purposes. Included in the guide are a tool to measure improved policies, a tool to measure a strengthened base of public support, and a survey to measure community members' perceptions about the prioritization of issues.

Guidelines and Best Practices