Need practical how-to info that aims to help you build your evaluation capacity? This collection includes suggested readings from our friends at BetterEvaluation, the Center for Evaluation Innovation, the Center for Effective Philanthropy, and Grantmakers for Effective Organizations as well as hand-picked content by Candid. Thousands of actual evaluations are  available for download.

More ways to engage:
- Add your organization's content to this collection.
- Easily share this collection on your website or app.

Search this collection

Clear all

8 results found

reorder grid_view

Collaborative Outcomes from the Youth Justice and Employment Community of Practice

October 18, 2022

Established in mid-2021, the Youth Justice and Employment Community of Practice (CoP) is a partnership of the Annie E. Casey Foundation (AECF), the National Youth Employment Coalition (NYEC), and Pretrial Justice Institute (PJI) formed to improve outcomes for youth with justice involvement by increasing collaboration among local workforce and juvenile justice systems. The CoP began during the middle of COVID-19 at a time when counterparts in each jurisdiction were seeking to reestablish pandemic-disrupted communication and collaboration. CoP participants met monthly to share knowledge and expertise on topics of importance to both systems. Based on work from the CoP, participating cities and counties produced notable improvements in building relationships, expanding partnerships, and promoting investments that benefit justice-involved young people in their communities. This report documents successes and offers recommendations for others seeking to improve outcomes for these young people.

Evaluation of the Baltimore Health Corps Pilot: An Economic and Public Health Response to the Coronavirus

September 30, 2022

The Baltimore Health Corps was a city-run pilot launched in June 2020 and concluding in December, 2021. The pilot simultaneously addressed two issues: the spread of COVID-19 and the resulting employment crisis faced by Baltimore residents.The Baltimore City Health Department and the Mayor's Office of Employment Development led the Baltimore Health Corps, drawing on their experiences with equitable recruitment and hiring practices, workforce-supporting activities and public health worker training. Together, they led a team of public and private partners that included the Baltimore Civic Fund, Baltimore Corps, HealthCare Access Maryland (HCAM), Jhpiego and the Mayor's Office of Performance and Innovation.The initiative tracked those who contracted the virus at the height of the pandemic and connected COVID-19-positive individuals with testing, resources and other assistance. In doing so, the Baltimore Health Corps also placed unemployed workers on a path to high-quality, lasting careers via temporary positions as community health workers with the Baltimore City Health Department and HealthCare Access Maryland (HCAM). The program hired from a pool of Baltimore residents who reflected the city's racial and ethnic demographics and were unemployed, underemployed or furloughed because of the pandemic. By September 2021, 336 health workers had received training and took on roles within either the Health Corps' contact tracing and outreach program or the care coordination and access program.While these health worker positions were intended to last just eight months, as the pandemic persisted, the jobs were extended thanks to funding from the American Rescue Plan Act. As of May 2022, 126 Baltimore Health Corps workers remain employed with either the health department or HCAM, while 119 former staff members have since moved on to other employment opportunities.This is the Final Report to follow the Early Lessons Report for the Baltimore Health Corps Pilot Study. Readers are encouraged to review the Early Lessons Report for a detailed description of the formation of the Pilot Study, the role of each partner, as well as findings from the first year of the Pilot Study.

A Funder's Guide to Using Evidence of Program Effectiveness in Scale-Up Decisions

May 1, 2014

This Guide provides funders with practical advice on how to think about and use evidence of effectiveness when considering investments in scale-up opportunities. The Guide does not seek to turn private funders into evaluation experts or to delve into the methodological details of particular research approaches. Rather, the focus is on the right questions that funders should ask and the pitfalls they should avoid, including how to recognize the limitations of certain kinds of evidence. The Guide is divided into three sections:Section I, Eight Key Questions to Ask Throughout the Scale-Up Process, presents what funders should look for to determine whether programs are effective. These questions provide the building blocks for the discussion in the following section.Section II, Application of the Eight Questions to Scale-Up Decisions, shows how the questions apply to the different stages of a program's evidence-building and scale-up.Section III, Next Steps for the Field, highlights some remaining challenges for the field to consider in using evidence of effectiveness to guide scale-up decisions.

Evaluation in Foundations; Guidelines and Best Practices

Four Essentials for Evaluation

May 30, 2012

GEO created this guide to help grantmakers get to the next level in their evaluation efforts. The target audience is champions and supporters of evaluation who want to embed these practices more deeply in the work of their organizations.The term "evaluation" can refer to a lot of different activities, including data collection, information gathering and research about grantmaker-supported activities. GEO's emphasis, however, is on "evaluation for learning."Evaluation is about more than ensuring that grantees are doing what they promise, or that a specific program area at a foundation is meeting its goals. Rather, it's about advancing knowledge and understanding among grantmakers, their grantees and their partners about what's working, what's not and how to improve their performance over time.Using evaluation in this way requires grantmakers to transform themselves into learning organizations. Beyond getting smarter about specific evaluation methods and approaches, this means adopting a continuous process, a culture and a commitment to support the capacity of people to see patterns and insights that can lead to ever-improving results.

Guidelines and Best Practices; Must-Reads; Tools and Frameworks

A Handbook of Data Collection Tools: Companion to "A Guide to Measuring Advocacy and Policy"

May 21, 2012

This handbook of data collection tools is intended to serve as a companion to A Guide to Measuring Advocacy and Policy. Organizational Research Services (ORS) developed this guide on behalf of the Annie E. Casey Foundation to support efforts to develop and implement an evaluation of advocacy and policy work. The companion handbook is dedicated to providing examples of practical tools and processes for collecting useful information from policy and advocacy efforts. Included within this handbook are a legislative process tracking log, a meeting observation checklist, a policy brief stakeholder survey, a policy tracking analysis tool, and a policy tracking form.This best practice provides an approach to measure advocacy and policy change efforts, starting with a theory of change, identifying outcome categories, and selecting practical approaches to measurement.

Guidelines and Best Practices; Tools and Frameworks

Born Learning Washington Monthly Media Tracking Form

December 5, 2007

Page 24This is a tool that measures strengthened base of media support. It tracks the amount of campaign-generated media visibility that has been developed - noting dimensions such as type of media (TV, radio, print, other) and type of placement (PSA, news story, op-ed, programming).

Campaign Champions Data Collection Tool

December 5, 2007

This is a tool that measures strengthened base of public support. This form tracks and measures the number of "champions" (people who take actions to advance the public will) engaged and the actions these champions undertake as a part of the Born Learning campaign.

A Guide to Measuring Advocacy and Policy

February 20, 2007

The overall purpose of this guide is twofold. To help grantmakers think about and talk about measurement of advocacy and policy, this guide puts forth a framework for naming outcomes associated with advocacy and policy work as well as directions for evaluation design. The framework is intended to provide a common way to identify and talk about outcomes, providing philanthropic and non-profit audiences an opportunity to react to, refine and adopt the outcome categories presented. In addition, grantmakers can consider some key directions for evaluation design that include a broad range of methodologies, intensities, audiences, timeframes and purposes. Included in the guide are a tool to measure improved policies, a tool to measure a strengthened base of public support, and a survey to measure community members' perceptions about the prioritization of issues.

Guidelines and Best Practices