Explore Issue Areas

  • Aging
  • Agriculture and Food
  • Animal Welfare
  • Arts and Culture
  • Athletics and Sports
  • Children and Youth
  • Civil Society
  • Community and Economic Development
  • Computers and Technology
  • Consumer Protection
  • Crime and Safety
  • Disabilities
  • Education and Literacy
  • Employment and Labor
  • Energy and Environment
  • LGBTQI
  • Government Reform
  • Health
  • Housing and Homelessness
  • Human Rights and Civil Liberties
  • Humanitarian and Disaster Relief
  • Hunger
  • Immigration
  • International Development
  • Journalism and Media
  • Men
  • Nonprofits and Philanthropy
  • Parenting and Families
  • Peace and Conflict
  • Poverty
  • Prison and Judicial Reform
  • Race and Ethnicity
  • Religion
  • Science
  • Substance Abuse and Recovery
  • Transportation
  • Welfare and Public Assistance
  • Women
  • Help
  • Add to Issuelab
  • Sign in
  • Sign Up
  • About
  • Issue Areas
  • Services
  • News

Measure Results

Need practical "how-to" info that aims to help you build your evaluation capacity? This collection includes suggested readings from our friends at BetterEvaluation, the Center for Evaluation Innovation, the Center for Effective Philanthropy, and Grantmakers for Effective Organizations as well as hand-picked content by IssueLab. Actual evaluations are available at Find Results. This site is part of our #OpenForGood campaign.

"GenY Unfocus Group - KP Digital Health 47613" by Ted Eytan licensed under CC BY-SA 2.0

Readings suggested by

Document Type

Select a category

Issue Areas

Languages

View
  • Funders
  • Publishers
  • Bibliography
Engage
  • Share the Collection
  • Suggest a Report

76 results found

RELEVANCY

  • Relevancy
  • A - Z
  • Newest - Oldest
  • Oldest - Newest
Impacting Responsibly

Impacting Responsibly

May 24, 2019

Candid; New Philanthropy Capital (NPC); Salesforce.org; Urban Institute;

Designed to help the social sector measure its impact in a responsible manner, the report, Impacting Responsibly, gathers insights from thought leaders in the fields of philanthropy, measurement, and evaluation in nine areas — impact capacity building, impact frameworks and standards, constituent feedback, current reporting burden, resource inequities, impact data ownership, roles and responsibilities, collaboration, and limits of quantitative evidence. The contributions also address questions such as: How can organizations of all sizes and budgets use impact data? How can they better engage those they serve through impact data? How should they handle privacy and data protection? And how can they collaborate to maximize what they can learn from impact data?

Learning in Philanthropy: A Guidebook

Learning in Philanthropy: A Guidebook

May 12, 2019

Grantmakers for Effective Organizations (GEO);

When grantmakers focus on learning for improvement, we use evaluation and learning to generate information and insights that will help us better understand both how we're doing in our work and how to improve. A focus on taking action based on what we learn ensures that we are engaged in strategic or applied learning. Our learning should be tied directly to the strategies we are pursuing and the decisions we are making.Learning in Philanthropy: A Guidebook provides a solid basis for thinking and talking about the next steps in our organization's learning work. The guidebook is designed to serve as a resource to help grantmakers answer critical learning questions and embed learning more deeply into the day-to-day work and cultures of our organizations.

MEL Practice at the David and Lucile Packard Foundation: Evaluation in Support of Moving from Good to Great

MEL Practice at the David and Lucile Packard Foundation: Evaluation in Support of Moving from Good to Great

Jan 31, 2019

David and Lucile Packard Foundation; ORS Impact;

In early 2017, ORS Impact evaluated and re-examined the David and Lucile Packard Foundation monitoring, evaluation, and learning (MEL) principles and practice. The purpose of this evaluation was to discover what works well, identify areas for improvement, and stimulate reflection and experimentation. While this report uncovered many examples of strong MEL practice across the Foundation it also highlighted opportunities for improvement. Research findings fed into Foundation decisions to update both internal and external MEL processes and requirements, including refinement of the Foundation's Guiding Principles for MEL.A key audience of this report include readers wrestling with how to best support MEL in philanthropic settings so that it can support greater learning and impact, such as MEL staff working inside foundations and external evaluators working with foundations.

Weaving Successful Partnerships: When Funders, Evaluators, and Intermediaries Work Together

Weaving Successful Partnerships: When Funders, Evaluators, and Intermediaries Work Together

Jan 23, 2019

Engage R+D; Equal Measure; Harder+Company;

The aim of this report is to contribute to field dialogue and learning about how to structure complex systems change strategies involving multiple partners.

Understanding & Sharing What Works: The State of Foundation Practice

Understanding & Sharing What Works: The State of Foundation Practice

Nov 08, 2018

The Center for Effective Philanthropy (CEP);

The Center for Effective Philanthropy (CEP) surveyed private and community foundation leaders regarding what they know about what is and isn't working in their foundations' efforts to achieve their goals. Drawing from 119 survey responses and in-depth interviews with 41 foundation CEOs, the report finds that while the majority of foundation CEOs believe they understand well what is working in their programmatic efforts, more than 40 percent believe their foundation is not investing enough time and money in developing that understanding.

A Primer About Monitoring and Evaluation, for Funders and Implementers

A Primer About Monitoring and Evaluation, for Funders and Implementers

Oct 01, 2018

Giving Evidence; Keystone Accountability;

This primer for monitoring and evaluation was developed jointly by Giving Evidence and Keystone to help a funder-client review and rethink their monitoring and evaluation practices. It was initially developed for a funder that had solicited our services to help them review and rethink their monitoring and evaluation practices, but we feel it is relevant for a broader audience of donors and implementers. The primer established a 4-level framework covering the monitoring of inputs and outputs, results monitoring, evaluation of grantee impact and evaluation of the contributions made to impact by the funders themselves.

Knowledge Management - A Primer

Knowledge Management - A Primer

Oct 01, 2018

Giving Evidence; Keystone Accountability;

Foundations create a lot more knowledge than they use, and most don't capture the full value of this data. Attached is our primer on trends in knowledge management. Learn what's changed, and discover easier ways to work. If you'd like to take advantage of machine learning and artificial intelligence to generate better insights, here are tips on how to design better systems and workflows for your team. It also includes a 9-stage scale to decide how sophisticated your current approach is, and where you can improve.

Mona Foundation Monitoring Evaluation Framework: A Field-generated View of Measuring Sustained Change

Mona Foundation Monitoring Evaluation Framework: A Field-generated View of Measuring Sustained Change

Mar 02, 2018

Mona Foundation;

In January 2016, Mona Foundation received a grant from the Glaser Progress Foundation to support aprocess for defining Mona Foundation's theory of change and lay the groundwork for developing a frameworkto document impact. That goal was accomplished with Mona Foundation developing a theory of change,partnership framework, and strategic goals.In June 2017, the Glaser Progress Foundation awarded Mona Foundation a grant to identify the metrics andprocess for documenting change. A committee, made up of several external representatives with expertisein education and evaluation and two Mona Foundation staff provided guidance and feedback on the process.

Modal content
resource.notifications.documents_incoming

Suggest a Report

Have a guide, toolkit, or other resource that can help someone learn to do evaluation better? Great! Please send us your suggestion, along with your name and email address in case we need to get in touch. Thanks for helping us build the evaluation capacity of the social sector!

×

or BROWSE
Thank you for your suggestion! This window will automatically close.

Share the Collection

Use this form to customize and generate the code you need to display this content in your own environment - no programming required. The feed will inherit more specific styles, like font face and font color, from your website.






Show elements




Your code

Preview

Modal content
resource.notifications.documents_incoming

Suggest a Report

Have a guide, toolkit, or other resource that can help someone learn to do evaluation better? Great! Please send us your suggestion, along with your name and email address in case we need to get in touch. Thanks for helping us build the evaluation capacity of the social sector!

×

or BROWSE
Thank you for your suggestion! This window will automatically close.
THESE LOGOS ARE CALLED IN THRU DBASE

Get free, worthwhile monthly emails from IssueLab!

IssueLab
  • About
  • News
  • Services
Join Us
  • Add to Issuelab
  • Open Knowledge
  • Use Our Data
Support
  • FAQ
  • Contact Us
  • Privacy Policy
  • ToS