More ways to engage:
- Add your organization's content to this collection.
- Easily share this collection on your website or app.
16 results found
This publication offers a brief overview of how grantmakers are looking at evaluation through an organizational learning and effectiveness lens. It is based on a review of the current literature on evaluation and learning, outreach to grantmakers that have made these activities a priority and the work of GEO and the Council to raise this issue more prominently among their memberships. Many of these grantmakers are testing new approaches to gathering and sharing information about their work and the work of their grantees. We share the learning and evaluation stories of 19 GEO members in the pages that follow.
For foundations, there are lots of questions to reflect on when thinking about which evaluation practices best align with their strategy, culture, and mission. How much should a foundation invest in evaluation? What can they do to ensure that the information they receive from evaluation is useful to them? With whom should they share what they have learned?Considering these numerous questions in light of benchmarking data about what other foundations are doing can be informative and important.Developed in partnership with the Center for Evaluation Innovation (CEI), Benchmarking Foundation Evaluation Practices is the most comprehensive data collection effort to date on evaluation practices at foundations. The report shares data points and infographics on crucial topics related to evaluation at foundations, such as evaluation staffing and structures, investment in evaluation work, and the usefulness of evaluation information.Findings in the report are based on survey responses from individuals who were either the most senior evaluation or program staff at foundations in the U.S. and Canada giving at least $10 million annually, or members of the Evaluation Roundtable, a network of foundation leaders in evaluation convened by CEI.
This free digital resource provides a clear and systematic guide for managers of an evaluation, whether this is being done by an external evaluator, an internal team or a hybrid team. In addition to guidance it provides , links to further detail and examples as required. A particular feature is the GeneraTOR which prompts for particular information to produce a draft Terms of Reference document which can be shared, reviewed and finalised with other stakeholders.
Many of funders are already committed to evaluation as a tool to improve their programs and strategies, and are seeking ways to engage in this learning process in partnership with grantees. One critical aspect of understanding and growing the impact of grantmaking dollars is supporting the capacity of grantees to assess their progress and adopt a learning for improvement mindset. Many nonprofits already collect and analyze data on program performance and hope to develop more comprehensive learning agendas. However, the nonprofit sector continues to wrestle with finding the resources, time and space to take evaluation to its full potential. This Smarter Grantmaking Playbook piece offers grantmakers ideas for how to provide evaluation capacity support more effectively.
A majority of grantmakers are struggling to make evaluation and learning meaningful to anyone outside their organizations. Not only is evaluation conducted primarily for internal purposes, but it is usually done by the grantmaker entirely on its own -- with no outside learning partners except perhaps an external evaluator -- and provides little value and may even be burdensome to the grantee. It may be that some funders do not consider expanding the scope of learning efforts beyond their own walls. Or perhaps the gap is driven by funding constraints or funder -- grantee dynamics. In any case, grantees and other stakeholders are critical partners in the achievement of grantmakers' missions and are therefore critical learning partners as well.In this publication, GEO offers actionable ideas and practices to help grantmakers make learning with others a priority. The publication includes stories about foundations that are learning together with a variety of partners, plus a discussion of the key questions that can help shape successful shared learning. It is based on research and interviews conducted from late 2013 to 2015, including extensive outreach to grantmakers, evaluation practitioners and others. The focus of GEO's inquiry: documenting the challenges facing grantmakers as they set out to learn with others, lifting up what it takes to do this work successfully and identifying grantmakers that show a commitment to learning together.
We felt we should begin our work together by crafting a common definition of "high-performance organization." We knew that without a thoughtfully developed, thoroughly vetted definition of "high performance," any call for raising performance in our sector would ring hollow. In addition to providing a common definition of "high performance," the PI also lays out in detail the seven organizational pillars that can help you achieve high performance. To crib from the late author Stephen Covey, these are the seven habits of highly effective organizations.We do not intend this document to be a manifesto. We hope it will be a North Star to guide leaders on a journey of continuous learning and improvement--so they can make as much difference as they possibly can for the people and causes they serve.
Place-based initiatives involve multiple partners joining together to tackle pressing community-wide issues. In order to better understand and quantify the positive impact of these complex, long-term initiatives, this publication offers a framework of measures and potential indicators that can help grantmakers evaluate and ultimately improve their work.
Foundations need to build new ways of thinking and interaction that help to combat cognitive traps, support rigorous inquiry, and foster more deliberative decision making. This brief highlights several common cognitive traps that can trip up philanthropic decision making, and suggests straightforward steps that strategists, evaluators, and organizational learning staff can take to address them.
As part of our ongoing work to strengthen our support for communities, the trustees and staff of the Otto Bremer Foundation engaged in a series of learning seminars on evaluation. In order to make the core concepts easily accessible and retrievable, we asked Michael Quinn Patton, who led these seminars, to create a set of basic reference cards. These became the Evaluation Flash Cards presented here, with the idea that a core concept can be revisited "in a flash." Illustrations of the concepts are drawn from Otto Bremer Foundation grants. We hope this resource is useful to other organizations committed to understanding and improving the results of the programs they support.
This Learning Brief draws from literature and research, as well as more than a dozen interviews with foundation leaders, evaluation practitioners, and social sector thought leaders, with the intention of starting the conversation in the field around Next Generation Evaluation characteristics and approaches.
This briefing shares five principles for engaging community stakeholders in evaluation planning, data collection and the interpretation and use of findings as part of place-based initiatives. These insights emerged from the shared experiences of grantmakers and evaluation practitioners during the first year of GEO's "Embrace Complexity" Community of Practice -- a group focused on the evaluation of place-based grantmaking.
Foundations have important but unrealized potential to contribute value to strategy by building, supporting, and engaging in learning. While learning is important for strategic success in most circumstances, it becomes essential when foundations engage in many large and extraordinarily difficult and complex concerns: improving food security in Africa; addressing global warming, poverty, or issues of equity in difficult urban settings.This article identifies three common "traps" that hinder foundation capacity to learn and adapt: 1) linearity and certainty bias; 2) the autopilot effect; and 3) indicator blindness. The authors propose a framework to avoid these traps. Through learning from action, a truly powerful strategy – one with the potential to foster change and better outcomes – can emerge and take hold.