More ways to engage:
- Add your organization's content to this collection.
- Easily share this collection on your website or app.
3 results found
Developmental evaluation (DE) has emerged as an approach that is well suited to evaluating innovative early-stage or market-based initiatives that address complex social issues. However, because DE theory and practice are still evolving, there are relatively few examples of its implementation on the ground. This paper reviews the practical experience of a monitoring and evaluation (M&E) team in conducting a developmental evaluation of a Rockefeller Foundation initiative in the field of digital employment for young people, and offers observations and advice on applying developmental evaluation in practice.Through its work with The Rockefeller Foundation's team and its grantees, the M&E team drew lessons relating to context, intentional learning, tools and processes, trust and communication, and adaption associated with developmental evaluation. It was found that success depends on commissioning a highly qualified DE team with interpersonal and communication skills and, whenever possible, some sectoral knowledge. The paper also offers responses to three major criticisms frequently leveled against developmental evaluation, namely that it displaces other types of evaluations, is too focused on "soft" methods and indicators, and downplays accountability.Through its reporting of lessons learned and its response to the challenges and shortcomings of developmental evaluation, the M&E team makes the case for including developmental evaluation as a tool for the evaluation toolbox, recommending that it be employed across a wide range of geographies and sectors. With its recommendation, it calls for future undertakings to experiment with new combinations of methods within the DE framework to strengthen its causal, quantitative, and accountability dimensions.
You're encouraged to use data to describe your program's benefit, but where to start? Lengthy reports and big spreadsheets are hard to interpret and don't always help to prioritize next steps or plan for program improvement. For many people, data analysis can seem like a daunting task, requiring specialized knowledge and years of training. This guide provides a gentle introduction to practical approaches to explore and analyze data commonly used in youth-serving organizations. The activities in this guide are suitable for a variety of group sizes and stakeholders so that you can draw on the experiences, knowledge, and insights from everyone involved in your program, including youth!
This best practice outlines the aspects of successful program management and evaluation, with a focus on the types of software that can facilitate them. It breaks program evaluation data collection, use, and reporting down into five discrete elements (central program data hub, auxiliary data systems, proactive data gathering, existing data gathering, and reporting/visualization) and delves into the tools used by various organization types for each category.