Event Date: 2026-07-14 The Evaluators' Institute Deadline: 2026-07-10
The tools and techniques of cost-benefit and cost-effectiveness analysis will be presented. The goal of the course is to provide analysts with the skills to interpret cost-benefit and cost-effectiveness analyses. Content includes identification and measurement of costs using the ingredients method; how to specify effectiveness; shadow pricing for benefits using revealed preference and contingent valuation methods; discounting; calculation of cost-effectiveness ratios, net present value, cost-benefit ratios, and internal rates of return. Sensitivity testing and uncertainty will also be addressed. Individuals will work in groups to assess various costs, effects, and benefits applicable to selected case studies across various policy fields. Case studies will be selected from across policy fields (e.g. health, education, environmental sciences).
Event Date: 2026-07-15 The Evaluators' Institute Deadline: 2026-07-13
Most evaluation data are more complex than standard statistical methods assume. The constructs we care about cannot be observed directly and are measured imperfectly through surveys and instruments. At the same time, participants are nested within sites, sites within regions, and measures are often repeated over time. When evaluators apply conventional approaches to data like these, the result is often biased estimates, misleading conclusions, and findings that don’t hold up to scrutiny. This three-day course introduces structural equation modeling (SEM), including confirmatory factor analysis (CFA) and multilevel modeling (MLM). The course is organized around problems that evaluators regularly encounter, with methods introduced as tools to address them. Each day builds on the last, moving from measurement to data structure to their integration. Lectures and discussions are paired with applied software demonstrations throughout the course. Participants are encouraged to bring their own data for use during Day 3 application sessions.
Strategic planning is becoming a common practice for governments, nonprofit organizations, businesses, and collaborations. The severe stresses facing these entities make strategic planning more important and necessary than ever. For strategic planning to be really effective it should include systematic learning informed by evaluation. This course examines the theory and practice of strategic planning and management with an emphasis on practical approaches to identifying and effectively addressing organizational challenges – and doing so in a way that makes systematic learning and evaluation possible. The approach engages evaluators much earlier in the process of organizational and programmatic design and change than is usual. Topics covered include understanding why strategic planning has become so important; understanding what strategic planning is – and is not; the Strategy Change Cycle; key strategic planning tools and techniques; and designing formative, summative, and developmental evaluations of strategic planning processes, missions, strategies, and organizational performance.
Event Date: 2026-07-16 The Evaluators' Institute Deadline: 2026-07-14
In an era of rapid change, evaluators are increasingly called upon to lead—whether guiding teams, managing stakeholder (interest holder) relationships, or navigating complex ethical landscapes. This two-day, hands-on workshop equips current and aspiring evaluator-leaders with a practical toolkit to lead people and projects with confidence. Blending evidence-based frameworks, positive psychology, and easy-to-use AI tools, you’ll build repeatable habits and tools for leading in dynamic contexts. Through hands-on activities and guided reflection, you will: -Explore how self-awareness and a “change mindset” drive ethical leadership. -Use digital self-assessments to identify strengths, leadership styles, and growth areas. -Practice reflective journaling and scenario-based reflection with AI tools. -Conduct evaluator bias checks to support more equitable decision-making. -Build a future-oriented leadership vision using storytelling and generative AI. -Apply strategies to lead with clarity and integrity during times of uncertainty.
Data analysis involves creativity, sensitivity and rigor. In its most basic form qualitative data analysis involves some sort of labeling, coding and clustering in order to make sense of data collected from evaluation fieldwork, interviews, and/or document analysis. This intermediate level workshop builds on basic coding and categorizing familiar to most evaluators, and extends the array of strategies available to support rigorous interpretations. This workshop presents an array of approaches to support the analysis of qualitative data with an emphasis on procedures for the analysis of interview data. Strategies such as enumerative and interpretive content analysis, thematic analysis, narrative analysis, and the framework method of analysis are presented and illustrated with reference to examples from evaluation and from a range of disciplines, including sociology, education, political science and psychology. Issues of quality, including validity, trustworthiness and authenticity of qualitative data are integrated throughout the workshop.