Event Date: 2026-05-19 Cooperation Canada/Salanga Deadline: 2026-05-19
Curious about how to design an ethical AI chatbot?Come learn about the practical "how-to" of designing ethical AI chatbots in real-world development contexts.
Event Date: 2026-07-06 The Evaluators' Institute Deadline: 2026-07-01
This course will introduce a range of basic quantitative and qualitative social science research methods that apply to evaluating various programs. This foundational course introduces methods developed more fully in other TEI courses and serves as a critical course designed to ensure a basic familiarity with a range of social science research methods and concepts. Topics will include qualitative research with a special emphasis on focus groups and interviews, experimental design, quasi-experimental design, and survey research methods. This course is suitable for those who want to update their existing knowledge and skills, and will serve as an introduction for those new to the topic.
This three-day hands-on course teaches evaluators to leverage artificial intelligence and machine learning to transform program administrative data into automated evaluation systems. Using KNIME, a free visual-based analytics platform that requires no coding experience, participants will learn how to implement causal modeling techniques that significantly reduce evaluation timeframes while increasing quasi-experimental rigor, insight depth, and actionability. Through guided exercises, participants will be provided the foundational tools and techniques for the complete workflow of modern AI-powered evaluation, including: -Cleaning and transforming both structured (numeric, ordinal, and categorical data) and unstructured (text) program data -Using large language models for qualitative analysis -Training machine learning algorithms to conduct causal modeling that identifies and evaluates natural experiments within historical program data. The workshop includes hands-on practice with KNIME, working with sample datasets to build actual evaluation models with automation. Participants will leave with foundational skills to begin applying these techniques in their evaluation work.
Event Date: 2026-07-07 The Evaluators' Institute Deadline: 2026-07-06
Program evaluations are often complex, challenging, multi-faceted endeavors that require evaluators to juggle the needs of interest holders, funder requirements, data collection logistics, and their internal teams. Fortunately, many of these challenges can be minimized with effective evaluation management. In this interactive workshop, we provide tools, resources, and strategies to help evaluators manage their evaluations successfully. During Day 1, using case studies, mini-lectures, and group discussions, we explore traditional evaluation management practices, focusing on the processes and logistics of how to manage an evaluation team and the entire evaluation process from project initiation and contracting through final reporting. During Day 2, we continue to build participants’ evaluation management toolkit by introducing four essential, experience-tested strategies that will elevate all participants’ project management game. Across both days, there will be ample opportunities to share your own perspective, ask relevant questions, and apply the content covered to your own work.
Event Date: 2026-07-09 The Evaluators' Institute Deadline: 2026-07-07
The overall goal of Monitoring and Evaluation (M&E) is the assessment of program progress to optimize outcome and impact – program results. Monitoring activities systemically observe assumed indicators of results, while evaluation activities, build on monitoring indicator data to assess intervention/program effectiveness, the adequacy of program impact pathways, likelihood of program sustainability, the presence of program strengths and weaknesses, the value, merit and worth of the initiative, and the like. This interactive course focuses on practical application and will cover: the purpose and scope of M&E; engaging stakeholders and establishing an evaluative climate; connecting program design and M&E frameworks; performance and results-based M&E approaches; data collection and methods; measuring program and success; and sustaining M&E culture. Course participants will have an understanding of M&E frameworks and fundamentals, M&E tools, and practice approaches. Case examples will be used to illustrate the M&E process.