The CES is glad to announce a webinar delivered on January 18 in English by Robert Lahey on connecting the dots between M, E, RBM, and deliverology.
Two government-wide changes have come along in 2016 that will impact the way Evaluation (E) gets conducted & used by federal officials in Canada: one is the introduction of deliverology with the creation of a centrally-led results and delivery unit in the Privy Council Office; the other, the new Treasury Board Policy on Results that introduces changes to the federal evaluation function, including closer links to performance measurement. Both put a focus on ‘results' (RBM or results based management). And, both offer opportunities for evaluation, assuming the right conditions hold. Yet, there is limited experience in Canada with the deliverology concept; and the roll-out of the new Results Policy impacting evaluation and monitoring is still less than a year old. The webinar session will take an analytical approach in examining the critical tools needed to measure ‘results', the experience in Canada to date, and what the introduction of deliverology might mean. Drawing on the experience of other countries where deliverology has been introduced, the intent is to identify opportunities for evaluation going forward. This includes consideration of the conditions needed for these to be realized, and the potential threats to evaluation and its use in the federal domain and beyond, should the assumptions for success not hold.
WHEN: Wednesday, January 18, 2017 from 12 pm to 1 pm Eastern Time
WHERE: CES webinars take place online using the GoToWebinar platform. You can check your computer's requirements.
COST: Free for all CES members.
Profile of the facilitator
Robert Lahey is founding Head of Canada's Centre of Excellence for Evaluation, the federal government's policy centre for evaluation, and, over three decades had headed the evaluation function in four government departments and agencies in Canada. Since 2004, as President of REL Solutions Inc., Bob has been advising agencies in Canada and internationally on evaluation capacity building, working internationally with the World Bank, UN agencies and directly with several countries in Africa, South America, Europe, the Middle East and the Caribbean. He has developed national Monitoring and Evaluation Strategies and multi-year Action Plans for Botswana, Ethiopia, Guyana and Trinidad and Tobago. His work has been published by the World Bank, United Nations Evaluation Group (UNEG), International Labour Organization (ILO), in journal articles and other publications that Bob has authored or co-authored in recent years. Bob is a member of the Canadian Evaluation Society's (CES) Credentialing Board and Emeritus member of the CES Educational Fund (CESEF). In 2004, the CES recognized Bob with their award for ‘Contribution to Evaluation in Canada' and, in 2015 inducted him into the CES Fellowship.
The goal of this webinar is to critically appraise what it means, in practice, to ‘measure results', identifying both technical and political considerations Evaluators need to be aware of if they are to remain relevant. In the process, the session examines the relationship between the key tools to measure ‘results' – Evaluation and Performance Measurement/Monitoring – and ‘deliverology', an RBM-like management approach recently introduced by the Government of Canada.
The intent of this session is to raise awareness of institutional and broad political considerations that represent a large part of the enabling environment for Evaluation in a public sector. Evaluators need to be cognizant of the reality that ensuring their effectiveness goes well beyond delivering a good (i.e. technically sound) evaluation report.
In addition, this webinar aims to strengthen capacities in line with the following Competencies for Canadian Evaluation Practice:
- Situational practice competencies that focus on the application of evaluative thinking in analyzing and attending to the unique interests, issues and contextual circumstances in which evaluation skills are being applied; and,
- Technical practice competencies, with particular reference to understanding theories of change, the complementarity between Evaluation and Performance Measurement/Monitoring, and their respective use in measuring ‘results'.