Winner, Award for Contribution to Evaluation in Canada, 2000

Harry Cummings has been a member of the Canadian Evaluation Society (CES) since 1989. He teaches program evaluation, rural development planning, regional economics and research methods at the University of Guelph and provides consulting services through his company Harry Cummings and Associates (HCA). He was interviewed by Michael Obrecht, a member of the CES since 1983, who is currently with the Canadian Institutes of Health Research (CIHR), a federal organization responsible for creation of new health knowledge, transfer of knowledge to users in the health care system and health-related firms and innovation and integration in the Canadian health research system. The interview was conducted by e-mail.

MO: Harry, I have been involved with the CES Evaluation Case Competition since its inception in 1996. The fact that for two years in a row you coached the winning team of students has made me curious about your views on program evaluation and your approach to teaching evaluative skills. First, could you tell us about your initial exposure to program evaluation and how your involvement in program evaluation has evolved since then?

HC: I was initially exposed to program evaluation while working for the federal government in Edmonton, Alberta. I worked in regional development and we had to have an evaluation plan in place for our program. I remember very clearly being concerned about how we would evaluate our program when the province did the implementation in most cases. I suppose my first exposure might have been through learning cost benefit analysis in my graduate and undergraduate geography degrees. Since then, I have become heavily involved in teaching and doing evaluations. My first evaluations were of international development programs run by CIDA (the Canadian International Development Agency). My first teaching in evaluation was in Guelph in about 1984. I have developed strong interests in applied methods, the use of program logic, logical frameworks and results-based management, and teach and use all of these tools in Canadian and international contexts.

MO: So, would I be right in guessing that you bring extensive practical experience to bear on your teaching of evaluation?

HC: Yes, lots of practical experience has been a big help in the classroom. First of all, it keeps you current and gives you credibility. My evaluation experience means I can always speak to what is useful in evaluation. I also use my evaluation reports as case studies. I am excited about the evaluation work I do and that shows in the classroom, I hope. Secondly, it gives you lots of material for teaching. I learned how to do Logical Framework Analysis (LFA) while working on an international evaluation. I then read more about it, and subsequently taught my students how to use the Logical Framework. This also feeds back to my evaluation work. Because I have taught the LFA method, read about it, and expanded the ideas to logic models more generally, I hope I do a better job of evaluation practice. The variety of experience I have also works well in the classroom: international, Canadian, economic impact, health and quality assurance, community development, etc, are all present in my work and the interests of my students.

MO: Just in case some readers must, as I do, confess ignorance about Logical Framework Analysis (LFA) could you recommend an article or book about it?

HC: Readers may wish to look at the Canadian Journal of Development Studies, (University of Ottawa) Special Issue, Vol. XVIII. 1997, Results Based Performance Reviews and Evaluations, edited by H. Cummings. It contains articles by myself, pp 587-596 and Sawadogo and Dunlop, pp 597-612.

MO: In teaching a group of students who may not previously have thought about program evaluation, what is the one basic principle that you consider the most important to instill?

HC: Evaluation makes sense. If you are going to invest in a program or project, you want to learn if your investment was a success. If it was not a success, why not and how can you do it better. Evaluation will help you do that. More importantly evaluation gives you a systematic approach to assessing the program or project that will ensure that all angles are covered. Of course this assumes you learn to do evaluation well.

MO: In the Evaluation Case Competition, each team of students has only five hours in which they have to read and understand an evaluation case file then develop specific recommendations in response to it. Of the many things you tell your students about program evaluation, what do you think best prepares them for outstanding performance in this highly intense activity?

HC: The one most important thing is to be flexible, innovative and think outside the box. Apply systematic evaluation approaches but don't get locked into the jargon. Other key elements are: appoint a team leader, do time management, have a conflict resolution strategy. And last but not least — have fun!!

MO: Speaking of fun, I learned from one of the members of the 1998 winning team that your coaching had included an evening discussion on evaluation over pizza at your place. Is working with the students in that sort of relaxed environment part of the team-building that prepares them for success in the Competition?

HC: Yes, Michael, building up a commitment to the Competition and to the team is a very important part of the process. Also, getting away from the University allows us to focus without distraction. Students also need to know how committed I am to evaluation and to them. Making my home available to them gives that clear message — and it gives them the chance to ask any questions they might have.

MO: I suppose it is also quite important that the time and effort students put into the Competition is recognized as the equivalent of evaluation course work. Do you give students some sort of academic credit for participating in the Competition?

HC: Yes. I develop a learning contract with the group of students. The contract has two levels depending on how far in the Competition students go (round 1 or round 2). Their involvement in the Competition may vary from 25 to 75% of their final mark in my program evaluation course.

MO: In chatting with members of the winning teams that you coached in 1998 and 1999 I got a sense that the students were exceptionally gifted and cosmopolitan in outlook. Do you think that the Program in Rural Planning and Development tends to attract unusually talented students?

HC: That sounds like a complement that I am proud to accept, Michael. They certainly are cosmopolitan in outlook and very experienced in group work. We have a nice combination of international and Canadian content in our student body and program. By the time students get to the graduate level they are generally a bright and committed group.

MO: Well thanks very much, Harry. I really enjoyed this e-mail exchange.

HC: Thank you, Michael and others, for organizing this Competition. It certainly has changed things in my teaching of evaluation at Guelph.