CES Online Learning Goes Live: Welcome to the CES e-Institute!

Volume 28, 2013 - Spring

Les défis de l’évaluation développementale en recherche: une analyse d’implantation d’un projet «Hôpital promoteur de santé»

Authors :
Pages :
1-26

Developmental evaluation (DE), an approach developed by Patton to support the implementation of innovations, remains relatively untested in research. This article describes an attempt to apply DE as part of an implementation study of a healthpromoting hospital project in a university hospital. The article highlights the dichotomous demands of academic research and evaluation use inherent in applying DE from a research perspective, and analyzes the challenges encountered in this case. The conclusion is that DE can be used in research, provided that preliminary context analysis is done to anticipate the challenges related to integrating the demands of research and evaluation use.

The Reciprocal Relationship Between Implementation Theory and Program Theory in Assisting Program Design and Decision-Making

Authors :
Pages :
27-41

The focus of this article is how Theory Driven Evaluation (TDE) and two of its central tenets—program theory and implementation theory—can be simultaneously used to inform and assist programmatic decision-making. The article argues there is a paucity of evaluation literature demonstrating how program theory can be beneficial to the design and interpretation of implementation theory. A case example is used to illustrate the importance of program theory in developing and interpreting implementation theory.

Using the Results of Ecc Evaluations of Public Health Interventions: Challenges and Proposals

Authors :
Pages :
43-66

Faced with the combined pressures of ecc recession and growing healthcare costs, public health administrators recognize the value of using ecc arguments to justify public health interventions. Given the expense and the time involved in conducting new ecc evaluations, decision-makers regularly speculate on the possibility of using results from studies done in a different context. This article analyzes the potential for using the results of ecc evaluations of public health interventions in contexts other than those in which the studies were done. More specifically, it sheds light on issues of quality and transferability of analyses for public health decision-making and offers practical proposals for increasing the transferability of studies.

Comparison of the Use of Self-Report Surveys and Organizational Documents in Knowledge Translation Research

Authors :
Pages :
67-85

We compared the same outcome data obtained from two different sources (self-report surveys and organizational documents) in order to examine their relative performance in evaluating the effect of knowledge translation strategies on evidence-informed decision-making. Our data came from a randomized controlled trial that evaluated the impact of knowledge translation strategies on promoting evidence-informed decision-making in public health units across Canada. We found that self-report surveys identified more outcome data than organizational documents; the types of documents that identified the most outcome data were evaluation plans, operational plans, work plans, and evaluation data; the types of documents that identified the least outcome data were meeting minutes, statistics/annual reports, and strategic plans; and evaluation plans, operational plans, and work plans together provide more outcome data than other combinations. Overall, our study suggests that evidence-informed decision-making may be appropriately measured by using multiple data sources in order to compare data across sources and to gain a more accurate representation of the results. Our findings also suggest that if organizational documents are used as a source of data in knowledge translation research, then specific types should be used in order to maximize the likelihood of identifying measures of effectiveness.

Learning Circles for Advanced Professional Development in Evaluation

Authors :
Pages :
87-96

Studies of Canadian evaluators have consistently shown them to be dissatisfied with opportunities for advanced training, suggesting a need to diversify the forms of professional development available to seasoned evaluators. This article reports on a trial implementation of an alternative learning model: learning circles for advanced professional development in evaluation. This model is grounded in approaches drawn from self-directed learning, self-improvement movements, adult and popular education, quality improvement, and professional journal clubs. Learning circles bring together experienced practitioners in structured collaborative learning cycles about topics of mutual interest. We experimented with an evaluation learning circle over several cycles, and report on what we learned about purpose, process, and outcomes for professional development. We hope that this model will be of interest to other evaluators, especially in the context of the competency maintenance requirements of the CE designation.

BOOK REVIEW: Fetterman, D. M. (2013). Empowerment Evaluation in the Digital Villages: Hewlett Packard's $15 Million Race Towards Social Justice. Stanford, CA: Stanford University Press. 154 pages. Available in paperback (ISBN 978-0-8047-8112-1),...

Authors :
Pages :
97-99

COMPTE RENDU DE LIVRE : Hurteau, M., Houle, S., & Guillemette, F. (Éds.). (2012). L'évaluation de programme axée sur le jugement crédible. Québec, QC : Presses de l'Université du Québec, 200 pages. Disponible en livre broché (ISBN 978-27605-3548-0) et...

BOOK REVIEW: Bold, C. (2012). Using Narrative in Research. London: Sage Publications Ltd. 200 pages. Available in paperback (ISBN 978-1-8486- 0719-4), hardcover (ISBN 978-1-8486-0718-7), and e-book (ISBN 978-1-4462-5426-4).

Authors :
Pages :
107-110

BOOK REVIEW: Fetterman, D. M. (2013). Empowerment Evaluation in the Digital Villages: Hewlett Packard's $15 Million Race Towards Social Justice. Stanford, CA: Stanford University Press. 154 pages. Available in paperback (ISBN 978-0-8047-8112-1),...

Authors :
Pages :
97-99

COMPTE RENDU DE LIVRE : Hurteau, M., Houle, S., & Guillemette, F. (Éds.). (2012). L'évaluation de programme axée sur le jugement crédible. Québec, QC : Presses de l'Université du Québec, 200 pages. Disponible en livre broché (ISBN 978-27605-3548-0) et...