2018
The editorial team of the Canadian Journal of Program Evaluation (CJPE) is pleased to announce that volume 33(1) is now published on-line. Given recent changes to CES policy regarding open access, the entire volume will be accessible to all starting July 1st. Reproduced below is the introduction to the spring 2018 issue.
This issue of the Canadian Journal of Program Evaluation will interest evaluators from many different sectors and with many different interests. The articles and practice notes featured in these pages focus on innovative methodological approaches, applied to various practice settings, such as health and education. First, the paper by Rusticus, Eva and Peterson argues for construct-aligned rating scales as one of the evaluator's tools, specifically in the area of medical education. The paper makes an important contribution by helping us conceptualize scale development to collect data most efficiently. Next, Rosella and her colleagues show that a team-based knowledge brokering strategy was effective in supporting the use of the Diabetes Population Risk Tool (DPorT) in public health settings. The following paper, presented by Chen and his co-authors, summarizes the findings of an empirical comparative study of evaluation models using a large-scale education initiative in Taiwan. This paper focuses specifically on the usefulness of evaluation models for planning and development purposes. The paper by Contandriopoulos, Larouche and Duhoux will be of interest to evaluators who work closely with research granting institutions or universities. Using social network analysis methods, these authors have found a positive correlation between collaborations and research productivity, and push their investigation further to consider the role played by formal networks on academic collaborations. The following paper, by Mediell and Dionne, presents an evaluation design quality control checklist, developed and validated empirically. The checklist will certainly be of interest to novice and experienced evaluators alike as they design and implement future evaluation studies.
Our practice notes also have something of interest for both evaluation practitioners and researchers. First, Kallemeyn discusses three frameworks for using photographs in evaluation practice: documenting social change, facilitating sense-making, and inspiring and imagining social change. Next, Nadin and her colleagues discuss a new way of obtaining informed consent for youth participation in evaluation research, when parental consent is not applicable or possible. And finally, Britto and Visano identify self-assessment performance metrics for Canadian microcredit programs, an area in which the more standard performance metrics applicable to public organizations cannot always be applied.
I hope that these papers provide you with new insights about exciting developments in our field and inspire you to approach your evaluation work with renewed enthusiasm and brand new additions to your toolbox! The editorial team would love your feedback on these papers and our previous publications, so don't hesitate to contact us!
Isabelle Bourgeois, Ph.D.|
Editor-in-Chief