Michael D. Fetters. (2020). The Mixed Methods Research Workbook: Activities for Designing Implementing, and Publishing Projects. Sage. Paperback, 293 pages.
In The Mixed Methods Research Workbook, Michael D. Fetters has developed a practical resource for scholars, graduate students, and individuals with an interest in mixed methods research (MMR). Since many of the concepts presented are essential to all types of human research, the workbook would make an excellent required reading for an introductory graduate-level MMR course. The workbook is intricately scaffolded, with engaging lessons covering vital areas of MMR training—literature review, ethics, publishing, validity, grant applications, and philosophy worldviews.
Mixed Methods Design in Evaluation, by Donna Mertens
Mixed Methods Design in Evaluation by Donna Mertens is a useful, well-organized book that explores the numerous applications and benefits of mixed methods approaches. In the words of the volume editors, after examining published evaluation examples, Mertens “concludes with prompts that engage the reader in thinking about how mixed methods can improve social inquiry” (p. xiv). Mertens advocates that mixed methods in evaluation “strengthen the credibility of evaluation findings” (p.
Learning and Leading: Integrating Mixed Methods in a Collaborative Approach to Educational Evaluation
This practice note describes the benefits of integrating mixed methods in a collaborative approach to evaluation with school districts and community partners in southwestern Ontario. We discuss the ways in which the integration of qualitative and quantitative data generated a multi-faceted perspective about a new mental health professional role as a complex educational phenomenon.
Collaborative Evaluation Designs as an Authentic Course Assessment
Competency-Based Evaluation Education: Four Essential Things to Know and Do
Scope Creep and Purposeful Pivots in Developmental Evaluation
A Case Study of the Guiding Principles for Collaborative Approaches to Evaluation in a Developmental Evaluation Context
Récemment, Shulha, Whitmore, Cousins, Gilbert et al Hudib (2015) ont proposé un ensemble de principes pour guider les pratiques collaboratives. Par une étude de cas, nous explorons ces principes dans un contexte d'évaluation développementale. Des données ont été recueillies à deux moments au cours d'une période de 18 mois, lors d'une collaboration entre un groupe d'évaluateurs et l'équipe d'un programme d'une organisation nationale. L'article explore les contributions de certaines approches collaboratives au respect de ces principes dans le cadre de l'évaluation développementale.
Optimizing Use in the Field of Program Evaluation by Integrating Learning from the Knowledge Field
Il y a presque 20 ans que Shulha et Cousins (1997) ont publié leur article phare sur l’utilisation de l’évaluation. Leurs travaux ont examiné une décennie, de 1986 à 1996, de théorie, pratique et recherches sur l’utilisation de l’évaluation. Depuis lors, il y a eu d’importants développements liés au phénomène de l’utilisation de l’évaluation. Au-delà de l’évaluation, un nouveau domaine s’est développé sur l’utilisation de la recherche dans la pratique et les politiques; en santé, le terme transfert des connaissances a été utilisé et en sciences sociales, on parle de mobilisation du savoir.
Introduction - Setting the Evaluation Use Context
Introduction. This special issue honours Dr. Lyn Shulha’s 25-year contributions to the Canadian field of program evaluation by bringing together the perspectives of authors from across North America to identify Dr. Shulha’s influence on their thinking and evaluation practices. Dr. Shulha’s scholarship is best described as a nonlinear influence because the effect of her work on evaluators’ thinking about collaboration, use, standards, and innovation cannot be directly traced.