Michael D. Fetters. (2020). The Mixed Methods Research Workbook: Activities for Designing Implementing, and Publishing Projects. Sage. Paperback, 293 pages.
In The Mixed Methods Research Workbook, Michael D. Fetters has developed a practical resource for scholars, graduate students, and individuals with an interest in mixed methods research (MMR). Since many of the concepts presented are essential to all types of human research, the workbook would make an excellent required reading for an introductory graduate-level MMR course. The workbook is intricately scaffolded, with engaging lessons covering vital areas of MMR training—literature review, ethics, publishing, validity, grant applications, and philosophy worldviews.
Mixed Methods Design in Evaluation, by Donna Mertens
Mixed Methods Design in Evaluation by Donna Mertens is a useful, well-organized book that explores the numerous applications and benefits of mixed methods approaches. In the words of the volume editors, after examining published evaluation examples, Mertens “concludes with prompts that engage the reader in thinking about how mixed methods can improve social inquiry” (p. xiv). Mertens advocates that mixed methods in evaluation “strengthen the credibility of evaluation findings” (p.
Learning and Leading: Integrating Mixed Methods in a Collaborative Approach to Educational Evaluation
This practice note describes the benefits of integrating mixed methods in a collaborative approach to evaluation with school districts and community partners in southwestern Ontario. We discuss the ways in which the integration of qualitative and quantitative data generated a multi-faceted perspective about a new mental health professional role as a complex educational phenomenon.
Collaborative Evaluation Designs as an Authentic Course Assessment
Competency-Based Evaluation Education: Four Essential Things to Know and Do
Scope Creep and Purposeful Pivots in Developmental Evaluation
A Case Study of the Guiding Principles for Collaborative Approaches to Evaluation in a Developmental Evaluation Context
Recently, Shulha, Whitmore, Cousins, Gilbert, and al Hudib (2015) proposed a set of evidence-based principles to guide collaboration. Our research undertakes a case study approach to explore these principles in a developmental evaluation context. Data were collected at two points in an 18-month period where an evaluation group collaborated with the program team from a national organization. This article explores the contributions of selected collaborative approaches to evaluation principles as they are applied in a developmental evaluation.
Optimizing Use in the Field of Program Evaluation by Integrating Learning from the Knowledge Field
It has been almost 20 years since Shulha and Cousins (1997) published their seminal paper exploring evaluation use. The paper examined a decade, 1986 to 1996, of theory, practice, and research on evaluation use. Since that time there have been significant developments related to the phenomenon of evaluation use. Outside of evaluation a new and burgeoning field has focused on the use of research in practice and policy; in health care the term knowledge translation has been used and in social sciences knowledge mobilization.
Introduction - Setting the Evaluation Use Context
Introduction. This special issue honours Dr. Lyn Shulha’s 25-year contributions to the Canadian field of program evaluation by bringing together the perspectives of authors from across North America to identify Dr. Shulha’s influence on their thinking and evaluation practices. Dr. Shulha’s scholarship is best described as a nonlinear influence because the effect of her work on evaluators’ thinking about collaboration, use, standards, and innovation cannot be directly traced.