CES Online Learning Goes Live: Welcome to the CES e-Institute!

Volume 21, 2006 - Fall

Simulating or imputing non-participant intervention durations using a flexible semi-parametric model

Authors :
Pages :
181-200

In the evaluation of labour market training programs using matching, evaluators must decide when to start comparing participant outcomes against non-participant outcomes. Measurement relative to an intervention period permits the separation of training opportunity costs from possible benefits, but an equivalent period for the comparison group must be determined. One method imputes the timing of the intervention for comparisons from that of the participant match. However, with Propensity Score Matching, this may produce biased outcome estimates. Instead, the authors develop and apply semi-parametric duration models using Human Resources and Social Development Canada data to simulate the positioning and duration of the intervention for non-participants.

Performance studies: the missing link?

Authors :
Pages :
201-208

Typically, a good measurement strategy to support results-based management includes both ongoing performance measures and periodic evaluations. It is argued that this is a too limited set of measurement tools, resulting, not infrequently, in less useful and costly ongoing performance measures. It is proposed that, in addition to ongoing performance measures and periodic evaluations, an alternative measurement tool called a performance study should be used in many situations, and further, that in a number of circumstances, performance studies should replace specific ongoing performance measures.

Suggestions d'améliorations d'un cadre conceptuel de l'évaluation participative

Authors :
Pages :
1-23

Due to the diversity of practices in participatory evaluation, it is not always easy to teach or practice this evaluation approach. This article aims to provide a preliminary attempt to validate the Weaver and Cousins (2005) framework in order to characterize three forms of participatory evaluation (stakeholder-based, practical, empowerment). The application of this framework to three case studies resulted in two recommendations to enhance the framework validity (detailing the values of the scale and reflecting on the relevance of stakeholder-based evaluation) and one research proposal recommendation (verifying the impact of empowerment evaluation).

Challenges of participatory evaluation within a community-based health promotion partnership: Mujer Sana, Comunidad Sana — Healthy Women, Healthy Communities

Authors :
Pages :
25-57

Evaluating multiple-member partnerships is always a challenging task. This article is based on our experiences using a participatory approach to evaluating a community-based health promotion research project. "Mujer Sana, Comunidad Sana—Healthy Women, Healthy Communities" was partnership-based, multi-sector, multicultural, and participatory. We describe our experiences working with participatory methods to evaluate the partnership per se. Three evaluation frameworks were applied sequentially by the authors and provided insight into the functioning of the partnership in a complex and changing environment. Our experiences suggest that, through the process of participatory evaluation, the partnership itself also changed in ways that were not fully captured at the time of the original project evaluation. We reflect on the process with questions that might help other groups to consider ways to evaluate partnerships in community-based, participatory health research projects with minority and majority community partners.

Understanding cultural competence through the evaluation of "Breaking the silence: a project to generate critical knowledge about family violence within immigrant communities"

Authors :
Pages :
59-79

This article examines the topical concept of cultural competence for evaluators by presenting reflections on the evaluation of "Breaking the silence: A project to generate critical knowledge about family violence within immigrant communities" as a case example. Experiences of the internal evaluator in relation to cultural competency are explored and implications for practice are presented. Including sufficient time to build relationships, facilitating a learning process, and developing evaluator competencies are among the salient themes presented.

Using community-based participatory research for an online dementia care program

Authors :
Pages :
81-104

In this article we describe our experiences using a Community-Based Participatory Research orientation (CBPR; Minkler & Wallerstein, 2003) with a group of community professionals in healthcare institutions. The purpose of the project was to design, develop, deliver, and evaluate an online dementia care program for registered and non-registered healthcare workers in longterm care homes who work with residents with dementia. The Demand-Driven Learning Model (DDLM; MacDonald, Stodel, Farres, Breithaupt, & Gabriel, 2001) was used to guide this process. The CBPR approach allowed multiple views, attitudes, and experiences to strengthen the content, delivery, and evaluation of the program. By addressing some of the issues involved in the process, we hope that our experiences documented in this article will help others develop research partnerships with community professionals, as well as plan, implement, and evaluate collaborative online healthcare training programs.

Comment favoriser la réussite d'une démarche d'implantation d'un programme au sein d'un milieu d'intervention : leçons tirées d'une étude de cas

Authors :
Pages :
105-131

Using a case study methodology, this article describes the process of implementing a cognitive-behavioural program in eight residential facilities in a Centre jeunesse du Québec. The objectives were to better understand the steps required to consolidate the implementation of the program and to identify structural factors that facilitate and impede the progress of successful implementation. Documentary analysis and qualitative interviews allowed deconstruction of the implementation process into seven phases: initial reluctance, training, exploration, resistance, application, reconciliation, and integration. The analysis shows that several structural factors have a noteworthy influence on the implementation process in this case study: the opportunities for discussion, the quality of the program to be implemented, the organizational context, and the team social climate. These factors largely shape the framework of the implementation process. The results of the case study may be useful in other implementation processes, allowing identification of potential difficulties and the mechanisms required for their resolution.

Development of a framework for comprehensive evaluation of client outcomes in community mental health services

Authors :
Pages :
133-180

The conduct of outcomes research on clients with serious mental illness using community mental health services is a challenge. Causal models with inclusion of mediating and moderating variables from social sciences evaluation methods provide a framework for conceptualizing and evaluating the complexity of community mental health services. This article presents the conceptualization and development of a framework for comprehensive evaluation of client outcomes in community mental health services and describes a case example of operationalizing and testing the framework in an evaluation of Assertive Community Treatment (ACT) in Southwestern Ontario, Canada. The initial framework was developed by hypothesizing a cause-effect pathway and links among delivered treatment variables, the implementation system, external factors, and intermediate and longer term outcomes. The framework was further validated and modified through stakeholder input. All variables identified in the framework were then operationally defined and instruments with good psychometric properties were chosen to measure the variables. This framework can provide a generic example for the conduct of community mental health evaluations.