CES Online Learning Goes Live: Welcome to the CES e-Institute!

Volume 22, 2007 - Special Issue

Secondary Analysis with Minority Group Data: A Research Team’s Account of the Chalenges

Authors :
Pages :
73-97

Understanding the challenges associated with conducting secondary analysis of large-scale assessment data is important for identifying the strengths and weaknesses of various statistical models, and it can lead to the improvement of this type of research. The challenges encountered in the analysis of assessment data from subpopulations may be of particular value for this purpose. To date, few studies have discussed the problems associated with the secondary analysis of large-scale assessment data. By relating the experiences of a research team that engaged in several projects involving secondary analyses of linguistic minority population data from three different large-scale assessment programs, this article aims to help readers understand the practical, conceptual, and technical/statistical challenges that can be encountered.

valuations à grande échelle de l'écriture: lien entre le score holistique et les composantes de l'écriture

Authors :
Pages :
99-119

Information generated by large-scale writing assessment holistic scores should serve as indicators for decision makers. But to what extent do holistic scores represent the different writing components? Based on a sample of written essays by 3,107 13- and 16-year-old Canadian students, analyses show that six writing components are related to the holistic score and the relationships do not vary based on language (essays written in English versus essays written in French). Results allow for a discussion of the interpretations of holistic scores in the context of writing assessments.

School Determinants of Achievement in Writing: Implications for School Management in Minority Settings

Authors :
Pages :
121-150

This study identified school factors that determined writing achievement for 13- and 16-year-old Francophone students in minority (Manitoba, Ontario, New Brunswick, and Nova Scotia) and majority settings (Quebec) (N = 5700). Factor analysis retained three factors subjected to binary logistic regression with students’ academic performance: Human and Material Resources and School-Community-Family Relations; Principal’s Vision and Beliefs; and Rules and Procedures. Logistic regression found two major determinants in writing achievement: Human and Material Resources and School-Community-Family Relations, and Principal’s Vision and Beliefs. In minority settings, the t-test showed significant deficits affecting the school’s ability to provide teaching programs in terms of Human and Material Resources and School-Community-Family Relations. The influence of both principals and teaching staff (individually) on general activities, school programs, and staff morale was weaker in the minority than in the majority contexts, in contrast to community support, school spirit, and students’ and teachers’ level of pride, which were stronger in minority settings.

La comparabilité des échantillons dans les enquêtes de l'International Association for the Evaluation of Educational Achievement

Authors :
Pages :
151-156

This research and practice note is intended to offer a perspective on interpreting student results from the International Association for the Evaluation of Educational Achievement TIMSS 2003 survey, specifically focussing on the sampling plan of participating schools and students. The observations can therefore be extended to other surveys of that kind, where the focus is not on mathematics.

Book reviews / Comptes rendus de livres

Commentaire de la rédactrice invitée: Programmes d'évaluation du rendement en lecture et écriture : Quelles leçons en tirer? / Guest editor's comment: Achievement Assessment Programs in Reading and Writing: What Lessons Can Be Learned?

Authors :
Pages :
165-181

Un mot de la rédactrice invitée / Guest Editor’s Remarks

Authors :
Pages :
v-xii

Teachers’ and researchers’ uses of assessment and evaluation can bring reading and writing together

Authors :
Pages :
1-28

Researchers and teachers of reading and writing can assess from different viewpoints or from a common one. In this manuscript two different viewpoints—a responsive view for writing and a developmental view for reading—show different vantage points, and the responsive view is used to show a way to bring reading and writing together. In general, this article advocates assessment for learning, as different from assessment of learning. Overall, my goal is to critique whether our uses of assessment and evaluation derive from our beliefs. Sometimes, as will be shown, this is the case, and at other times there appears to be a discrepancy between what researchers and teachers assess and value. This article may provide guidelines for assessment in classroom contexts, for uses of large-scale assessments, and for program evaluation.

Walking in their shoes: students’ perceptions of large-scale high-stakes testing

Authors :
Pages :
29-52

With the implementation of the Ontario Secondary School Literacy Test (OSSLT) in 2002, Ontario became the first province in Canada requiring successful completion of a large-scale highstakes literacy test for high school graduation. We began to explore and analyze students' perceptions of this testing program in terms of their preparation for the test, the test's impact and value, and the potential influence of such a testing program on the students' views about literacy. Our study used qualitative data obtained through focus groups and interviews with students who were either successful or unsuccessful on the OSSLT. The students recalled specific test preparation for the OSSLT, often at the expense of their regular schooling. Following test instructions was a clear message to the students, leading to formulaic test responses and a narrowly expressed view of literacy. Potentially important differences were also found between the successful and unsuccessful students.

Design and Development Issues in Provincial Large-scale Assesaments: Designing Assessments to Inform Policy and Practice

Authors :
Pages :
53-71

Over the past four decades, there has been much debate on key sources of data in evaluating education, determining school effectiveness, and providing evidence to inform accountability and education planning. Entangled in this debate has been the extent to which large-scale assessments of learning provide valid evidence about the quality of schooling and education in Canada and how they can be used to inform education practice and policy. This article discusses five issues in large-scale assessments that are key for their usefulness and for making valid inferences. Based on recent research on assessment design and validity, the authors offer recommendations for large-scale assessments to better serve the multiple purposes they are intended to serve.