Assessment Tools

Student Attitude Surveys: We have developed a Learning Attitudes Survey for Statistics. Near the start and end of STAT 200, students are expected to complete this on-line attitude survey. The survey attempts to gauge how students perceive the relevance of the discipline, their enthusiasm for studying it and how they go about learning in Statistics. A robust method of analyzing the resulting data has been devised and encoded in R (a freely available package for statistical computing), and a user guide has been created. Anyone wishing to implement our method on their own data should contact Dr. Bruce Dunham at A description of the method, and our findings from the analysis of our data, are being written up for future publication.

Concept Inventory for STAT 241/251: Work is on-going with the validation of a proposed concept inventory for STAT 241/251. This course is a calculus-based introduction to probability and statistics, and although such courses are widely offered there is no other existing concept inventory. Any instructor wishing to trial this concept inventory should contact Dr. Bruce Dunham at

WeBWorK Online Homework Tool: We are developing and implementing online homework problems for the large enrolment courses. The on-line homework application WeBWorK has been enhanced to integrate the statistical software R, and questions are being devised that make use of R’s capacity to generate data, perform analyses, and create graphics. Presently WeBWorK homeworks are being used in STAT 200, 203, 241/251, 300, 302, 305, 404, and 443.

Assessing the Difficulty Level of Examinations: When a course is transformed, it appears inevitable that changes are reflected in assessment tools. Typically, for example, examination questions become more concept-oriented following a transformation of the methods of teaching and learning. This can make it difficult to evaluate the effectiveness of the changes in pedagogy. One promising approach to this issue involves attempting to calibrate the difficulty of an examination by equating the questions on the test to levels of Bloom’s taxonomy. In this way an examination may be scored for difficulty, and compared with other examinations on the same course. Since student performances on assessments are readily accessible, we are developing a way of “Blooming” our examinations to help investigate how students perform in relation to objective measures of the difficulty level of the examinations. It is hoped this may be used to validate the effectiveness of course transformations in Statistics.