The last reaction to the recently released PISA results that I would like to address is the notion that the discrepancy between class marks and Diploma Exam marks in Alberta is due to grade inflation. One local columnist suggests that this grade inflation is deliberate on the part of Alberta teachers, acting on pressure from above.
As an example, let’s look at Math 30-1 from last year in Alberta. The average school-awarded mark was 74.8%, and the average diploma exam mark was 68.8%. The argument goes, then, that the grades were inflated a full 6% by classroom teachers. It should be noted that other subjects had a wider gap. I’m a math guy, though.
I think it’s important to decide what my classroom mark should represent. Is it my best prediction of what I think students will get on a diploma exam, or should it be an assessment of what I have seen all semester in class? They’re different things.
I’m a huge fan of Freakonomics. One of the things they point out is that it’s almost impossible to nail down one single cause of an event when there are numerous variables involved. There are so many reasons why classroom grades would be higher, on average, than diploma exam marks. Let’s look at things other than grade inflation that might contribute to this discrepancy.
- Immediacy of Assessment – In my classroom, I assess frequently and in small chunks. Students are tested on a unit while it is still very fresh in their minds. The diploma exam covers multiple units, some of which were learned a long time ago. In all my classes, students tended to do better on the individual unit exams than they did on the final exam, even when it was a final exam I created myself.
- Variety of Assessment – I assess my students in a variety of ways. The diploma exam is a machine scored test. There are no part marks for partial solutions. I look at the student’s work. I give questions that are not multiple choice. I give assessments that are not tests. Alberta Education gives one exam in one format. They would be the first to tell you that they hope teachers use a variety of different assessments.
- Psychometrics – The diploma exam undergoes a psychometric procedure of equating that I can not pretend to understand. I do not know how (or even if) this affects the discrepancy, but just in case it does, I’ll mention it.
- Pressure – The diploma exam is high-stakes, high-pressure. Students know this, and some perform lower than they did over the course of the semester.
These are some reasons I though of quickly. I’m sure there are more. My point is to suggest that the difference between school awarded mark and diploma exam mark is likely the result of many factors, and is not necessarily cause for alarm.
I would be remiss in writing this post if I didn’t acknowledge that I have occasionally seen assessment practices that inflate grades. I have also seen assessment practices that artificially lower grades. Both of these situations tend to stem from assessing behaviour rather than mastery of curriculum. I have made both mistakes in my career. Too much has already been written on this blog about accurate assessment. I’m not going there again.
When we use our program of studies and assess the outcomes that are contained within that document, we can be more confident that we have assessed our students accurately, regardless of how they score on a diploma exam.