Feeds:
Posts
Comments

At a session I did on Friday, I mentioned that if anyone was interested in links to bloggers speaking in support of our revised math curriculum, I’d pass them along. Rather than email them over and over again, I thought it would be more efficient if I catalogued them here.

These are the educators I know of who are speaking in support of our revised curriculum. If I’m missing any, please let me know and I’ll add them. If you don’t have a blog and would like to add something here, let me know and I can publish it as a guest post. I can even turn off commenting so mean people don’t insult you.

Update (March 18, 2014):  A Petition For Educators to Speak Out In Support of Revised Curriculum

Deirdre Bailey

Cherra-Lynne Olthof

Geri Lorway

David Martin

Joe Bower

Me

Radical SNAP

I take no credit for this idea at all. I was in a classroom this morning, and the teacher had the students play a game of Radical SNAP. The students were totally engaged, and were enthusiastically converting between mixed and entire radicals. It’s pretty simple to set up.

Materials: You need one deck of cards with the 10, J, Q and K removed for each pair of students, and one giant square root symbol per pair of students. This one should do the trick: Giant Root

Pair off the students in your class. Each pair gets a deck of cards, and should remove the 10, J, Q and K. Shuffle the remaining cards, and deal them so that each person has half the deck, face down.

Mixed to Entire

The students flip over their top cards. The student on the left puts his in front of the radical, and the student on the right puts hers under the radical. The first student to correctly convert the mixed radical to an entire radical wins the round.

Photo 2-12-2014, 11 01 43 AM

Entire to Mixed

The students flip over their top cards, and put them both under the radical. The first student to correctly simplify the radical or to identify that it can’t be simplified wins the round.

Photo 2-12-2014, 11 11 48 AM

 

2013 in review

The WordPress.com stats helper monkeys prepared a 2013 annual report for this blog.

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about 36,000 times in 2013. If it were a concert at Sydney Opera House, it would take about 13 sold-out performances for that many people to see it.

Click here to see the complete report.

Math History

With much of what has been going on in the media around math seeming so new for some people, I thought I’d go back in time and do some research into whether this truly was a new problem/issue. I thought it would be even more fun if I went to my old University paper to find that information. My dad edited The Gateway in the 50′s, and it is a truly great University newspaper, so that’s where I went looking.

I found some interesting articles.

Students aren’t prepared for first year Calculus – The Gateway – 1992

Universities complain students don’t have the basics mastered – The Gateway – 1975

Lack of drill and operation in grade school math – The Gateway – 1956

If intellectual curiosity were held more important than a good memory – The Gateway 1968

An editorial by my dad that has nothing at all to do with math education – The Gateway – 1959

Recently, Andrea Sands of the Edmonton Journal published an article outlining the current debate in Alberta around math curriculum. It was well-balanced and well researched. It presented both sides nicely. I sent her a short email pointing out my only concern with the article. She quotes University of Alberta Math Professor Gerda de Vries, who I have met and respect. de Vries says,

Alarm bells have been going off for a while. The students, in theory, are supposed to be better problem-solvers and that’s not what we’re seeing. And I’m getting that message from my colleagues in physics and chemistry is that the students just are not as well-prepared to solve problems.

Nowhere in Andrea’s article does she mention that de Vries and the rest of her colleagues at U of A have seen exactly one term’s worth of students who were taught using the new curriculum. Last September’s 12th grade students were the first batch of high school graduates to have any exposure at all to the revised curriculum, and they only had it for three years. If alarm bells have been going off for a while, then they were going off under our old curriculum.

Andrea replied quickly and assured me that de Vries was careful to say that she wasn’t sure whether the curriculum was the problem or whether there might be other factors at play. Unfortunately, the article fails to mention that. Andrea then asked about blending the approaches. Here’s what I sent her (slightly edited).

Andrea,

You’re bang on. This whole debate is a little silly. One camp thinks if we only teach algorithms, understanding will be lost. The other camp seems to think that if we only focus on understanding, proficiency will be lost. Why would we consider doing one without the other? Why is there the perception in the public that we are doing one without the other?

Bumpy is a great way to describe curriculum implementation, mostly from the teacher perspective. It takes a while to figure out what the new standards are. This curriculum was new, not so much in content, but in how we were being asked to teach. Older curriculums focused on algorithms (procedures) and hoped understanding would come along for the ride. This one flipped it so that we were to start with understanding, and have students build algorithms with that foundation of understanding. This is where things got bumpy for some of us (teachers).

The debate seems to center on two main things.

  1. At what point to we give students the algorithm if they are unable to develop it on their own?
  2. Do students have to do it one way that makes sense to them, show it to me in multiple ways regardless of their preferred method, or do it the one traditional way that many of us were taught?

The classes I taught (at the secondary level) were always based on the idea that I had to help my students get at the understanding in addition to the proficiency. I did this because when I went to University, I struggled with math at first despite doing quite well in high school. I was a great imitator. I didn’t have great understanding. My teacher would show a problem and then put the same one with different numbers on the test, which I would ace. When I hit university and had to think, I was in trouble. In addition, I had never struggled with math before, and didn’t know how to get out of trouble. This was in 1988, which is why I can say with some personal experience that this notion of students struggling in university math is not a new thing.

So when I became a teacher, I tried hard to make sure my students knew not only what to do to solve questions, but WHY that method worked. I wanted them to avoid a struggle at University like I had. Instead of starting with the algorithm, I tried hard to start with the understanding, and build the algorithm out of that understanding. In the end, whether the understanding was there or not, I had to give my kids an algorithm so they could do certain things. I just flipped it so that instead of starting with the algorithm, I started with understanding and built (or gave out) an algorithm from that understanding. To me, that’s the more effective way to teach math. I agree with your assertion that it’s not one or the other.

This may oversimply things but this debate is about proficiency (ability to DO math) and understanding (knowing WHY they are doing what they are doing and WHEN to do what they are doing). We can arrange those two things four ways.

Proficiency with understanding – I think we would all agree that this is what we are shooting for. If every kid got there, we’d be doing a great job.

Lack of proficiency with lack of understanding – Clearly, none of us want our students ending up here.

This whole debate seems to come down to how to rank the other two permutations of those two states, which is a little silly. Very few kids end up with one without some of the other.

Lack of understanding with proficiency – These kids would be able to multiply numbers quickly and efficiently. They’d struggle with why it worked, and would struggle with when to use multiplication. This was me. I could take complicated derivatives with the best of them. I had no idea at that time what a derivative represented. People who think this is the better of the other two states are happy kids can multiply without a calculator (and make change – It always seems to be about making change).

Lack of proficiency with understanding – These kids would struggle to multiply numbers quickly and efficiently on paper. They’d know what multiplication represents, and when to use it. People who think this is the better of the other two states think that kids can use a calculator for mundane calculations, and knowing why and when to multiply is the important skill.

That got long. That wasn’t my intention. Thanks for following up.

John

Curriculum and PISA

It turns out I wasn’t done. There is one more thing I need to look at regarding the PISA results in Alberta, because now our curriculum is being blamed by some Alberta parents and Manitoba mathematicians.

First of all, it needs to be pointed out that the students in Alberta who wrote PISA in 2012 came all the way through elementary school on the old math curriculum. Their curriculum included all the “back to the basics” stuff that some parents and some mathematicians want us to emphasize. The Alberta students who wrote PISA in 2012  had timed multiplication facts on their grade 3 PAT. If their PISA results are because of poor foundational skills, those skills were learned (or not learned) under our old program of studies.

Note: I used this implementation table, and counted back to grade 1 for students who were 15 in 2012.

Based on that same table, the 2021 writing of PISA will be the first time that Alberta students who have been on the revised curriculum since kindergarten will be assessed internationally.

I just don’t think that the curriculum is a problem. I don’t believe that if we reverted to curricula from the 1980′s that all students would suddenly thrive in math. Alberta classrooms are evolving, complicated environments. Since the beginning of my time in education (1975 – Joseph Welsh Elementary School), I have seen that many students struggle with math in grade school. I have seen many students who did well in grade school math struggle with University math. These are not new phenomena.

Further, it is important that Alberta parents (and Manitoba mathematicians) realize that Alberta stepped out of the WNCP curriculum in a number of key places.

This document highlights some of the key changes Alberta made to the K-9 curriculum that make it different from the curriculum elsewhere in the WNCP. A couple that I’d like to highlight for the “back to the basics” movement are below.

  • Alberta added this statement, which clearly indicates that recall of number facts (ie. ability to multiply without a calculator) is important. “Mastery of number facts is expected to be attained by students as they develop their number sense. This mastery allows for facility with more complex computations but should not be attained at the expense of an understanding of number.”
  • In grade 5, Alberta changed the phrase, “determine answers for basic multiplication facts” to this phrase, which clearly indicates that recall must be efficient, “determine, with fluency, answers for basic multiplication facts.”

I spent some time on the Singapore Ministry of Education site. Singapore is being held up as a country that is doing a better job than us. This quote from their own curriculum says essentially the same thing that Alberta’s curriculum does in the bullets above. In fact, their entire curriculum is strikingly similar to ours.

The development of skill proficiencies in students is essential in the learning and application of mathematics. Although students should become competent in the various mathematical skills, over-emphasising procedural skills without understanding the underlying mathematical principles should be avoided.

In my high school world, Alberta significantly modified the -2 stream (Foundations of Mathematics elsewhere in the WNCP). In Alberta, this stream contains much more Algebra. The intent was to get wider acceptance for that stream to programs in University that do not require calculus. It seems to have worked.

The problem with blaming a curriculum is that it is an ideological debate that is hard to prove definitively one way or another. Compound that with the fact that curriculum implementation is a long, slow process, and reacting too quickly to one measure (PISA results) can lead to some rash decisions being made. We face way bigger challenges in educating Alberta’s future than how we teach division algorithms.

Grade Inflation

The last reaction to the recently released PISA results that I would like to address is the notion that the discrepancy between class marks and Diploma Exam marks in Alberta is due to grade inflation. One local columnist suggests that this grade inflation is deliberate on the part of Alberta teachers, acting on pressure from above.

As an example, let’s look at Math 30-1 from last year in Alberta. The average school-awarded mark was 74.8%, and the average diploma exam mark was 68.8%. The argument goes, then, that the grades were inflated a full 6% by classroom teachers. It should be noted that other subjects had a wider gap. I’m a math guy, though.

I think it’s important to decide what my classroom mark should represent. Is it my best prediction of what I think students will get on a diploma exam, or should it be an assessment of what I have seen all semester in class? They’re different things.

I’m a huge fan of Freakonomics. One of the things they point out is that it’s almost impossible to nail down one single cause of an event when there are numerous variables involved. There are so many reasons why classroom grades would be higher, on average, than diploma exam marks.  Let’s look at things other than grade inflation that might contribute to this discrepancy.

  • Immediacy of Assessment – In my classroom, I assess frequently and in small chunks. Students are tested on a unit while it is still very fresh in their minds. The diploma exam covers multiple units, some of which were learned a long time ago. In all my classes, students tended to do better on the individual unit exams than they did on the final exam, even when it was a final exam I created myself.
  • Variety of Assessment – I assess my students in a variety of ways. The diploma exam is a machine scored test. There are no part marks for partial solutions. I look at the student’s work. I give questions that are not multiple choice. I give assessments that are not tests. Alberta Education gives one exam in one format. They would be the first to tell you that they hope teachers use a variety of different assessments.
  • Psychometrics – The diploma exam undergoes a psychometric procedure of equating that I can not pretend to understand. I do not know how (or even if) this affects the discrepancy, but just in case it does, I’ll mention it.
  • Pressure – The diploma exam is high-stakes, high-pressure. Students know this, and some perform lower than they did over the course of the semester.

These are some reasons I though of quickly. I’m sure there are more. My point is to suggest that the difference between school awarded mark and diploma exam mark is likely the result of many factors, and is not necessarily cause for alarm.

I would be remiss in writing this post if I didn’t acknowledge that I have occasionally seen assessment practices that inflate grades. I have also seen assessment practices that artificially lower grades.  Both of these situations tend to stem from assessing behaviour rather than mastery of curriculum. I have made both mistakes in my career. Too much has already been written on this blog about accurate assessment. I’m not going there again.

When we use our program of studies and assess the outcomes that are contained within that document, we can be more confident that we have assessed our students accurately, regardless of how they score on a diploma exam.

Follow

Get every new post delivered to your Inbox.

Join 103 other followers