These are the educators I know of who are speaking in support of our revised curriculum. If I’m missing any, please let me know and I’ll add them. If you don’t have a blog and would like to add something here, let me know and I can publish it as a guest post. I can even turn off commenting so mean people don’t insult you.

Update (March 18, 2014): A Petition For Educators to Speak Out In Support of Revised Curriculum

Deirdre Bailey

Cherra-Lynne Olthof

Geri Lorway

- There is no “fuzzy” math but there sure is some need for better math teaching….
- It has nothing to do with “FUZZY MATH” or “FUZZY CURRICULUM” I teach so that students can learn…

David Martin

- Larger shoes makes your child smarter!
- Case for the new math curriculum
- Curriculum Redesign
- Problems with PISA argument
- Math in a new setting
- A view from another David

Joe Bower

- Return of the Math Wars
- Mindless Math Mimicry
- 9 x 7 = 63
- Harmful Effects of Algorithms
- Alberta’s New Math Curriculum

Me

- Better Teachers
- Does Size Matter?
- More Standardized Tests
- Grade Inflation
- Curriculum and PISA
- This Whole Thing is Silly
- Math History
- Comparing Mathematicians to Math Educators (Scared to hit “Publish” on this one)

]]>

Materials: You need one deck of cards with the 10, J, Q and K removed for each pair of students, and one giant square root symbol per pair of students. This one should do the trick: Giant Root

Pair off the students in your class. Each pair gets a deck of cards, and should remove the 10, J, Q and K. Shuffle the remaining cards, and deal them so that each person has half the deck, face down.

**Mixed to Entire**

The students flip over their top cards. The student on the left puts his in front of the radical, and the student on the right puts hers under the radical. The first student to correctly convert the mixed radical to an entire radical wins the round.

**Entire to Mixed**

The students flip over their top cards, and put them both under the radical. The first student to correctly simplify the radical or to identify that it can’t be simplified wins the round.

]]>

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about

36,000times in 2013. If it were a concert at Sydney Opera House, it would take about 13 sold-out performances for that many people to see it.

Click here to see the complete report.

]]>

I found some interesting articles.

Students aren’t prepared for first year Calculus – The Gateway – 1992

Universities complain students don’t have the basics mastered – The Gateway – 1975

Lack of drill and operation in grade school math – The Gateway – 1956

If intellectual curiosity were held more important than a good memory – The Gateway 1968

An editorial by my dad that has nothing at all to do with math education – The Gateway – 1959

]]>

Alarm bells have been going off for a while. The students, in theory, are supposed to be better problem-solvers and that’s not what we’re seeing. And I’m getting that message from my colleagues in physics and chemistry is that the students just are not as well-prepared to solve problems.

Nowhere in Andrea’s article does she mention that de Vries and the rest of her colleagues at U of A have seen exactly one term’s worth of students who were taught using the new curriculum. Last September’s 12th grade students were the first batch of high school graduates to have any exposure at all to the revised curriculum, and they only had it for three years. If alarm bells have been going off for a while, then they were going off under our old curriculum.

Andrea replied quickly and assured me that de Vries was careful to say that she wasn’t sure whether the curriculum was the problem or whether there might be other factors at play. Unfortunately, the article fails to mention that. Andrea then asked about blending the approaches. Here’s what I sent her (slightly edited).

Andrea,

You’re bang on. This whole debate is a little silly. One camp thinks if we only teach algorithms, understanding will be lost. The other camp seems to think that if we only focus on understanding, proficiency will be lost. Why would we consider doing one without the other? Why is there the perception in the public that we are doing one without the other?

Bumpy is a great way to describe curriculum implementation, mostly from the teacher perspective. It takes a while to figure out what the new standards are. This curriculum was new, not so much in content, but in how we were being asked to teach. Older curriculums focused on algorithms (procedures) and hoped understanding would come along for the ride. This one flipped it so that we were to start with understanding, and have students build algorithms with that foundation of understanding. This is where things got bumpy for some of us (teachers).

The debate seems to center on two main things.

- At what point to we give students the algorithm if they are unable to develop it on their own?
- Do students have to do it one way that makes sense to them, show it to me in multiple ways regardless of their preferred method, or do it the one traditional way that many of us were taught?

The classes I taught (at the secondary level) were always based on the idea that I had to help my students get at the understanding in addition to the proficiency. I did this because when I went to University, I struggled with math at first despite doing quite well in high school. I was a great imitator. I didn’t have great understanding. My teacher would show a problem and then put the same one with different numbers on the test, which I would ace. When I hit university and had to think, I was in trouble. In addition, I had never struggled with math before, and didn’t know how to get out of trouble. This was in 1988, which is why I can say with some personal experience that this notion of students struggling in university math is not a new thing.

So when I became a teacher, I tried hard to make sure my students knew not only what to do to solve questions, but WHY that method worked. I wanted them to avoid a struggle at University like I had. Instead of starting with the algorithm, I tried hard to start with the understanding, and build the algorithm out of that understanding. In the end, whether the understanding was there or not, I had to give my kids an algorithm so they could do certain things. I just flipped it so that instead of starting with the algorithm, I started with understanding and built (or gave out) an algorithm from that understanding. To me, that’s the more effective way to teach math. I agree with your assertion that it’s not one or the other.

This may oversimply things but this debate is about proficiency (ability to DO math) and understanding (knowing WHY they are doing what they are doing and WHEN to do what they are doing). We can arrange those two things four ways.

**Proficiency with understanding** – I think we would all agree that this is what we are shooting for. If every kid got there, we’d be doing a great job.

**Lack of proficiency with lack of understanding** – Clearly, none of us want our students ending up here.

This whole debate seems to come down to how to rank the other two permutations of those two states, which is a little silly. Very few kids end up with one without some of the other.

**Lack of understanding with proficiency** – These kids would be able to multiply numbers quickly and efficiently. They’d struggle with why it worked, and would struggle with when to use multiplication. This was me. I could take complicated derivatives with the best of them. I had no idea at that time what a derivative represented. People who think this is the better of the other two states are happy kids can multiply without a calculator (and make change – It always seems to be about making change).

**Lack of proficiency with understanding** – These kids would struggle to multiply numbers quickly and efficiently on paper. They’d know what multiplication represents, and when to use it. People who think this is the better of the other two states think that kids can use a calculator for mundane calculations, and knowing why and when to multiply is the important skill.

That got long. That wasn’t my intention. Thanks for following up.

John

]]>

First of all, it needs to be pointed out that the students in Alberta who wrote PISA in 2012 came all the way through elementary school on the old math curriculum. Their curriculum included all the “back to the basics” stuff that some parents and some mathematicians want us to emphasize. The Alberta students who wrote PISA in 2012 had timed multiplication facts on their grade 3 PAT. If their PISA results are because of poor foundational skills, those skills were learned (or not learned) under our old program of studies.

Note: I used this implementation table, and counted back to grade 1 for students who were 15 in 2012.

Based on that same table, the 2021 writing of PISA will be the first time that Alberta students who have been on the revised curriculum since kindergarten will be assessed internationally.

I just don’t think that the curriculum is a problem. I don’t believe that if we reverted to curricula from the 1980′s that all students would suddenly thrive in math. Alberta classrooms are evolving, complicated environments. Since the beginning of my time in education (1975 – Joseph Welsh Elementary School), I have seen that many students struggle with math in grade school. I have seen many students who did well in grade school math struggle with University math. These are not new phenomena.

Further, it is important that Alberta parents (and Manitoba mathematicians) realize that Alberta stepped out of the WNCP curriculum in a number of key places.

This document highlights some of the key changes Alberta made to the K-9 curriculum that make it different from the curriculum elsewhere in the WNCP. A couple that I’d like to highlight for the “back to the basics” movement are below.

- Alberta added this statement, which clearly indicates that recall of number facts (ie. ability to multiply without a calculator) is important. “Mastery of number facts is expected to be attained by students as they develop their number sense. This mastery allows for facility with more complex computations but should not be attained at the expense of an understanding of number.”
- In grade 5, Alberta changed the phrase, “determine answers for basic multiplication facts” to this phrase, which clearly indicates that recall must be efficient, “determine, with fluency, answers for basic multiplication facts.”

I spent some time on the Singapore Ministry of Education site. Singapore is being held up as a country that is doing a better job than us. This quote from their own curriculum says essentially the same thing that Alberta’s curriculum does in the bullets above. In fact, their entire curriculum is strikingly similar to ours.

The development of skill proficiencies in students is essential in the learning and application of mathematics. Although students should become competent in the various mathematical skills, over-emphasising procedural skills without understanding the underlying mathematical principles should be avoided.

In my high school world, Alberta significantly modified the -2 stream (Foundations of Mathematics elsewhere in the WNCP). In Alberta, this stream contains much more Algebra. The intent was to get wider acceptance for that stream to programs in University that do not require calculus. It seems to have worked.

The problem with blaming a curriculum is that it is an ideological debate that is hard to prove definitively one way or another. Compound that with the fact that curriculum implementation is a long, slow process, and reacting too quickly to one measure (PISA results) can lead to some rash decisions being made. We face way bigger challenges in educating Alberta’s future than how we teach division algorithms.

]]>

As an example, let’s look at Math 30-1 from last year in Alberta. The average school-awarded mark was 74.8%, and the average diploma exam mark was 68.8%. The argument goes, then, that the grades were inflated a full 6% by classroom teachers. It should be noted that other subjects had a wider gap. I’m a math guy, though.

I think it’s important to decide what my classroom mark should represent. Is it my best prediction of what I think students will get on a diploma exam, or should it be an assessment of what I have seen all semester in class? They’re different things.

I’m a huge fan of Freakonomics. One of the things they point out is that it’s almost impossible to nail down one single cause of an event when there are numerous variables involved. There are so many reasons why classroom grades would be higher, on average, than diploma exam marks. Let’s look at things other than grade inflation that might contribute to this discrepancy.

- Immediacy of Assessment – In my classroom, I assess frequently and in small chunks. Students are tested on a unit while it is still very fresh in their minds. The diploma exam covers multiple units, some of which were learned a long time ago. In all my classes, students tended to do better on the individual unit exams than they did on the final exam, even when it was a final exam I created myself.
- Variety of Assessment – I assess my students in a variety of ways. The diploma exam is a machine scored test. There are no part marks for partial solutions. I look at the student’s work. I give questions that are not multiple choice. I give assessments that are not tests. Alberta Education gives one exam in one format. They would be the first to tell you that they hope teachers use a variety of different assessments.
- Psychometrics – The diploma exam undergoes a psychometric procedure of equating that I can not pretend to understand. I do not know how (or even if) this affects the discrepancy, but just in case it does, I’ll mention it.
- Pressure – The diploma exam is high-stakes, high-pressure. Students know this, and some perform lower than they did over the course of the semester.

These are some reasons I though of quickly. I’m sure there are more. My point is to suggest that the difference between school awarded mark and diploma exam mark is likely the result of many factors, and is not necessarily cause for alarm.

I would be remiss in writing this post if I didn’t acknowledge that I have occasionally seen assessment practices that inflate grades. I have also seen assessment practices that artificially lower grades. Both of these situations tend to stem from assessing behaviour rather than mastery of curriculum. I have made both mistakes in my career. Too much has already been written on this blog about accurate assessment. I’m not going there again.

When we use our program of studies and assess the outcomes that are contained within that document, we can be more confident that we have assessed our students accurately, regardless of how they score on a diploma exam.

]]>

Some of you who know me well are about to be shocked. I don’t hate our Diploma Exams (12th grade exit exam). In fact, I used to be a big fan of them. These exams are written by Alberta teachers (not publishing companies), are field tested, and then analyzed by psychometricians. They are high-quality exams.

I’m not so fond of the Provincial Achievement Tests given at the end of grades 3, 6 and 9 in Alberta. Fortunately, they are on the way out, and will be replaced by diagnostic exams to be given at the start of those years. The intention is that teachers can use the data to help and support where needed. It’s a great idea in theory. Hopefully it maintains that intent in practice.

The problem with all these exams is that outside organizations started using them for the wrong reasons (ranking schools). Alberta Education made them entirely multiple choice, and the whole thing just wasn’t as useful to me as it once was.

Let’s back up to 1992 when I was a brand-new teacher. The internet wasn’t yet up and running. Email was just starting out. I taught in a small school where I was the only math teacher. There was no Twitter helping me collaborate with other math teachers. There were no blogs to read about how other people were teaching things. I had a textbook, a curriculum guide, and some students. That was it.

The diploma exam was the only feedback I ever got about whether my teaching was meeting Alberta expectations. I loved looking at my results. Some students surprised me. Some students disappointed me. What was really useful, though, was seeing which units went well and which units had room for improvement when I looked at my class data as a whole. I used the results to improve my instruction.

Early on, it did occur to me though, that the diploma exam might be given in the wrong year. The data I got about my 12th graders didn’t help them a bit, because they had moved on. It helped me help next year’s class, but did nothing for this year’s class. Maybe this exam would be more useful if given in 11th grade.

In the late 1990′s, however, outside institutions began using the provincial exam data to rank schools. This exercise is pure statistical lunacy. They didn’t control for poverty, parental education or any other relevant factor. They simply ranked schools. The folly of this ranking can be illustrated by looking at two schools.

Old Scona is in Edmonton, and is the #1 ranked school in 2012 according to the Fraser Institute. I am not criticizing them. It’s a school in my district, and I know some fine people who teach there. They do great things with great kids. Old Scona has selective entrance. They reserve their 120 spots for students whose grade 9 average is over 80%, and they further separate students by having them write a standardized entrance exam. Old Scona selects the best of the best in Edmonton Public Schools. According to the Fraser Institute, the average income of parents of Old Scona students is $103 300 per year. There are no special needs students at Old Scona, and 8.3% of their students are ESL.

Contrast Old Scona with Mistassiny in Wabasca, the Fraser Institute’s last ranked school from 2012 (#279 out of 279). It’s a whole different world. 79.8% of their students are ESL. 20.7% of their students have special needs. The average income of their parents is $30 600 per year. The town of Wabasca is mostly FNMI, and is surrounded by five reserves. This school serves a difficult population. They do not select their students.

If these rankings truly reflect the work being done in these schools by the teachers, then we should be able to swap the entire staffs of these two schools, and within a year or two, Mistassiny would be #1, right? It’s an absolutely ludicrous suggestion.

One year, I switched from a school with a high population of struggling learners to a school with a strong academic population. I taught one class that achieved a 92% average on the Math 30 Pure diploma exam. Only one student in the class failed to get honors on that exam (and he missed by a percent). People were patting me on the back. It was ridiculous praise. I was the same teacher I was the year before in a different setting. The reason, and the sole reason, that my class did so well was because they were really strong students. Any teacher would have had those results with that class. I didn’t suddenly become teacher of the year, simply because I switched schools. I was the same old teacher I always was.

Ranking schools was never the intention of these tests. The data, which could be useful to teachers, is being misused by those with political agendas.

Later on, in 2008, Alberta Education eliminated the written response portion of the Math and Science exams. This was a huge mistake, particularly since it coincided with the implementation of a revised curriculum focused on communication and personal strategies. You can’t assess those things on a machine scored test. The written questions provided students with the opportunity to demonstrate their strategies. The written questions were a great way to lend value to important parts of our curriculum like communication and problem solving. Losing them made the exams worse.

Further, I have come to understand that Alberta Education is now exploring ways to have English and Social Studies writing scored by machine as well. As I understand it, these guys are working with Alberta Education to come up with a way to mark the writing of Alberta Students.

Over the years, I have taught more than 30 classes that wrote diploma exams. I loved teaching those classes. I didn’t teach to a test. I taught the curriculum to the best of my ability, and the test results always seemed to come out all right. There were some classes and some students that did better than I expected. There were some classes and some students who did worse than I expected. I always looked at the data, though, and tried to figure out what I could have done better.

I don’t think provincial exams are a terrible evil. I do think the data are being misused. I do think we should consider moving the grade 12 exams back a year so we could still help those students who need it. I definitely think we need to re-institute written response on math and science exams, and leave the scoring of all writing to actual human beings.

]]>

A **columnist** is someone who writes for publication in a series, creating an article that usually offers commentary and opinions.

A **journalist** collects, writes, and distributes news and other information, while refraining from bias.

An Edmonton Journal columnist, one who could never be accused of even partially refraining from bias, has been arguing that class size reduction efforts in Alberta have been a waste of money. He cites research and even quotes the head of PISA (sounding dangerously like a journalist), but then conveniently leaves out relevant details that would contradict his argument (he is, after all, a columnist).

Some high performing countries (according to PISA) have larger class sizes than we do. The obvious conclusion is that we should increase class sizes, right? Not so fast, Mr. Biased Columnist. It’s only a logical conclusion if you ignore other all the other factors at play.

Currently, the norm in high schools in Alberta is to have teachers teach 7 out of 8 blocks. That means each semester I see four unique classes of around 35 students, for a total of 140 students per semester. I get an 80 minute preparation period every other day. Because of the already large class sizes, I spend most of my preparation time creating and grading assessments. Very little of my preparation time is spent on actually thinking about how to teach my material better.

Mr. Biased Columnist points out that places like Finland, Korea, Singapore (among others) have class sizes that are larger than in Alberta, and still perform better on PISA (this is fact). What he deliberately neglects to tell us (Logical fallacy of Omission – Stacking the Deck) though, is that teachers in those countries spend far less time in front of students than we do in North America. From the Singapore Ministry of Education:

The workload of our teachers varies across the year, depending on whether it is peak or non-peak periods. Over the entire year, our teachers teach, on average, about 15 hours per week. To deliver classroom teaching effectively, teachers also spend approximately twice as much time on teaching-related duties such as preparing for lessons, providing remediation for weaker students, setting and marking of homework and examinations.

**They spend double their assigned time on teaching related duties compared to time spent teaching.** I spend 1/8 of my assigned time on teaching related duties compared to time spent teaching.

Singapore has secondary classes in the neighbourhood of 40 students, but based on what I read above, they would only see two of those a day. That’s a total of 80 students per semester, which is far fewer than we see each semester in Alberta. In addition, the Ministry of Education in Singapore indicates:

Some schools also deploy two teachers in a class of 40 students—one teacher brings the class through the curriculum, while the other teacher assists specific students who may have difficulty understanding the materials being covered.

Wow! Singapore teachers actually have less marking to do than I do, more time built into their schedules to collaborate with colleagues and plan good lessons, AND they get to team-teach in large classes? It’s a model I’d be willing and eager to explore. Are they hiring in Singapore?

Mr. Biased Columnist suggests that good teachers will do well no matter how many students we give them. I agree. Under our current conditions, however, they will likely burn out from all the marking and management problems that large classes can bring. We don’t want to burn out our good teachers, do we? One third of new teachers in Alberta burn out within 5 years. Let’s revisit Singapore.

The annual resignation rate for teachers has remained low at around 3% over the past five years. In our exit interviews and surveys, workload has not been cited as a major reason for leaving the Education Service. Nonetheless, we will continue to monitor the workload of teachers through internal employee feedback channels to ensure that workload is maintained within reasonable levels.

Singapore has bigger classes, fewer teaching hours, more collaborative time built into their day and retains 97% of their teachers. Does class size matter? Not nearly as much as teacher collaboration built into the school day.

On a completely different vein, I do need to point out that in my travels across Alberta, I already see classrooms that were built to hold 25 students jam-packed with 40 desks. I don’t know how we can physically put more bodies into those classrooms. Are we going to build a bunch of new schools with large lecture theatres?

Some Resources I Used

- Singapore Ministry of Education
- Singapore Parliament Class Size
- PISA Key Findings 2012
- Comparing the US to Finland and Singapore
- Teacher Workload and Pay – Various Countries

]]>

I taught in 5 schools in both rural and urban Alberta, where I worked with hundreds of teachers. Since becoming a consultant, I have worked with dozens of schools and several hundreds of teachers across Alberta. I can assure you that the vast majority of our teachers are good. Some are exceptional. Yes, some struggle. I am confident that each of them, regardless of ability, would tell you that they can be better. All of them want to improve. So how do we help them?

We need to change our PD model. Workshops are fun, but they don’t change teacher practice. Coaching and collaboration do. Let’s steal from the Japanese. Let’s give teachers time to collaborate with colleagues. Build this time into the school day, so it’s not at 4:00 when everybody is exhausted. Have teachers plan lessons together, then observe the lesson in action with real kids in a real classroom. Have them get back together and discuss how the lesson worked. It’s called lesson study. It makes everybody better.

Connected and collaborative teachers have strong social capital, and one study concludes that “even low-ability teachers can perform as well as teachers of average ability *if *they have strong social capital.”

More and more of what I read and experience when I’m in classrooms tells me one thing. It doesn’t matter what school your child is in. It doesn’t matter what program your child is in. It doesn’t matter what the curriculum looks like. What matters is the adult that is in front of your child. Great teachers move children multiple grade levels in a year.

Great teachers come in all styles. Some lecture. Some use discovery learning. Some are constructivist. Some are disorganized. Some dress well. Some are sloppy. What they all have in common, though, is a deft ability to build positive relationships with students.

When students have a positive connection with their teacher, good things happen regardless of style, curriculum, subject area, program or any other variable you can name. In my experience, teachers that struggle seem to have a disconnect with their classes, despite the fact that they may be well planned and hard-working. I’m not sure this ability to connect can be taught. I think it’s directly related to people’s personalities. Beat me up in the comments over that one.

While we are at it, we might as well do one more thing pertaining to better teachers. Let’s make sure our best teachers are where they are needed most. Put them in the schools with high populations of our most at-risk students. Let’s give those kids a chance.

And to the people who suggest we fire a bunch of under-performing teachers, all I can say is this: You still need to convince me that there are hundreds of people out there desperate for teaching jobs who will be better than the ones you want to get rid of.

]]>