Feeds:
Posts

## Homework Research

In my last post, I speculated that there were three reasons I read educational research.

1. I encounter it (via Twitter, blogs, or in journals) and I’m curious, so I read it.
2. I deliberately seek it out to confirm a bias. (Don’t judge me. We all do this.)
3. I’m genuinely interested in what the research has to say on a certain topic, so I search for it.

Since biases are fun, let’s look at an article I dug up for the second reason. I’ve made my views on homework pretty clear on this blog in a couple of posts. Here’s a study I found on the subject of homework. Unfortunately, it failed to confirm my bias.

Are We Wasting Our Children’s Time By Giving Them More Homework?

The study is by Daniel J. Henderson, of New York and was published as IZA Discussion Paper No. 5547, March 2011.

Following an identification strategy that allows us to largely eliminate unobserved student and teacher traits, we examine the effect of homework on math, science, English and history test scores for eighth grade students in the United States. Noting that failure to control for these effects yields selection biases on the estimated effect of homework, we find that math homework has a large and statistically meaningful effect on math test scores throughout our sample. However, additional homework in science, English and history are shown to have little to no impact on their respective test scores.

Yikes. Math homework has a large and statistically meaningful effect on math test scores throughout our sample? Uh. Oh. I guess I’d better read more than just the abstract and see if I can figure our what is going on. The math used in the study is complicated. That might make it tricky to read.

Here’s something I wonder about. Page 9.

…higher able students benefit more from additional homework.

Perhaps higher able students are the only ones who actually do the homework, because they’re the only ones who are capable of doing it.

Later, on page 17.

Taking the Peabody Individual Achievement Test in math as our benchmark, the gain from math homework (1.77 points) corresponds to one-fourth of the raw black-white test score gap between the ages of 6 and 13

My question would be: Can we be sure the gain on that test is solely attributable to homework? Maybe we can. I’ll admit to not fully understanding the tables in the study.

Here’s a finding on page 19 that I am glad to hear. At least one of my biases was confirmed by this study.

The teachers Treatment of the homework (whether it is being recorded and/or graded) does not appear to affect the returns to math homework.

## Educational Research

I’m going to write some posts over the next little while about educational research. Just before Christmas, Michael Pershan and Chris Robinson were going back and forth on Twitter about research vs. blog posts.

“If teachers can rely on blog posts, where does that leave ed research?”

That question got me thinking a whole lot about how and why I use educational research and how and why I use blog posts.

I read research for several reasons.

1. I encounter it (via Twitter, blogs, or in journals) and I’m curious, so I read it.
2. I deliberately seek it out to confirm a bias. (Don’t judge me. We all do this.)
3. I’m genuinely interested in what the research has to say on a certain topic, so I search for it.

My recent blog post about delayed feedback falls into the first category. A colleague showed it to me and I was curious, so I read it.

I tend to mine blogs for ideas that I can use immediately in classrooms and workshops. Those ideas don’t have to be researched based, in my opinion. The fact that a colleague tried it already and it worked for her is sufficient for me to try it out. That endorsement is worth one class period or one unit of study of my time. I see these shared ideas the same way I saw lunchroom conversations in the 1990s. “I did this cool thing in my class today. You should try it out.

If I were contemplating a major shift in my practice, I’d probably go to research in addition to listening to colleagues. SBG would be an example of something I’d research before changing my whole practice. A blog might inspire me to try it, and the research would confirm that it’s worth doing. One year in Math 8, I did the entire course in cooperative learning groups and activities. That’s a big commitment. That’s a big shift. Research supported and justified my change.

In the next few blog posts, I’m going to look at some of the research I’ve read over the past few years. I’ll explain how I happened across it, and how I use it now.

## Delayed Feedback

It’s been an interesting enough week in the assessment world that I’m compelled to blog for the first time in a long time.

Early last week, I encountered this “Focus on Formative Feedback” literature review by Valerie Shute.

Table 4, near the end, on page 179 lists “formative feedback guidelines in relation to timing issues.” Shute recommends using immediate feedback for difficult tasks, and delayed feedback for simple tasks. She says that to promote transfer of learning, teachers should consider using delayed feedback. To support that claim, she says,

According to some researchers (e.g., Kulhavy et al., 1985; Schroth, 1992), delayed may be better than immediate feedback for transfer task performance, although initial learning time may be depressed. This needs more research.

Then, just yesterday, Dan Meyer jumps in with a post on delayed feedback.

My gut says that the timing of the feedback is far less important than the quality of the feedback. Dylan Wiliam has entire chapters dedicated to providing feedback that moves learners forward. Next steps are useful to all students. Evaluative feedback that evokes emotion isn’t particularly useful to anyone.

I’m not sure this does need more research.

## The Sky Is Falling!

There’s been a lot of twitter and media buzz about a new app that scans math questions and gives answers.

Dan Meyer has compiled some thoughts on the app over on his blog, and he has been commenting on Twitter as well.

I decided to test Dan’s comment with (what else?) a test. I gave the app one exam from each of grades 7, 8, 9, 10, 11 and 12. My conclusion is that it doesn’t solve them anywhere near as well as most kids would.

Full disclosure: The grades 10, 11 and 12 exams were ones I created, and I’m always conscious of trying to avoid having questions that can be answered with a calculator alone. The grades 7, 8 and 9 exams were from a publisher.

I tried to pick topics it had a shot at solving. I tried to pick topics with mostly number and equations.

Grade 10 – Algebra and Number

The app got 0/30 on my exam. On the questions I thought it should be able to answer, it got 0/10.

This one was its most blatant error. I did have it centered properly prior to snapping a screenshot. It registered the 2, and ignored it.

These were its first steps. It had trouble recognizing that square bracket.

The app got 3/32 on my exam. On the questions I thought it should be able to answer, it got 3/13. I was a little surprised. Clearly I need to tweak some questions. Here’s one it got right.

Grade 12 – Exponents and Logarithms

The app can’t recognize logs, or manipulate anything but the most rudimentary equations. It got 0/30 overall and 0/9 on the ones I thought it should get.

Math 7 – Integers

The app struggles with brackets. I hovered over expressions like (-2) + (4) – (-7) endlessly waiting for an answer of any kind (right or wrong) and never got anything. It got 0/20 overall and 0/9 on the ones I thought it would get.

Math 8 – Fractions

The app nails fraction calculations. It got 7/20 overall and got 7/7 on the ones I would have expected it to get. Here’s one it got right.

Math 9 – Equations

The app got 6/20 overall and 6/6 on the ones I would have expected it to get. It solves basic equations (no logs, no powers, no quadratics, few brackets) correctly every time I try it. Some of the steps seem convoluted to me.

I’m not sure this is the game changer some people fear it is. It’s a calculator, and not a particularly accurate one. As long as we’re asking the right questions, let them use this app. Just have them check their answers on a calculator.

## My AAC Work

My last post reflecting on my two and a half years at Alberta Assessment Consortium was too much about feelings and not enough about number crunching. Here are some numbers reflective of what I did.

Travel

• Total distance driven = 39 126 km
• Total distance flown = 13 962 km
• Nights in hotels = 97 (48 this school year alone)

Far too many of the drives this winter looked like this:

With all that driving, my 160 GB iPod and its 7824 songs was my best friend.

• Most of the time I play it on shuffle mode, all songs in the queue.
• The top 25 most played is a diverse list including Adele, Biz Markie, Edwin Sharpe, Pitbull, Gwen Stefani, Leonard Cohen, Project Jan & Project Jenny, Shaggy, Mumford & Sons, Band of Horses.
• The most played song (81 plays) was Whale of a Tale by Danny Michel.
• On shuffle, a lot of songs end up coming up that I’m not in the mood for.
• The most skipped song (50 skips) was something called April Showers by Sugarland. I wonder why it’s on my iPod.
• High on both lists are Hate Me by Blue October (39 plays, 37 skips) and A- Punk by Vampire Weekend (37 plays and 37 skips)
• I worked my way through all the Freakonomics podcasts from start to finish.

Cities and Towns Visited For Work

• 29 Different Cities
• Most Visited City – Grande Prairie – 30 Days
• Second Most Visited City – Fort McMurray – 19 Days
• Closest City Visited – St. Albert (or is Sherwood Park closer?)
• Farthest City Visited – Toronto, Ontario

Work

School Visits

• 153 School Visits
• 42 Unique Schools
• Grande Prairie Composite was stuck with me the most, at 23 visits.

Coaching Visits

• 85 Coaching Visits
• 41 Different Teachers Coached

Workshops/Presentations

• Total, including full day, half day, and shorter – 93
• Teachers in workshops – 2017
• Unique teachers in workshops – A subset of that 2017
• Workshops in French – 5
• Most common workshop theme – Formative Assessment, of course.

Meetings

• Meetings Attended – 121 (Ug!)

Demo Lessons

• Total – 43
• Total Flops – 2

Meals With Keynote Speakers

• Steve Leinwand – 1 (But it was actually the second time I dined with him)
• Cathy Lassiter – 1
• Ruth Sutton – 2
• Ken O’Conner – 1
• David Coffey – 1
• Kathryn Coffey – 1

## What I Learned At AAC

My secondment at Alberta Assessment Consortium ends next week. For the past 2.5 years, I have traveled the province conducting a research study in which I worked with math teachers on embedded formative assessment. We also studied the coaching model as a professional learning tool.

As I transition back to my district, I’m reflecting on my time at AAC. I’d like to share with you what I think I took most from this experience.

I could tell you about all the people I met across the province who are doing great things in high school math classrooms, but that would sound trite.

I could tell you about how much I learned about assessment, but I’d have been doing an absolutely terrible job of this work if I didn’t learn a whole lot.

When I took the job, I had no idea I would need to make videos as part of the project. The ones I made are posted here. They’re not in the order I made them, but an astute viewer will see my progression. After the first one, we bought new camera equipment because the flip camera wasn’t cutting it. At one point, we had a videographer come in and teach us about cuts, B-roll, transitions, multiple cameras and other tricks. We hired a video “intern”, who made one video for me, and helped me dabble in Adobe. For the most part, though, those videos are all me, and are all iMovie.

The thing is, I had no idea I’d enjoy that creative process so much. Let me tell you how much I enjoyed it.

Last week, I spent a day at a local elementary school filming K-3 students talking about their writing. I hit it with three cameras, one on a boom giving an overhead shot of the students’ work. I recorded an audio track on a separate microphone. I brought a colleague to interview the students so I could focus on filming. I did my best to film it like a pro. In the end, I had more than 90 minutes of footage, filmed from three different angles. This footage is to be used by our video intern under the guidance of future AAC employees to make 30 second snippets to use in workshops and to post on our website.

The thing is, I couldn’t let it go.

Even though I don’t own the footage, and can’t use it myself, I had to make something from it. Knowing full well that no one would ever see it outside our office, I spent hours piecing it all together into something I loved. It’s 15 minutes of young kids talking about feedback. I built in multiple angles. I worked in their funny comments. I worked in their insightful comments. I pieced it all together in a manner that really amuses me. I added transitions and pulled audio tracks from my best track into the clips from the other cameras. I learned how to line that audio up to the students’ lips. It comes in at 15 minutes long, and it’s some of my best work. I’ve revised it twice more after rendering it and showing it to people.

On Friday, I’ll wipe my work laptop clean and pass all my video (including this one) on to the boss on a hard drive. At that point, I won’t even have a copy of this creation any more.

Why did I do all that knowing that very few people would ever see it, and that I couldn’t keep it? Because it reflects the thing I learned most about and really enjoyed doing during this job. Who (other than that Bloom guy) knew that a creative process could be so enjoyable and valuable? That’s a nice thing for a rigid math guy to come to understand.

## An Elementary Teacher’s Perspective on AMD

I’ve been blogging about my experience at the Alberta Mathematics Dialogue last week, in which a group of university mathematics professors offered a critique of the K-12 math program in Alberta. My colleague, Pat, attended as well. Pat has more than 30 years experience as a teacher and consultant in Alberta. She has a BSc (math major), BEd and MEd. As a high school teacher, I can’t pretend to know a whole lot about how young children learn mathematics. Pat, however, is truly an expert in this area. I asked her if she would be willing to share a few words here, and she agreed. What follows are her words.

I’ve been an elementary teacher since 1979. It’s a designation I’ve always been proud of, even though it seems the complexity of the work is poorly understood and not always respected. For most of the past 4 years I’ve been out of my classroom, supporting Alberta teachers in the areas of mathematics and assessment. I attended the 2014 Alberta Mathematics Dialogue in Camrose on May 1.

In addition to attending the presentations examining the Alberta K-12 mathematics curriculum, I was able to join a round-table discussion at the end of the day. The presenters from the earlier sessions were there, along with other interested participants. The discussion focused again on the math curriculum – past, present and future – and its impact on mathematics learning in Alberta classrooms.

There was overwhelming agreement among the post-secondary faculty in attendance that the math skills of their students have significantly declined over past 10 or more years. This is not an area I have expertise in, but I’m willing to work under the assumption that they know what they’re talking about, and are not guilty of looking to the past with rose coloured glasses. However, almost no one in the room seemed prepared to question the causes of this perceived decline. It seemed accepted as a truth that changes to the Alberta curriculum caused the problem, and that reversing those changes would fix it.

Alberta teachers (as well as teachers in many jurisdictions around the world) have been asked to teach math through more of an inquiry approach – teaching math through problem-solving rather than for problem-solving, if you will. Teachers present problems for students to explore, and then help them use this exploration to develop an understanding of math concepts and strategies they need to move their learning forward. Personal strategies for operations are part of the equation, and a mastery of basic facts is still critical. (Even as I try to explain this in a nutshell, I sense the eye-rolling of the masses of critics who see this approach as so much hogwash. Please accept for a moment that I have some serious experience to back up my opinions.)

In my classes I have mathematically talented students who need to be challenged, as well as students whose past experiences have made them fragile, uncooperative, discouraged and hard to motivate. I need to find a way to interest all my students, sometimes almost against their will, in the problems I’m asking them to explore so they can begin to grapple with the ideas that might be useful to solve them. Once students have worked to solve a problem, sometimes unsuccessfully, they are far more likely to be interested in thinking about an approach (mine or another student’s) that might do the trick. I try to give them a need for the math I want them to learn. A hard lesson I’ve learned after many years of teaching math to elementary students: as much as I’d like to, I can’t do the understanding for my students. All I can do is my best to engage them in thinking about what I need them to think about. I have to rely on them to do the hard work of making sense of it.

It is unbelievably complex work, but an inquiry approach in my math classroom helped me and my diverse students function as a mathematics community. Without a doubt, I was a better and more successful math teacher using the current math curriculum, as well as the one before it, than I was using the 1975 mathematics curriculum (which, according to Anna Stokke of the University of Manitoba, was the last excellent math curriculum in Alberta). My students thrived under an inquiry approach.

I’m pretty sure I don’t need to lecture the mathematicians in the crowd about the difference between “correlation” and “cause and effect.” The perceived decline in math abilities is correlated with an enormous number of changes and challenges that have impacted students and teachers in Alberta schools in the past years, and the curriculum is just one of them. I find it fascinating and disturbing that critics, particularly in the media, seem so unwilling to consider the possibility that the task of improving math achievement is far more complex than it might seem at first glance (and, in my opinion, impossible to measure using a single standardized test). An easy fix like making the curriculum more rigorous or traditional or focused on basics almost certainly does not exist.

Recently, when I polled a roomful of university educated adults about their opinion of math as students, about a third of them admitted to having hated it. I fail to see this as evidence of the great success we had back in the “good old days.” Instead of  blindly charging back in that direction, why don’t we take a deep breath, set aside the destructive, combative nature of the current debate, and support the work of our teachers and curriculum developers (who, believe it or not, bring essential skills and expertise to the table) in whatever way we can. The challenges we face are more than failure to memorize times tables. The world we live in is changing at a dizzying rate. Preparing our students to navigate it successfully is the most important work I can imagine.