An anonymous commenter on one of my recent posts accused me of not providing data backing up my criticisms of Reform Math. I explained that I see my main contribution to the debate about Reform Math as being in my side-by-side comparisons of Reform Math with Singapore Math and pre-1960's U.S. math.

Detailed comparisons of particular problems sets and elicited skill sets also have their role, and help to ground such concepts as "higher level thinking," "conceptual understanding," and "mathematical challenge."I also noted that, "for a combination of practical, ethical, and ideological reasons," the kind of randomized study Anonymous was requesting are not as not as abundant as one might wish.

Meanwhile, my new friend Leigh Lieberman has been digesting and disseminating tons of what data there is. For example, she's shared with me a November, 2007 Washington Post article on the "unusual approach" of performance-based grouping, as conducted in Rock View Elementary, a low-income school with limited resources in Montgomery County MD. From the article:

The Kensington school's 497 students are grouped into classrooms according to reading and math ability for more than half of the instructional day.There groupings, the article notes, are "fluid and temporary:"

Students are tested regularly in multiple areas and are promoted to more challenging course work as their skills improve. No one is ever demoted.The results:

While some other Montgomery County schools serving low-income populations have posted higher test scores, few have shown such improvement or consistency across socioeconomic and racial lines.Leigh writes:

What I find most exciting about this case, besides the dramatic speed and degree of success in just a few years of use, is what happened the year when it was banned and then subsequently restored.From the article:

In the 2005-06 academic year, Roberson [Rock View's principal] was instructed to halt performance-based grouping, for at least one year, "to see if it really had an impact on student performance." Students returned to mixed-ability classrooms. Test scores fell. The next fall, performance-based grouping resumed. Scores rebounded to all-time highs.Leigh also notes that "Many of the reform movement's so-called 'successes' cannot be readily reproduced elsewhere (very few schools fall in their economic and professional parent range)."

I'll be sharing more of Leigh's data/links here on this blog. To start us out, here is a study she's forwarded me that examines how Michigan State students who used the Reform Math Core-Plus program in high school fared in comparison with those who didn't:

A Study of Core-Plus Students Attending Michigan State University (Richard Hill and Thomas Parker).

## 7 comments:

I posted about this before, but it seems appropriate to do so again in light of what you've written above. When the National Math Advisory Panel was meeting, Sherry Fraser, one of the bigwigs of the Interactive Math Program (IMP) (which received a grant from NSF to develop it) gave testimony, part of which follows:

"How many of you remember your high school algebra? Close your eyes and imagine your algebra class. Do you see students sitting in rows, listening to a

teacher at the front of the room, writing on the chalkboard and demonstrating how to solve problems? Do you remember how boring and mindless it was? Research has shown this type of instruction to be largely ineffective. Too many mathematics classes have not prepared students to use mathematics, to be real

problem-solvers, both in the math classroom and beyond as critical analyzers of their world."

I wrote her an email and asked her for references to support her statements. Her reply:

"I'm a firm believer in people doing their own research. I'm sure you won't have any trouble finding a number of sources to confirm this. I certainly didn't."

Hmm. We were in Rows, but Mrs. Tannehaus used the overhead, not the board. And we solved problems during class too, to practice what she taught us.

And I recall it being fun. Well, some things (factoring) made us groan because they were hard, but we eventually got them, with practice and guidance.

It sounds like Fraser just didn't like math and is trying to punish the rest of us because of it!

"side-by-side comparisons"

The studies that try to show improvement should provide side-by-side comparisons of before and after curricula. Our school improved using Everyday Math, but that was over MathLand, and the question is how much was the improvement, even if it was statistically significant? How good is slightly better than really bad?

In addition, the change corresponded with a new superintendent who set higher expectations of teaching because of NCLB. I think the improvement had more to do with the fact that NCLB forced the school to pay a little more attention to basic skills. The improvement probably would have happened even if they kept MathLand. In any case, most reform supporters select their curriculum and then look any few percentage points of relative improvement. They don't seem to understand that huge improvements are possible, especially with some kids, and especially if you separate the kids by willingness or ability. Why do high SES kids do so much better? Parents ensure that basic skills are mastered and many are separated and taught more at home. Instead of looking for a few relative percentage points, why not pick out some basic skills (which nobody would disagree with) and test mastery of those? As I always say, what understanding and problem solving skills make it OK to do poorly on the simple NAEP test?

In fifth grade, while my son was at a private school that used Everyday Math, they were considering moving to a new math curricula. I had discussion with a (very nice) head of curriculum and told her about Singapore Math. She had never heard of it, so I loaned her my books. In the end, she said that they were good, but "not right for our mix of kids". It was then that I realized that she thought the curriculum was too difficult. Reform math supporters are desperately trying to claim the high ground of problem solving and understanding, but at best, they are only achieving them through lower expectations, and there is no proof that it works. So much for the glories and benefits of EM. You can see these differences in a side-by-side analysis. In our public school's case, EM is all about supporting our full-inclusion model. EM supposedly works by definition because teachers are told to "trust the spiral".

In theory, you can always trade content and speed for better understanding or something. This might improve test scores at the lower end. The funny thing is that the numbers they use to look for improvement (like state tests) are more likely based on a better mastery of basic skills. Does anyone ever look at where improvement happens in these comparisons?

The big lie is that reform math is supposed to be better even for the most capable students. They never talk about how they are really lowering expectations, but covering it up with fancy talk of understanding and problem solving.

Religion might use data as "proof", but that doesn't mean that they are open to data that proves the religion false.

We sat in rows. The teacher lectured. It was my favorite class. Please tell me this doesn't count as research.

FWIW, I am not a math brained person. . .I like math, and I can do math, but I am not particularly gifted at it. It's always been hard work for me, but hard work can be enjoyable. Must be that midwestern upbringing I had.

As for Fraser -- in her own way, she is also right. There have always been lousy teachers - math and otherwise -- where you could find bored students sitting in rows hating the class they were in. That is probably a function of the teacher's ability to teach and the preparation of the students before they find themselves in that room.

Steve - "not right for our mix of kids" might also mean, not right for our mix of teachers, i.e., Singapore Math demands a fair competency from the teacher. For teachers with so-so math skills, a program like Everyday Math or Trail Blazers is far preferable. You don't need any mathematical knowledge to teach them. . . you just follow the script. If the kids get lost or confused and ask questions you can't answer, you can always "trust the spiral."

My point in posting the story about Fraser is that when asked to provide the research that she said "shows" what she claimed, she told me to look it up myself.

Lynne G is correct. Traditional math done poorly doesn't mean it can't be done properly and effectively.

wow!

Post a Comment