So asks Dennis DeTurck, an award winning math professor and dean of the college of arts and sciences at the University of Pennsylvania. Children don't understand fractions; fractions are as obsolete as slide rules. Calculators and decimals let us breeze where once we slogged, baffled by expressions like 1/2, by finding common denominators, and by inverting and multiplying.
Wednesday, March 5, 2008
Of course, many children--left-brainers and math buffs--start grasping fractions as early as first grade. And surely all children must master them in order to advance through algebra?
Pondering this, I once asked Professor Deturck about rational expressions like 1/x and y/z. How do you express these as decimals and manipulate them with calculators?
Maybe we should wait until algebra before introducing fractions, he replied. Or (see above link) until calculus.
Those of us who teach children know well the pedagogical nightmare that arises when we introduce two tough concepts at once-- e.g., fractions and variables, or fractions and derivatives.
A second reason for fractions before algebra struck me last night. I was helping my son through an algebra problem, and amid all the messy denominators--e.g., (a-b)*(x-a)--he'd lost sight of how to find common denominators. It was only when I gave him an analogous problem with numbers in place of variables that he rediscovered the algorithm and why it works. Had fractions with numbers not been something familiar we could return to, he might have continued to flounder.