## Strange Accounts of the Common Core State Standards

The mathematicians Sol Garfunkel and David Mumford discuss the Common Core State Standards for Mathematics (CCSS) in their August 24 *New York Times* editorial “How to Fix Our Math Education,” concentrating their remarks on high school.

In contrast, the education researcher Andrew Porter and colleagues concentrate on Grades 3–6 and 8 in “Common Core Standards: The New U.S. Intended Curriculum” in the April issue of *Educational Researcher*. In August, Porter described these findings in his *Education Week* article “In Common Core, Little to Cheer About.”

As someone who has read the CCSS, I find these articles peculiar. (Disclosure: I edited the penultimate version of the CCSS and am the editor for the CCSS Progressions.)

Here’s why.

In brief, although Garfunkel and Mumford make assertions about the CCSS, they don’t seem to have read them.

The list is longer for Porter and his colleagues. Their analysis seems to omit the CCSS Standards for Mathematical Practice and some conclusions apparently rely on a mistake in a table. Their measures of focus and cognitive demand miss important characteristics of intended curricula in the US and other countries. The authors acknowledge this possibility, saying, “Judging the quality of the Common Core standards is of great importance, but it is only *partially and tentatively addressed here*” (emphasis added). Despite this caveat, Porter asserts in *Education Week*, “Our research shows that the common-core standards do not represent a meaningful improvement over existing state standards.”

It would be more accurate to say, “Our research *does not show* that common-core standards do—or do not—represent a meaningful improvement over existing state standards.” Absence of evidence is not evidence of absence.

Here are some details.

**The New York Times article**

Garfunkel and Mumford say, “Today, American high schools offer a sequence of algebra, geometry, more algebra, pre-calculus and calculus. . . . This has been codified by the Common Core State Standards. . . . a highly abstract curriculum.”

However:

- The CCSS do not dictate how topics are sequenced in courses. (See its note on courses.)

- The CCSS do not include standards for topics in calculus.

- For high school, the CCSS include substantial requirements in statistics and probability (see Statistics and Probability, pp. 79–83) and in modeling (see Modeling, p. 72–73). Another relevant high school domain is Number and Quantity, which says:

In high school, students encounter a wider variety of units in modeling, e.g., acceleration, currency conversions, derived quantities such as person-hours and heating degree days, social science rates such as per-capita income, and rates in everyday life such as points scored per game or batting averages. They also encounter novel situations in which they themselves must conceive the attributes of interest. (CCSS, p. 58)

This is not a “highly abstract” codification of “algebra, geometry, more algebra, pre-calculus and calculus.”

**The Educational Researcher article**

In their *Educational Researcher* article, Porter et al. compare the CCSS with the mathematics standards from 14 states and with grade 8 mathematics requirements in Finland, Japan, and Singapore. (They also consider the English Language Arts standards, but that analysis will not be discussed here.) Here are four questions raised by their methods, analysis, and conclusions.

*Where are the practice standards? *The CCSS standards for mathematical practice are not mentioned nor does the analysis appear to consider them because they do not fit the categories used. (These categories are discussed in more detail below.) For example, Standard for Mathematical Practice 4 begins:

4 Model with mathematics.

Mathematically proficient students can apply the mathematics they know to solve problems arising in everyday life, society, and the workplace. (CCSS, p. 7)

Also, are overall aims and objectives from Finland, Japan, and Singapore included in the analysis? These include:

Aim 4. Recognize and use connections among mathematical ideas, and between mathematics and other disciplines. (Singapore grades 7–9 syllabus, p. 1)

The pupils will come to understand the importance of mathematical concepts and rules, and to see the connections between mathematics and the real world. (Finland, grades 7–9, p. 164)

To enable students to understand deeply the fundamental concepts, principles, and rules relating to numbers, quantities, figures and so forth, to acquire methods of algebraic expressions and strategies, and to improve their ability to examine phenomena mathematically. To enable students to enjoy mathematical activities, to appreciate mathematical ways of approaches and thinking, and to foster positive attitudes toward making use of them. (Japan, grades 7–9, p. 50)

If such “process standards” have not been included in the analysis, a potential similarity between the CCSS and international objectives has not been considered, and a potential difference between the CCSS and state standards not discussed.

*Mistake in grades 3–6 algebra characterization? *Porter et al. focus on grades 3–6, saying repeatedly that “state standards [i.e., the standards of the 14 states examined] place a much greater emphasis on advanced algebra” (p. 106, see also pp. 111, 115). This characterization is confirmed by Table 4. However, it is contradicted by Figure 2 which shows “Topographical maps comparing Common Core and state standards at coarse-grain topic level for math.” Something appears to be amiss with Table 4 because it also indicates that state standards for grades 3–6 place 25% of their emphasis on instructional technology, but only 13% on number sense. This is contradicted by Figure 2 which does not show such a large emphasis on instructional technology.

[Update: Porter et al.’s erratum in the October 2011 issue of *Educational Researcher *says that in “Table 4, the numbers for ‘Math’ under the ‘State’ column should have been located one row lower, with the bottom percentage of 25.71 moving to the top to describe number sense. The percentage 13.84 describes operations, 15.08 describes measurement, and so forth.”*]

*Relevant measures of focus? *Porter et al. measure focus in two ways (see p. 108).

- “how many cells were needed in the content matrix of topics by cognitive demand to capture 80% of the total content; the fewer the cells, the greater the focus.”
- “how many cells [of the matrix] contained 1% or more of total content; the more such cells, the greater the focus.”

Neither of these measures is consistent with the meaning of focus in the CCSS. To see this, it helps to know what the content by cognitive demand matrix looks like.

“Content” is subcategories of the coarse-grain topic level mentioned earlier, e.g.

“number sense”: place value, whole numbers, fractions, decimals.

“basic algebra”: absolute value, use of variables.

“advanced algebra”: quadratic equations, systems of equations.

A complete list of categories and subcategories is here.

“Cognitive demand” has five categories: Memorize; Perform procedures; Demonstrate understanding; Conjecture, generalize, prove; Solve nonroutine problems.

Thus, two rows of the content by cognitive demand matrix are:

memorize | perform procedures | demonstrate understanding | conjecture, generalize, prove | solve nonroutine problems | |

place value | |||||

whole numbers |

With these measures of focus, a collection of standards that involves less repetition of topics in each grade and several levels of cognitive demand could be indistinguishable from standards with more repetition of topics over the grades, but less cognitive demand. However, according to the CCSS, the former would be more focused than the later.

In the CCSS, “focus” has the meaning used by William Schmidt and his colleagues (see the CCSS references). Schmidt et al. measure focus by examining the *number of topics* that occur in each *grade*. The results of their analysis for US state standards and top-achieving countries is shown in their 2002 *American Educator* article.

The first two rows of Schmidt et al.’s matrix are:

grade level | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |

whole number: meaning | ||||||||

whole number: operations |

Schmidt’s analysis of intended grades 1–8 curricula as reflected in standards from 21 US states and in documents from top-achieving countries is here. This shows a considerable difference in number and distribution of topics per grade between the states and the top-achieving countries. A video showing Schmidt’s analysis of a close-to-final version of the CCSS and standards of 21 states is here. The CCSS slide occurs a bit less than two-thirds of the way through the video.

Of course, Porter et al. are free to use their own measures of focus. However, although their discussion of focus cites Schmidt et al. 2001 (see p. 103), they do not mention that their measures of focus are quite different from those of Schmidt and his colleagues.

[Update: Schmidt and Houang comment in their November 2012 article in *Educational Researcher*,

Using a somewhat unique and different definition of focus from that used by Schmidt et al. (2001), they [Porter et al.] considered not only the number of topics but also the cognitive demand associated with the topics as found in the standards.]

*Accurate characterization of intended curricula in top-performing countries? *In *Education Week*, Porter says:

But curricula in top-performing countries we studied—like Finland, Japan, and New Zealand—put far less emphasis on higher-order thinking, and far more on basic skills, than does the common core.

But do they?

For mathematics, Porter et al. analyzed grade 8 curriculum documents from three countries: Finland, Japan, and Singapore. Aside from the question of whether any region’s curriculum should be characterized by a single grade, there are other technical details that suggest a less sweeping conclusion.

Finland, Japan, and Singapore do not have *standards* for what students should know. Instead, they have curricula, courses of study, or syllabuses, and *learning objectives *or *aims* that students should follow.

One difference between standards and learning objectives is that the latter don’t require nearly as much space to describe. How were these less detailed specifications coded in the content by cognitive demand matrix? Or, did Porter et al. use some other description of the mathematics requirements? All that we are told is: “Wisconsin’s SEC database contains some information on content standards for other countries. In mathematics, there are data for Finland, Japan, and Singapore on eighth-grade standards” (p. 113).

The Finnish content objectives for grade 8 are not separated from those in grades 6–9 (see pp. 163–167). These fit on four pages. They include:

The pupils will know how to

solve a first-degree equation

. . .

use pairs of equations for solving simple problems. (p. 165)

The content objectives for the Japanese grade 8 course of study fit on two pages. They include:

To enable students to develop their ability of calculating and transforming algebraic expressions depending on the intended purpose, and to enable them to understand simultaneous linear equations with two variables, and to motivate their ability to use such equations. (p. 52)

Some might consider these to be procedures, and in fact, solving an equation is classified under “perform procedure” in Porter et al.’s analysis (see Table 5, p. 109). Such classifications may have contributed toward the high percentage of objectives classified as “basic skills.” (Here I am assuming that “basic skills” correspond to “memorize” and “perform procedures” in the hierarchy of cognitive demand.) However, solving an equation is not necessarily just a matter of using a procedure. More generally, it’s also possible that other “basic skills” are not so basic. This is illustrated by the findings of the TIMSS Video Study.

This study analyzed videos of eighth grade classrooms in seven countries. US classrooms had the highest percentage of classroom discussions of problem solutions that were classified as “using procedures” and none classified as “making connections.” In contrast, top-performing Japan, Czech Republic, and Hong Kong had the highest percentages of “making connections” problems. (Singapore and Finland did not participate in this study.)

One notable finding was that problems whose* statements* were coded as “using procedures” were sometimes *solved in the classroom* in a manner that was coded as “making connections” (Hiebert et al., 2003, pp. 116). In particular, a large percentage of problem statements in Hong Kong lessons were coded as using procedures, but a smaller percentage of problem solutions were coded in this way. One such example (which involves solving equations) is a public release lesson from Hong Kong which can be viewed here.

In light of these findings, the education researchers James Hiebert and Yuichi Handa proposed a hypothesis in their article “A Modest Proposal for Reconceptualizing the Activity of Learning Mathematical Procedures.”

Perhaps the coding in this [TIMSS] study did not capture what is critical in procedural and conceptual activity. Maybe the traditional distinction between conceptual and procedural is too crudely defined and starkly contrasted to capture what is really intended and/or what is operationalized as procedural problems play out in classroom activity. (2004, p. 3)

Their reconceptualization of procedural learning is illustrated by their analysis of a TIMSS public release lesson on square roots from Hong Kong. This lesson can be viewed here.

In this article, Hiebert and Handa state: “The implication derived from reviewing the Hong Kong SAR lesson is that the interplay between procedural and conceptual makes clear distinctions between them problematic” (p. 11). This proposal has received considerable discussion in the mathematics education community (see, e.g., Baroody et al., 2007; Star, 2005; and the articles that cite them).

Porter et al. do not seem to be aware of this discussion. Instead, they say:

Our conclusions are based on the assumption that the content distinctions made by our . . . procedures are important. But a key question remains: Do we describe content at too crude or too precise a level of detail? (2011, p. 115)

I would like to suggest that the answer is “none of the above.”

**Endnote**

*Porter et al.’s erratum (published in the October 2011 issue of Educational Researcher) says that the following paragraph from the April 2011 article:

There are many notable differences between Common Core and state standards. For example, whereas Common Core standards place a much greater emphasis on basic algebra, state standards place a much greater emphasis on advanced algebra. Similarly, Common Core standards place greater emphasis than do states on geometric concepts, but less emphasis on advanced geometry. There is a huge difference in emphasis on instructional technology (e.g., calculator use); in the state standards, nearly 26% of content is on instructional technology, compared with none in the Common Core standards.

should be replaced by:

There are some differences between Common Core and state standards. For example, Common Core puts a heavier emphasis on number sense and operations than do state standards. In contrast, Common Core puts much less emphasis on geometric concepts, data displays, and probability than do states.

**References**

Baroody, A., Feil, Y., & Johnson, A. (2007). An alternative reconceptualization of procedural and conceptual knowledge. *Journal for Research in Mathematics Education, 38*(2), 115–131. Retrieved from http://www.nctm.org/publications/article.aspx?id=17382

Common Core State Standards Initiative. (2010). *Common Core State Standards for mathematics*. Retrieved from http://www.corestandards.org/assets/CCSSI_Math%20Standards.pdf

Finnish National Board of Education. (2004). *National core curriculum for basic education 2004.* Vammala: Vammalan Kirjapaino Oy. Retrieved from http://www.oph.fi/english/sources_of_information/publications

Garfunkel, S., & Mumford, D. (2011). How to fix our math education. *New York Times* 24 August. Retrieved from http://www.nytimes.com/2011/08/25/opinion/how-to-fix-our-math-education.html

Hiebert, J. et al. (2003). *Teaching mathematics in seven countries: Results from the TIMSS 1999 Video Study.* Washington, DC: US Department of Education, National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2003013

Hiebert, J., & Handa, Y. (2004). *A modest proposal for reconceptualizing the activity of learning mathematical procedures.* Paper presented at the annual meeting of the American Educational Research Association, San Diego. Retrieved from http://www.udel.edu/education/mathed/downloads/Hiebert2004.pdf

Ministry of Education, Culture, Sports, Science and Technology of Japan. (2004). *The courses of study in Japan* (Study Group for U.S–Japan Comparative Research on Science Mathematics and Technology Education, trans.). Retrieved from http://www.seiservices.com/APEC/APEC_KB/KBDisplay.aspx?lngPkID=1567

Ministry of Education, Singapore. (2006). *Secondary mathematics syllabuses.* Singapore: Curriculum Planning and Development Division, Ministry of Education. Retrieved from http://www.moe.gov.sg/education/syllabuses/sciences/files/maths-secondary.pdf

Porter, A. (2011). In Common Core, little to cheer about. *Education Week*, *30*(37), 24–25.

Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards: The new U.S. intended curriculum. *Educational Researcher, 40*(3), 103–118. Retrieved from http://www.aera.net/uploadedFiles/Publications/Journals/Educational_Researcher/4003/103-116_04EDR11.pdf

Schmidt, W., Huang, R., & Cogan, L. (2002). A coherent curriculum: The case of mathematics. *American Educator*, 1–17. Retrieved from http://www.aft.org/pdfs/americaneducator/summer2002/curriculum.pdf

Star, J. (2005). Reconceptualizing procedural knowledge. *Journal for Research in Mathematics Education, 36*(5), 404–411. Retrieved from http://www.nctm.org/publications/article.aspx?id=17463

Turning to existing state assessments, Porter et al. find the average alignment to the Common Core math standards is just 0.19 and 0.17 for reading. They repeated that analysis for the NAEP assessments, finding that the alignment for math is 0.20 in both fourth and eighth grade and for reading is 0.28 in fourth grade and 0.21 in eighth grade. In other words, the SEC analysis finds that the Common Core standards are real different from what’s on state and NAEP tests today.

silver accountDecember 8, 2012 at 8:30 pm

[…] comparison of CCSSM and the standards of certain Asian countries. This complements an earlier post about the article of Porter et al in Education Researcher which purported to compare CCSSM with 14 […]

Cathy Kessel critiques the critiques | I Support the Common CoreAugust 10, 2013 at 9:05 pm