Math Task Force’s Bad Calculation

The number of incoming college students who require development mathematics coursework is a national problem. As reported by the National Center for Educational Statistics, 42% of students entering college for the first time in fall 2003 took a developmental math course. At our institution, Worcester State University, 54% of students entering in fall 2004 placed into developmental math. This is an enormous area of concern for a number of reasons; there is a monetary cost to students who must take courses for which they are not granted credit, and colleges and universities must pay instructors to teach such courses. In this article we examine a policy change recently implemented by the Massachusetts Board of Higher Education (MBHE) that seeks to address this issue by drastically changing how students are placed into their first college-level math class.
The current process for placing incoming students into their first mathematics course was formed by the MBHE’s 1998 Common Assessment Policy of Massachusetts, which was based on a report from the Mathematics Assessment Task Force. All members of the committee had a background in mathematics, and half of the members held mathematics faculty positions. In addition, detailed minutes of all meetings were included in the report, as were all votes. The policy dictated that all incoming students were required to take the Accuplacer Elementary Algebra exam, which covers topics found in an Algebra I course (typically taken in eighth or ninth grade). In addition, the policy mandates a “cut score” that determines whether or not an incoming student is placed into a developmental course.

A decision to change this process was made in October 2013 when the MBHE decided to implement four “primary and comprehensive recommendations” made by the Task Force on Transforming Developmental Math Education. The MBHE report indicates that the task force had 17 members. In contrast to the 1998 task force, only five of the members are listed as having current positions that involve math instruction. One other member is a former mathematics professor. In further contrast, the report includes neither minutes nor records of votes. Attempts to obtain them suggest that there are no such records.

The task force’s first recommendation is that recent high school graduates whose high school GPA is 2.7 or higher are exempt from the initial placement exam (currently Accuplacer) and should be placed directly into the lowest college-level math course appropriate for their chosen pathway of study. Further, high school graduates whose high school GPA is lower than 2.7 but higher than 2.4 and who have successfully passed four math courses including math in their senior year are exempt from the initial placement exam and should be placed directly into the college-level math course appropriate for their chosen field of study. To clarify, this refers to overall high school GPA, not just GPA in high school math classes. We strongly disagree with this recommendation.

Our first point of contention is the stance the task force takes regarding developmental coursework. The report states that “students who enroll in developmental coursework are less likely to graduate … these students often become discouraged and never reach a point where they even attempt an entry-level course.” The first part of the statement should surprise no one. Along the same lines, first-year students who fail courses are less likely to graduate. Should we deal with this by banning failing grades for first-year students?

The second part of the statement suggests that the best way to eliminate the discouragement a student experiences when faced with developmental work is to allow them automatic placement to a credit-bearing course. Inherent in this conclusion is the assumption that such a student will succeed in that course, even though, in many cases, they will lack vital background skills necessary for success (skills they would be taught in a developmental math course). The task force neglects to contemplate how discouraging it can be for a student to repeat multiple times the same course for which they are not prepared.

Our concern about students struggling in courses for which they are not prepared is based in experience. A few years ago, our administration waived placement test requirements for transfer students. During the summer of 2013, we had a pre-calculus student who transferred in a course equivalent to college algebra from a community college. Under the old policy, this student would still be required to take Accuplacer; under the new policy, however, the student was allowed to register for pre-calculus. This student worked hard, asked questions, and scored 4% on the first midterm exam. He subsequently withdrew from the course. In discussions with him, we suggested that he retake college algebra, because he was clearly lacking the skills necessary to allow him to succeed in pre-calculus. He opted instead to try pre-calculus again in the fall. Once again, he worked hard, asked questions and this time, he scored 6% on the first midterm. He again withdrew from the course, and finally, after a year, he agreed to sit in a college algebra course to build up his skills. This student was done an incredible disservice by being deemed prepared for a course for which he was clearly not ready. He wasted hundreds of dollars, and we can say from direct conversations that he was incredibly discouraged from the experience. We understand that this anecdote does not specifically apply to developmental coursework, but our overall point is that placement tests serve a vital purpose, and when students are deemed prepared for the class of their choosing without testing their basic skills, we fear such situations will occur far more frequently.

Another point of contention we have with the task force recommendations is the exemption from placement testing of all incoming students with a high school GPA of 2.7 or above (and in the case of students who passed four math classes including one in their senior year, a high school GPA of 2.4 or above). We again worry that such a policy will result in a multitude of students who are woefully underprepared for their first college-level math class. Under this policy, a student who receives a D- in every math class he took in high school (and in the first case, who didn’t even take a math course in his senior year) will be deemed ready for college-level math so long as his high school GPA is 2.7. Anyone who assumes that a student who received D’s in all of their high school math classes and who did not even take a math class in their senior year will succeed in their first college-level course either grossly underestimates the rigor of a college-level mathematics course or expects that the standards in such courses will be lowered to accommodate so many ill-prepared students. We have spoken with several members of the task force to try to ascertain the research base for this recommendation. No specific information was provided. Responses ranged from “I missed that meeting” to “Ask DHE staff.”

The College Board’s 2013 State Profile Report for Massachusetts provides information about the high school GPAs of college-bound senior. The average GPA of those who provided this information was 3.23. Only 13% indicated that they had GPAs below 2.7. This suggests to us that this is an extremely low threshold.

We also have serious reservations about the apparent research basis for the task force recommendations, namely “Predicting Success in College: The Importance of Placement Tests and High School Transcripts.” The 2012 paper by Clive Belfield and Peter M. Crosta of the Community College Research Center (CCRT), Teacher’s College, Columbia University examined only community colleges, and did not include any data from the Commonwealth of Massachusetts. Secondly, we are concerned with how the task force interpreted the paper. In its draft report, the task force summarizes the paper as follows:

“This paper uses student data from a statewide community college system to examine the validity of placement tests and high school information in predicting course grades and college performance. The authors find that placement tests do not yield strong predictions of how students will perform in college. In contrast, high school GPAs are useful for predicting many aspects of students’ college performance.”

This is a misleading summary, as the task force seems to confuse predictors of college success with accurate course placement. While they are related, a placement test does not tell whether someone will be successful in college math classes, but rather if they have the knowledge base to be successful. Before we implemented our current placement program at WSU, students were able to enroll in college-level math classes regardless of their placement test scores. Not surprisingly, we saw very high failure rates among these students in a broad range of classes.

But perhaps most importantly, the CCRT paper’s authors acknowledge that their work on validity metrics is based on extrapolation. In particular, since students who score below the Accuplacer cut score are placed into a developmental class, there is no direct data on how such students would fare in a college-level class were they directly placed into one. So to predict how such students would fare were they directly placed into a college-level course, extrapolation must be used (see Appendix for hypothetical examples).

According to our first hypothetical example, we can linearly extrapolate below the cut score to “conclude” that 60% of students who score a 20 on Accuplacer (for reference, the cut score is 82) would pass their first college-level math course. One can see how if this conclusion were, in fact, valid, an argument could be made for eliminating placement testing. Unfortunately, there is absolutely no evidence to support a curve of this shape for scores on the placement test below the cut score. To further illustrate the dangers of extrapolation, consider the next set of scatterplots in our Appendix that plot median heights of boys ages 2 to 20. Suppose we had never seen a boy under age 15 and we only had data for boys ages 15 to 20. As in our above hypothetical, we could extrapolate below the age of 15 to “conclude” that a 2-year-old child would be over 5 feet tall!

We also question why community colleges and state universities are being subject to the same recommendations. State universities and community colleges have major differences, ranging from mission statements to student population and demographics. Specifically, community colleges offer open enrollment, which leads to huge numbers of unprepared students in comparison to state universities. According to data in the task force’s report, in fall 2010, 53% of incoming community college students required developmental math education, compared with 23% of incoming students at state universities. It also seems that the composition of the task force itself slants heavily toward a community college perspective. Of the 17 members of the task force, over half are affiliated in some way with a community college; on the other hand, only one member is currently a faculty member in the mathematics department of a state university.

In addition, the majority of research (nine of 17 papers) used to support the task force’s recommendations originates from the CCRT at Columbia University. In fact, even though the task force cites 17 resources to support its recommendations, they actually originate from only three sources: CCRT, Jobs for the Future, and Complete College America. We certainly understand the need for reform at the community college level and we appreciate the vitally important role that community colleges play. However, we are very concerned that state universities are being subject to the same recommendations as community colleges, when it is apparent that voices from the state university system were in the vast minority with regard to task force membership and research basis.

In the section of the task force’s report titled “Charge to the Task Force on Transforming Developmental Math Education,” four areas were cited as being highlighted in the 2011 Final Report of the Working Group on Graduation and Student Success Rates. Under the first area, “Research and Education,” a bullet point states: “Review innovative practices currently in place at colleges within and outside of Massachusetts and create initiatives which successfully scale up best practices across multiple campuses.” However, the successful program we have at Worcester State was ignored in the task force report. Why? As mentioned above, in 2004 we had a failing entry-level program, with 54% of our entering students placed into developmental math and only 30% of these students passing their developmental math course. By working closely with our administration, we embarked on a data-driven redesign of the program, including careful statistical analysis of the effectiveness of our changes. Through efforts to increase students’ awareness of the placement process, as well as improve their mathematical preparation, the percentage of entering students requiring remediation decreased from 54% in 2004 to 24% in 2006. Additionally, in our redesigned developmental math classes, pass rates increased from 30% in 2004 to 80% in 2009, where they have remained. (For more detail on our program, see Successful Developmental Math: “Review-Pretest-Retest” Model Helps Students Move Forward, published in The New England Journal of Higher Education. It is surprising that this paper was not included as a reference in the task force report.)

We feel that implementation of the task force recommendations will result in either pressure to lower standards in entry-level math courses or increasing numbers of students failing their first college-level math course. We understand and commend the desire of the task force to improve developmental education, but there is scant evidence that these recommendations will have that effect.

Mike Winders is associate professor of mathematics at Worcester State University. Richard Bisk is a professor of mathematics at Worcester State University and was math department chair from 2004-2012.




Comments are closed.