Manhattan Institute for Policy Research.
search  
 
Subscribe   Subscribe   MI on Facebook Find us on Twitter Find us on Instagram      
 
 
   
 
     
 

The New York Times

 

The Test Passes, Colleges Fail

November 17, 2008

By Peter D. Salins

Stony Brook, NY

FOR some years now, many elite American colleges have been downgrading the role of standardized tests like the SAT in deciding which applicants are admitted, or have even discarded their use altogether. While some institutions justify this move primarily as a way to enroll a more diverse group of students, an increasing number claim that the SAT is a poor predictor of academic success in college, especially compared with high school grade-point averages.

Are they correct? To get an answer, we need to first decide on a good measure of �academic success.� Given inconsistent grading standards for college courses, the most easily comparable metric is the graduation rate. Students� families and society both want college entrants to graduate, and we all know that having a college degree translates into higher income. Further, graduation rates among students and institutions vary much more widely than do college grades, making them a clearer indicator of how students are faring.

So, here is the question: do SATs predict graduation rates more accurately than high school grade-point averages? If we look merely at studies that statistically correlate SAT scores and high school grades with graduation rates, we find that, indeed, the two standards are roughly equivalent, meaning that the better that applicants do on either of these indicators the more likely they are to graduate from college. However, since students with high SAT scores tend to have better high school grade-point averages, this data doesn�t tell us which of the indicators � independent of the other � is a better predictor of college success.

Instead, we need to look at the two factors separately. And we can, thanks to the recent experience of the State University of New York, America�s largest comprehensive university system, where I was provost from 1997 to 2006. SUNY is blessed with many different types of campuses, mirroring most of the collegiate options (other than small elite private institutions) that characterize contemporary higher education. The university also collects a gold mine of student data, including statistics on pre-admission academic profiles and graduation rates.

In the 1990s, several SUNY campuses chose to raise their admissions standards by requiring higher SAT scores, while others opted to keep them unchanged. With respect to high school grades, all SUNY campuses consider applicants� grade-point averages in decisions, but among the total pool of applicants across the state system, those averages have remained fairly consistent over time.

Thus, by comparing graduation rates at SUNY campuses that raised the SAT admissions bar with those that didn�t, we have a controlled experiment of sorts that can fairly conclusively tell us whether SAT scores were accurate predictors of whether a student would get a degree.

The short answer is: yes, they were. Consider the changes in admissions profiles and six-year graduation rates of the classes entering in 1997 and 2001 at SUNY�s 16 baccalaureate institutions. Among this group, nine campuses raised the emphasis they put on the SAT after 1997. This group included two prestigious research universities (Buffalo and Stony Brook) and seven smaller, regional colleges (Brockport, Cortland, New Paltz, Old Westbury, Oneonta, Potsdam and Purchase).

Among the campuses that raised selectivity, the average incoming student�s SAT score increased 4.5 percent (at Cortland) to 13.3 percent (Old Westbury), while high school grade-point averages increased only 2.4 percent to 3.7 percent � a gain in grades almost identical to that at campuses that did not raise their SAT cutoff.

Yet when we look at the graduation rates of those incoming classes, we find remarkable improvements at the increasingly selective campuses. These ranged from 10 percent (at Stony Brook, where the six-year graduation rate went to 59.2 percent from 53.8 percent) to 95 percent (at Old Westbury, which went to 35.9 percent from 18.4 percent).

Most revealingly, graduation rates actually declined at the seven SUNY campuses that did not raise their cutoffs and whose entering students� SAT scores from 1997 to 2001 were stable or rose only modestly. Even at Binghamton, always the most selective of SUNY�s research universities, the graduation rate declined by 2.8 percent.

The change is even more striking if we compare experiences of three pairs of similar SUNY campuses that, from 1997 to 2001, took sharply divergent paths. First, Stony Brook and Albany, both research universities: over four years, at Stony Brook the average entering freshman SAT score went up 7.9 percent, to 1164, and the graduation rate rose by 10 percent; meanwhile, Albany�s average freshman SAT score increased by only 1.3 percent and its graduation rate fell by 2.7 percent, to 64 percent.

Next, Brockport and Oswego, two urban colleges with about 8,000 students each: Brockport�s average freshman SAT score rose 5.7 percent to 1080, and its graduation rate increased by 18.7 percent, to 58.5 percent. At the same time, Oswego�s freshman SAT average rose by only 3 percent and its graduation rate fell by 1.9 percent, to 52.6 percent.

Finally, Oneonta and Plattsburgh, two small liberal arts colleges with 5,000 students each: Oneonta�s freshman SAT score increased by 6.2 percent, to 1069, and its graduation rate rose 25.3 percent, to 58.9 percent. Plattsburgh�s average freshman SAT score increased by 1.3 percent and its graduation rate fell sharply, by 6.3 percent, to 55.1 percent.

Clearly, we find that among a group of SUNY campuses with very different missions and admissions standards, and at which the high school grade-point averages of enrolling freshmen improved by the same modest amount (about 2 percent to 4 percent), only those campuses whose incoming students� SAT scores improved substantially saw gains in graduation rates.

Demeaning the SAT has become fashionable at campuses across the country. But college administrators who really seek to understand the value of the test based on good empirical evidence would do well to learn from the varied experiences of New York�s state university campuses.

Original Source: http://www.nytimes.com/2008/11/18/opinion/18salins.html?_r=3&ref=opinion

 

 
PRINTER FRIENDLY
 
LATEST FROM OUR SCHOLARS

Reclaiming The American Dream IV: Reinventing Summer School
Howard Husock, 10-14-14

Don't Be Fooled, The Internet Is Already Taxed
Diana Furchtgott-Roth, 10-14-14

Bad Pension Math Is Bad News For Taxpayers
Steven Malanga, 10-14-14

Proactive Policing Is Not 'Racial Profiling'
Heather Mac Donald, 10-13-14

Smartphones: The SUVs Of The Information Superhighway
Mark P. Mills, 10-13-14

Failing The Subways -- On Track For Debt And Decay
Nicole Gelinas, 10-13-14

The Free Speech Movement Won, But Free Speech Lost
Sol Stern, 10-12-14

Book Review: 'Breaking In' By Joan Biskupic
Kay S. Hymowitz, 10-10-14

 
 
 

The Manhattan Institute, a 501(c)(3), is a think tank whose mission is to develop and disseminate new ideas
that foster greater economic choice and individual responsibility.

Copyright © 2014 Manhattan Institute for Policy Research, Inc. All rights reserved.

52 Vanderbilt Avenue, New York, N.Y. 10017
phone (212) 599-7000 / fax (212) 599-3494