The Mission of the Manhattan Institute is
to develop and disseminate new ideas that
foster greater economic choice and
individual responsibility.

Boston Globe.

Study finds MCAS an accurate gauge
February 16, 2003

By Michele Kurtz

In the ongoing debate over high-stakes testing, a new study has found that such exams accurately reflect real student achievement, findings its authors say show that concerns about tests like the MCAS are unfounded.

''This should give us some confidence that the high-stakes test results are reliable indicators of student performance,'' said Jay P. Greene of the Manhattan Institute, a conservative think tank that authored the study.

The study compares schools' results on high-stakes exams such as the Massachusetts Comprehensive Assessment System test with their results on other standardized tests that are not used in rating schools or deciding if students can graduate. It examines results for schools in two states and seven districts, including Boston, a sampling that represents 9 percent of the nation's public school students.

But critics blasted the study, which they said draws broad conclusions based on a small percentage of school results nationwide. And they criticized the authors for not addressing questions of whether high-stakes tests drive students to drop out.

''The little data Greene [and the other authors] have decided to rely on does not warrant their conclusions,'' said Audrey Amrein, a researcher at Arizona State University's College of Education.

The authors of the Manhattan Institute study say their work rebuts claims that Amrein and other researchers made in a recent national study that concluded that students in states with high-stakes tests were learning the material covered by those tests, but not much else. That study by Arizona State researchers found that students in states with high-stakes tests generally did not outpace their peers on other independent exams, such as the ACT and SAT and the National Assessment of Educational Progress.

Greene said that whatever so-called teaching to the test is going on does not distort the scores. ''Teaching to the test is only a bad thing if teachers are only teaching test-specific skills,'' he said. ''But if they're teaching general specific skills, such as reading or how to do math, that may be OK.''

But while the researchers' findings show a high correlation between scores on high-stakes tests and such exams as the Stanford 9 and the Iowa Test of Basic Skills, the results are generally mixed when comparing gains from year to year. ''If we want to reward or punish schools for the difference they make in educating students, high-stakes testing has not shown itself to be reliable,'' Greene said.

Gains on test scores are a key component in how schools are judged under the new federal No Child Left Behind law, which mandates that students be proficient in English and math by 2014.

''A lot of students in wealthy suburban neighborhoods are going to pass these tests no matter what,'' said Marcus Winters, an author of the study. ''Gains only look at the effects schools are having.''

Lost in the debate, some say, is a bigger question.

''There's an assumption underneath all of this that any of these tests, including NAEP, actually represents a high-quality education,'' said Monty Neill, executive director of FairTest, a Cambridge-based group that opposes the MCAS graduation requirement. ''There's an awful lot that one would hope kids would learn and often do learn that's just not tested.''

©2003 Boston Globe

 

 


Home | About MI | Scholars | Publications | Books | Links | Contact MI
City Journal | CAU | CCI | CEPE | CLP | CMP | CRD | ECNY
Thank you for visiting us.
To receive a General Information Packet, please email support@manhattan-institute.org
and include your name and address in your e-mail message.
Copyright © 2009 Manhattan Institute for Policy Research, Inc. All rights reserved.
52 Vanderbilt Avenue, New York, N.Y. 10017
phone (212) 599-7000 / fax (212) 599-3494