The Mission of the Manhattan Institute is
to develop and disseminate new ideas that
foster greater economic choice and
individual responsibility.

Richmond Times-Dispatch.

Study rates Florida first in school tests
Va., other areas lagging behind
February 12, 2003

By Jason Wermers

A study of standardized testing systems in Virginia, Florida and seven large public-school divisions around the country concluded that Virginia and the others lag behind Florida.

The report, issued yesterday, concludes that high-stakes standardized testing programs, if correctly implemented, can be reliable measures of what students are learning in the classroom.

But the study by the Manhattan Institute for Policy Research's Center for Civic Innovation found that of the programs it studied, only Florida's actually achieves this goal.

The school systems studied in the report - "Testing High Stakes Tests: Can We Believe the Results of Accountability Tests?" - together represent nearly one-tenth of the nation's total public-school enrollment.

These systems' high-stakes testing programs - in particular, Virginia's Standards of Learning tests - provide accurate yearly snapshots of how well schools are teaching students what they need to learn, according to the study. But these snapshots cannot confidently be compared over time to say that student achievement is improving or declining, the report finds.

Critics of Virginia's SOL tests have repeatedly questioned whether the state's tests are, in fact, able to test what is most important to students. The tests, say critics, emphasize rote memorization at the expense of important learning skills and concepts.

Mark Christie, president of the Virginia Board of Education, took the study's conclusions as an endorsement of the SOL testing program.

"Overall, the study supports our basic approach in Virginia, while illustrating the need to make some modifications to our testing program that we were already planning to make under the No Child Left Behind law," Christie said.

Jay P. Greene, a senior fellow at the institute and lead author of the report, compared high-stakes standardized tests with their "low-stakes" counterparts.

High-stakes tests have significant consequences attached to them. In Virginia, SOL test results determine a school's accreditation rating and eventually will determine whether a school is accredited. Beginning with this year's high school juniors, the tests also determine whether a student can graduate.

Low-stakes tests do not have clearly defined consequences and, therefore, do not tempt test administrators to distort or manipulate test results - something that has happened in isolated instances with high-stakes tests.

Florida and Virginia were singled out because they were the only states that administer both high-and low-stakes tests to roughly the same students.

Virginia gives the high-stakes SOL tests in third, fifth and eighth grades and at the end of high school courses. The state's students take a national test, the Stanford 9, in fourth, sixth and ninth grades.

Florida's state tests, given annually, look at gains made by individual students year-to-year, the report says, in contrast to the programs in Virginia and the other school systems that were studied.

Christie noted that Florida's annual state tests in math and reading in grades three through 10 allow that state to track individual students' scores. Virginia will be able to do this once it starts annual SOL testing in reading and math in grades three through eight, which it plans to do by 2005-06 to comply with No Child Left Behind.

The study says that results of the Florida's state tests and Stanford 9 show similar gains and are at similar levels. But while Virginia's SOLs and Stanford 9 yield similar results in a given year, the scoring gains over time in these tests had only a weak correlation.

Virginia education officials have used Stanford 9 test results to bolster their claims that the SOLs are raising state students' achievement levels.

The report concludes that "well-designed high-stakes accountability systems can and do produce reliable measures of student progress, as they appear to have done in Florida, but we can have less confidence that other states' high-stakes tests are as well designed and administered as Florida's."

Bob Schaeffer, public education director for the Massachusetts-based National Center for Fair and Open Testing, questioned the use of the Stanford 9 for comparison with the SOLs or the Florida test.

"The [Stanford 9] is so similar to the SOLs in structure, it doesn't really give you an independent check on the Virginia test," Schaeffer said.

He said the National Assessment of Educational Progress, given every four years in several subjects, would provide a more valid comparison. But the Manhattan Institute report says the infrequency of NAEP tests keeps them from being a valuable comparison.

©2003 Richmond Times-Dispatch

 

 


Home | About MI | Scholars | Publications | Books | Links | Contact MI
City Journal | CAU | CCI | CEPE | CLP | CMP | CRD | ECNY
Thank you for visiting us.
To receive a General Information Packet, please email support@manhattan-institute.org
and include your name and address in your e-mail message.
Copyright © 2009 Manhattan Institute for Policy Research, Inc. All rights reserved.
52 Vanderbilt Avenue, New York, N.Y. 10017
phone (212) 599-7000 / fax (212) 599-3494