The Testing Iceberg

If you’re looking for a controversy in K-12 education, look no further than the issue of standardized testing. We loathe their high-stakes value, the anxiety they induce in our children, and the billions of dollars spent on tutors and programs to prepare for them. Yet standardized tests offer quick results and insight to guide the increasing micro-management of our children’s education.

Examples of this test-prep culture cover the hallways in many DC Public and Charter Schools. Posters showcasing the District’s mandated exam, currently the DC CAS, highlight practice sessions, study-prep rallies, and exam reporting dates. Schools know how crucial preparation is for these exams and why it is important to celebrate a culture of high expectations.

Our understanding of standardized testing in K-12 schools became clearer last month when Teach Plus released its report, The Student and the Stopwatch. The report calculated the amount of time students spend actually taking state- and district-mandated tests in English Language Arts and math in urban and suburban schools. It found that urban school districts spend considerably more time on testing than their suburban counterparts.

Two issues the report didn’t address, however, are 1) Does the amount of time students spend taking tests reflect the real time dedicated to testing?, and 2) Why do urban schools spend 24% more time on testing than their suburban neighbors?

Real Time Spent on Testing
The report concluded that across the 12 urban districts it evaluated, students spend 1.7% of their time taking tests. So why does it feel like 50% of education conversations are related to testing when it occupies less than 2% of students’ time? That’s because this finding reveals only the tip of the time-on-testing iceberg.

Beneath the water, where a great majority of the iceberg is hidden, is the amount of time students spend preparing for these exams. When exams hold such high stakes, as they do in Washington, DC, and many other cities, the need for great amounts of preparation time is imperative.

A study by the American Federation of Teachers, reported in The Washington Post, found that test prep and testing absorbed 19 full school days in one district and a month and a half in another. Although the Teach Plus report frames the time spent on standardized testing as quite low, the amount of time dedicated to preparing for these exams has grown into a major chunk of the teaching and learning experience, especially in many of the urban districts highlighted in the report.

Urban vs. Suburban
So why do suburban schools spend considerably less time on testing than urban schools? After decades of underperforming urban schools being largely ignored by those beyond (and sometimes within) their immediate communities, they are now the focal point of the education reform movement in the United States.

Major corporations, private foundations, and city leaders are heavily investing in urban schools. These stakeholders want quick, easy-to-understand, and tangible returns on their investments - and testing produces these. Therefore, testing, and the preparation culture it requires, is significantly more present in urban schools.

Regardless of the value we may place on testing, it’s important to understand that when calculating the total time students spend on testing, the amount of time they spend preparing cannot be overlooked. Evaluating this in both urban and suburban schools is a worthy study to pursue because it will provide a clearer picture of the true amount of time committed to testing in students’ education.


Thanks Eric for this analysis. It is important to further emphasize that many districts use somewhat similar testing tools to measure students’ gains. The question remains unanswered: does DCCAS represent a measureable assessment to monitoring students’ progress? Obviously, the answer is no. Another point of interest that should have been developed in your analysis is the upcoming mandated assessment by the Common Core State Standards, called PARCC. Assessments cannot determine or assess college readiness? It is time to shift our priorities toward another framework for creating and implementing measureable assessments. It is especially important to realize that the data collected from DCCAS or any other assessments out there are not better than teacher’s data or assessment. What assessment and data do not take into consideration is the socio-economic and literacy factors. What we need to achieve is progress to address the students’ needs in terms of literacy and work ethic in order to address personalized/individual assessment regarding social behavior and academic enablers (i.e., “attitudes and behaviors that allow a student to participate in, and ultimately benefit from academic instruction”; Deperna and Elliot, 2002, p. 294). If we continue ignoring this reality, then DCCAS or any other future assessment will continue to provide the same unrealistic status quo!

I agree with your comments. Like many important and large scale reform initiatives, the new CCSS assessment review process was pushed along at a fast rate. Because of this, some of the more important measures that it should track, and which you point out, will not be revealed from these assessments.

Add new comment