There is a subset of the population out there that adores standardized tests. It is hardly a homogeneous group. Some like them because they did well on them as young kids, turned out successful, and think that standardized tests must be good at predicting future success. While using one example doesn’t make a compelling argument, there is some degree of logic present. People who endorse private or charter schools love standardized tests because they will consistently point to how poor a job public schools do … provided they never look at changes in test data from freshman to senior year. Others simply see no better way to measure student accomplishment.
However, there is at least one point that this all hinges on: can we trust these tests. Clearly, if the answer is “no”, then the entire argument of supporting their use becomes a moot one. Having delved into arguments with some of these people, I can tell that some of them have never looked in depth at standardized tests before.
I have. There are lots of problems with them, and rather than seeing these problems get better, they seem to be, at the least, staying the same. I was, thus, attracted to this Washington Times article (which originally appeared in the Huffington Post) that really says something about what goes in with standardized tests.
Sara Holbrook is an author who specializes in young adult poetry. Two of Holbrook’s poems (“A Real Case” and “Midnight“) ended up being used as part of the reading assessment on the Texas STAAR test. A teacher (who might be being investigated for revealing test questions) wrote to her and asked her about one of the poems and how she might answer a question asked on the test.
As you might guess, Holbrook was unable to answer questions analyzing her own poetry. If you read the article, she gives great arguments as to why this happened, but what it comes down to is that the test writers simply do not understand what they are writing about. Keep in mind, at least with testing giant Pearson, question writers need not even have a college degree to write. And grading the test … according to that same article, a college degree is required, but the want ad was posted to Craig’s List.
In short: the question writers of these all-important tests may not have a college degree or any meaningful experience in the subject they are writing questions in, and even if they do, may not understand the subjects they are writing on to the point where they can write questions that actually measure what they purport to measure.
Keep that in mind the next time these all-important scores come out and describe a school.