A diary of the self-absorbed...

Friday, April 8, 2016

Reading vs. Writing Proficiency


Questions of Reading and Writing Proficiency

It’s not often that I find myself agreeing with David Coffey on education, but in his April 7 “spotlight” submission, he raises what probably should be a given in education: strong leadership produces results. I am not sure we need 100 million dollars or a slate of longitudinal studies to demonstrate this as it is “conventional wisdom,” but filling in the knowledge gaps with evidence is hardly ever a bad thing.

That’s why I went and looked up the test scores for Quitman County Elementary School. Assuming that the scores I looked at were accurately presented on the school report card website (and in today’s world that tends to be a scary assumption to make), I found that indeed test scores rose significantly with the implementation of the reading initiatives described in the Wall Street Journal op-ed submitted by Richard Grant and referenced at length by Mr. Coffey. (Incidentally, scores have declined since then, which would make for an interesting value-added discussion, but we can save that for another day!)

What struck my fancy in the Quitman raw data was a fascinating comparison of the language arts scores to the scores of its kindred cousin: writing proficiency.  Fourth grade language arts scores did in fact reveal that only 18% of the student population was proficient in 2010 and only 22% were proficient the next year in 2011. No one questions that this is an abysmal failure in either education or motivation, or perhaps both.

But what about writing? Fourth grade writing scores in 2010 for Quitman County Elementary stood at 79% proficient and in 2011 they recording a whopping 93% proficient.

The data leaves us with what I will refer to as a “first things first” question. Simply put, as adults interested in the education of our children, we need to ask how a cohort of students who are 93% proficient in writing manages to see only 1 in every 5 of these exact same students pass a language arts test?

Here are the possible answers to that question and before we take even the first step into upending education or spending bucket-loads of money on solutions, we need to determine which answer is correct:

#1  Some students can write very well, but still can’t read for comprehension. We need look no further than master satirist Lewis Carroll and his poem “Jabberwocky” to see the idea of this principle in effect.

#2  The language arts test was either flawed or using a bad rubric. If they truly can write proficiently, isn’t this possible? Perhaps the reading portions were poorly constructed or graded much more harshly than deserved.

#3  The writing test was either flawed or using a bad rubric. Perhaps the students did not write nearly as “proficiently” as the test let on. It could be their writing submissions were graded on some kind of a curve.

#4  Students were less interested in the language arts test and in their utter boredom starting choosing random answers to be done faster. (This was me in school.)

I don’t know which of the above, or what combination of the above, is the correct answer. What I do know is that we owe it to ourselves and to our children to find out before we begin talking about “solutions” to problems that we do not fully understand.

If the rubrics of either test were flawed, then that is information parents and school systems should be made aware of immediately. If our standards are too high for reading comprehension or too low for writing proficiently, we owe it to ourselves to know which of these is true.

I am probably one of the few parents who actually took the sample portions our State elementary reading tests and can wholeheartedly say that if I were one of the students who passed writing, but not language arts, it would be 100% due to answer #4. The sample test that I took was ridiculously tedious and boring. So much so, that I found the sixth grade sample practically unreadable. I doubt my scores as a college educated adult would be all that much better than my sixth-grade counterpart who sat marveling over a spider webs in the window when faced with monotony in the classroom.

There may be other answers than the four I listed here. There may also be teams of “experts” out there who actually know the answer outright and can respond without batting an eye. Nevertheless, until we have had conversations such as the one I am suggesting here (and many more like them… let’s do math next!) then we can claim neither clarity nor sincerity with regards to what will work and what won’t work in education.

No one denies that we can raise test scores and there are many ways to do that. Many of these “methods” have significant trade-offs (just ask South Korea and look at the youth suicide rate). At issue though is whether or not we’ve educated our children in the process of raising their test scores and there aren’t too many ways to measure that snafu. But we can start by better educating ourselves. We need to determine why five times more kids in Quitman County Elementary can apparently write proficiently, but seemingly can’t read.

We need to decide if we are bringing the right ruler to the desk and whether or not we are measuring the right things because the raw data here suggests that maybe we aren’t.