On Data, Part One: Responding to Data-Driven Instruction
Basing decisions on data is not the same as basing them on knowledge.
- Jeff Henig, Guest Post for Rick Hess Straight Up
"Please explain why it's so important for data to drive the decisions of educators," directed a principal during a recent interview I had for a teaching job.
A red flag went up in my head. My experience with people who like to talk about data-driven decision making in education has not been a positive one. I gathered myself and tried to address the question as sincerely as possible.
I said (as best I can remember):
In a series of posts inspired by one of Jeff Henig's recent guest posts for Rick Hess Straight Up, I'd like to explore the concept of data-driven instruction over the next few days, why it's become so suddenly all the rage in low-achieving schools, and its promises and pitfalls. I intend on doing this especially by relating my experiences with data-driven instruction, exploring the culture and systems that bore it, and soliciting feedback from other educators who have similar and divergent perspectives.
Prior to delving into data-driven instruction, it might be useful to read what New Leaders for New Schools (a corporate-reform minded organization that trains principals) thinks about it here and something Larry Cuban wrote on it here. I recently wrote something of an allegory attempting to describe my experience with data here. Over a year ago I wrote a long-winded piece on data here. And lastly, Mr N posted on my blog in May on TFA's data emphasis here.
The misuse of data is at the heart of our misguided attempts to reform public education. Understanding the way it's being used is critical to understanding the problems it's created.
- Jeff Henig, Guest Post for Rick Hess Straight Up
"Please explain why it's so important for data to drive the decisions of educators," directed a principal during a recent interview I had for a teaching job.
A red flag went up in my head. My experience with people who like to talk about data-driven decision making in education has not been a positive one. I gathered myself and tried to address the question as sincerely as possible.
I said (as best I can remember):
I think good teachers have always used evidence of student understanding to guide their instruction. The evidence I find myself using most often to inform my instruction includes the looks on my students' faces when I ask them a question, ideas they write about in essays and reflections, conversations I hear them have with each other about my content, and the answers they give to the questions, "How do you know?" and "Why do you think that?" This kind of evidence often betrays what's really going on their heads; allows me to see what information, concepts, or habits of mind they might be missing; and consider ways forward.
I think it's important for me to say, however, that I've become wary about the way "data" is often used in schools. It's not that I think there's anything wrong with using data per se, it's just been my experience that a lot of schools have recently been so caught up with the accumulation of numbers and data that they've forgotten about teaching and learning.
The data I've been asked to use in the past has too often been limited to numbers derived from attendance, behavior logs, and both standardized and non-standardized tests. While I think these can useful, I find them far less useful than sitting down with a student and asking them what they thought about tasks they were asked to complete on an assessment.
The further from the student's decision making process we are when we attempt to "analyze data" and the less sure about the quality of the tools we use to create data, the less likely we are to draw conclusions that will allow us to adjust our instruction in meaningful ways. Knowing that forty percent of the school scored proficient on the state reading tests is less useful than knowing that fifty-five percent of my students scored proficient on it, which is, in turn, less useful than knowing that a particular student answered ninety percent of the questions correctly, which is, in turn, less useful than knowing what that particular student thought about the items s/he answered, both correctly and incorrectly.
Using quantitative data can be useful, but only when we can look under its hood for the reasons it emerged as it did, and not when we're using it merely as a means of satisfying a need for demonstrating accountability.I wanted to let the principal know that I am not willing to play games with data the way I've been expected by administrators who've harped on data-driven instruction in the past.
In a series of posts inspired by one of Jeff Henig's recent guest posts for Rick Hess Straight Up, I'd like to explore the concept of data-driven instruction over the next few days, why it's become so suddenly all the rage in low-achieving schools, and its promises and pitfalls. I intend on doing this especially by relating my experiences with data-driven instruction, exploring the culture and systems that bore it, and soliciting feedback from other educators who have similar and divergent perspectives.
Prior to delving into data-driven instruction, it might be useful to read what New Leaders for New Schools (a corporate-reform minded organization that trains principals) thinks about it here and something Larry Cuban wrote on it here. I recently wrote something of an allegory attempting to describe my experience with data here. Over a year ago I wrote a long-winded piece on data here. And lastly, Mr N posted on my blog in May on TFA's data emphasis here.
The misuse of data is at the heart of our misguided attempts to reform public education. Understanding the way it's being used is critical to understanding the problems it's created.
I agree heartily with this concept: "Knowing that forty percent of the school scored proficient on the state reading tests is less useful than knowing that fifty-five percent of my students scored proficient on it, which is, in turn, less useful than knowing that a particular student answered ninety percent of the questions correctly, which is, in turn, less useful than knowing what that particular student thought about the items s/he answered, both correctly and incorrectly."
ReplyDeleteWell said!
Great answer, James.
ReplyDeleteIf I may say so, principals who ask questions that way sound like scared and cowardly. My lord,what has happened to some of these people?
But did you get the job?
ReplyDeleteThanks for writing. I don't have the courage.