By UELIP Associate: Angel Gonzales
Having spent some time researching and reporting on data at the state level for the Thomas B. Fordham Institute, an education think-tank, I believed that I understood the discourse around accountability pretty well. Students either hit proficiency levels or show growth through value-added measures. This data gives analysts a good idea of how a school is performing and a general idea of how school districts are functioning. At the state and national levels, government officials and policy institutions use data to move forward their ideas on how children should be taught. This is all well and good but I kept on asking myself, “How did districts conceptualize data?” In the month that I have spent at DCPS as a UELIP, I have come to understand the unique experiences of students and school makes how the district works with data much more fluid.
In my role as a UELIP, I work with data specialists under the Office of Chief of Schools on two different projects where I am helping use data as a tool for improving school performance. Jessica Morris has me tasked with reporting on the data delivery systems on English language learners and dual language students, which would give DCPS a clearer sense on what relevant information to give superintendents and principals. Sarah Lee has me researching metrics used to hold alternative schools accountable in other districts that can be suggested to the Office of the State Superintendent of Education (OSSE), our state agency. Alternative schools in DCPS service students who have dropped out and re-enrolled into high school.
What have I learned from my research so far? In short, data is messier than I thought. Accountability benchmarks that define a school’s performance do not always take everything into account. Take, for example, Luke C. Moore High School. If you strictly look at student performance you can see some very upsetting statistics. The school has a very small number of students reaching “proficient” on the DC CAS and they are labeled as “Priority,” which means that it needs intense intervention to improve its poor performance. Luke C. Moore High School, however, is an alternative school, with 71% of its student population between the ages of 18 and 24. Labeling this school as “Priority” glosses over the unique experience of the school where the students are piecing together their academic career after a prolonged disruption.
Data can tell us some more things about what is happening at alternative schools in DCPS. Graduation rates show us how many students are accomplishing their goal of receiving a high school degree (which is far more unlikely for students who have been separated from school). Growth in student attendance from year to year may tell us how a school like Luke C. Moore is keeping highly mobile students in the classroom. These statistics may be able to tell us more about the quality of an alternative school than a proficiency rate on the DC CAS. Luke C Moore, however, is still compared at the state-level against schools with students who have not shared the same experience, continuing to perpetuate an assumption that it is a poor performing school. District staff, like Sarah, has taken notice of this, creating an opportunity for me to explore better ways to determine how a school like Luke C. Moore is performing. More importantly, the incoherence between what the district and state see in this school has led to some constructive discussion on data being tracked for alternative schools.
Part of the reason why Sarah has worked to redefine accountability structures for alternative schools is proximity. When she collaborates with schools on a daily basis, she can better understand how a school is being served by the benchmarks asked of them by the district and state. This understanding, in the best circumstances, works its way back up to create better tools for accountability at the district, state, and national levels as in the case of the work currently done by Sarah and myself on alternative schools.
Our work is still in progress but it has already unpacked a dimension of data that was not privy to me in my previous work. Instead of seeing data as rigid markers of school ability, I am now beginning to see the process of collecting and understanding data as a more malleable process that should respond local context.