Title
Subscribe Today
Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Drowning in Data but Thirsty for Analysis


by Melissa Roderick - 2012

This commentary frames the importance of the topic of this special issue by highlighting the changes that have occurred in school systems around data use, particularly in large urban districts, and the need for a more rigorous evidence base. Collectively, the articles in this volume provide a jumping-off point for such a research agenda around data use in schools. Each of the articles identifies significant gaps in our knowledge base and develops useful conceptual frameworks within which to think about the dimensions of data use, the quality of the research evidence, and the implications for the field.

Here then is our situation at the start of the twenty-first century. We have accumulated stupendous know-how. We have put it in the hands of some of the most highly trained, highly skilled, and hardworking people in our society. . . Nonetheless, that know-how is often unmanageable. Avoidable failures are common and persistent, not to mention demoralizing and frustrating across many fieldsfrom medicine to finance, business to government. And the reason is increasingly evident: the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely or reliably. Knowledge has both saved and burdened us.   Atul Gawande, The Checklist, p. 13


In this special issue, Cynthia Coburn and Andrea Bueschel bring together a diverse set of articles around one of the most central reform ideas in contemporary public school policy and practice (Turner & Coburn, this issue, 2012) That strategy is not charters, or small schools, or standards, or accountability, or any of the other reform platforms that have shaped American education over the last several decades. It is, simply, data. The articles in this volume offer a range of phrases to capture where we find ourselves. Julie Marsh (2012, this issue) writes that the K12 education landscape is lush with data. Jennifer Jennings (2012, this issue), quoting the Duncan administration, notes that a focus on data is the driving force behind the new educational reform strategy. Major foundations such as the Bill and Melinda Gates Foundation and the Dell Foundation are making substantial investment in data as a strategy for school improvement, investing in an array of organizations like the Data Quality Campaign. Together, building data systems and improving data quality make up one of the four core educational reform areas identified in the Race to the Top program. Many states, moreover, are engaged in building new assessments to align with the Common Core standards.


The authors of these articles are not asking whether the growing focus on data is a good reform strategy. There is an unstated assumptionperhaps justifiedthat the question of whether we should or should not invest in data is moot. In this new technological century, shutting off the supply of data to schools would be nearly impossible. Instead, the questions that these articles pose are: How do we identify the interventions that will best promote data use? What do we know about creating conditions that promote the effective use of data? Each article looks at these unifying questions from a different perspective. What emerges, however, is a useful list. These articles suggest that promoting data use is about (1) designing assessments or indicators that give the right feedback to teachers (Supovitz, 2012, this issue); (2) developing accountability systems that incentivize the right behavior (Jennings, 2012, this issue); (3) designing interventions that build the capacity of adults to engage effectively in translating analysis of data into changes in practices (Marsh, 2012, this issue); (4) creating organizational environments that promote and support effective data use (Daly, 2012, this issue; Marsh, 2012, this issue); and (5) attending to the broader political context within which data use initiatives unfold (Henig, 2012, this issue). In each article, the authors develop useful conceptual frameworks within which to think about the dimensions of data use, the quality of the research evidence, and the implications for the field.


Ultimately, the general consensus of the authors of these articles is that the research base is very weak. The lack of clarity in findings is well illustrated in Julie Marshs (2012, this issue) literature review, Interventions Promoting Educators Use of Data. Marsh makes a heroic attempt to make sense of this messy field and in this well-organized review demonstrates that it is really difficult to make definitive statements about what we know. As Turner and Coburn (2012, this issue) summarize in the introductory on interventions to promote data use, Despite the proliferation of data use interventions and the growing public attention to and enthusiasm for the idea of data use, there is shockingly little empirical research on these approaches.


In the remainder of this commentary, I highlight the importance of this topic, particularly in large districts, and then return to the preceding list to discuss what I view as the significant contributions of this special issue.


FROM DATA DESERT TO DATA DELUGE


In Interventions to Promote Data Use: An Introduction, Turner and Coburn (2012, this issue) characterize the current state of the research base as nascent. This is not surprising. I come at this topic from the perspective of a large urban school system and as a researcher at the Consortium on Chicago School Research (CCSR). Ive spent most of my career at CCSR promoting the use of research and data in schools. Yet, I am astounded by the fast pace of change in data use and the impact of that change on teachers, principals, and school administrators. I cant think of another example in education where practice has changed as fast as the use of data. The Internet, software, and other technological advances have made getting, assembling, analyzing, and disseminating data cheap and easy. This has transformed educators work environment and districts approaches to reform. When the first computers were introduced into classrooms and schools, I dont think many would have imagined that the most significant impact of these technologies would be to transform the way principals interact with their staff around instruction, the way educators work together in schools, what they use as evidence, and ultimately the skills required of teachers and administrators.


To use an example from Chicago, in a very short period of time, we have gone from a school system in which high school principals had little access to data in their building to a system in which principals are inundated. In 2006, a group of high school principals approached me about forming a group of reform-minded principals to work together around problems and to get access to CCSR research and researchers, data on their own schools performance, and assistance in developing strategies for improvement. The group that eventually formed is called the Network for College Success. A high priority for these principals was getting access to information on their own schools performance because the CPS data system was not set up for schools to be able to analyze their own data on a regular basis. One principal, after hearing a presentation on CCSRs research on ninth grade and the development and use of the on-track indicator, contacted me on a Sunday afternoon soon after the first meeting with questions about how to calculate a ninth-grade on-track rate. She and her vice principal had hand-entered the grades of her entire freshman class into an Excel spreadsheet because at that time, CPS high school principals only had access to a students transcript, not the grades of the entire class. Principals were also unable to compare a students performance in eighth and ninth grades because elementary school grades and attendance were recorded on paper life cards and not electronically.


By comparison, during recent planning for a retreat in a Network high school, the school-based team requested data to look at students attendance and grades across the transition to high school. The data analyst downloaded his freshman watch list, which included students grades and test scores in eighth grade and merged it with data from the freshman success report, which tracks those same students ninth-grade grades, attendance, and test scores. In a very short time, the analyst could compare changes in attendance and grades between the end of eighth grade and the end of the first semester of high school and how these changes differed by achievement, gender, or other characteristics.


There are numerous examples like this one in which the barriers to data access have been removed. Sophisticated graphing programs allow for visually exciting displays. iPads and other hand-held devices allow educators to take notes during observations and walk-throughs, assemble them, and have a product completed within minutes. Online surveys are easy to post and allow for aggregation of data in real time. Data dashboards give principals and administrators easy access to a range of summary data on their performance. And in the classroom, computerized tests allow for interim assessments that provide quick results.


Perhaps as a result, in a short period of time, the problem with access to data has shifted from not having enough data to having too much. And, as well documented in these articles, the results are quite uneven, and the implications unclear. Are all these data actually supporting instruction and school practice? And what are the conditions under which they do? In my 20 years working in school reform in Chicago high schools, I have seen principals transform school and students outcomes through the effective use of data. I have seen examples of data use where all levels of the school system were working toward the same end and where data reports came from the central office to schools in ways that changed behavior and transformed outcomes. I have seen instances of this. But they are pretty rare.


What is more common, particularly in recent years, is watching principals and teachers struggle to make sense of the deluge of information and data they face daily: incomprehensible performance management decks, data dashboards, packaged test and survey reports all in three colors with beautiful graphs but little guidance, and school report cards filled with trends on 20 different indicators that dont seem to provide any insight beyond whether a school is red, yellow, or green. I have watched high school teams struggle with the proliferation of formative and interim assessments, all of which are supposed to be aligned with performance on the accountability test; however, very few provide teachers with meaningful feedback on their teaching or why students are struggling or succeeding, or insight into what they should do differently. These days in Chicago, schools are frequently overwhelmed with data: One principal commented when contemplating joining the principals professional network we were forming, Whatever you do, just please dont give me a PIN or login ID. Anyone working in urban schools cant help but ask the question: Is this data deluge making a difference?


ASK THE RIGHT QUESTIONS, FOCUS ON THE RIGHT OUTCOMES, AND PROVIDE USEFUL FRAMEWORKS: A RESEARCH AGENDA


If we are drowning in data, how could more research possibly be helpful? What is clear in reading this set of articles is that although the technical barriers to data use have been taken away, how to use data to maximize its benefits is an open question. Although we have the topic list of things to worry about (e.g., whether assessments are effective in informing teacher practice, whether accountability incentivizes the right behavior), we do not yet have the evidence that could guide a teacher, a principal, or a district in making decisions about how to choose from and implement data processes in ways that effectively improve instruction.


Moving the evidence base ahead on this issue is particularly important over the next several years given the amount of money invested by the federal government in the development of data systems. Improving the quality of assessments is particularly important given the time and effort that states, districts, administrators, and teachers are putting into aligning with the Common Core standards. Will Common Core be implemented in a way that makes teachers once again feel embattled, judged, and constrained, while students, particularly the most vulnerable, lose whole swaths of the school year to test prep? Or, will Common Core become an effective approach that (1) informs teachers work, (2) builds teachers capacity, and (3) makes data, assessment, and standards part of the toolkit of good schools and good teachers? From my readings of these articles, the answers to these questions in many ways will depend on that list: how the assessment system is designed, how the intervention supports capacity building, how the organizational environment supports teachers and administrators to use data effectively, and so forth.


The articles in this set make two important contributions. First, they provide a useful jumping-off point for a research agenda. Each of the articles poses critical questions that are broad but detailed enough to wrap a research agenda and discussion around. Collectively, these reviews identify significant gaps in our knowledge base and provide useable frameworks within which to begin thinking about how to promote effective data use. Second, the editors of this volume set the bar high by focusing on outcomes and processes as much as the design of the intervention. They ask: What is the link between data use, teacher response, and student outcomes? What are the processes by which data intervention changed teacher practice, and how does the change in teacher practice change student outcomes? These are certainly the right questions, but they are challenging. I was impressed by each authors commitment to grappling with the evidence and asking tough questions.


In my read, answering these challenging questions may require a very different set of studies and research designs than has been used in this field to date. Turner and Coburn (2012, this issue), in their review, conclude that we need a new generation of more analytic, high-quality, and theoretically informed studies. Each of these articles demonstrates that shift. For example, in a field that has been largely qualitative, Julie Marsh (2012, this issue) suggests using a randomized experiment that would assign teachers to different treatments in order to understand how teachers data use changes under different conditions. Jennifer Jennings (2012, this issue) echoes this theme and calls for research to examine how the specific designs of accountability systems affect teacher behavior. Thus, both authors conclude that critical to supporting data use is understanding how teachers respond under different conditionswhether those conditions are the accountability system, designed interventions, or the nature of the assessment system. My interpretation of the literature review, however, is that this is an area where the current research falls short and where disconnected studies lead to a long list of contextual influences that provide little direction. Several articles expressed equal frustration at how little attention has been paid to connecting the dots between data use, instructional change, and student learning. Filling in these gaps, however, will require, as Turner and Coburn argue, much larger, more theoretically based, and methodologically tight studies that would take research in this area in a new direction.


HOW DO WE GET EDUCATORS TO USE DATA EFFECTIVELY? THREE POTENTIALLY CRITICAL ELEMENTS


In my day-to-day interactions with teachers, principals, and leadership in the Chicago Public Schools, the most frequent request I get is for data that will inform instruction. The problem is not that teachers and school administrators dont want to improve. So what are the problems? The articles in this special issue provide some compelling clues.


First, the articles suggest that one potential reason that teachers struggle with using data to inform instruction is that the interim, formative, and summative assessments that we are using are not designed to inform instruction. There are similar problems with the way data are reported back to teachers. Jonathan Supovitz (2012, this issue) makes a compelling case that if we want data to inform instruction, we have to design assessments that shift the emphasis from a test of a students performance to an evaluation of his or her understanding. He argues that most interim assessments are not currently being designed for that purpose. They simply provide interim measures of what students can and cannot do, a list of what topics or skills to reteach. Proponents of the Common Core argue that for students to reach the standards, many teachers will have to teach differently. There is no disagreement that the Common Core standards place a new emphasis on demonstrating understanding and analysis (Porter, McMaken, Hwang, & Yang, 2011). But even designing formative assessments under Common Core that are miniature versions of the summative assessment will simply tell teachers earlier what students dont understand. It will not provide insight to teachers about how or why students are struggling, nor will it build teachers capacity to teach those skills. The first challenge, then, is to find ways to design assessments and their reporting to contribute to a teachers analysis of her students needs.


A second explanation, and one that was touched on in this set of articles, is that teachers do not have a research base to understand the problem and a set of strategies for how to respond: the thanks but now what problem (Marsh, 2012, this issue). Although it is important to investigate the link between data use, teachers response, and outcomes, we must also ask: Do we know how teachers should respond and which responses produce better effects? What I have learned at CCSR and in working in schools in Chicago is that practitioners will use data if they know they can influence the outcome and if they have effective strategies to do so. If a group of students are not performing on an assessment, what are effective response strategies? When educators dont have a rich toolkit or when they dont believe they can affect the outcome, they will fall back onto test prep or rote reteaching because they dont perceive alternatives. Therefore, there is a critical role for research in identifying the strategies and levers for improvement but also for providing evidence of the link between different responses and student outcomes.


A third important piece of the puzzle is that teachers and principals do not have access to critical supports that will guide them in creating a culture of data use. This is perhaps the most important lesson I have learned from my work in CCSR. Educators need tremendous support, problem-solving processes, and structures to help make the move from understanding the problem, to analysis of how it affects their day-to-day work, to planning and managing the response. This point is made convincingly in Alan Dalys article, Dyads, Dynamics, and Data (2012, this issue). Daly highlights the role of social relationships in data use as central to the process by which data do or do not effectively impact classroom practice. Marsh (2012, this issue) provides insight into ways that carefully constructed interventions can foster these relationships. She says, The [data use] process thrives when interventions ensure that data are easy to understand and usable; include norms and structures promoting the safety and confidentiality of data and data discussions; target multiple leverage points; and involve opportunities for cross-site (or level) collaboration.


The work I do with principals and their teams in the Network for College Success is deeply embedded in the relationships weve built over the last six years. In reading this set of articles, I naturally reflected on this workwhat has become a quite successful experience of trying to make data useful. I was struck by Julie Marshs summary of effective processes because it so well captured so much of what we have learned in our practice. The work was not giving the principals, counselor, and teacher the data. It was making the data understandable, useable, and relevant to the central problems they face. It was grappling with their questions, posing questions back, and always presenting the research evidence about effective ways to respond to the data. It was about bringing educators in like roles across schools together to identify and solve common problems and, using data, develop strategies for improvement. Even then, it took multiple experiences looking at data before the principals in the group could easily begin to analyze the data, talk about how the problems played out in their building, and identify strategies. It took time for educators not to look at every piece of data about their school as a judgment on the quality of their work. And it took even more time for educators to begin making the data live in their school by supporting teams in ongoing monitoring and data use.


Thus, in the end, for data use to be effective, we need to go beyond just providing data. We need a better, more robust theory of action and a strong evidence base for understanding what data schools need to address different problems and how data can be used effectively at various organizational levels and under what conditions. The articles in this issue are an important first step.


References


Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record,114(11).


Gawande, A. (2009). The checklist manifesto: How to get things right. New York, NY: Holt and Company.


Henig, J. R. (2012). The politics of data use. Teachers College Record, 114(11).


Jennings, J. L. (2012). The effects of accountability system design on teachers use of test score data. Teachers College Record, 114(11).


Marsh, J. A. (2012). Interventions promoting educators use of data: Research insights and gaps. Teachers College Record, 114(11).


Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards: The new U.S. intended curriculum. Educational Researcher, 40, 103115.


Supovitz, J. A. (2012). Getting at student understandingThe key to teachers use of test data. Teachers College Record, 114(11).


Turner, E. O. & Coburn, C. E. (2012). Interventions to Promote Data Use: An Introduction. Teachers College Record, 114(11).




Cite This Article as: Teachers College Record Volume 114 Number 11, 2012, p. 110309-
https://www.tcrecord.org ID Number: 16815, Date Accessed: 10/24/2021 12:54:16 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Melissa Roderick
    University of Chicago
    MELISSA RODERICK is the Hermon Dunlap Smith Professor at the School of Social Service Administration at the University of Chicago and a codirector at the Consortium on Chicago School Research, where she leads CCSR’s research on postsecondary. Professor Roderick is also codirector of the Network for College Success, a network of high school principals and their teams focused on developing high-quality leadership and student performance in Chicago’s high schools. Professor Roderick is an expert in urban school reform, high school reform, high-stakes testing, minority adolescent development, and school transitions. Recent publications include: Farrington, C., Roderick, M., Allensworth, E., & Nagaoka, J. (2012) Teaching adolescents to become learners: The role of non-cognitive factors in shaping school performance, a critical literature review. Chicago, IL: Consortium on Chicago School Research; and Roderick, M., Nagaoka, J., & Coca, V. (2011). Potholes on the road to college: High school effects in shaping urban students’ participation in college application, search and enrollment. Sociology of Education, 84, 178–211.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS