Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Using Data to Alter Instructional Practice: The Mediating Role of Coaches and Professional Learning Communities

by Julie A. Marsh, Melanie Bertrand & Alice Huguet — 2015

Background: Despite increased access to student learning data, scholars have demonstrated that teachers do not always know how to use these data in ways that lead to deep changes in instruction and often lack skills and knowledge to interpret results and develop solutions. In response, administrators have invested in instructional coaches, data coaches, and professional learning communities (PLCs) to support teachers in this process. Despite their popularity, there is limited research on the ways in which coaches and PLCs mediate teachers’ use of data and the various types of expertise brought to bear on this process.

Purpose: This exploratory study examined how working with a coach or PLC shaped teachers’ responses to data in six middle schools and the factors that influenced the activities and effects of coaches and PLCs. Our intent was to deeply examine processes and identify key constructs and relationships to guide future research and practice.

Research Design: Our research involved a year-long comparative case study of six low-performing middle schools in four districts that supported teacher data use via literacy coaches, data coaches, or PLCs. We draw on cultural historical activity theory and data from 92 interviews, 6 focus groups, 20 observations of meetings, and monthly surveys of case study teachers (15), coaches (4), and PLC lead teachers (2).

Findings: We found that coaches and PLCs played important roles in mediating teachers’ responses to data and were often associated with instances in which teachers used data to alter their instructional delivery (as opposed to surface-level changes in materials and topics). Further, the dynamic relationship between vertical expertise (an individual’s knowledge and skills) and horizontal expertise (knowledge that is co-created through interactions and movement across contexts) may help explain the ways in which PLCs and coaches facilitated deeper level changes in pedagogy. Finally, dialogue was a central mediating practice, and school leadership and the district-level context shaped the possibility for change.

Conclusions: Our research adds conceptual clarity to what types of expertise may be needed to ensure that teachers respond productively to data. The study suggests that administrators should consider multiple facets of expertise when designing interventions, recruiting coaches, assembling PLCs, and developing professional development for coaches and teacher leaders. The centrality of dialogue also suggests the need for policies and structures allowing for uninterrupted time for educators to collectively reflect on data.

Recent scholarship points to the potential of data-driven decision making to change classroom instruction (Campbell & Levin, 2009; Herman, Wardrip, Hall, & Chimino, 2012; McNaughton, Lai, & Hsiao, 2012; Nelson, Slavit, & Deuel, 2012). These studies show, that in some instances, student learning data—often defined as student assessment results but inclusive of other forms, such as student work and behavioral data—can substantively inform and shape teachers’ practice. However, research also indicates that educators often use data in simplistic ways that do not significantly alter instruction (Ikemoto & Marsh, 2007; Marsh, Pane, & Hamilton, 2006; Nelson et al., 2012; Oláh, Lawrence, & Riggan, 2010)—a finding that conforms with prior literature on the resistance of pedagogy to structural reforms (Elmore, 1995). Teachers have access to voluminous amounts and types of data and may take steps to analyze them but often fail to respond in the classroom based on knowledge gained. In fact, scholars have demonstrated that teachers do not always know how to use data in ways that lead to deep changes in instruction and often lack skills and knowledge to formulate questions, interpret results, and develop solutions (Cosner, 2012; Gummer & Mandinach, 2015, this issue; Heritage, Kim, Vendlinski, & Herman, 2009; Marsh et al., 2006; Means, Chen, DeBarger, & Padilla, 2011; Oláh et al., 2010; Supovitz & Klein, 2003).

Many educators have recognized the need to better assist teachers in bridging the data–practice divide and have invested in “interventions” to assist in this process (for a full review, see Marsh, 2012). Three popular interventions are a literacy coach, data coach, and professional learning community or PLC (often called a data or inquiry team)—each with slightly different conceptions of the expertise needed to support teachers. Widespread throughout the United States, literacy coaches have become a central part of federal, state, and district literacy reforms. Defined as specially trained teachers with pedagogical and language arts content-area expertise, literacy coaches offer on-site and ongoing support for teachers, including assistance with data (Bean, Draper, Hall, Vandermolen, & Zigmond, 2010; Coburn & Woulfin, 2012; Rodgers & Rodgers, 2007). In theory, their expertise is believed to help teachers make sense of data and adjust their literacy practice in response.  In contrast, data analysis experts or coaches tend to focus less on content, in theory bringing data literacy expertise (e.g., how to draw sound inferences from data) to guide teachers with accessing, interpreting, and using data. According to a U.S. survey about technology supporting data use, half of districts have made available data analysis experts or coaches to at least some of their schools (Means, Padilla, & Gallagher, 2010). They may be school-based or shared among a set of schools and are sometimes associated with the central office or an intermediary organization (Lachat & Smith, 2005; Love, Stiles, Mundry, & DiRanna, 2008). Finally, PLCs are frequently associated with data-driven reform initiatives and can also take the name of inquiry groups or data teams (Nelson, Slavit, Perkins, & Hathorn, 2008; Vescio, Ross, & Adams, 2008). They typically involve collaborative work among peers, guided by a lead teacher or facilitator. In theory, PLCs are effective in influencing teachers’ thinking and practice because the discussions occur among trusted peers who may bring to the process diverse expertise and knowledge that enrich the conversations and analysis process.

Despite their popularity, there is limited research on the ways in which coaches and PLCs specifically mediate teachers’ use of data and the various types of expertise brought to bear on this process. In the spirit of Little’s (2012) call for research on the micro-processes of data use, this article focuses on a critical yet underexamined phase of data use: the actions taken in response to data. We examine how working with a coach or PLC mediated teachers’ responses to data in a sample of six middle schools and the factors that influenced the activities and effects of coaches and PLCs. Given the exploratory nature of our study, our intent is to deeply examine processes and identify key constructs and relationships to guide future research and practice.

We focus particular attention on instances in which teachers report using data to change the delivery of their instruction—reorganizing how students acquire information or skills. The national discourse around data-driven decision making in education frequently touts the benefits of data for changing teachers’ practice and assumes that improved student achievement will occur if teachers use data to reflect on and alter instruction. U.S. Secretary of Education Arne Duncan, for example, asserted, “Good data promotes transparency and accountability. It shows the public the value that they’re getting in their investment in education. It gives teachers information they need to change their practices to improve student achievement” (2010). District and school leaders in our case study districts also frequently mentioned the goals of using data to “modify instruction” and “change your instructional program.” Yet, as noted, studies have found that teachers often respond to data by reteaching content, identifying particular students for out-of-class support, and making other procedural changes to practice (Goertz, Nabors Oláh, & Riggan, 2009; Oláh et al., 2010). Still others have documented teachers responding to test results by teaching test-taking skills or engaging in “gaming” practices (e.g., focusing on the “bubble kids”), which may yield short-term improvements in achievement but not necessarily deeper learning as intended by those promoting assessment- and data-driven reform (Booher-Jennings, 2005; Hamilton et al., 2007; Jennings, 2012; Marsh et al., 2006; Pedulla et al., 2003). As such, we believe it is critical to investigate cases in which teachers use data to adjust their pedagogy and understand the supports and conditions facilitating these actions.

In the end, we, like other scholars, found that teachers often responded to student learning data with surface-level changes to instruction. However, we focus on the less common response of change in instructional delivery, finding that coaches and PLCs played significant mediating roles—drawing on both vertical expertise (an individual’s knowledge and skills) and horizontal expertise (knowledge that is cocreated through interactions and movement across contexts) to help teachers adjust instruction based on data.  In fact, when both forms of expertise were strong, we were more likely to observe these types of changes in practice. We further found that dialogue—particularly dialogue focused on both data and instruction—was a central mediating practice and that school leadership and the district-level context shaped these interactions.   

These findings have significant implications for education policy and practice. The current policy context—with evolving state accountability systems and new college and career ready standards—places high demands on teachers to adapt and improve their practice. Knowing how to facilitate instructional change is critical for realizing the goals of these new policies. Evidence on efforts to support teachers in rethinking and improving their teaching is essential for ensuring positive outcomes for all students and can benefit both practitioners and researchers. Given the significant resources currently devoted to data and data-use supports, policy makers could also benefit from knowing how to maximize their return on these investments. Our research can inform these multiple stakeholders.  

In the remainder of this article, we first provide a brief review of relevant empirical research on coaches and PLCs. Next, we present a theoretical framework that reconceptualizes traditional notions of expertise, followed by a description of our research methods. We then present our findings and conclude with implications for policy, practice, and future research.


Research shows that teachers can use data to set informed goals for classroom improvement; adjust instruction at the whole-class, group, and individual levels; and alter curriculum and content sequencing (Dembosky, Pane, Barney, & Christina, 2006; Heppen et al., 2012; Kerr, Marsh, Ikemoto, Darilek & Barney, 2006). Generally, the literature implies that data analysis can be leveraged positively for classroom and school improvement (Herman at al, 2012; Ikemoto & Marsh, 2007). Yet research also indicates that educators often use data in ways that do not lead to changes in practice (Marsh et al., 2006; Nabors Oláh, Lawrence, & Riggan, 2010). Teacher habit has in cases proved intractable; even when intending to change instruction, teachers frequently reproduce old practices (Spillane, 2002) or engage in surface-level “instructional tinkering” (Cuban, 1993; Nelson et al., 2012). Although data-use skills may improve over time with experience (Cosner, 2011; Jacobs, Gregory, Hoppey, & Yendol-Hoppey, 2009), working with others has the potential to develop teachers’ ability to understand and use data to inform and improve practice. Two common sources of capacity building for current teachers are coaches and PLCs, which we discuss next.


Research demonstrates that teachers working with instructional coaches may use newly introduced skills more often and appropriately than unassisted teachers. For example, a study of Reading First schools in Michigan found that teachers who had professional development (PD) in addition to the support of a literacy coach “showed patterns of reading instruction aligned with the instructional vision of the PD program” more so than teachers with professional development only (Carlisle & Berebitsky, 2011, p. 793). In a longitudinal study of Reading First implementation, Coburn and Woulfin (2012) similarly found that teachers were more likely to make adjustments to their instruction when policy-directed changes were interpreted by a coach. Coburn and Russell’s (2008) study of two district coaching initiatives also found that teachers who were actively coached continued using district routines and norms in interactions within their professional network even when the coach was not present. A number of other studies also indicate that coaches have a positive effect on teachers’ implementation of standards in the classroom (Brown, Reumann-Moore, Hugh, du Plessis, & Christman, 2006; Garet et al., 2008; Kohler, Crilley, Shearer, & Good, 1997; Wong & Nicotera, 2006).

In addition to general instructional support, coaches often provide guidance to teachers in the area of data-driven decision making. The U.S. Department of Education’s National Educational Technology Trends Study found that primary data-use leadership in schools was often provided by instructional coaches who connected teachers with student data and assisted in translating the data to classroom application (Means, Padilla, DeBarger, & Bakia, 2009). In a mixed methods study of Florida’s reading coach program in middle schools, teachers who received more data support from coaches reported making more instructional changes in their classrooms than those who received less (Marsh, McCombs, & Martorell, 2010).

Some schools and systems tailor coaching specifically to data use in the form of a data coach position. Although research isolating the effects of data coaches is limited, there is some mixed evidence of influence on teacher instruction. A 2008 study of reading program implementation in Boston demonstrated that when teachers were given access to professional development on using benchmark assessment results, the addition of data coaching support had a narrow impact on teachers’ reported understanding of how to utilize data to inform their instructional practices (Quint, Sepanik, & Smith, 2008).   

Extant literature also sheds light on the importance of coaches’ expertise, which is often considered an important influence on teacher practice (Neufeld & Roper, 2003). Teachers are more likely to reach out to individuals they feel hold useful skills and knowledge (Spillane, Hallett, & Diamond, 2003). A study of Florida’s reading coach program showed that teacher and principal perceptions of coach “quality”—as measured by teacher ratings of coach skills and experience—were correlated with their perceptions of the coach’s influence on their instructional change (Marsh, McCombs, & Martorell, 2012).

The success of coaching programs may also rest on the strength of relationships. Coaches can support the development of more collective forms of expertise by crossing social boundaries, acting, for instance, as intermediaries between policy and practice. In crossing boundaries, the quality of a coach’s relationships and social ties with others may help to “diffuse resources throughout a system” (Daly, 2012, p. 6). Those social ties may be strengthened if a coach is skilled at cultivating relationships; Ertmer and colleagues’ (2005) study of 31 instructional coaches indicated that their “people skills” were critical in facilitating teacher growth.  


Another capacity-building strategy of schools and systems has been to promote collaboration among teachers (Darling-Hammond & Richardson, 2009; Wei, Darling-Hammond, Andree, Richardson, & Orphanos, 2009), often taking the form of a PLC, which involves collaboration to reflect on and improve practice (Stoll, Bolam, McMahon, Wallace, & Thomas, 2006; Stoll & Louis, 2007; Talbert, 2009). Studies indicate that teacher participation in collaborative inquiry may have positive effects on instruction. Louis and Marks (1998), in a mixed methods study of 144 teachers at 24 schools, found that teachers who participated collaboratively in reflective dialogue with a focus on learning were more likely to engage their students in authentic pedagogy—measured as instruction requiring students to engage in higher order thinking—than those with less cooperative teaching environments. However, in contrast, an in-depth, yearlong case study of teacher collaboration found, that although collaboration led to improved connectedness among group members, it did not necessarily alter teacher instruction (Tschannen-Moran, Uline, Woolfolk Hoy, & Mackley, 2000).

Through the collaborative process, PLCs also have the potential to affect teacher data-use skills. Datnow, Park, and Kennedy-Lewis’s (2012) multischool data-use study concluded that social interactions were a leading influence in developing the ways that teachers utilized student data. Similarly, Symonds’s (2003) study of Bay Area schools that made achievement gains found that professional collaboration assisted teachers in the data-use process. As part of a larger, 6-year investigation of 11 Milwaukee public schools, Mason (2003) found that teachers at one school made significant changes in their mindsets about data use through participation in PLCs, viewing student data as a tool for improvement rather than an accountability measure.

The joint review of data in a PLC provides a key opportunity for groups to build on their collective expertise (Daly, 2012). Nelson’s (2009) 5-year case analysis of teacher inquiry groups found that groups that made a routine of focusing on collaboration in their inquiry—rather than simply sharing their individual expertise—were successful in creating instructional improvements. Interpersonal relationships foster collaboration within PLCs (Mitchell & Sackney, 2011) and are particularly important to the sharing of teacher data. Trust in the PLC setting is therefore necessary to enable complex, inquiry-focused efforts (Bryk, Camburn, & Seashore Louis, 1999; Ikemoto & Marsh, 2007) that contribute to teacher learning.

Individual expertise also plays a role in PLC work. Often a coach or head teacher trained in facilitation takes on a leadership role (Hipp, Huffman, Pankake, & Olivier, 2008). Beyond this role, however, each teacher in a PLC also contributes personal knowledge to group inquiries: Some teachers may add strong content familiarity, whereas others may be experts in delivery (Putnam & Borko, 2000). The collaborative process also generates deeper individual expertise for teachers (Hadar & Brody, 2010).

Although several studies of coaching and PLCs and their efforts to support data touch on issues of expertise, few consider the relationship between individuals and groups in terms of cocreation of knowledge. Our study takes a unique approach to teacher data use by drawing on the concepts of vertical and horizontal expertise, which we explain next.  


When considering the mediating role of coaches and PLCs in teachers’ use of data, we employ a theoretical framework that can account for both the knowledge of individuals and the ways that individuals work together to generate knowledge across contexts. The cultural historical activity theory concepts of vertical and horizontal expertise capture the relationship between individual knowledge and the processes through which individuals may interact to generate new knowledge (Engeström, 1999; Engeström, Engeström, & Kärkkäinen, 1995; Gutiérrez, 2008). Vertical expertise encompasses much of what is traditionally considered expertise: knowledge and skills that can be augmented over time (Engeström, 1999; Engeström et al., 1995). This conception of expertise understands learning as movement from novice to expert (Engeström et al., 1995; Gutiérrez, 2008, p. 149). Coaches and PLC members can possess varying degrees of vertical expertise in several areas, including data analysis, literacy instruction, and adult learning. For example, a coach could have strong vertical expertise in data analysis, having previously moved from novice to expert in her understanding of how to interpret and draw inferences from test results, for instance.  

Recent scholarship has expanded on the vertical, or traditional, notion of expertise. Now some scholars consider the ways that individuals with different forms of vertical expertise cross boundaries between multiple, overlapping, or even conflicting contexts (Anagnostopoulos, Smith, & Basmadjian, 2007; Engeström et al., 1995; Gutiérrez, 2008; Tuomi-Gröhn, Engeström, & Young, 2003). Anagnostopoulos et al. (2007) noted, “Horizontal expertise emerges from these boundary crossings as professionals from different domains enrich and expand their practices through working together to reorganize relations and coordinate their work” (p. 139). We view horizontal expertise as knowledge that is co-created through interactions and movement across contexts (Anagnostopoulos, Smith, & Nystrand, 2008; Engeström et al., 1995). Encapsulated in this definition are several elements. First is the idea of boundary crossing. Both people and ideas move horizontally among contexts and social worlds (Akkerman & Bakker, 2011; Engeström et al., 1995; Gutiérrez & Larson, 2007). For our study, the movement is both literal—as coaches and teachers cross into different physical spaces and groups within schools—and cognitive—as they discuss ideas originating from different professional experiences, such as subject-matter training. A second element of horizontal expertise focuses on the contexts involved in crossing boundaries. Oftentimes the movement is considered to be across the boundaries of domains, activity systems, and/or institutions—that is, contexts that are very different from one another (Anagnostopoulos et al., 2008; Engeström, 2001; Engeström et al., 1995). However, people can also cross boundaries between and among social practices (Gutiérrez & Larson, 2007), which can unfold within the same institution, for example.

Synthesizing these conceptions, we consider the range of contexts in which the teachers and coaches in our study interacted—classrooms, PLC groups, one-on-one meetings, staff meetings, administrator meetings, and contexts beyond schools. For instance, coaches may attend district-level meetings and bring insights back to teachers at individual schools. Even PLC members teaching the same grade level and subject at the same school may interact in various contexts, including their individual classrooms and the PLC group itself.  Third, horizontal expertise entails the co-construction of hybrid ideas. As people cross boundaries into different contexts, they not only spread ideas but also may co-create new knowledge through interactions (Anagnostopoulos et al., 2008; Engeström et al., 1995). We consider the ways that teachers and coaches may jointly construct knowledge in some instances but not others. As an example of all three elements of horizontal expertise, a coach could engage in boundary crossing as she moves between the contexts of administrator meetings and PLC meetings. In this movement, she not only transmits but also alters ideas gained in both contexts, resulting in hybrid ideas.1

Our framework recognizes that horizontal and vertical forms of expertise operate synergistically. As individuals increase their knowledge and skills, they also work together and cross boundaries, potentially creating hybrid ideas (Anagnostopoulos et al., 2007). In this article, we highlight the dynamic relationship between the two types of expertise. For instance, the vertical expertise of coaches, teachers, and PLC members could bolster the collective’s capacity to co-create ideas. Conversely, the horizontal expertise generated in teacher–coach and PLC interactions could lead to individuals increasing their vertical expertise.

It is instructive to consider the concepts of horizontal and vertical expertise along with an understanding of teachers’ potential trajectories through a cycle of data use. When completed in an ideal fashion, this data cycle, adapted from the theory of action promoted by data-use advocates (Ackoff, 1989; Mandinach & Jackson, 2013; Marsh et al., 2006), begins with (1) a teacher accessing data. From there, the teacher (2) analyzes the data to turn them into information, (3) combines the information with her or his own understanding and expertise to form actionable knowledge, and (4) uses the knowledge to take action. PLCs and coaches may influence teachers’ use of data at any point along the cycle through a range of practices, including: assessing teacher needs, modeling, observing, providing feedback, dialoguing, and brokering (e.g., transporting tools across contexts) (Marsh & Farrell, 2014). Some of these practices involve more vertical expertise, and some involve more horizontal expertise. Along the vertical dimension, coaches and PLC members could engage in the practices traditionally involved in a mentor–novice relationship, such as observing a teacher as she analyzes data and formulates a response to it. On the other hand, the practices of dialoguing and brokering may entail either or both vertical and horizontal expertise. As an example of when these practices involve horizontal expertise, PLC members could jointly discuss data and, in response, co-create lesson plans that draw on ideas from multiple contexts.

In this article, we use the concepts of vertical and horizontal expertise to better understand how coaches and PLCs may support teachers and alter their approaches to one part of the data-use cycle: that of the response. Specifically, we focus on responses in which teachers reported changing their delivery of instruction. We now turn to our study methods and findings.


In this article, we draw on data from a yearlong comparative case study of six low-performing middle schools in four districts that supported teacher data use via literacy coaches, data coaches, or PLCs (Merriam, 1998; Ragin & Becker, 1992). We seek to answer the following research questions: (1) How does working with a coach or PLC mediate teachers’ responses to data? (2) What factors influence the activities and effects of coaches and PLCs?



Districts and schools were purposefully selected to maximize the conditions identified by prior research as supporting effective data-use interventions, ensure that the coach or PLC had been in place for a minimum of two years, and provide variation in characteristics of coaches and PLCs (e.g., content-area expertise). Three of the four districts (Sequoia, Rainer, and Mammoth2) were medium sized and located in one state; the fourth district (Shenandoah) was significantly larger and located in another state.  All the schools had failed to meet state accountability targets for more than 5 years. As Table 1 illustrates, the size of case study schools varied, but all six enrolled significant proportions of students of color and/or English language learners. Two of the districts invested in literacy coaches (Mammoth, Shenandoah), one invested in data coaches (Rainier), and one invested in PLCs (Sequoia). At the school level, data support often came in the form of multiple interventions. Table 1 also provides additional details of the case study schools and participants. The appendix illustrates the variety of data emphasized in each school.

Table 1. Study Schools and Participants


Middle School

Interviewed Coaches and Teachers



   410 students

       25% English language learners

       95% Latina/o students

       4% Asian/Pacific Islander students

       1% White students

   Main data support intervention: PLC

   Secondary intervention: Literacy coach

1.  Mr. Castillo, 7th grade English and social studies

2.  Ms. Wallace, 7th grade English and social studies  

3.  Ms. Zhang, 7th grade English and social studies  

4. Literacy coach “teacher on special assignment”


   460 students

       25% English language learners

       95% Latina/o students

       5% White students

   Main: PLC

   Secondary: Literacy & math coach

1. Literacy and math coach

2. 8th grade English  

3. 8th grade English  

4. 8th grade English  

5. 7th grade English  



   850 students

        27% English language learners

        85% Latina/o students

        5% African American students

        5% Asian/Pacific Islander students

        5% White students

   Main: Data coach/Secondary: PLC

1. Ms. Jenson, data coach

2. 8th grade English  

3. 7th grade English  

4. 8th English  


     700 students

        35% English language learners

        85% Latina/o students

        5% African American students

        5% Asian/Pacific Islander students

        5% White students

   Main: Data coach/Secondary: PLC

1. Data coach  

2. Mr. Donovan, 7th grade English

3. 7th and 8th grade English  

4. 7th grade English and social studies   



      800 students

        27% English language learners

        90% Latina/o students

        4% Asian/Pacific Islander students

        4% White students

        2% African American students

    Main: Literacy coach / Secondary: PLC

1. Ms. Mendoza, literacy coach  

2. Ms. Berkowicz, 7th grade English

3. Ms. Santos, teacher, 8th grade English  

4. Science  

5. 6th grade English  


Blue Ridge  

    340 students

        0% English language learners

        97% African American students        

        3% White students

    Main: Reading coach/Secondary: PLC

1. Ms. Simonson, reading coach

2. 7th grade English  

3. 7th grade English  

Note. Although the numbers have been slightly altered to maintain anonymity, the basic proportions remain true. “Main” represents the intervention emphasized by the district and given significant district support and/or resources. “Secondary” represents additional interventions in which individual schools invested. Focus group teachers are not listed here. Names are pseudonyms. The named participants are mentioned in the article. Unnamed individuals are either not discussed in the article or mentioned briefly and not named.


We visited each school at least three times during the 2011–2012 school year. At each school site, we interviewed the coach or PLC lead teacher, school administrators, and two to three case study teachers per school (who primarily taught language arts and were either PLC members or teachers working with a coach), for 79 total interviews. Case study teachers, coaches, and PLC lead teachers were interviewed three times during the year. We also conducted one focus group per site with a total of 24 non-case-study teachers (many of whom taught subjects other than language arts) and conducted a total of 20 observations of school and district meetings throughout the year. Teacher interview and focus group protocols were designed to elicit credible information about the nature of interactions with coaches or PLCs, interpretations of data, and responses by anchoring the questions to data “artifacts” teachers were asked to bring to most interviews (e.g., “What do they tell you about your students or class? Based on the data, what would be your next steps as an instructor?”) and to a proximate time period (e.g., “Can you please recall the last time you and the coach met?  Can you describe it to me? Where did you meet? What kinds of things did you do during that meeting?”). Finally, we interviewed 13 districtwide leaders, such as superintendents, assistant superintendents, and staff overseeing literacy efforts, and collected documents, such as data analysis protocols. Semistructured protocols guided all interviews and focus groups, which lasted 45 minutes to an hour and were recorded and transcribed. We coded all transcripts and documents using NVivo software.

We also administered monthly Web-based surveys to the case study teachers (15), coaches (4), and PLC lead teachers (2). These surveys asked educators to reflect on their work in the prior month, including meetings with teachers and/or coaches around data, strategies used to support data interpretation and response, and challenges and successes around using data to inform instruction. The final survey also asked educators to reflect on the ways in which working with PLC teachers or a coach over the course of the year affected their knowledge, skills, and practice. On average across the year, we received completed surveys for 94% of the case study teachers and 91% of the coaches or lead PLC teachers.3

Data collection and analysis were continuous and iterative (Miles & Huberman, 1994; Strauss & Corbin, 1997). We conducted basic descriptive analyses of the survey results and compared responses of educators from different types of schools. We began coding all the qualitative data with an initial set of codes related to the data use cycle and capacity-building practices, as well as contextual conditions at the individual, school, district, and external level.  In later phases, we reanalyzed all passages coded as “response to data” to identify instances for which we had sufficient details to identify the specific type of response, for a total of 343 instances. In these instances, teachers generally described how they responded to a specific set of student learning data that they either (1) had described or shared with us in the course of an interview, (2) had examined in a PLC or coach meeting we observed, or (3) were asked to describe in a survey. We further narrowed the sample of coded instances to examine only those responses that were instrumental (Coburn, Toure, & Yamashita, 2009; Murnane, Sharkey, & Boudett, 2005; Weiss, 1980) (meaning that some sort of action was reported to have been taken) and directly related to instruction—which represented the vast majority of instances coded.4 The resulting 294 instances were fairly evenly distributed across the sample of schools.

We coded all 294 instances for evidence of “change in delivery,” defined as a reorganization of how students acquire information or skills. This encompasses the adoption of a single strategy employed perhaps only once, in addition to long-term shifts in pedagogical approaches. For example, change in delivery occurred when a teacher altered his instruction following an informal formative assessment as his students were reading. He realized that students had not understood the literary device of “flashback.” To address this, he reorganized how his students were acquiring an understanding of the device by bringing in a video of the Toy Story 3 movie, playing a snippet in which a flashback occurs. By providing a nontext example of a flashback from a popular film, he bridged a familiar concept to the realm of literacy, thereby changing delivery of instruction.  Other instances involved more extensive changes, such as a shift from teacher-led instruction to peer-to-peer interaction, as described later. Responses coded as “no change in delivery” included instances in which teachers either retaught a topic not mastered by students (using the same mode of delivery), retested students, referred students for out-of-class tutoring, or asked students to analyze their data (absent a change in delivery).5

In the final phase of analysis, we identified emergent themes related to horizontal and vertical expertise (discussed in later sections) and cross-referenced these with our coded instances. We also turned to theoretical literature to inform our conceptual framework and our analysis of response types observed. Further, we analyzed all coded text related to context to identify possible associations with the observed patterns of teachers’ responses to data. We created data displays and tables to tally and identify patterns and associations, using multiple sources of evidence to corroborate our findings (Miles & Huberman, 1994). To ensure trustworthiness (Patton, 2002), we conducted consistency checks in which two researchers coded all instances and discussed and reconciled disagreements.  After one researcher completed the matrix of response instances, another researcher coded 298 of the 343 total instances using the same coding categories. After this, the two researchers compared similarities and differences in coding, coming to a consensus on the meaning of particular codes. From there, one researcher recoded 173 instances, resulting in 80% agreement with the other researcher. The remaining incongruent instances were recoded to the agreement of both researchers.   


Our analysis revealed that coaches and PLCs helped to mediate teachers’ responses to data, especially in instances in which teachers used data to change the delivery of instruction.6 These instances, however, were somewhat rare. Echoing past research, we found that very few (57 of 294, or 19%) of the teachers’ instructional responses to data—with or without coach or PLC support—involved change in delivery. In the remaining instances, teachers generally reported responding to data by reteaching topics, asking students to reflect on their results, or providing students extra support out of the classroom without changing how students acquired information or skills. To be clear, change in delivery could involve these practices. However, for these practices to be part of change in delivery, an alteration of how students acquire knowledge or skills would need to occur. For instance, if a teacher responded to data by reteaching a concept in the same manner she had used previously, no change in delivery would have occurred.

The finding that little change in delivery occurred is consistent with observations of several school and district leaders, who were aware of teachers’ struggles to adjust their practice based on data. Commenting on the nature of PLC discussions, one principal reported,

There will be a conversation about, “Well I taught it this way. How did you teach it?” … But there isn’t that next conversation about, “Okay, so did it make a difference? Did you specifically teach it differently or use a whole different whatever to teach it, or did you just kind of review it?” … They reviewed it, they went over it, maybe they explained it a little bit different. But I wouldn’t say the instruction was wholeheartedly different.

Our analysis suggests that coach and PLC support was pivotal in helping teachers make such changes. As Table 2 illustrates, more than two thirds of change in delivery responses involved a coach or a PLC, whereas about half of the other responses entailed such involvement.

Table 2. Association Between Data Response Types and Involvement of Coach or PLC

Data response types

Number of examples

Number of examples involving a coach or PLC

% of examples involving a coach or PLC

Change in delivery




No change in delivery





How did coaches and PLCs mediate teachers’ use of data in ways that supported change in delivery? Next we focus on instances in which teachers reported using data to alter their pedagogy. This focus necessitates an examination of particular case study schools because not all coaches and PLCs promoted such change. First we examine differences in the ways that coaches and PLCs mediated teachers’ use of data, and then discuss a key mediation practice: dialogue. Finally, we turn to an exploration of the factors that shaped possibilities for this change.


The work of coaches and PLCs in mediating teachers’ data use leading to change in delivery was by no means uniform. Instead, key differences appeared between the practices of coaches and PLCs. Coaching relationships had the potential to foster hybrid ideas and long-term change in delivery, whereas PLCs appeared to limit possibilities for such long-term shifts in that members often focused on sharing discrete strategies. Variation in the role and form of horizontal and vertical expertise accounts for much of this difference.


Countering conventional wisdom that expects coaches to draw on vertical expertise in a master–novice relationship, we found that coach–teacher interactions involved both forms of expertise, which worked in tandem to either foster or hinder change in delivery. When coach–teacher interactions and boundary crossing fostered horizontal expertise and capitalized on relevant vertical expertise, possibilities arose for hybrid ideas and long-term change in delivery. Hybrid ideas sometimes formed as coaches and teachers collaborated and not only shared their views but also melded them to create new understandings. Long-term change in delivery involved a teacher adopting a practice for a span of several weeks or permanently incorporating it into her or his repertoire. However, when interactions inhibited horizontal expertise or did not involve relevant forms of vertical expertise, possibilities for change in delivery were significantly constrained. Next we describe two examples of coach–teacher interactions that show how change in delivery was either fostered or hindered.

No change in delivery with a coach. Ms. Jenson, the data coach at Cascades, provides an instructive case illustrating a coaching relationship that did not result in change in delivery. This example shows how limited horizontal expertise and relevant vertical expertise closed off possibilities for change in delivery. Literacy instruction was one of the areas of vertical expertise with which Ms. Jenson struggled even though she was expected to work with all teachers at the school, including the language arts teachers. On a survey at the beginning of the year, she rated herself as having an emerging “ability to deliver effective literacy instruction.” As for her prior experience, she had been an elementary school teacher for three years, and a middle school math and science teacher for five years, and then she taught at the high school level. Her only direct experience with literacy was during her tenure at an elementary school. Also, just as important, the teachers did not perceive her as holding such expertise. As one teacher put it in an interview, “I would never go to our coach for advice on language arts, not in a million years . . . . I mean, her background is math.”

In addition, tensions characterized several relationships between Ms. Jenson and teachers, perhaps indicating that the coach had a novice level of expertise in interpersonal skills and understanding how to work with adult learners. One observer characterized the coach’s demeanor as condescending: “I have personally witnessed [it] and I thought it was a little bit of condescending behavior, like, ‘I am smarter than you.’ One thing about Ms. Jenson is she wants to be the smart person in the room. . ..” Others perceived Ms. Jenson as controlling. In describing a PLC meeting that the data coach attended, a teacher commented,

And [the] data coach kind of took over.  I mean she wasn’t really letting us be a PLC or be a lesson planning group of teachers. And we are English teachers so of course we want to, we could work well together; we could work and figure out strategies and we know what to do. We know how to plan our lessons, but the data coach came in and it felt like she was imposing her agenda on us.

Both of these descriptions suggest that Ms. Jenson experienced difficulty working with adults.

Ms. Jenson’s novice-level vertical expertise in English language arts, interpersonal skills, and adult learning greatly affected the possibilities for the development of horizontal expertise with case study teachers. As suggested in past research (Marsh et al., 2008, 2012), we found that a coach’s vertical expertise seemed to be a precondition to enact horizontal expertise with the teachers. The most glaring indictor of this occurred mid–school year, when an administrator at Cascades forbade Ms. Jenson from working with language arts teachers because of past interpersonal conflicts. This action blocked the development of horizontal expertise because boundary crossing—in the form of meetings and classroom visits—was prohibited. Even prior to this mandate, however, possibilities for horizontal expertise were constrained. As shown in the preceding quotation, a teacher characterized the coach as “imposing her agenda” at a PLC meeting, thereby inhibiting the co-construction of knowledge. Considering this limitation of horizontal expertise, it is not surprising that there were only two instances of change in delivery documented at Cascades, and neither of these involved the coach.

The example of Ms. Jenson illustrates how the dynamic interaction of weak relevant vertical expertise and constrained horizontal expertise conspired to block changes in delivery. Ms. Jenson’s lack of credibility with the teachers seemed to undermine possibilities for boundary crossing, thereby inhibiting the development of horizontal expertise. In turn, the absence of horizontal expertise may have limited possibilities for change in delivery.

Change in delivery with a coach. In contrast to Cascades, the case of Green illustrates how coach–teacher boundary crossing involving relevant vertical expertise fostered horizontal expertise and promoted potentially long-term change in delivery and hybrid ideas. The literacy coach at Green, Ms. Mendoza, had strong vertical expertise in language arts, interpersonal skills, and adult learning. She had been a literacy coach for three years at Green after serving as a middle school teacher for seven years. Also, she had particular knowledge about language acquisition for English language learners and had undergone training in balanced literacy, an initiative of her school district. On a survey at the beginning of the school year, she rated herself as “proficient” in her “ability to deliver effective literacy instruction.” She explained,

I think I can determine a student’s need and then be able to help teachers figure out what we can do to service those students, so that’s why I consider myself pretty proficient. Not advanced because I haven’t written a book or anything, so I feel like I can’t, but I feel like I’m pretty comfortable with literacy instruction.

Administrators and teachers, however, held an even higher opinion of her knowledge of English language arts. A teacher remarked, “I know when I go to her that she is a credible resource; she knows what she is talking about.”

In addition to possessing an understanding of literacy instruction, Ms. Mendoza had strong interpersonal skills. Administrators and teachers alike considered her to be highly personable. One administrator remarked, “Working [with] adults, she’s advanced. She’s doing really well. She’s able to communicate the passion behind the work that we’re doing. She keeps it very collaborative and the teachers feel very supported. She can joke with them; she’s got a really great balance.” Teachers shared this estimation of Ms. Mendoza: “She’s very open-minded,” one teacher remarked, “and she will throw her ideas out there. It’s not like, ‘You should do this.’ And she is always looking for the positive in what I do.” Ms. Mendoza made conscious decisions that may have contributed to her colleagues’ positive opinions, including choosing to eat lunch in the staff lounge. Also, when she first began at Green, she helped teachers in any way they requested, including accomplishing administrative assistant-type tasks, such as making copies. This assistance allowed her to “bridge” into providing more substantial literacy support.  

Beyond interpersonal skills, Ms. Mendoza possessed a sophisticated understanding of adult learning. She had learned the concepts of “heavy” and “light” coaching from a professional reading and applied these to her work at Green. She explained,

Coaching light has more to do with being a support person, providing resource[s], gathering text, demo-ing lots of lessons, doing a lot of the work. . . . You’re being the friend; you’re being helpful; you’re being supportive, which is part of the coach role. But on the heavy side, you’re analyzing their practice; you’re forcing them to reflect. You’re asking them questions that push their comfort level.

She applied light coaching to her one-on-one work with teachers, who seemed to respond favorably to this approach. For instance, Ms. Berkowicz, a seventh-grade language arts teacher, implied that she felt comfortable approaching the coach with poor test results because the coach would not judge her. Describing a particular instance, she said, “You know, I went to her and I said, ‘Oh my god, I gave them a quiz and they bombed.’ And I know I can share that with her without any insecurities because she’s not going to go, well, like, ‘What did you do?’” Ms. Mendoza reserved heavy coaching for meetings with groups of teachers. She described her role in this setting as questioning the causes of cross-classroom student outcomes as reflected in data and discussing common teaching practices. In addition to her application of the heavy and light coaching concept, Ms. Mendoza considered how her physical presence affected her interactions with teachers. She explained,   

A lot of times I find that I lower myself or I kind of do this. [She stood up from the table, approached the interviewer, and crouched down.] And it’s the weirdest thing, but I do think it’s better than this type of thing [stood up, leaned over the interviewer], physically. It’s just something I've noticed in myself that I do when you’re looking at something like [student test results].

Ms. Mendoza’s comments indicate that she considered teachers’ affective states to be an important aspect of the learning process.

Ms. Mendoza’s strong vertical expertise in literacy instruction, interpersonal skills, and adult learning facilitated the development of horizontal expertise. In conjunction, the two forms of expertise seemed to promote hybrid ideas and long-term change in delivery. To illustrate the dynamic relationship between the two forms of expertise, we discuss the coach’s work with a veteran eighth-grade language arts teacher named Ms. Santos. Over the course of the school year, the two worked together to use interim assessment results, state test scores, and student work to create small groups and differentiated curriculum designed to improve state test scores. These alterations, taken together, represented change in delivery.

The dynamic relationship between the two forms of expertise was apparent at the outset of the coach and teacher’s collaboration, which entailed multiple forms of boundary crossing. Ms. Santos approached Ms. Mendoza asking for support in analyzing interim assessment “cluster scores” for one of her language arts classes that was struggling. In approaching Ms. Mendoza for help, Ms. Santos moved between the context of the classroom and the meetings with the coach.

The interplay between vertical and horizontal expertise arose in their first conversation about grouping. Both participants’ descriptions of this conversation suggest that the idea for grouping was co-constructed. Ms. Santos reported, “So I really kind of just wanted to vent and brainstorm with her during lunch, and we did that, but then the next day . . . she came to me with cluster scores and with maybe we could try doing small groups and just giving me more ideas.” Here Ms. Santos described brainstorming with Ms. Mendoza, suggesting the enactment of horizontal expertise, while also indicating that the coach had given her ideas, reflecting the coach’s vertical expertise. Ms. Mendoza described this conversation similarly but first provided some background information, indicating that Ms. Santos’s original idea had been to replicate a strategy Ms. Mendoza had employed for English language learners in the whole school. Another difference in the descriptions is that Ms. Mendoza attributed the grouping idea to Ms. Santos:

At first. . . [Ms. Santos] said, “I want to work with looking at the data.”  She wanted to do the same thing that I had done with the EL [English learner] students with her language arts students, which is basically to let them know what, and that’s how she printed out the cluster scores. So like word analysis is a struggle for this kid or reading comprehension is a struggle for that kid. So at first she was just going to kind of have those conversations with kids [about their scores], but then she’s like, “So what do we do with it? Like once a kid knows that they’re struggling in reading comprehension, like, how can I work with them in small groups?” So then she asked if I could come up with some things that she could do with them to work in the small groups.

These two descriptions of the same conversation point to the ways that boundary crossing contributed to what could be viewed as a co-constructed idea. Ms. Santos sought to replicate Ms. Mendoza’s strategy of speaking with ELL students about their data, which had unfolded within the context of the whole school. Ms. Mendoza passed along insights from this context to Ms. Santos, who contributed insights from the particular context of her classroom. Through this boundary crossing, the original idea of speaking to students about data became an idea to work with students in small groups. Interestingly, each attributed the idea of grouping to the other.

As the school year progressed, the groups became vehicles for state test “clinics.” Ms. Mendoza and Ms. Santos worked together to create separate small-group activities focusing on writing strategies that unfolded over the course of six weeks. Their boundary crossing extended beyond one-on-one meetings in that Ms. Mendoza spent time in Ms. Santos’ classroom helping to implement the new instruction involving small groups practicing specific writing strategies either independently or with one of the adults. This was a significant shift from past instruction in that peer-to-peer learning was introduced.

The collaboration of Ms. Santos and Ms. Mendoza indicates the potential of coaches mediating teachers’ use of data in ways that lead to change in delivery. The example shows how vertical and horizontal expertise can dynamically interact through boundary crossing, possibly resulting in hybrid ideas and long-term classroom changes.


As with coach–teacher interactions, horizontal and relevant vertical expertise in PLCs appeared to foster change in delivery, whereas a lack of these impeded it. The difference, however, arose in the ways that expertise unfolded. In PLCs, horizontal expertise often took the form of individual teachers sharing discrete instructional strategies. One teacher would describe a successful classroom strategy, and the other teachers would sometimes use these in their classrooms. Even though this practice did not entail real-time co-construction of ideas, the creation of hybrid ideas was still possible. For instance, a teacher could alter another teacher’s idea through implementation. However, the clearinghouse function of PLCs did seem to limit possibilities for long-term change in that discrete strategies were often the focus.

Next we provide two examples of PLCs. The first illustrates how a lack of horizontal and relevant vertical expertise limited possibilities for change in delivery. The second, on the other hand, illustrates how these forms of expertise in PLCs resulted in change in delivery involving the implementation of discrete strategies.

No change in delivery with a PLC. Earlier we introduced Green as an exemplar of horizontal and vertical expertise in a coach–teacher relationship that led to change in delivery. However, this school was also the site of a struggling PLC that illustrates how a lack of these forms of expertise hindered instructional change. The PLC—composed of three seventh-grade language arts teachers: Ms. Berkowicz, introduced earlier, Mr. Morgan, and another male teacher—was characterized as tension-filled and unproductive. Ms. Berkowicz, a veteran teacher at Green who had recently joined the PLC, did not feel like a valued member of the group. In order to assist the PLC, Ms. Mendoza, the literacy coach discussed earlier, facilitated some of the group’s meetings, but to little avail. The coach commented,

There seemed not to be a value in each other's work, to be able to listen and gather information and say, “Well, let me try it this way.” It was more like, “Let’s just give each other information and I'm still going to do it the way I want to do it,” type of thing.

This commentary suggests that one or more PLC members lacked sufficient vertical expertise in interpersonal skills, thereby inhibiting a transfer of ideas. Ms. Berkowicz pinned the blame for these issues mainly on Mr. Morgan, explaining,

Well, . . . [Mr. Morgan] is totally confident in what he’s doing. And he doesn’t want to spend the time, like three hours, you know, planning a lesson [with the PLC]. His idea is just passing work, you know. And it comes across as egotistical. But he’s only going to use what he uses because he thinks it’s the best.

We did not interview Mr. Morgan or the other teacher, but had we done so, we might have heard two additional perspectives on the PLC. What seems clear, however, is that weak vertical expertise in interpersonal skills inhibited the development of effective working relationships, in turn hindering horizontal expertise. Compounding this problem was the fact that the three teachers did not have a common planning period, making collaboration all the more difficult. Not surprisingly, we found no instances of change in delivery associated with the work of this PLC.

Change in delivery with a PLC. A PLC at Whitney provides an instructive case, illustrating how, through the interplay of horizontal and vertical expertise, some PLCs mediated teachers’ use of data leading to change in delivery. Three seventh-grade language arts teachers at this school—Ms. Zhang, Ms. Wallace, and Mr. Castillo—met weekly to discuss data and plan lessons. In terms of vertical expertise in literacy instruction, Ms. Zhang and Ms. Wallace often looked to Mr. Castillo for assistance with strategies for effective lessons even though Ms. Zhang formally held the role of PLC leader and often guided group discussions. Also, Ms. Wallace, who was new to the school and the curriculum, looked to the other two teachers for guidance. She explained, “I really rely on Ms. Zhang and Mr. Castillo because they’re used to teaching this curriculum and they know what to teach.” However, by the end of the school year, she began to offer instructional strategies to her fellow PLC members. Reflecting this shift, Ms. Zhang began to view Ms. Wallace as an important source of vertical expertise. Ms. Zhang stated,

Ms. Wallace has more things to bring to the table now. It’s like, “Oh I tried this,” or “You guys gave me this strategy, and I used it and it worked great.” Or, “I thought of this.” And, you know, even just conversationally, like, she shares a lot of strategies with me. It’s not so much her just relying on me. Like, I feel like I could lean on her.  

Ms. Zhang and Mr. Castillo demonstrated expertise in adult learning in their mentorship of Ms. Wallace in the literacy curriculum. Both teachers commented about their efforts to provide explicit guidance to her. Ms. Zhang expounded,

And we don’t just say, “Okay, we’re teaching theme and next week plot.” Like, it’s, “We’re going to teach theme, and then here’s what we want to do with it.  Here’s what we want to accomplish with it.” You know, especially with Ms. Wallace being new to us, we’ve had to get more into that, because we can’t just tell her, “Theme, okay, goodbye, see you later.”

For example, before a unit on summary writing, Ms. Zhang explained to Ms. Wallace how she taught the unit, providing a graphic organizer and examples. Several times throughout the school year, Ms. Wallace praised the other two teachers in interviews for their skilled guidance.

Also, the three teachers exhibited vertical expertise in interpersonal skills. They listened to one another’s ideas, as is evident in the preceding quotations. At the beginning of the school year, Mr. Castillo and Ms. Zhang routinely shared ideas with Ms. Wallace, which she often implemented in her class. Toward the end of the year, however, the sharing of ideas flowed more evenly, with Ms. Wallace contributing instructional strategies that Ms. Zhang valued.

These rich forms of vertical expertise allowed for the development of horizontal expertise, which served to facilitate change in delivery in response to data, specifically in the form of sharing discrete classroom strategies. One particular instance arose when, upon analyzing data from a practice writing test, the teachers realized that in answering writing prompts, students had confused the “response to literature” and “persuasive writing” genres. The PLC responded by focusing on graphic organizers to help students identify the genre required by a given test prompt, including persuasive, response to literature, narrative, and summary genres. For the first three, the organizers were tables that students filled out to provide them with an outline, while the other organizer featured a display of the parts of a traditional fictional story. Ms. Zhang created these graphic organizers herself, not in collaboration with her PLC members.

Even though the graphic organizers originated with Ms. Zhang, each team member used them in her or his classroom, showing how boundary crossing extended beyond the movements of the teachers. In addition, the graphic organizers themselves crossed boundaries, becoming “boundary objects”—artifacts that have different meanings in different settings and allow people to work together across contexts (Akkerman & Bakker, 2011; Engeström et al., 1995). As the organizers crossed into different classroom contexts, each teacher used them in slightly different ways, thereby generating hybrid ideas. For instance, part of Mr. Castillo’s approach to teaching the graphic organizers involved the use of technology that was not available to the other two teachers. His classroom was equipped with interactive clickers, hand-held devices resembling TV remote controls with which students entered question responses. He explained,

So the kids give the answer on the clicker and then they have to write the answer. So like today I did testing . . . [on the] four types of genres. So I put a prompt up; the kids [had to] to say which one it was; then they have to write out the graphic organizer.  

Mr. Castillo infused hybridity into Ms. Zhang’s original idea by adding clickers. The work with the graphic organizers, then, involved horizontal expertise as both teachers and the graphic organizers themselves crossed contextual boundaries, giving rise to hybrid ideas. Interestingly, these hybrid ideas arose from the sharing of a discrete instructional strategy that each teacher then adapted.

In contrast, the coach–teacher interaction between Ms. Mendoza and Ms. Santos entailed real-time co-construction of hybrid ideas through discussion. These dissimilarities may account for the differences in the duration of the change in delivery in the two examples. The use of the graphic organizers unfolded over a period of several weeks and involved the implementation of a single strategy. There is no evidence to suggest that any of the three teachers permanently shifted their pedagogical practice—which contrasted sharply with the long-term and more complex changes in delivery that unfolded in Ms. Santos’s classroom.


In both PLCs and coach–teacher interactions, dialogue was a key mediating practice associated with change in delivery. There were two main foci of dialogue—instruction and data. Somewhat echoing previous research (Horn & Little, 2010), we found that dialogue about instruction was especially important in fostering horizontal expertise, providing teachers the necessary support to make changes in delivery. As Table 3 indicates, dialogue about instruction was associated with 87% of relevant response instances, whereas dialogue about data was associated with 47% of instances. (In this section, denominators are numbers of instances associated with a coach or PLC.) Of note, the non–change in delivery responses were less often associated with dialogue about instruction and more often associated with dialogue about data than responses of change in delivery.

Table 3. Association Between Data Response Types Linked to Coach or PLC and Type of Dialogue Involved in Mediation

Responses linked to PLCs

and coaches

Number of examples

% of examples involving dialogue about instruction

% of examples involving dialogue about data

Change in delivery




No change in delivery




To illustrate the different forms of dialogue, we draw on examples that fostered change in delivery, first discussing dialogue about instruction. One form of this dialogue entailed PLC members or coach–teacher groups sharing teaching strategies, often in conjunction with a reflection on data and/or past instruction. The teachers in the Whitney PLC engaged in dialogue of this nature prior to implementing instruction on the graphic organizers. As described earlier, the teachers jointly analyzed data from a writing test, sharing a strategy that led to change in delivery across all three teachers’ classrooms.

Dialogue about instruction also took the form of discussion about lessons. Often PLC members and coach–teacher groups responded to data by discussing past and future lesson plans and jointly planning upcoming instruction. At Emmons, for example, the school’s seventh-grade English language arts PLC analyzed interim assessment scores and discovered that students struggled with one of the state standards: supporting thesis statements in essays. A PLC member, Mr. Donovan, attributed the poor results to his and the other teachers’ focus of instruction at a “lower thinking level.” He recounted the teachers’ conversation: “You know what? Here they’re asking us to evaluate. We didn’t ask them to evaluate anything. We didn’t ask them to judge. We asked them to make up their own. Judging somebody else is [something different]. . ..” In response, he and another teacher jointly planned a lesson. This co-constructed lesson—a hybrid idea in concrete form—encouraged students to think at a “higher level”:

What we have our students do is we have our students go through these really weak essays and . . . determine, okay, what is the point of this essay? What are its pieces? So first of all they identify the pieces. Then from there, then they have to go down. Okay, what in this essay supports this thesis? Does this paragraph really try to prove it that is right? Does it really try to prove that. . . [is] right? So now we’re getting to that evaluation level of evaluating support here.

In co-planning a lesson—which functioned as a boundary object between two classroom contexts—Mr. Donovan and his partner teacher jointly enacted horizontal expertise, which in turn supported change in delivery.

As these examples suggest, dialogue about instruction often coincided with dialogue about data. Of the 34 instances of change in delivery involving dialogue about instruction, 19 also involved dialogue about data. Indeed, our survey results (shown in Table 4) support this assertion, indicating that PLCs and coach–teacher pairs or groups frequently discussed data and instructional responses. Approximately two thirds or more of teachers across schools reported discussing and reflecting on data, sharing knowledge or suggestions of test hypotheses about data, and sharing suggestions of how to develop action plans and modify instruction based on the data with their coaches and PLC members. They also rated these opportunities for dialogue as effective “to a moderate extent” in improving their understanding of data and informing their instruction. This overlap and dual perceived usefulness point to the importance of the confluence of both types of dialogue, which is predicted by the data cycle discussed in our theoretical framework. A focus on simply accessing and analyzing data, without a complementary focus on possible responses to the data, constitutes an incomplete data cycle.

Table 4. Teachers’ Reports on Use of and Perceived Effectiveness of Dialogue About Data and Instruction

“Did your coach/members of the PLC use the following strategies in working with data with you during the past month? If so, to what extent was the strategy effective in improving your understanding of data and informing your instruction?”

% Reporting Use (average)

Rating of Effectiveness (average)


Allowing me to discuss and reflect on data



Sharing knowledge or suggestions of how to build and test hypotheses about the data




Sharing knowledge or suggestions of how to develop action plans based on the data



Sharing suggestions of how to modify instruction based on data



Note.  Teachers were asked these questions monthly. Use percentages and ratings reported above represent the average of case study teacher responses to these questions across schools over the course of the year. Response options to the second question of effectiveness included: not effective (1), effective to a small extent (2), effective to a moderate extent (3), and effective to a great extent (4).

Dialogue about data, like its counterpart about instruction, took several forms but was less frequently associated with horizontal expertise. In some cases it involved a PLC member or coach presenting quantitative data reports and graphics to teachers. Ensuing discussions frequently focused on perceived student weaknesses on certain standards. These conversations varied in the degree of data analysis, with some remaining at a cursory level. Importantly, dialogue about data alone did not appear to encourage change in delivery. There were no instances of such change associated exclusively with dialogue about data. It was always accompanied by dialogue about instruction. This finding points to the centrality of dialogue about instruction to fostering change in classrooms. Also, this suggests that dialogue about data—disconnected from a consideration of instruction—may fail to provide adequate guidance for teachers to respond to data in substantive ways.

The two schools that invested in data coaches—Cascades and Emmons—provide insights about a strong focus on data without an equivalent focus on instruction. At both schools, teachers and coaches focused on state test data more than most other types of data (see the appendix) and dialogued more about data than instruction. The coaches at these schools helped teachers access data reports and guided them individually or in groups in spotting trends in the data. For instance, at Cascades, the data coach, Ms. Jenson, met with a group of math teachers to discuss district interim benchmark assessment results. A teacher reported,

She presented the results of benchmark three. And we looked ahead to benchmark four to see which of the standards that we had problems with on benchmark three were also tested on benchmark four so that we could see what we needed to do to get ready for the next benchmark.

As the teacher made clear, the coach’s main focus was presenting the data and determining areas of concern in light of the next test, with scant explicit focus on instructional responses to these data.

The dearth of dialogue about instruction at data coach schools did not translate into a lack of instructional responses in general (i.e., there were instances of teachers reteaching content, asking students to retake tests). However, the response of change in delivery in particular was less common at these schools than at other schools. Of all the responses documented at data coach schools, 10% involved changing delivery. In contrast, 21% of responses at PLC schools and 18% of responses at literacy coach schools involved such changes.7 As the case of the data coach schools suggests, a heavy focus on the front end of the data cycle (accessing and analyzing data), without a comparable focus on the back end (translating data into actionable knowledge and responding), appeared to limit the possibilities for change in delivery.

To recap, an overwhelming focus on dialogue about data seemed to limit the flow of ideas about instruction, constraining possibilities for horizontal expertise forming around the key area of change in delivery in response to data. Dialogue about instruction, however, appeared to enable the enactment of horizontal expertise, allowing teachers to use data to move beyond simply reteaching. In this way, dialogue about instruction not only supported teachers in moving through a complete data cycle—from accessing data to responding—but also provided a key resource for more complex changes. This finding suggests that discussion about instructional responses to data may be the linchpin to encouraging reflective instructional changes. Moreover, dialogue about data may augment dialogue about instruction.


Two main factors facilitated or constrained the possibility of change in delivery in response to data. For one, school leadership shaped possibilities for expertise and subsequent change in delivery. In addition, district-level factors provided not only the context of schools but also the policies that drove reforms.  

School Leadership

Resonating with research on the critical role principals play in facilitating data use (Kerr et al., 2006; Lachat & Smith, 2005; Mandinach & Jackson, 2012; Schildkamp & Ehren, 2013; Young & Kim, 2010), principals in our study influenced opportunities for change in delivery in response to data, providing a vision for data use and instructional change. Principals also oversaw scheduling efforts, sometimes setting aside time for collaboration, an important aspect of successful data-related reforms (Nelson et al., 2012).

As an example of providing a vision, the Green principal sought to nurture the staff’s focus on instructional improvement. One of the school’s priorities, he explained, was small-group instruction driven by data:

Yeah, small-group instructions really are important in the school and we’re still pushing that because teachers are still having difficulty understanding. . . what that looks like, and how it feels to do that, and you release the work to students and feel comfortable leaving kids to teach other kids. . . . [The coach] is a leader in the small-group instructional practices, and we used the data to drive those groups’ benchmarks, formative [and] summative assessments.

As our preceding narrative about Ms. Mendoza makes clear, the principal’s vision became reality in at least the classroom of Ms. Santos, who used benchmark scores and other data to form student groups. In this way, the principal’s approach to data use was translated through coach–teacher interaction into changes in delivery. However, a strong vision and its transfer to classrooms were not apparent at all our schools. At Cascades, the principal did not seem to fully embrace data-driven decision making, appearing to Ms. Jenson as though he was simply concerned with following district mandates. This compliance orientation transferred to the teachers as well. In describing data-use reforms at the school, one teacher noted,

Now we know . . . [the principal] has pretty much said, “This is what we are going to do; this is the way it’s going to be.” So prior to that, like, . . . [another teacher] and I had planned stuff out. It was like, “Throw that in the can.”

Principals also shaped PLC and coach–teacher work by structuring time for collaboration—a factor identified in past studies as critical to facilitating collaboration and data use (Datnow, Park, & Wohlstetter, 2007; Louis, Marks, & Kruse, 1996; Marsh, 2012). At two schools with PLCs in the same district, the amount of meeting time differed dramatically. At one, Whitney, PLCs were given one meeting time a week, whereas at the other, Sherman, PLCs were provided meeting time four days a week. The frequent meetings at Sherman had come about several years before in response to the school being in Program Improvement status under No Child Left Behind. The leadership team’s resulting plan included increased collaboration time. The principal explained, “We came up with the seven periods. . . . We looked at how we could adjust for our seven-period day where teachers by contract . . . get a prep period and one collaboration period.” With the added collaboration time—providing more opportunities for dialogue and enactment of horizontal expertise—the potential for change in delivery increased. Indeed, at Sherman, we documented 19 instances of change in delivery, and at Whitney only 9.

At the schools with coaches, some principals were able to protect the time of the coach to focus on instructional matters. For instance, shortly after arriving at Emmons, the principal encouraged the data coach to support literacy instruction and spend time in classrooms, marking a shift from previous years. The coach remarked, “I feel like. . . [the principal’s] focus is more on instruction . . . which is the right thing. And I don’t think that we’ve had that focus per se until this year.” In contrast, at Cascades, in the same district as Emmons, the principal loaded the coach’s schedule with noninstructional tasks, such as supervision duty. Ms. Jenson commented that this duty took her away from spending more time with teachers, and one teacher explicitly complained that she was difficult to access.

District-Level Policy and Practice

School district policies also influenced the possibilities for change in instructional delivery. Funding choices were one avenue for this influence. For instance, in Mammoth School District, home to Green, district leaders were committed to reform efforts centered on balanced literacy, a reading and writing workshop model supported by literacy coaches. This reform had begun before the arrival of the current superintendent of the district, who had served in that post for several years at the time of the study. Despite budget shortfalls that had beset the district, the superintendent chose to maintain allocations for the coaches even though funding these positions was costly. She noted,  “We’ve really squeezed a lot of money out of categorical to make that happen [to keep coaches]. They are a big hit to the general fund, but it all depends on where your priorities are.” The superintendent prioritized coaches because “it would be very difficult . . . to move student achievement forward if you didn’t have that support of the coaches.” In this way, she created possibilities for the coach–teacher interactions at Green that led to change in delivery.

In contrast, the Rainier School District, in which Cascades and Emmons are located, responded to budgetary pressures by combining an instructional coach position with a position responsible for coordinating testing and distributing data. Both the Emmons coach, mentioned earlier, and the Cascades coach, Ms. Jenson, had previously occupied the data coordinator roles and then moved into data coach positions. At both of these schools, we found an emphasis on dialogue about data, without a corresponding focus on dialogue about instruction. This neglect of the latter end of the data cycle constrained the possibility for change in delivery stemming from coach mediation. The dual nature of the new data coach role and both coaches’ backgrounds as providers of data, not supporters of instruction, likely contributed to this focus.

Districts also influenced the potential for change in delivery through policies regarding data management systems and technology. Each of the four study districts invested in data management programs that allowed administrators, coaches, and teachers to examine interim and state test results by specific content foci tied to state standards. Through these programs, teachers accessed reports that were color coded to show students’ levels of proficiency on specific standards. This disaggregation of data encouraged a strong focus on reteaching content related to discrete standards, a response that usually did not entail change in delivery. In describing a specific data report showing assessment results by content foci, Ms. Simonson, the literacy coach at Blue Ridge, explained how the data guided instructional responses:

And we kind of look at this to kind of like build our focus lessons that we’re doing in the classroom to see which one of these—I think it was the reading application, [that] is the one that has pretty much everything on it like, compare and contrast, cause and effect. We looked at that one and you see that’s the one we really focus on.

In encouraging an emphasis on reteaching, the snapshots provided by the disaggregated data reports may have stifled opportunities for change in delivery.


In summary, we found that coaches and PLCs can play important roles in mediating teachers’ responses to student learning data and were often associated with instances in which teachers used data to alter their instructional delivery (as opposed to surface-level changes in materials and topics). Our research also indicates that a dynamic relationship between horizontal and vertical expertise may help explain the ways in which PLCs and coaches facilitated deeper level changes in pedagogy, as well as differences we observed across these types of supports. Notably, we were more likely to observe the formation of hybrid ideas and change in practice when both forms of expertise were present. In this way, the application of the concepts of horizontal and vertical expertise help unpack the complexity of the process of building capacity and using data to change practice. Finally, dialogue appeared to be a key mediating practice associated with changes in instructional delivery, and school leadership and district-level context shaped the possibility for such changes.  

These findings have important implications for policy, practice, and research. The demand to be data-driven is likely to persist for many years to come. In an era of changing standards, new school accountability systems, and performance-based teacher evaluation systems, there is significant if not growing pressure on teachers to use data to guide practice. New formative, interim, and summative assessment results aligned with new college and career ready standards will provide vital sources of data and direction to educators as they adjust to new accountability and evaluation benchmarks and more expansive learning goals. Our research adds conceptual clarity to what types of expertise may be needed to ensure that teachers are able to productively respond to these data. In particular, we hypothesize that vertical expertise may be a precondition to developing horizontal expertise and that within the vertical domain, content-area knowledge may be necessary but not sufficient. If a coach or teacher leader with subject-matter mastery is lacking interpersonal skills and a solid understanding of working with adult learners, she or he may be unlikely to gain much traction in facilitating joint understandings and deep-level changes in teachers’ practice in response to data.

Our study suggests that administrators should consider these facets of expertise when designing interventions, recruiting for coaches, assembling PLC groups, and developing professional development for coaches and teacher leaders. The centrality of dialogue also suggests the need for policies and structures allowing for uninterrupted time and a safe environment for educators to reflect on data and to identify strategies in response to what is learned from analyses.

Given the limits of our study, more research is needed to further advance the field’s understanding of how best to support teachers in using data to improve instruction. Because our research design does not allow us to definitely attribute changes in instructional delivery to coaches and PLCs, future studies might compare how teachers respond to data when working with a coach or PLC and how they respond when working alone and relying on their own professional judgment. Further, our study does not shed light on the extent to which decisions to alter instruction in response to data occur in other settings outside of a PLC or coaching relationship. With a sample of six low-performing schools and a handful of educators in these schools, our data also may not be wholly representative of the population of teachers within a school and schools in other contexts. Future studies could expand this type of analysis in other types of schools. Also, we draw heavily on self-reports of teacher behavior and responses to data and have only limited observational data on actual behavior. Although we did our best to anchor interviews to artifacts and recent events to ensure credible reports of instruction, we recognize that future research would be strengthened by including additional observation data.

In addition, all our case study teachers entered voluntarily into the coaching relationships observed. Without additional research, we cannot be certain of the chances of observing similar changes in instruction from teachers who are not willingly paired with a coach, as may be required to truly transform the quality of instruction schoolwide. Further studies are needed to understand how to best engage teachers who may be less enthusiastic about, if not resistant to, working with a coach or reflecting on their practice in a PLC. Research should also examine the long-term influence of data on teachers’ practice and student learning. For example, it is possible that in some cases, changes in delivery of instruction may not occur or even be possible until a year after a teacher reflects on data, which we have not captured in this study. Further, do teachers continue to use new strategies in subsequent years or apply them to instruction in other areas or with particular types of students?  Some argue that data use is most impactful when used to facilitate professional learning and that true learning involves permanent changes in behavior and knowledge (Katz & Dack, 2013; Woolfolk, Winne, & Perry, 2012). As such, longitudinal research is needed to document effects on teachers and students. Comparative studies of data use and support in mathematics and literacy might also advance a more nuanced understanding of the domains of expertise associated with shifts in instructional practice as well as improvements in student outcomes.

Finally, future studies should continue to examine the ways in which teachers and administrators use data to foster equity in classrooms—equity in not only learning outcomes but also the quality of instruction afforded to students.  In the schools we studied, the majority of students were Latina/o and African American. Research indicates that such students often receive reductive, non-enriching curricula and instruction that emphasize basic and discrete skills and lead to students’ development of “unproductive and weak strategies for literacy learning” (Gutiérrez, Morales, & Martinez, 2009, p. 225). Future research might further consider the quality of the instructional changes occurring in these classrooms and whether student learning data help enrich instruction for these students, potentially addressing the so-called achievement gap.


1. Although similar, the concept of horizontal expertise is not synonymous with the concept of group knowledge. Much like conceptions of knowledge within communities of practice (Lave & Wenger, 1991), group knowledge is knowledge that is possessed in common by a group (Cook & Brown, 1999). This understanding of knowledge, however, does not account for the movement and interactions across contexts that can lead to learning and change (Tuomi-Gröhn et al., 2003). Because our analysis here is focused on the ways that coaches and PLCs may mediate teachers’ uses of data, the concept of horizontal expertise is not only fitting but preferable.

2. Throughout the article, all names of districts, schools, and individuals are pseudonyms.

3. A total of 15 respondents completed all their monthly surveys; the remaining 6 educators failed to respond to one or two of the surveys during the year.

4. That we found a preponderance of instrumental responses (330 of 343, or 96%) may be due to our interview protocols, which directly asked participants about their responses to data. Of the instrumental responses, only six were not directly related to instruction (e.g., teachers sharing ideas with one another).

5. Responses coded as “change in delivery” were responses that occurred within the classroom. For this reason, referral to out-of-class tutoring by definition did not constitute change in delivery because it occurred outside the classroom. We recognize that not all analyses of data necessitate a change in instructional delivery. It is possible that, on examining data, teachers concluded that their students performed well or met their goals and therefore decided to continue instructing without changes. In our data set, however, we rarely encountered such an instance.

6. We refer to this as “change in delivery” for the remainder of the article.

7. These differences may not appear to be substantial. However, they are notable considering the low number of this type of response across schools.


The authors gratefully acknowledge support for this research from the Spencer Foundation. We also greatly appreciate the cooperation of educators in our case schools and district, as well as contributions from other members of our research team, including Caitlin Farrell, Jennifer McCombs, Beth Katz, and Brian McInnis. In addition, we benefited greatly from helpful feedback from Judith Warren Little, Gina Biancarosa, Ellen Mandinach, Edith Gummer, and Martin Orland.

Appendix: Types of Data Emphasized in Case Study Schools


Note.  These data come from the monthly survey asking teachers, “In your meetings with the [literacy/data coach or PLC] last month, how much did your work emphasize the following data sources?” Response options included: no emphasis (1), small emphasis (2), moderate emphasis (3), and great emphasis (4). Means reported above represent the average of case study teacher responses over the course of the year. Assessments marked “external” refer to those developed by an external party, whereas assessments marked “teacher” refer to those that were teacher developed. The top two ranked types of data are shaded gray for each school.


Ackoff, R. L. (1989). From data to wisdom. Journal of Applied Systems Analysis, 16, 3–9.

Akkerman, S. F., & Bakker, A. (2011). Boundary crossing and boundary objects. Review of Educational Research, 81(2), 132–169.

Anagnostopoulos, D., Smith, E. R., & Basmadjian, K. G. (2007). Bridging the university-school divide: Horizontal expertise and the “two-worlds pitfall.” Journal of Teacher Education, 58(2), 138–152.

Anagnostopoulos, D., Smith, E. R., & Nystrand, M. (2008). Creating dialogic spaces to support teachers' discussion practices: An introduction. English Education, 41(1), 4–12.

Bean, R. M., Draper, J. A., Hall, V., Vandermolen, J., & Zigmond, N. (2010). Coaches and coaching in Reading First schools: A reality check. Elementary School Journal, 111(1), 87–114.

Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas accountability system. American Educational Research Journal, 42, 231–268.

Brown, D., Reumann-Moore, D. R., Hugh, R., du Plessis, P., & Christman, J. B. (2006). Promising InRoads: Year One report of the Pennsylvania High School Coaching Initiative. Philadelphia, PA: Research for Action. Retrieved from http://www.researchforaction.org/publication-listing/?id=25

Bryk, A. S., Camburn, E., & Seashore Louis, K. (1999). Professional community in Chicago elementary schools: Facilitating factors and organizational consequences. Educational Administration Quarterly, 35(5), 751–781.

Campbell, C., & Levin, B. (2009). Using data to support educational improvement. Educational Assessment, Evaluation and Accountability, 21(1), 47–65.

Carlisle, J. F., & Berebitsky, D. (2011). Literacy coaching as a component of professional development. Reading and Writing: An Interdisciplinary Journal, 24(7), 773–800.

Coburn, C. E., & Russell, J. L. (2008). District policy and teachers’ social networks. Educational Evaluation and Policy Analysis, 30(3), 203–235.

Coburn, C. E., Toure, J., & Yamashita, M. (2009). Evidence, interpretation, and persuasion: Instructional decision making at the district central office. Teachers College Record, 111(4), 1115–1161.

Coburn, C. E., & Woulfin, S. L. (2012). Reading coaches and the relationship between policy and practice. Reading Research Quarterly, 47(1), 5–30.

Cook, D. N., & Brown, J. S. (1999). Bridging epistemologies: The general dance between organizational knowledge and organizational knowing. Organization Science, 10(4), 381–400.

Cosner, S. (2011). Teacher learning, instructional considerations and principal communication: Lessons from a longitudinal study of collaborative data use by teachers. Educational Management Administration & Leadership, 39(5), 568–589.

Cosner, S. (2012). Leading the on-going development of collaborative data practices: Advancing a schema for diagnosis and intervention. Leadership and Policy in Schools, 11(1), 26–65.

Costa, A. L., Garmston, R. J., Ellison, J., & Hayes, C. (2002). Cognitive coaching: A foundation for renaissance schools. Norwood, MA: Christopher-Gordon.

Cuban, L. (1993). How teachers taught: Constancy and change in American classrooms, 1890-1990 (2nd ed.). New York, NY: Teachers College Press.

Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record, 114(11), 1–38.

Darling-Hammond, L., & Richardson, N. (2009). Research review/teacher learning: What matters. Educational Leadership, 66(5), 46–53.

Datnow, A., Park, V., & Kennedy-Lewis, B. (2012). High school teachers' use of data to inform instruction. Journal of Education for Students Placed at Risk, 17(4), 247–265.

Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles: Center on Educational Governance, University of Southern California.

Dembosky, J. W., Pane, J. F., Barney, H., & Christina, R. (2006). Data driven decisionmaking in Southwestern Pennsylvania school districts. Santa Monica, CA: RAND.

Duncan, A. (2010, July). Unleashing the power of data for school reform. Paper presented at the STATS-DC 2010 National Center for Education Statistics Data Conference, Bethesda, MA. Retrieved from http://www.ed.gov/news/speeches/unleashing-power-data-school-reform-secretary-arne-duncans-remarks-stats-dc-2010-data

Elmore, R. (1995). Structural reform and educational practice. Educational Researcher, 24(9), 23–26.

Engeström, Y. (1999). Learning by expanding: Ten years after, Introduction to the German and Japanese editions. Learning by expanding. Retrieved from (in English) http://lchc.ucsd.edu/mca/Paper/Engestrom/expanding/intro.htm

Engeström, Y. (2001). The horizontal dimension of expansive learning: Weaving a texture of cognitive trails in the terrain of health care in Helsinki. Paper presented at the New Challenges to Research on Learning, Helsinki, Finland.

Engeström, Y., Engeström, R., & Kärkkäinen, M. (1995). Polycontextuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction, 5, 319–336.

Ertmer, P. D., Richardson, J., Cramer, J., Hanson, L., Huang, W., Lee, Y., . . . Um, E. J. (2005). Professional development coaches: Perceptions of critical characteristics. Journal of School Leadership, 15(1), 52–75.

Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., . . . Sztejnberg, L. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030: National Center for Education Evaluation and Regional Assistance. Available from http://ies.ed.gov/ncee/pubs/

Goertz, M. E., Nabors Oláh, L., & Riggan, M. (2009). From testing to teaching: The use of interim assessments in classroom instruction. Philadelphia, PA: Consortium for Policy Research in Education.

Gummer, E. S., & Mandinach, E. B. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4).

Gutiérrez, K. D. (2008). Developing a sociocritical literacy in the Third Space. Reading Research Quarterly, 43(2), 148–164.

Gutiérrez, K. D., & Larson, J. (2007). Discussing expanded spaces for learning. Language Arts, 85(1), 69–77.

Gutiérrez, K. D., Morales, P. Z., & Martinez, D. C. (2009). Re-mediating literacy: Culture, difference, and learning for students from nondominant communities. Review of Research in Education, 33, 212–245.

Hadar, L., & Brody, D. (2010). From isolation to symphonic harmony: Building a professional development community among teacher educators. Teaching and Teacher Education, 26(8), 1641–1651.

Hamilton, L., Stecher, B., Marsh, J., McCombs, J. S., Robyn, A., Russell, J. L., . . . Barney, H. (2007). Standards-based accountability under No Child Left Behind: Experiences of teachers and administrators in three states. Santa Monica, CA: RAND Corporation.

Heppen, J., Jones, W., Faria, A., Sawyer, K., Lewis, S., Horwitz, A., . . . Casserly, M. (2012). Using data to improve instruction in the Great City Schools: Documenting current practice. Washington, DC: American Institutes for Research and The Council of Great City Schools.

Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational Measurement: Issues and Practice, 28(3), 24–31.

Herman, P., Wardrip, P., Hall, A., & Chimino, A. (2012). Teachers harness the power of assessment: Collaborative use of student data gauges performance and guides instruction. JSD, 33(4), 26–29.

Hipp, K. K., Huffman, J. B., Pankake, A. M., & Olivier, D. F. (2008). Sustaining professional learning communities: Case studies. Journal of Educational Change, 9(2), 173–195.

Horn, I. S., & Little, J. W. (2010). Attending to problems of practice: Routines and resources for professional learning in teachers’ workplace interactions. American Educational Research Journal, 47(1), 181–217.

Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the "data-driven" mantra: Different conceptions of data-driven decision making. Yearbook of the National Society for the Study of Education, 106(1).

Jacobs, J., Gregory, A., Hoppey, D., & Yendol-Hoppey, D. (2009). Data literacy: Understanding teachers' data use in a context of accountability and response to intervention. Action in Teacher Education, 31(3), 41–55.

Jennings, J. L. (2012). The effects of accountability system design on teachers’ use of test score data. Teachers College Record, 114(11), 1–12.

Katz, S., & Dack, L. A. (2013). Towards a culture of inquiry for data use in schools: Breaking down professional learning barriers through intentional interruption. Studies in Educational Evaluation. Retrieved from http://www.sciencedirect.com/science/article/pii/S0191491X13000503

Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112(2), 496–520.

Kohler, F. W., Crilley, K. M., Shearer, D. D., & Good, G. (1997). Effects of peer coaching on teacher and student outcomes. Journal of Educational Research, 90(4), 240–250.

Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333–349.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York, NY: Cambridge University Press.

Little, J. W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 118(2), 143–166.

Louis, K. S., & Marks, H. M. (1998). Does professional community affect the classroom? Teachers' work and student experiences in restructuring schools. American Journal of Education, 106(4), 532–575.

Louis, K. S., Marks, H. M., & Kruse, S. (1996). Teachers’ professional community in restructuring schools. American Educational Research Journal, 33(4), 757–798.

Love, N., Stiles, K. E., Mundry, S. E., & DiRanna, K. (2008). A data coach's guide to improving learning for all students: Unleashing the power of collaborative inquiry. Thousand Oaks, CA: Corwin Press.

Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30–37.

Mandinach, E. B., & Jackson, S. S. (2012). Transforming teaching and learning through data-driven decision making. Thousand Oaks, CA: Corwin Press.

Marsh, J. A. (2012). Interventions promoting educators' use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.

Marsh, J. A., & Farrell, C. C. (2014). How leaders can support teachers with data-driven decision making: A framework for understanding capacity building. Educational Management Administration and Leadership. DOI: 10.1177/1741143214537229.

Marsh, J. A., McCombs, J. S., Lockwood, J. R., Martorell, F., Gershwin, D., Naftel, S., . . . Crego, A. (2008). Supporting literacy across the Sunshine State: A study of Florida middle school reading coaches. Santa Monica, CA: RAND.

Marsh, J. A., McCombs, J. S., & Martorell, F. (2010). How instructional coaches support data-driven decision making: Policy implementation and effects in Florida middle schools. Educational Policy, 24(6), 872–907.

Marsh, J. A., McCombs, J. S., & Martorell, F. (2012). Reading coach quality: Findings from Florida middle schools. Literacy Research and Instruction, 51(1), 1–26.

Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education. Santa Monica, CA: RAND.

Mason, S. A. (2003, April). Learning from data: The role of professional learning communities. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

McNaughton, S., Lai, M. K., & Hsiao, S. (2012). Testing the effectiveness of an intervention model based on data use: a replication series across clusters of schools. School Effectiveness and School Improvement, 23(2), 203–228.

Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers' ability to use data to inform instruction: Challenges and supports. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation and Policy Development.

Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision making in schools: Teacher access, supports and use. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation and Policy Development.

Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.

Merriam, S. B. (1998). Qualitative research and case study applications in education. Revised and expanded from "Case study research in education." (ERIC Document Reproduction Service No. ED415771)

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook: Thousand Oaks, CA: Sage.

Mitchell, C., & Sackney, L. (2011). Profound improvement: Building capacity for a learning community (2nd ed.). London, England: Taylor & Francis.

Murnane, R. J., Sharkey, N. S., & Boudett, K. P. (2005). Using student-assessment results to improve instruction: Lessons from a workshop. Journal of Education for Students Placed at Risk, 10(3), 269–280.

Nabors Oláh, L., Lawrence, N. R., & Riggan, M. (2010). Learning to learn from benchmark assessment data: How teachers analyze results. Peabody Journal of Education, 85, 226–245.

Nelson, T. H. (2009). Teachers' collaborative inquiry and professional growth: Should we be optimistic? Science Education, 93(3), 548–580.

Nelson, T. H., Slavit, D., & Deuel, A. (2012). Two dimensions of an inquiry stance toward student-learning data. Teachers College Record, 114(8), 1–42.

Nelson, T. H., Slavit, D., Perkins, M., & Hathorn, T. (2008). A culture of collaborative inquiry: Learning to develop and support professional learning communities. Teachers College Record, 110(6), 1269–1303.

Neufeld, B., & Roper, D. (2003). Coaching: A strategy for developing instructional capacity: Promises & practicalities. Washington, DC: Aspen Institute Program on Education and the Annenberg Institute for School Reform.

Oláh, L. N., Lawrence, N. R., & Riggan, M. (2010). Learning to learn from benchmark assessment data: How teachers analyze results. Peabody Journal of Education, 85(2), 226–245.

Patton, M. Q. (2002). Two decades of developments in qualitative inquiry: A personal, experiential perspective. Qualitative Social Work, 1(3), 261–283.

Pedulla, J. J., Abrams, L., Madaus, G., Russell, M., Ramos, M. & Miao, J. (2003). Perceived effects of state-mandated testing programs on teaching and learning: Findings from a national survey of teachers. Chestnut Hill, MA: National Board on Educational Testing and Public Policy.

Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4–15.

Quint, J. C., Sepanik, S., & Smith, J. K. (2008). Using student data to improve teaching and learning. Boston, MA: MDRC.

Ragin, C. C., & Becker, H. S. (1992). What is a case? Exploring the foundations of social inquiry: Cambridge, England: Cambridge University Press.

Rodgers, A., & Rodgers, E. M. (2007). The effective literacy coach: Using inquiry to support teaching and learning. New York, NY: Teachers College Press.

Schildkamp, K., & Ehren, M. (2013). From "intuition"- to "data"-based decision making in Dutch secondary schools? In K. Schildkamp, M. K. Lai, & L. M. Earl (Eds.), Data-based decision making in education: Challenges and opportunities [online]. Dordrecht, Netherlands: Springer.

Spillane, J. (2002). Local theories of teacher change: The pedagogy of district policies and programs. Teachers College Record, 104(3), 377–420.

Spillane, J., Hallett, T., & Diamond, J. B. (2003). Forms of capital and the construction of leadership: Instructional leadership in urban elementary schools. Sociology of Education, 1-17.

Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning communities: A review of the literature. Journal of Educational Change, 7, 221–258.

Stoll, L., & Louis, K. S. (2007). Professional learning communities: Divergence, depth and dilemmas: McGraw-Hill International.

Strauss, A., & Corbin, J. M. (1997). Grounded theory in practice. Maidenhead, England: Open University Press and New York: McGraw-Hill International.

Supovitz, J., & Klein, V. (2003). Mapping a course for improved student learning: How innovative schools use student performance data to guide improvement. Philadelphia, PA: Consortium for Policy Research in Education.

Symonds, K. W. (2003). After the test: How schools are using data to close the achievement gap. San Francisco, CA: Bay Area School Reform Collaborative.

Talbert, J. E. (2009). Professional learning communities at the crossroads: How systems hinder or engender change. In Second international handbook of educational change (pp. 555–571). New York, NY: Springer.

Tschannen-Moran, M., Uline, C., Woolfolk Hoy, A., & Mackley, T. (2000). Creating smarter schools through collaboration. Journal of Educational Administration, 38(3), 247–272.

Tuomi-Gröhn, T., Engeström, Y., & Young, M. (2003). From transfer to boundary-crossing between school and work as a tool for developing vocational education: An introduction. In T. Tuomi-Gröhn & Y. Engeström (Eds.), Between school and work: New perspectives on transfer and boundary-crossing (pp. 1–15). Amsterdam, Netherlands: Pergamon.

Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24(1), 80–91.

Wei, R. C., Darling-Hammond, L., Andree, A., Richardson, N., & Orphanos, S. (2009). Professional learning in the learning profession: A status report on teacher development in the United States and abroad. Oxford, OH: National Staff Development Council.

Weiss, C. H. (1980). Knowledge creep and decision accretion. Knowledge: Creation, Diffusion, Utilization, 1(3), 381–404.

Wong, K., & Nicotera, A. (2006). Peer coaching as a strategy to build instructional capacity in low performing schools. In K. Wong & A. Nicotera (Eds.), System-wide efforts to improve student achievement (pp. 271–302). Charlotte, NC: Information Age.

Woolfolk, A., Winne, P., & Perry, N. (2012). Educational psychology (5th Canadian ed.). Toronto, Ontario, Canada: Pearson.

Young, V. M., & Kim, D. H. (2010). Using assessments for instructional improvement: A literature review. Education Policy Analysis Archives, 18(19).

Cite This Article as: Teachers College Record Volume 117 Number 4, 2015, p. 1-40
http://www.tcrecord.org ID Number: 17849, Date Accessed: 3/24/2017 11:57:14 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Julie Marsh
    University of Southern California
    E-mail Author
    JULIE A. MARSH is an associate professor at the Rossier School of Education at the University of Southern California. She specializes in research on K-12 policy implementation, educational reform, and accountability. Her research blends perspectives in education, sociology, and political science. Recent publications on data use include: “Trickle down accountability? How middle school teachers engage students in data use,” Educational Policy, 2014, 1–28 (online first), and “Interventions promoting educators’ use of data: Research insights and gaps,” Teachers College Record, 2012, 114(11), 1–48.
  • Melanie Bertrand
    Arizona State University
    E-mail Author
    MELANIE BERTRAND is an assistant professor at Arizona State University in Mary Lou Fulton Teachers College. Her research employs micro- and macro-level lenses to expand conceptions of leadership and explore the role of student voice in challenging systemic racism in education. She recently published an article in Educational Administration Quarterly about possibilities for reciprocal dialogue between students of color and educational decision makers.
  • Alice Huguet
    University of Southern California
    E-mail Author
    ALICE HUGUET is a Dean’s Ph.D. fellow at the University of Southern California’s Rossier School of Education. Her research interests include data use policies for school improvement in urban contexts, interorganizational relationships between schools of varying governance models, and implementation of parent engagement policies. Recent publications include: “Democratic engagement in district reform: The evolving role of parents in the Los Angeles Public School Choice Initiative,” Education Policy, 2014, 1–34 (online first), and “Building teachers’ data-use capacity: Insights from strong and struggling coaches,” in Education Policy Analysis Archives, 2014, 22(52), 1–26.
Member Center
In Print
This Month's Issue