Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

The Development of Capacity for Data Use: The Role of Teacher Networks in an Elementary School

by Elizabeth Farley-Ripple & Joan Buttram - 2015

Background: Amid calls for increased data use, there is little research or policy guidance for how to build schools’ capacity to leverage data to improve teaching and learning. Building on previous research highlighting the social nature of data use, we contend that in order to understand how capacity develops, research must focus on relationships and networks that support educators’ practice, conceptualizing capacity as socially embedded.

Purpose: This article explores the development of data use capacity in an elementary school through a social network approach. Our analysis focuses on the structure of data advice networks, the characteristics of perceived experts in the network, and the productiveness of the network in terms of influencing beliefs and practice.

Population: Data come from a sample of 42 educators from an elementary school exemplified by its district as a strong user of data to improve teaching and learning. Participants completed a survey about their data use beliefs, practices, and school context, as well as a social network questionnaire indicating from whom they sought advice on using data.

Research Design: We used the survey data to identify characteristics of the schools’ data use networks using descriptive statistics and social network analysis (SNA). SNA was also used to develop measures of structural location in those networks, which were then used to predict similarities in teachers’ beliefs and practices around data use.

Findings: Findings reveal that data use networks are influenced by the larger professional structure of the school, with data advice being from colleagues who are part of their larger professional network. Network structure reveals few highly central “advice givers” and many “advice seekers” connected by teachers and leaders who serve as brokers of advice. We find that brokers may play an important role in developing shared practices, given that the indirect relationships they support are predictive of shared data use practices.

Conclusions: This research is among the first to explore data use through a social network approach and offers early evidence about how educators’ networks enable schools to build capacity for data use. Our findings have implications for the design of professional development, for professional development for school leaders, and for successful implementation of reforms related to data use.

In the last decade, expectations for using data to improve teaching and learning have grown exponentially. Underlying policy efforts to increase educators’ use of data is a theory of action in which data use improves instruction, whether data inform school improvement efforts or teachers’ daily instructional decision making. Central to this underlying theory of action is educators’ capacity for data use. As Daly (2012) argued, “The ultimate success of data use for educational improvement may depend on how states and local education agencies build capacity” (p. 2).

However, amid calls for increased expectations for data use, there is little understanding of how capacity for data use develops. A great deal of policy effort is toward data use, which focuses generally on creating data systems, management information systems, and assessment systems to provide timely access to quality data for teachers (Data Quality Campaign, 2013; Means, Padilla, & Gallagher, 2010; NCES, n.d.), belying a belief that “if you build it, they will come.” However, having access to data does not equate to using it to improve teaching and learning (Schildkamp & Kuiper, 2010; Stecker, Fuchs, & Fuchs, 2005). In fact, there is evidence that data systems may be underutilized (Cho & Wayman, 2014; Means, Padilla, DeBarger, & Bakia, 2009; Wayman, Cho, & Shaw, 2009; Wayman, Jimerson, & Cho, 2012).

Additionally, interventions designed to build capacity have been implemented, and research is beginning to provide evidence of their effectiveness (Konstantopoulos, Miller, & van der Ploeg, 2013; May & Robinson, 2007; Saunders, Goldenberg, & Gallimore, 2009). However, as Marsh (2012) illustrated in a review, data use interventions rely on a range of strategies, from professional development to technology support to setting expectations and norms for data use, generally producing mixed evidence in terms of outcomes for educators’ knowledge, skills, and practices. The range of strategies suggests that there is little consensus on the mechanisms by which educators’ skills are influenced. Thus, amid calls for increased data use, there is little evidence on how capacity for data use develops.

In this article, we offer evidence of a potentially powerful mechanism for the development of capacity to use data: teacher networks. In the literature to date, we identify two dominant views of capacity in the literature—capacity as situated at the individual and organizational levels. However, we argue that greater attention needs to be paid to a third conceptualization—capacity as embedded in social relations. Using the perspective of social network theory, we explore how interactions between educators can support capacity development within schools. In the following pages, we present a brief review of the data use literature, attending specifically to the issue of capacity. We then examine this conceptualization of capacity in the context of an elementary school exemplified by its district as a strong user of data to improve teaching and learning. Our analysis focuses on the structure of data advice networks, the characteristics of perceived experts in the network, and the productiveness of the network in terms of influencing beliefs and practice. Finally, we discuss our results in terms of how a socially embedded conceptualization of capacity may be applied in research, policy, and practice.


Research findings often frame data use as an outcome, with capacity understood as organizational conditions that support or enable data use. Specifically, research has identified the importance of structured time as critical for successful collaboration (Blanc et al., 2010; Coburn & Turner, 2011; Datnow, Park, & Wohlstetter, 2007; Halverson, Grigg, Prichett, & Thomas, 2007; Ikemoto & Marsh, 2007; Ingram, Louis, & Schroeder, 2004; Kerr, Marsh, Ikemoto, Darilek, & Barney, 2006; Sharkey & Murnane, 2006; Supovitz , 2002; Supovitz & Klein, 2003; Wayman, 2005). Other resources shown to impact collaborative data use include timely access to data or other evidence of student learning (Kerr et al., 2006; Lachat & Smith, 2005; Lasley, 2009; Marsh, Pane, & Hamilton, 2006; Supovitz & Klein, 2003; Wayman & Cho, 2007; Young, 2006), tools or guides for collaborative activities (Cosner, 2011; Little & Curry, 2009; Nelson, 2008), and additional professional development (Cosner, 2011; Datnow et al., 2007; Fuchs, Fuchs, Karns, Hamlett, & Katz, 1999; Hamilton et al., 2009; Kerr et al., 2006; Saunders et al., 2009; Supovitz, 2002; Wayman & Cho, 2007).

Relatedly, organizational capacity is associated with leadership, given that both the allocation and coordination of resources are in large part functions of district and school leaders. This literature has documented a number of leadership actions that constrain or support data use (Anderson, Leithwood, & Strauss, 2010; Blanc et al., 2010; Knapp, Copland, & Swinnerton, 2007; Datnow et al., 2007; Firestone & Gonzalez, 2007; Hamilton et al., 2009; Honig & Venkateswaran, 2012; Ikemoto & Marsh, 2007; Supovitz & Klein, 2003; Wayman, 2005; Weinbaum, 2009; Young, 2006).

A second research lens focuses on individual capacity—that is, educators’ knowledge and skills related to data use (see Mandinach, Friedman, Gummer, 2015, this issue). The activity of using data can be described as comprising analytical and action-oriented tasks in which educators must make sense of the data (analysis and interpretation) as well as make choices about how to act (Coburn & Tuner, 2011; Cosner 2011; Marsh et al., 2006 ). This entails a degree of knowledge about both instruction and data—a combination recently termed data literacy for teaching but also labeled pedagogical data literacy (Mandinach & Gummer, 2013) and instructional decision-making (Means, Chen, DeBarger, & Padilla, 2011). Related to educators’ data literacy are their beliefs and attitudes regarding data. Research has suggested that preexisting cognitive frameworks mediate interpretation and search for information (Honig & Coburn, 2008; Kennedy, 1982) and that individuals vary in their beliefs about forms of evidence and their value (Coburn, 2001; Coburn &Talbert, 2006; Corcoran, Fuhrman, & Belcher, 2001; Light et al., 2005). However, research on data use more often than not identifies the lack of individual capacity in schools and districts (Ikemoto & Marsh, 2007; Means et al., 2009; Supovitz & Klein, 2003; Wayman & Stringfield, 2006). As a result, findings often suggest a need to improve human capacity through investments in professional development linking data to instructional practice (Cosner, 2011; Datnow et al., 2007; Kerr et al.,2006; Saunders et al., 2009; Schildkamp & Kuiper, 2010; Supovitz, 2002; Wayman & Cho, 2007) or address data literacy needs through preservice preparation programs (Mandinach & Gummer, 2013).

Nested between these two conceptualizations of capacity is a third form—data use capacity embedded in social relations or the social structure of the organization. The significance of a socially embedded conceptualization of capacity is suggested by data use research in three ways. First, research has found that data use in schools and school systems is rarely an individual activity, but rather that practice is often social in nature (Halverson et al., 2007; Means et al., 2009, 2011). Discussed in the previous point, research has documented how preexisting cognitive frameworks and beliefs influence how data are valued and interpreted. Therefore, to the extent that data use occurs in a social context, how data are interpreted and transformed into actionable information will depend on the individuals involved in the conversation. Thus, capacity for data use is not an individual attribute but one that occurs at the level of interpretation and action—which may often be groups or teams of teachers. Research further suggests that effective data use stems from collaborative processes (e.g., Datnow et al., 2007; Kerr et al., 2006; Lachat & Smith, 2005; Means et al., 2009; Schildkamp & Kuiper, 2010; Wayman & Stringfield, 2006) and can result in better instructional decision making and foster deeper use of data (Lachat & Smith, 2005; Supovitz, Merrill, & Conger, 2010; Young, 2006). Further, most prescriptive models of data use incorporate substantial collaborative elements (Supovitz & Morrison, 2011).

In addition to serving as opportunities for co-construction of knowledge based on data, social relations are opportunities for knowledge to be shared. That is, educators draw on social relations to identify expertise and to access information, resources, or skills throughout the organization. Research documents the role of formal instructional leaders, such as coaches, to support learning about data use (Cosner, 2011; Lachat & Smith, 2005). Further, evidence suggests that learning about data use is not limited to formal or expert resources; rather, research provides descriptions of successful collaboration around data based on learning from and with colleagues with only slightly more advanced data use skills (Wayman, Jimerson, & Cho, 2012; Wayman & Stringfield, 2006; Young, 2006). Thus, conceptualizing capacity as socially embedded elucidates the process by which knowledge and expertise move through an organization.

Relatedly, research on data use consistently has found that the culture of an organization influences data use or inquiry more broadly through the development of norms and expectations for inquiry (Honig & Venkateswaran, 2012; Ikemoto & Marsh, 2007; Ingram et al., 2004; Knapp et al., 2007; Lachat & Smith, 2005; Supovitz & Klein, 2003; Wayman & Stringfield 2006) and through development of norms associated with professional community, such as a need for trust, a focus on student learning, shared values, deprivatized practice, and reflective dialogue (Blanc et al., 2010; Bryk & Schneider, 2002; Hord, 1997; Kruse, Louis, & Bryk, 1995; McLaughlin & Talbert, 2001; Saunders et al., 2009; Supovitz & Morrison, 2011; Vescio, Ross, & Adams, 2008). These dimensions of organizational culture shape the nature of interaction among educators, which further supports the conceptualization of capacity as socially embedded.

These findings suggest that both schools’ and individuals’ capacity to use data are influenced by the nature and quality of social relations among educators, which we refer to as teacher networks. However, little work specifically has focused on examining the presence, quality, strength, or impact of teacher networks with respect to data use or connected these relationships to the development of school or individual capacity for data use, though calls for such research have been made (Daly, 2012). The purpose of this research is to explore these issues through a case study of an elementary school. Because this research is concerned with educators’ interactions and relationships with regard to data, we utilize a social network approach in which focus is shifted from individuals and their attributes to their interactions with others and position in a larger organizational network.  This approach, therefore, enables an understanding of how individuals’ or actors’ interactions (or lack thereof) enable or constrain certain opportunities—for example, access to information, expertise, or other resources. Thus, in the context of data use, such analysis permits an understanding of schoolwide networks (e.g., who interacts around data, who learns from whom about data use) and individuals’ position in the network (e.g., who are considered “experts,” who has access to expertise).  


Social network theory has been applied to a number of issues in education research, particularly to understand implementation and sustainability of reforms (Atteberry & Bryk, 2010; Coburn & Russell, 2008; Cole & Weinbaum, 2010; Daly & Finnigan, 2009, 2011; Daly, Moolenaar, Bolivar, & Burke, 2010; Penuel, Riel, Krause, & Frank, 2009; Spillane et al., 2009), teacher collaboration (see Moolenaar, 2012, for a discussion), professional learning (Penuel, Sun, Frank, & Gallagher, 2012), and diffusion of innovation (Frank, Zhao, & Borman, 2004; Frank, Zhao, Penuel, Ellefson, & Porter, 2011). Though at the time of this study, no work has specifically focused on the development of capacity for data use, a number of relevant network theory concepts employed in education research are instructive in understanding the role of networks in developing teacher capacity. Although a comprehensive review of network concepts is beyond the scope of this analysis (see Daly, 2012, for such a discussion), we focus on three key issues relevant to our purposes.


The foundation of social network theory is the social ties between actors, and the pattern of ties among individuals creates a network structure. A good deal of social network analysis in education research has focused on characteristics of network structure as a way of understanding how resources flow to and from individuals in the network. Of particular interest here are the network characteristics density (a measure of how many of the organization’s members interact with each other) and centralization (how evenly distributed interactions are across actors). Education research typically uses these measures to assess organizational attributes such as social capital, capacity for reform, and organizational learning (Atteberry & Bryk, 2010; Daly & Finnigan, 2010; Daly et al., 2010; Finnigan & Daly, 2012; Moolenaar & Sleegers, 2010). If capacity is conceptualized as embedded in the social ties among teachers, then the information about the structure of those relations helps us to understand how capacity moves or develops in an organization.


From a network theory perspective, not all ties among actors are the same. Relations between individuals can be characterized in three ways: relations, interactions, and flows (Borgatti, Mehra, Brass, & Labianca, 2009). Social relations describe the nature of the relationship (e.g., colleagues, friends). Interactions quantify the nature of a social relation (e.g., frequency, strength). Flows indicate resources that move through those relations (e.g., expertise, knowledge). Research on social networks in education have documented that the structure of networks can vary by the type of relation and that each type of relationships can have a different impact on outcomes such as beliefs, attitudes, and practice (Cole & Weinbaum, 2010; Finnigan & Daly, 2012). To help categories ties, previous studies have characterized networks as expressive, referring to natural and social affinities, and instrumental, characterized as developing for a particular purpose, such as advice networks (e.g., Cole & Weinbaum, 2010; Daly & Finnigan, 2010, 2011; Finnigan & Daly, 2012; Spillane & Kim, 2012). Research also shows that the frequency or strength interactions may be related to outcomes of reform, such as in the case of implementation of mathematics reform (Coburn, Russell, Kaufman, & Stein, 2012; Penuel et al., 2012). That is, how often actors interact or the importance of the relationship may help explain outcomes such as degree or fidelity of implementation. There is also is growing attention to the resources that move through relations in an organization. Research in education has focused on how networks serve as a mechanism for creating or constraining access to resources such as expertise (Baker-Doyle & Yoon, 2010; Frank et al., 2004; Penuel et al., 2012) or research evidence (Finnigan, Daly, & Che, 2013).


Network theory holds that an actor’s position in a network is related to his or her degree of influence in the organization. The concept of centrality is one often applied in education research. Central actors are hypothesized to have substantial influence or control over how resources, information, or expertise are distributed throughout an organization. Recent work suggests the power of central actors in the diffusion of knowledge and attitude.  For example, studies have explored the centrality of leadership in instructional advice networks (Spillane & Kim, 2012) as a way of understanding influence on the instructional program of a school. In contrast, individuals on the periphery may have fewer ties to others and therefore be dependent on few relations for access to organizational resources. Those in between can serve as connectors or bridges, such as central offices serving as brokers of policy to schools and between groups (Daly & Finnigan, 2011) or coaches supporting the diffusion of an initiative (Atteberry & Bryk, 2010).

In the context of data use, network structure, content and strength of relations, and actor position are useful concepts in understanding the socially embedded nature of capacity to use data. They provide ways to measure which resources, such as knowledge and expertise around data use, are shared, as well as how often and between whom they are shared. Thus, an understanding of teacher networks should be a productive lens for understanding both individuals’ and schools’ development of capacity. To explore this proposition, our research examines the role of teacher networks in the development of capacity for data use in an elementary school. Drawing on social network theory, we focus on the structure of data advice networks, the characteristics of those from whom advice is sought, and the productiveness of the network in terms of influencing beliefs and practice. Specifically, the present study seeks to address the following questions: (1) What are the characteristics of professional and data advice networks in the school? (2) Who are the providers of support for data use, and what are their characteristics? (3) Are data advice networks productive in developing shared data beliefs and practices?


Data for this analysis were drawn from a broader multiple case study of school data use in four elementary schools (serving Grades K–5) in two districts in a mid-Atlantic state. Districts were selected based on comparable size and diversity of population. Schools were selected with the advice of the superintendents, who identified one school where data use was considered strength and another as an area of improvement. In separate analyses conducted as part of the research project, one school was identified as an exemplar in terms of data use. We elected to focus on this school—Allegheny Elementary—for further investigation of the development of data use capacity.


Allegheny Elementary School is a high-performing school of about 550 students and 53 educators. Allegheny is one of 12 schools in a primarily urban district. The school serves a majority low-socioeconomic-status (88% are eligible for free/reduced lunch) and non-White student population (50% of students were African American, 30% were Hispanic), with a substantial portion of English language learners (20%) and students requiring special education services (12%).1 In 2010–2011, 85% of students exceeded math and reading standards.

To further describe the context of Allegheny, we draw on previous analyses (see Farley-Ripple & Buttram, 2013, 2014). Many of the conditions research has identified as important for data use were evident in this school, including leadership for data use, a strong sense of professional community, and key resources. The principal was highly involved in the instructional program and targeted resources to meet instructional needs. For example, by hiring a literacy coach, a math coach, and several reading specialists, the principal increased the instructional staff of the school; this enabled him to assign and integrate instructional specialists into professional learning communities, provide release time for teacher collaboration and professional development, and provide more targeted support to struggling teachers and students. The principal emphasized team and leadership development through collaboration and suggested that many teaching-related activities should be done in this collaborative environment, including discussing lesson plans, sharing activities, discussing how students were doing, and ultimately delving into specific topics based on an agenda and goals they would develop together. The school had been implementing professional learning communities for grade-level teams for two years prior to 2010 and had a structure in place to allocate 90 minutes during and after school time for collaborative data use.

As a result of both leadership and collaboration, the school had established norms and practices for using data to drive improvement. Further, teachers and administrators utilized a wide range of data—facilitated by district investment in a comprehensive data system—to inform classroom, grade-level, and schoolwide decisions. For example, teachers were expected to regularly examine data and to set short-term goals for student performance between assessments. At faculty meetings, teams posted both goals and student results by individual teacher in faculty meetings, where instructional strategies were shared, improvements were suggested, and successes—both group and individual—were celebrated.

In many ways, Allegheny is an outlier. In our previous research (Farley-Ripple & Buttram, 2013, 2014), we found that Allegheny was uniquely successful in producing schoolwide data use for instructional improvement. More broadly, it outperformed most elementary schools in the state and all schools in its district despite serving a traditionally disadvantaged population. As an outlier, it is valuable as a case study because the context for developing capacity for data use is not confounded with other issues that research has identified as constraining data use—including those organizational conditions examined in previous research—thus offering insight into the specific influence of networks on the development of data use capacity. Although its uniqueness limits generalizability of specific findings, we can use what we learn from Allegheny as a foundation for exploring capacity development in other contexts. Furthermore, our analysis of Allegheny illustrates several ways in which a network analytical approach can be valuable in understanding individual and school capacity for data use.


The larger study from which our data are drawn employed a mixed-methods approach in which qualitative data (interviews, focus groups, and document analysis) were coupled with survey data. Data were collected during the 2010–2011 school year. For the purposes of the present analyses, we focus on data collection as relevant to the examination of social networks at Allegheny, which was drawn from a schoolwide survey administered in May 2011, with an 80% response rate (n = 42).

The survey included three types of items: social network items; items related to beliefs about data and data-use practices; and teacher background questions. To collect information about teachers’ social networks, respondents were asked two separate sets of questions using a bounded approach in which we provided a list of all the staff in the school. First, we asked individuals to identify their professional network by listing up to five of their closest professional colleagues, indicating how often they interacted with them on a 5-point scale from daily to less than a few times a year. Although professional networks are not a focus of this research specifically, we included this measure to gauge the frequency with which teachers generally interacted professionally in comparison with interactions around data use and as a measure of the social capital of the teachers and the school. In addition, we asked a set of instrumental questions related to data advice. Specifically, we asked them to identify their data advice network by (1) listing up to five colleagues to whom they go for help or advice using data and (2) how often they go to those colleagues on a 5-point scale from daily to less than a few times a year.

To explore the productivity of networks, we selected two outcomes intended to capture teachers’ reported beliefs and practices with respect to data use. We utilize two scales from Wayman and colleagues (2009): A five-item scale related to beliefs about the value of data in instruction was utilized to represent the construct pedagogy, and a second five-item scale related to how data were used in instructional decision-making was utilized to represent the construct practice (see Appendix A for items). We recognize that these items represent simple conceptualizations of belief and practice and may not include important dimensions of each construct; however, responses to the items demonstrated sufficient reliability and variability for our exploratory purposes. Last, we included background questions related to role in the school, years of experience, and perceived comfort and expertise using data. Descriptive statistics on all measures used are presented in Appendix A.


Analyses of scale items were conducted in SPSS. Appendix B presents more information about the items and reliability for pedagogy, practice, and comfort. Social network analysis was analyzed using UCINET (Borgatti, Everett, & Freeman, 2002) and NetDraw (Borgatti, 2002). The social network items from the survey yielded two matrices—professional and data advice—in which ties were characterized by both direction (who the respondent identified) and strength (how often did the respondent go to that person). We utilized these matrices to answer research questions 1 and 2.

The first research question focuses on the characteristics of professional and data advice networks in the school. We use network measures of density and centrality to characterize the structure of professional and data advice networks. We first calculated density—the number of ties between actors divided by the total number of possible ties—as a measure of the connectedness of the school. That is, if there are more social ties between actors, we might expect information or resources to flow more quickly through the school. In this case, we are interested in general professional interactions and interactions specific to advice related to data use. Therefore, we compare density of the professional network to the density of the data advice network using a bootstrap paired sample t test.  

As a second overall network measure, we calculated Freeman’s degree centralization. This measure is based on the number of incoming ties to each actor (indegree) and the number of ties from an actor to others (outdegree). Because our data constrained the outdegree of individuals by requesting identification of up to five of their colleagues, we focus on indegree as a measure of centrality. At a network level, the indegree centrality statistic is a measure of how distributed indegree is for a given network. Thus, higher values (closer to 1) indicate a concentration of indegrees on few individuals and thus a more centralized network. Central actors can be considered influential and may exert disproportionate control over resources. In an advice network, a highly centralized structure may indicate that expertise (or perceived expertise) is located within a few individuals rather than dispersed throughout the organization.

We also use the quadratic assignment procedure (QAP) for measuring the association between these networks. This procedure calculates the correlation—because all matrices are valued data, the Pearson correlation coefficient is most appropriate—between any pair of matrices and produces a measure of association. The procedure then runs a series of permutations (here, n = 500) that randomly matches actors, reporting the proportion of permutations where the association is larger or smaller than the observed association. These proportions are used to determine the statistical significance of the observed association. We correlated the two matrices in two ways. First, we dichotomized the matrices to observe the correlation based entirely on whom individuals identified as part of their network, absent measures of strength. This provides a way of assessing whether the members of the networks differ, irrespective of how often individuals interact. Second, we correlated the matrices in which ties indicated strength, which provides a measure of whether the frequency of interaction with members differs between networks.

To answer the second research question, which focuses on providers of support for data use, we draw again on centrality as a means of identifying key actors in the data advice network. We calculate Freedman’s indegree for each individual to assess consistency across networks and to delineate central from peripheral actors in the advice network. Indegree is calculated at the actor level by summing the number of actors seeking advice from that individual. Because relations are valued by the frequency of interaction, indegree is the sum of values for all incoming ties to an actor. We articulate three groups: those individuals who were in the top 10th percentile for indegree centrality (n = 6), those from whom anyone sought advice (indegree > 0, n = 24), and those from whom no one sought advice (n = 27). We then utilized descriptive analyses to compare characteristics of each group-based role in the school as well as on responses for the following survey items: years’ experience in the school and in education; pedagogy and practice scales; school culture scales; items related to perceived comfort and expertise using data; and value of particular professional development opportunities related to data use. We also use flow-betweenness as a measure of centrality to identify those who connect other advice seekers with central advice givers. These analyses provide a profile of the empirically defined “experts” in the school.

Last, we utilize a QAP regression model to explore the productivity of the advice network in producing similarity in responses to the practice and pedagogy scales (research question 3). Survey responses were used to generate attribute matrices to complement matrices measuring social networks. Attribute matrices allow nodes in a network to be grouped by a characteristic. For example, it is possible to examine relationships within and between groups, with a variable indicating group membership in the attribute data file. Attribute matrices may also be transformed into similarity or difference matrices. Such a transformation produces a measure of how similar a respondent is to (or dissimilar from) each other member of the network. Importantly, in this third analysis, we do not consider whether the advice networks function to improve practice; rather, we focus on development of shared practices (as measured by similarities/differences in the practice scale). We do not presume that measures of data practice utilized here are sensitive to quality of data use, nor are we certain that there is consensus on what constitutes “good” use of data. Thus, we qualify the results of the third research question in that we focus on the productivity, rather than the effectiveness, of advice networks.

We utilize the similarity transformation in UCINET to generate symmetrical similarity matrices for the scales pedagogy and practice, which produces a continuous variable capturing the absolute difference in scale scores on these two measures. These matrices therefore indicate how similar one’s beliefs and practices are to those of every other member of the matrix.

We hypothesize that a productive advice network would lead to similarities in these scales. To predict that similarity, we incorporate a set of covariate matrices. First, we include the professional matrix to control for the baseline social structure of the school. Alternatively, we consider models in which educators’ role (grade level, administrator, and so on) is included because professional relations are stronger within role categories than between, and role more clearly distinguishes between groups than the professional network does. Second, we add as predictors matrices that capture the frequency of data advice networks. We do this in three ways. We first model similarities in terms of the frequency network itself—a simple measure of reported advice seeking. This matrix, however, does not link all actors to all other actors; that is, Actor A may seek advice from Actor B, who seeks advice from Actor C, yet the frequency matrix will not show any relation between A and C even though it is likely that C has significant but indirect influence on A. We therefore add two alternatives based on transformations of the frequency matrix. The first is a matrix of the geodesic distance—the shortest path between two actors based on the presence of ties irrespective of strength. Because the data advice network measures strength of ties as well, we can further refine the measure to capture the optimum path—defined as reachability. The strength of a path is defined as the strength of its weakest link. The optimal path between two nodes is defined as the one with the highest overall strength, which is to say, the one with the strongest weakest link. The weighted reachability matrix gives the strength of the optimal path between each pair of nodes.

We utilize a double-semi-partialling multiple regression technique offered in UCINET. Like the QAP correlation procedure, QAP regression (Krackhardt, 1987) produces estimates for R-square and coefficients based on observed data, then compares the results with the results of permutations (here, n = 2,000) in which the rows and columns of the dependent matrix are randomly permuted. Reported results indicate the proportion of permutations for which the coefficient as low or as high as reported were found, with a low proportion (< .05) indicating it is unlikely that the observed relationship occurred by chance. Because our model controls for professional ties, we use a modified technique for multiple regression suggested by Dekker, Krackhardt, and Snijders (2007), referred to as double-semi-partialling.


In this section, we present the results of our analyses and organize this discussion in relation to the specified research questions.


A visual representation of network structure is useful in contextualizing numeric and statistical measures. Figures 1 and 2 illustrate the professional and general data advice frequency networks, respectively. Relations are represented by arrows, with arrow heads indicating the direction of the relationship. The shape of the nodes indicates the educators’ role in school: Circles identify classroom teachers in Grades 1–5, diamonds indicate related arts teachers, triangles indicate instructional specialists such as math coaches and reading teachers, circle-within-squares indicate counselors or educational diagnosticians, and squares indicate administrators. The size of each node illustrates its centrality in the network, with larger nodes having greater indegree, that is, more people coming to them for advice. These figures shed light on key differences in the networks, illustrated statistically later in this article.

Figure 1. Teachers’ general professional network, by role [39_17852.htm_g/00001.jpg]

Figure 2. Teachers’ data advice network


Table 1 presents the network characteristics of, and correlation between, Allegheny Elementary’s professional and advice-seeking networks. Both average degree and density suggest that there are substantial differences between professional and data advice networks. In the professional network, ties are quite dense, with 50% of all possible ties between members of the staff present if we discount directionality (that is, if we assume that it doesn’t matter whether the relationship is reciprocal or not). In contrast, only 24% of ties are present in the overall frequency of data advice network. If we account for directionality, then these networks are even sparser, with a small percentage of ties being utilized for data advice. These findings are visible in Figures 1 and 2, where the professional network appears more densely connected than the data advice network.

Table 1. Network Characteristics and Pearson’s r Correlation



(ignoring directionality)


(accounting for directionality)

Indegree Centralization



Valued Matrix Correlation







Data Advice




 ***p <.001.

Centralization of the networks differs substantially as well, with the data advice network being far more centralized than the professional network. Similarly, this is evident in Figures 1 and 2, where several actors appear central in the professional network while the data advice network is highly focused on actor 18.

The correlation of dichotomized matrices indicates the degree to which actors interact with the same individuals for multiple purposes. As is evident, the networks are highly correlated, which can be interpreted as an indicator that educators at Allegheny Elementary generally work within their existing and primary professional network when seeking advice related to data use, rather than seeking out advice from others with whom they interact less frequently. Professional and data advice networks are less strongly associated, however, in terms of strength of interaction; the correlation between valued matrices is .44. This suggests that although the individuals constituting the networks are highly similar, there is a difference in the frequency of interaction around general professional issues and data use.


We used actors’ indegree centrality as a measure of how influential they were in the data advice network of the school. The mean-value-weighted indegree across actors was 8.24, and the range was 0 to 96. Discounting frequency of interaction, the average number of people seeking advice from each individual was 2.11 per actor, with a range of 0 to 26. As discussed earlier, the data advice network is highly centralized (indegree centralization 35%), meaning that a few individuals are particularly important in distributing advice related to data use. We therefore identified three groups to profile as part of the second research question. First, we identified those highly central actors, defined as exceeding the 90th percentile threshold for frequency-weighted indegree centrality (25.4). Six individuals fit that profile (3 responded to survey). Second, we identified the set of actors with indegree greater than zero, indicating that they were a source of advice to others in the school (n = 20, 17 responded to survey). Last, we identified those who were not sources of advice but rather only served as seekers (n = 27, 18 responded to survey). We present characteristics of these three groups in Tables 2 and 3.

Table 2. Role Differences by Centrality Categories


Role in School


Core Subject Classroom Teacher

Formal Leader (Administrator, coach, etc.)

Related Arts Teacher (Library, art, etc.)

>90th Percentile






% within role





Other support givers






% within role





Advice Seekers






% within role





% within role





Note: Chi-square test of statistical significance yields p <.05.

Table 3. Characteristics of School Staff Based on Centrality Categories


Years in School

Years in Education




Still Need to Learn a Lot

Flow Betweenness***

>90th Percentile

























Other support providers

























Advice seekers


















































***Analysis of variance (ANOVA) test of statistical significance yields p <.001.

Those offering advice, including the most central actors, were more likely to be in nonclassroom roles and hold formal leadership positions. The principal, four coaches, and a third-grade teacher constituted the most influential actors in the data advice network, whereas core classroom teachers (grade-level teachers) were far more likely to be advice seekers. Results also suggest that highly central actors (those from whom advice is most frequently sought) are less likely to seek advice themselves and that those from whom no advice is sought are also less inclined to seek advice. In the middle are those who both routinely seek and give advice. This creates a picture in which key actors—though not necessarily highly central in terms of indegree—are mediators of information flowing between more and less central actors in the network. We include a measure of this mediating role, termed flow-betweenness, in the analysis. Aside from role, this is the only other highly significant difference between centrality groups.

In addition, there is some indication that highly central actors had higher scores on the pedagogy scale and were more experienced than their peers, though missing data here are particularly problematic. There is also evidence that advice seekers had lower responses on pedagogy and comfort using data, as well as stronger agreement that they still need to learn a lot about data use and higher scores on practice. However, these differences are not statistically significant.

It is worth focusing here on the most central actor in the data advice network—the literacy coach—who had an indegree 2.5 times the next highest advice giver. This indicates that this individual serves a highly critical role in supporting data use in the school. Interview data from the larger study confirms that the literacy coach is charged by the district with training teachers to use a variety of instructional programs and that she is deployed by the principal in a number of leadership and support roles. Although these responsibilities are not specific to data use, the instructional staff clearly perceives her role to encompass support for data use.


Answers to our first two research questions inform our analysis of the third: to assess whether the data advice network can be productive in influencing beliefs about the value of data for instruction (pedagogy) as well as reported practices with data use (practice). First, we know that professional and data advice networks are highly correlated in terms of who an individual interacts with but that the frequency and purpose of interaction are far less tightly associated. Therefore, we want to control for teachers’ preexisting professional networks to distinguish between “any” professional interaction and those focused on data advice. Second, we know that the data advice network is highly centralized but that a set of actors is mediating the flow of information from the perceived experts (i.e., highly central actors) to those who are primarily advice seekers. Based on this information, we suspect our dependent variables may be a result not just of direct interaction between actors but indirect influence as well. Therefore, we produced several regression models for each of the dependent variables. Results are presented in Table 4.

Table 4. QAP Multiple Regression Models Predicting Responses on Pedagogy and Practice Scales





Stand. β


Stand. β


Stand. β


Stand. β


Stand. β


Stand. β


Stand. β



























































































Note: QAP = quadratic assignment procedure.

Table 4, continued. QAP Multiple Regression Models Predicting Responses on Pedagogy and Practice Scales





Stand. β


Stand. β


Stand. β


Stand. β


Stand. β


Stand. β


Stand. β



























































































We first ran models including only the professional network and then only the data advice network measures (frequency, geodesic, and reachability). As in ordinary least squares linear regression models, the R-square is interpreted as the degree to which variance in the dependent variable is explained by the independent variables. The dependent variables are a measure of similarity and difference between responses on each scale, with smaller values interpreted as less difference or greater similarity, and larger values interpreted as less similarity and greater difference. The independent variables are measures of strength of interaction, thus higher values are associated with stronger interactions.

Analyses produced no meaningful explanation for the variability in the pedagogy scale (R2 = 0), indicating that the most important predictors of beliefs about data are not accounted for in this model or that there is insufficient variability in this measure within the school context. In either case, our models offer no evidence suggesting the extent to which data advice networks influence beliefs about the value of data in instruction.

We focus then on predicting reported data use practices. Professional network ties predict less than 1% of variability in practice, though the effect was statistically significant and in the expected direction. An increase in 1 unit of frequency in terms of professional ties is associated with a .25 SD increase in similarities in practice. Models that incorporated direct measures of interaction (frequency) and nonoptimal measures of indirect interaction (geodesic) did not explain much of the similarity in practices (R2 < .01). However, the optimal data advice measure—reachability—explained substantially greater proportions of the variability in practice: 15.1%. Further, an increase in reachability is associated with a decrease in .39 SD in differences in practice. When controlling for professional network relations, this model fit improves slightly to 15.3%, and the association with reachability is .38 SD.


The purpose of this research has been to consider how a socially embedded conceptualization of capacity can help us to understand how data use capacity develops. Drawing on the results of our analyses, we focus the following discussion on what can be learned from the structure of data advice networks, actor position within those networks, and the productivity of those networks in producing shared practice.


Analysis suggests that educators’ professional networks and data advice networks are highly similar in terms of which individuals teachers seek support from (the relation), yet are less similar when evaluating the strength of relationships as measured by frequency. Further, analyses indicate that professional networks are denser than the data advice network, whereas the latter is more centralized. These findings are interesting for a number of reasons.

The finding that the actors in professional and data advice networks are generally the same suggests the power of professional networks for building capacity. That is, the individuals with whom one primarily interacts are those through whom one accesses organizational resources, including knowledge/expertise, as in this study. Conversely, resources may flow primarily through preexisting networks rather than through issue-specific networks formed on the basis of other factors such as expertise. Professional and data advice networks are less strongly associated, however, in terms of strength of interaction. This finding is intuitive: Professional interactions are not limited to data use, nor is data use part of all professional interactions. Therefore, we might expect frequencies of interaction to differ. Nonetheless, there still exists a moderate, statistically significant correlation indicating that data use may be a substantive part of teachers’ professional relationships in Allegheny.

Though the composition and frequency of interaction in data advice and professional networks are correlated, centralization and density differ significantly. Again, this is a somewhat intuitive finding. First, we expect professional ties to be denser because it is a more general measure of collegial relations at Allegheny, whereas the data advice network is more instrumental and limited to relations based on specific needs (here, advice on data use). As summarized by Daly and Finnigan (2009) and Daly and colleagues (2010), denser networks are associated with capacity for exchanging resources and deep implementation of reform, which is confirmed in our finding that data advice networks generally tap into the denser web of professional ties. Conversely, sparse ties (i.e., the data advice network) may be associated with the diffusion of innovation because these ties are utilized to seek new information—which is confirmed in our finding that data advice networks may be productive resources for developing shared practice. Therefore, our findings support earlier work arguing that both denser and sparser networks may be important within an organizational structure because they provide access to different kinds of information and resources (Haythornthwaite, 2001; Tenkasi & Chesmore, 2003, as cited in Daly et al., 2010).

Relatedly, we might expect the network for data use advice to be more centralized—as it is found to be at Allegheny—because expertise may not be evenly distributed across an organization; rather, there is likely to be a subset of individuals from whom advice is sought. A greater degree of centralization may indicate that educators in Allegheny perceive different degrees of expertise and seek knowledge only from relevant sources. In this case, advice-seeking based on perceived expertise appears to be productive in developing shared practice, suggesting that capacity for data use can develop through a centralized instrumental network.


Our analysis of data advice networks examines characteristics of highly central actors (those perceived as “experts”) in the Allegheny community, moderate advice givers, and advice seekers. Findings reveal few distinct characteristics that define each group: There were no significant differences in terms of experience, responses to pedagogy and practice scales, or comfort using data.

In an advice network, we might expect that actors with a particular level of skill or knowledge would be tapped for advice, which explains earlier findings related to centralization. However, in the data advice network, proxies for expertise—experience and responses to pedagogy and practice scales—revealed few differences, none of which were statistically significant. Alternatively, “expertise” might be whether the actor feels that he or she needs to learn a lot more about data use as well as his or her personal comfort using data. It may be that educators turn to those who seem fluent in particular activities. However, these measures did not distinguish between advice giving/seeking groups either. Thus, findings suggest that how educators in Allegheny define expertise was not observed in this study. Nonetheless, findings related to comfort and learning may also suggest an additional characteristic of the Allegheny school community. Perceived experts’ belief that they still need to learn a lot about data use and the moderate level of comfort using data imply a belief that they need to continue to learn about data use—even “experts” are engaged in ongoing learning to improve their practice. The practice of ongoing learning is consistent with other norms and expectations that characterize the culture of Allegheny and may be a contributing factor to its exemplary performance under challenging conditions.

Although proxies for “expertise” failed to distinguish among advice giving/seeking groups, major differences, however, were associated with position in the school and “betweenness” in the network. Highly central actors (> 90th percentile) were far more likely to have formal leadership positions, which, absent other observable differences in expertise, suggests that a key component of the data advice network structure in Allegheny is positional authority. That is, educators are more likely to seek advice from those with formal instructional leadership positions, which included the literacy coach, math coach, reading teachers, and principal. This may be due to structural issues in how particular roles and responsibilities are assigned, with formal leaders given duties related to data use support. An alternative may be that formal leaders achieved those roles in part because of expertise not observable in existing measures, in which case positional and expert authority may coexist in highly central individuals.

A related finding is the existence of a group that both seeks and gives advice. These actors—generally classroom teachers—have a high degree of betweenness and mimic the structure that underlies a coach-to-coach model often employed in professional development. Betweenness makes these individuals critical connection points between highly central actors or perceived “experts,” and the majority of classroom teachers who are advice seekers only—a position that can be defined as “brokers” (Moolenaar, Daly, & Sleegers, 2010). These actors facilitate or constrain the distribution and flow of resources—in this case, advice on data use—throughout the organization. Given the exemplary status of Allegheny as a data user, it seems that these individuals have supported the development of data use capacity. Similarly, this “between” group may serve a valuable function in making the work of central actors, such as the literacy coach, more manageable. If assigned responsibility for data use support, providing such support to approximately 40 classroom teachers would prove highly time consuming. Through interaction with the between group, support can reach the advice seekers indirectly—an important aspect of the productivity of data advice networks.


Our last objective was to explore whether Allegheny’s network structure was productive. That is, can advice seeking networks be a productive way for schools and teachers to generate data use capacity? Findings reveal little explanatory power in predicting beliefs (pedagogy) about the role of data in instruction. This may indicate measurement error or that other factors (e.g., preservice preparation, professional development) not considered in our model are highly influential on beliefs about data use. Findings related to practice—that is, a measure of teachers’ use of data during the course of instruction—are quite different. First, we find that direct interaction does not explain similarities in practice but that a measure of indirect interaction explains a substantial proportion of variance (15%) in differences in practice, even controlling for general professional relationship. This can be interpreted to mean that practice is influenced by those beyond your immediate and direct interactions. That is, it isn’t just whom you seek advice from, but from whom they seek advice as well. Influence, therefore, is exerted indirectly.  In the previous discussion, we referred to the significance of the between group who linked highly central actors to advice-seeking actors. Their significance is confirmed in this finding. Overall, the analyses offer empirical evidence that advice networks can be productive in influencing teachers’ practice.


The purpose of this research has been to explore the development of data use from the perspective that capacity is socially embedded. Through social network analysis, we are able to present information drawn from a unique case study about the nature of professional and advice networks, the characteristics of perceived experts, and the productivity of the network in terms of shared beliefs and practices. In Allegheny, teacher networks reveal a centralized structure for seeking advice on data use that occurs largely within existing professional relationships. In these networks, we observed key roles for administrators and teacher leaders but also for classroom teachers who serve as bridges between perceived experts and less connected colleagues. Ultimately, these networks generated shared practices around data use.

The uniqueness of the case and the methodology employed have limitations that qualify our findings. First, Allegheny is an exemplar in many ways, and it is not expected that relationships observed here translate directly to other, broader contexts. However, the context of this study is generally free of many factors that constrain the development of capacity for data use (e.g., absence of access to data, poor instructional leadership), providing a unique opportunity to examine the mechanism of advice networks independent of other confounding issues. Second, our response rate on the survey, while high, still resulted in missing data (n = 11). Missing data can be highly problematic in social network analysis. In this case, nonrespondents may have reported patterns of relations that differ from those we did observe, creating an opportunity for us to draw inaccurate conclusions about Allegheny. Third, the measures used in this study could be refined and improved. The measures of practice and pedagogy have been used in a number of contexts, but extensive validation of the instruments has not been conducted. There is a need for widely available and validated measures that allow for more consistent estimation of data use behaviors. Findings also suggest that other factors that may differentiate levels of data use expertise were not accounted for in our survey and would be beneficial in future work.  

In spite of these limitations, this research has implications for both research and practice.  This research offers evidence that advice networks may be effective in establishing shared practices within this school community. Although it does so in a context that enjoys strong instructional leadership, structures that support data use, and a school culture reflecting professional community, we can use what we have learned about Allegheny as a foundation for exploring capacity development in other contexts. That is, under what conditions are teacher advice networks viable mechanisms for developing capacity? Given that data advice networks were built from professional networks in Allegheny, in schools with weaker ties among teachers and less social capital among colleagues, we might expect data advice networks to be more sparse or diffuse, perhaps lessening the productivity of networks. We might also anticipate that schools lacking strong leadership for data use will have different structures for seeking advice, perhaps a less centralized structure resulting in more varied sets of data use practices. These are speculations informed by what we observed at Allegheny but offer important directions for future research.

Evidence provided here offers also an empirical argument for the need to conceptualize capacity as socially embedded and that teacher networks may serve as a powerful mechanism for the development of capacity to use data. As consensus grows about the social nature of data use, we anticipate greater attention to network analysis moving forward. In particular, we do not attend to the quality of social relations or the quality of practices. Specifically, we do not focus on what types of relations are likely to produce improvement in data use practice. Work by Coburn and colleagues (2012) focuses on depth or quality of interaction around mathematics reform implementation, and such an approach would be a fruitful way of advancing findings presented here. Second, research is needed to explore the role of leadership in instrumental networks. Our analysis suggests that those with positional authority—in this case, those with formal instructional leadership positions—are positioned to exert substantial influence over practice. Whether this is by design (e.g., in the roles and responsibilities delegated to leaders) or a result of the coexistence of expert and positional authority is not observable in the scope of this project. Further, the prominence of formal instructional leaders in data advice networks may be related to the context of the school and the nature of leadership in general; other research has found varied degrees of centrality for school leadership in advice networks (Spillane & Kim, 2012).

Findings related to the characteristics of professional and data advice networks, key actors in data advice networks, and the productivity of networks may also be instructive for several aspects of policy and practice. Our findings are relevant to the design and implementation of initiatives or interventions to support data use. The socially embedded capacity to use data and social relations as a mechanism for building capacity for data use suggest that efforts to improve teachers’ use of data may benefit from collaborative structures that focus on the interactions of teachers around data, confirming findings of previous research discussed earlier. Efforts that target individual knowledge, such as traditional professional development for select teachers, could be augmented by creating structures or processes to ensure that knowledge becomes part of the flow of information within the school (e.g., peer coaching models of professional development, for example, Joyce & Showers, 2002). Although the effectiveness of any one strategy is not known, evidence presented here suggests that interventions that tap into social ties among teachers may be highly productive.

Our research also suggests a need to invest in the professional development of school leaders. As in Allegheny, formal instructional leaders may, by position or expertise, serve as highly central actors in data advice networks, yet professional development resources are typically focused on classroom teachers. Furthermore, leadership is not limited to administrators but includes instructional specialists (e.g., coaches or reading teachers) available to support classroom-based teachers. To the extent that these leaders are influential in the advice networks of their school, it is critical that they have a deep understanding of the strategy—here, data use—for which teachers seek support.

Relatedly, our findings are useful in considering leadership strategies for successful implementation of reform. Here, the reform is data use, and findings support that dense ties, instrumental networks, central actors, and “between” connectors may be defining aspects of capacity development. As school leaders adopt and implement reform, knowledge of the underlying professional structure and the nature of advice networks in the school could be instrumental in successful reform, as suggested by Daly and Finnigan (2009). For example, if employing a peer coaching model of professional development, leaders can capitalize on existing networks by identifying those key teachers who connect expert and advice-seeking colleagues. Similarly, leadership can focus on building dense ties to build socially embedded capacity for reform. For example, leaders can develop collaborative structures that enable meaningful interaction among teachers, creating a denser network to support reform efforts.

Finally, as in all research on data use, there is a need to more explicitly connect data use to changes in teacher practice, then to improvements in student learning. The theory of action underlying data use reform efforts is that all our work will have an impact on the quality of teaching and learning, and as we move forward in research and practice, it is essential to retain this critically important focus.


This research was funded by the Spencer Foundation’s Data Use and Educational Improvement Initiative. We would also like to thank Steve Borgatti (University of Kentucky) for his support in the production of this research.


1. Data accessed through school profiles on the Department of Education website: http://profiles.doe.k12.de.us/SchoolProfiles/State/Default.aspx


Anderson, S., Leithwood, K., & Strauss, T. (2010) Leading data use in schools: Organizational conditions and practices at the school and district levels. Leadership and Policy in Schools 9(3), 292–327.

Atteberry, A., & Bryk, A. S. (2010). Analyzing the role of social networks in school-based professional development initiatives. In A. J. Daly (Ed.), The ties of change: Social network theory and application in education (pp. 51–76). Cambridge, MA: Harvard Education Press.

Baker-Doyle, K. J., & Yoon, S. A. (2010). Making expertise transparent: Using technology to strengthen social networks in teacher professional development. In A. J. Daly (Ed.), Social network theory and educational change (pp. 115–126). Cambridge, MA: Harvard Education Press.

Blanc, S., Christman, J. B., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. E. (2010). Learning to learn from data: Benchmarks and instructional communities. Peabody Journal of Education, 85(2), 205–225.

Borgatti, S. P. (2002). NetDraw: Graph visualization software. Harvard, MA: Analytic Technologies.

Borgatti, S. P., Everett, M. G., & Freeman, L. C. (2002). UCINET for Windows: Software for social network analysis. Harvard, MA: Analytic Technologies.

Borgatti, S. P., Mehra, A., Brass, D. J., & Labianca, G. (2009). Network analysis in the social sciences. Science, 323, 892–895.

Bryk, A. S., & Schneider, B. L. (2002). Trust in schools: A core resource for improvement. New York, NY: Russell Sage Foundation.

Cho, V., & Wayman, J. C. (2014). Districts’ efforts for data use and computer data systems: The role of sensemaking in system use and implementation. Teachers College Record, 116 (2). Retrieved from http://www.tcrecord.org/Content.asp?ContentId=17349

Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading policy in their professional communities. Educational Evaluation and Policy Analysis, 23(2), 145–170.

Coburn, C. E., & Russell, J. L. (2008). District policy and teachers’ social networks. Education Evaluation and Policy Analysis, 30(3), 203–235.

Coburn, C. E., Russell, J. L., Kaufman, J. H., & Stein, M.K. (2012). Supporting sustainability: Teachers’ advice networks and ambitious instructional reform. American Journal of Education, 119, 137–182.

Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American Journal of Education, 112(4), 469–495.

Coburn, C. E., & Turner, E. O. (2011): Research on data use: A framework and analysis. Measurement: Interdisciplinary Research and Perspectives, 9(4), 173–206.

Cole, R. P., & Weinbaum, E. H. (2010). Changes in attitude: Peer influence in high school reform. In A. J. Daly (Ed.), Social network theory and educational change (pp. 77–96). Cambridge, MA: Harvard University Press.

Corcoran, T., Fuhrman, S. H., & Belcher, C. L. (2001). The district role in instructional improvement. Phi Delta Kappan, 83(1), 78–84.

Cosner, S. (2011). Supporting the initiation and early development of evidence-based grade-level collaboration in urban elementary schools: Key roles and strategies of principals and literacy coordinators. Urban Education 46(4), 786–827.

Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record, 114(11), 1–38.

Daly, A. J., & Finnigan, K. (2009). A bridge between worlds: Understanding network structure to understand change strategy. Journal of Educational Change, 11(2), 111–138.

Daly, A. J., & Finnigan, K. S. (2010). A bridge between worlds: Understanding network structure to understand change strategy. Journal of Educational Change, 11(2), 111–138.

Daly, A. J., & Finnigan, K. (2011). The ebb and flow of social network ties between district leaders under high stakes accountability. American Education Research Journal, 48(1), 39–79.

Daly, A. J., Moolenaar, N., Bolivar, J., & Burke, P. (2010). Relationships in reform: The role of teachers' social networks. Journal of Educational Administration, 48(3), 20–49.

Data Quality Campaign. (2013). Data for action, 2013. Washington, DC: Author. Retrieved from http://dataqualitycampaign.org/files/DataForAction2013.pdf

Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles: University of Southern California, Center on Educational Governance.

Dekker, D., Krackhardt, D., & Snijders, T. A. B. (2007). Sensitivity of MRQAP tests to collinearity and autocorrelation conditions. Psychometrika, 72, 563–581.

Farley-Ripple, E. N., & Buttram, J. (2013). Developing collaborative data use through professional learning communities: Implementation evidence from Delaware. Studies in Educational Evaluation. http://dx.doi.org/10.1016/j.stueduc.2013.09.006

Farley-Ripple, E. N., & Buttram, J. (2014). Schools’ use of interim data: Practices in classrooms, teams, and schools. In A. R. Shoho, B. Barnett, & A. Bowers (Eds.), Using data in schools to inform leadership and decision making. International Research on School Leadership Series (Vol. 5, pp. 39–66). Charlotte, NC: Information Age.

Finnigan, K. S., & Daly, A.J. (2012). Mind the gap: Organizational learning and improvement in an underperforming system. American Journal of Education, 119, 41–70.

Finnigan, K. S., Daly, A. J., & Che, J. (2013). Systemwide reform in districts under pressure: The role of social networks in defining, acquiring, using, and diffusing research evidence. Journal of Educational Administration, 51(4), 476–497.

Firestone, W. A., & Gonzalez, R. A. (2007). Culture and processes affecting data use in school. In P. A. Moss (Ed.), Evidence and decision making (pp. 132–154). Malden, MA: Blackwell.

Frank, K. A., Zhao, Y., & Borman, K. (2004). Social capital and the diffusion of innovations within organizations: Application to the implementation of computer technology in schools. Sociology of Education, 77(2), 148–171.

Frank, K. A., Zhao, Y., Penuel, W. R., Ellefson, N., & Porter, S. (2011). Focus, fiddle, and friends: Experiences that transform knowledge for the implementation of innovations. Sociology of Education, 84(2), 137–156.

Fuchs, L. S., Fuchs, D., Karns, K., Hamlett, C. L., & Katz, M. (1999). Mathematics performance assessment in the classroom: Effects of teacher planning and student problem solving. American Educational Research Journal, 36, 609–646.

Halverson, R., Grigg, J., Prichett, R., & Thomas, C. (2007). The new instructional leadership: Creating data-driven instructional systems in schools. Journal of School Leadership, 17(2), 158–193.

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J. A., & Wayman, J. C. (2009). Using student achievement data to support instructional decision making (No. NCEE 2009-4067). Washington, DC: National Center for Educational Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Haythornthwaite, C. (2001). Tie strength and the impact of new media. In proceedings of the 34th annual Hawaii international conference on systems sciences, Maui.

Honig, M. I., & Coburn, C. (2008). Evidence-based decision-making in school district central offices: Toward a policy and research agenda. Educational Policy, 22(4), 578–608.

Honig, M. I., & Venkateswaran, N. (2012). School-central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199–222.

Hord, S. M. (1997). Professional learning communities: What are they and why are they important? Issues…about Change, 6(1), 1–8.

Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the “data driven” mantra: Different conceptions of data-driven decision making. National Society for the Study of Education Yearbook, 106(1), 105–131.

Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106, 1258–1287.

Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. Alexandria, VA: ASCD.

Kennedy, M. M. (1982). Evidence and decision. In M. M. Kennedy (Ed.), Working knowledge and other essays (pp. 59–103). Cambridge, MA: Huron Institute.

Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education 112(4), 496–520.

Knapp, M., Copland, M., & Swinnerton, J. (2007). Understanding the promise and dynamics of data-informed leadership. In P. A. Moss (Ed.), Evidence and decision making (pp. 74–104). Malden, MA: Blackwell.

Konstantopoulos, S., Miller, S. R., & van der Ploeg, A. (2013). The Impact of Indiana’s System of Interim Assessments on Mathematics and Reading Achievement. Educational Evaluation and Policy Analysis, 35(4), 481–499.

Krackhardt D. (1987). Predicting with networks: Nonparametric multiple-regression analysis of dyadic data. Social Networks, 10, 359–381.

Kruse, S. D., Louis, K. S. & Bryk, A. S. (1995). An emerging framework for analyzing school-based professional community. In K. S. Louis, S. Kruse, & Associates (Eds.), Professionalism and community: Perspectives on reforming urban schools (pp. 3–22). Long Oaks, CA: Corwin Press.

Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333–350.

Lasley, T. J. (2009). Using data to make critical choices. In T. J. Kowalski & T. J. Lasley (Eds.), Handbook of data-based decision making in education (pp. 243–258). New York, NY: Routledge.

Light, D., Honey, M., Henze, J., Brunner, C., Wexler, D., Mandinach, E., & Fasca, C. (2005). Linking data and learning: The Grow Network study. New York, NY: Education Development Center, Center for Children and Technology.

Little, J. W., & Curry, M. W. (2009). Structuring talk about teaching and learning: The use of evidence in protocol-based conversation. In L. M. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement (pp. 29–42). Dordrecht, the Netherlands: Springer.

Mandinach, E. B., Friedman, J. M., & Gummer, E. S. (2015). How can schools of education help to build educators’ capacity to use data? A systemic view of the issue. Teachers College Record, 117(4).

Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation programs. Educational Researcher, 42(1), 30–37.

Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.

Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research (OP-170). Santa Monica, CA: RAND.

May, H., & Robinson, M. A. (2007). A randomized evaluation of Ohio’s Personalized Assessment Reporting System (PARS). Philadelphia, PA: Consortium for Policy Research in Education.

McLaughlin, M. W., & Talbert, J. E. (2001). Professional communities and the work of high school teaching. Chicago, IL: University of Chicago Press.

Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ abil­ity to use data to inform instruction: Challenges and supports. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.

Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision making in schools: Teacher access, supports and use. Report prepared for U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Prepared by SRI International, Menlo Park, CA.

Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.

Moolenaar, N. M. (2012). A social network perspective on teacher collaboration in schools: Theory, methodology, and applications. American Journal of Education, 11, 7–39.

Moolenaar, N. M., Daly, A. J., & Sleegers, P. J. (2010). Occupying the principal position: Examining relationships between transformational leadership, social network position, and schools’ innovative climate. Educational Administration Quarterly, 46(5), 623–670.

Moolenaar, N. M., & Sleegers, P. J. (2010). Social networks, trust, and innovation. How social relationships support trust and innovative climates in Dutch Schools. In A. J. Daly (Ed.), Social network theory and educational change (pp. 97–115). Cambridge, MA: Harvard Educational Press.

National Center for Education Statistics. (n.d.). About the SLDS grant program. Retrieved from https://nces.ed.gov/programs/slds/about_SLDS.asp

Nelson, T. (2008). Teachers’ collaborative inquiry and growth: should we be optimistic? Science Teacher Education, 93, 548–580.

Penuel, W. R., Riel, M. R., Krause, A. & Frank, K. A. (2009). Analyzing teachers’ professional interactions in a school as social capital: A social network approach. Teachers College Record, 111(1), 124–163.

Penuel, W. R., Sun, M., Frank, K. A., & Gallagher, H. A. (2012). Using social network analysis to study how collegial interactions can augment teacher learning from external professional development. American Journal of Education, 119, 103–136.

Saunders, W. M., Goldenberg, C. N., & Gallimore, R. (2009). Increasing achievement by focusing grade-level teams on improving classroom learning: A prospective quasi-experimental study of Title I schools. American Educational Research Journal, 46(4), 1006–1033.

Schildkamp, K., & Kuiper, W. (2010). Data informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496.

Sharkey, N. S., & Murnane, R. J. (2006) Tough choices in designing a formative assessment system. American Journal of Education, 112(4), 572–578.

Spillane, J. P., Hunt, B., & Healey, K. (2009). Managing and leading elementary schools: Attending to the formal and informal organization. International Studies in Educational Administration, 37(1), 5–28.

Spillane, J. P., & Kim, C. M. (2012). An exploratory analysis of formal school leaders’ position in instructional advice networks in elementary schools. American Journal of Education, 119, 73–102.

Stecker, P. M., Fuchs, L. S., & Fuchs, D. (2005). Using curriculum-based measurement to improve student achievement: review of research. Psychology in the Schools, 42(8), 795–819.

Supovitz, J. A. (2002). Developing communities of instructional practice. Teachers College Record, 104(8), 1591–1626.

Supovitz, J. A., & Klein, V. (2003). Mapping a course for improved student learning: How innovative schools use student performance data to guide improvement. Philadelphia, PA: Consortium for Policy Research in Education.

Supovitz, J. A., Merrill, E., & Conger, M. (2010). Productive teacher use of student performance and related data in professional learning communities. Paper presented at the annual meeting of the American Educational Research Association, Denver, CO.

Supovitz, J. A., & Morrison, K. (2011). Does collaboration facilitate data use in schools? Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Tenkasi, R., & Chesmore, M. (2003). Social networks and planned organizational change. Journal of Applied Behavioral Science, 39(3), 281–300.

Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teacher practice and teacher learning. Teaching and Teacher Education, 24(1), 80–91.

Wayman J. C. (2005) Involving teachers in data-based decision-making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Students Placed at Risk 10(3), 295–308.

Wayman, J. C., & Cho, V. (2007). Preparing educators to effectively use student data systems. In T. J. Kowalski, & T. J. Lasley (Eds.), Handbook of data-based decision making in education (pp. 89–104). New York, NY: Routledge.

Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of educator data use. Unpublished document.

Wayman, J. C., Jimerson, J. B., & Cho, V. (2012). Organizational considerations in establishing the data-informed district. School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice, 23(2), 159–178.

Wayman, J. C., & Stringfield, S. (2006). Technology-supported involvement of entire faculties in examination of student data for instructional improvement. American Journal of Education, 112(4), 549–571.

Weinbaum, E. (2009). Learning about assessment. An evaluation of a ten-state effort to build assessment capacity in high schools. Philadelphia, PA: Consortium for Policy Research in Education.

Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms. American Journal of Education, 112(4), 521–548.

Appendix A. Descriptive Statistics for Measures (n = 42)




Core subject classroom teacher



Formal leader (administrator, coach, etc.)



Related arts teacher (library, art, etc.)





Years in school



Years in district



Years in education



 I feel I have a lot to learn about using data.



Scale: Comfort using data



Scale: Pedagogy



Scale: Practices



Appendix B. Items and Reliability Measures for School Context Scales

Scale and Items (6-point agreement scale)


Cronbach’s Alpha



Data Practice (Wayman et al., 2009)

I use data to assess student learning.

I use data to diagnose student learning needs.

I adjust my instruction based on student data.

I use multiple data sources to understand student learning.

I use data to plan lessons.

I use data to set student learning goals.





Pedagogy (Wayman et al., 2009)

Data can help educators plan instruction.

Data can offer information about students that was not already known.

Data can help educators know what concepts students are learning.

Data can help educators identify learning goals for students.

Students benefit when teacher instruction is informed by data.






I am comfortable accessing data on my own.

 I am able to analyze or manipulate data to get information that I need.

 I feel confident when discussing data with others.

 I feel comfortable making curricular and instructional decisions based on data.

 I am more capable of using data than most of my colleagues.





Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of educator data use. Unpublished document.

Cite This Article as: Teachers College Record Volume 117 Number 4, 2015, p. 1-34
https://www.tcrecord.org ID Number: 17852, Date Accessed: 5/25/2022 2:27:47 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Elizabeth Farley-Ripple
    University of Delaware
    E-mail Author
    ELIZABETH FARLEY-RIPPLE is an assistant professor at the University of Delaware. Her research focuses on issues of leadership, organizational improvement, and capacity building at the school and district levels. She is an experienced mixed methodologist, with expertise using large administrative data sets, multilevel models, survey research, and social network analysis as well as engaging in large-scale qualitative data collection and analysis. Dr. Farley-Ripple has published research in respected journals such as Educational Researcher, Educational Policy, the American Journal of Education, Educational Management, Administration, and Leadership, and Urban Education.
  • Joan Buttram
    University of Delaware
    E-mail Author
    JOAN BUTTRAM is the director of the Delaware Education Research and Development Center at the University of Delaware. Dr. Buttram’s research focuses on educational leadership, professional development, and school improvement. She is currently conducting research and/or evaluation studies related to high school reform, professional learning communities, multiple professional development and training programs in both K–12 and higher education, and health services for individuals with disabilities. Prior to coming to UD, she directed the Southwest Regional Educational Laboratory in Austin, Texas, for nine years.
Member Center
In Print
This Month's Issue