Title
Subscribe Today
Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

The Development of Capacity for Data Use: The Role of Teacher Networks in an Elementary School


by Elizabeth Farley-Ripple & Joan Buttram - 2015

Background: Amid calls for increased data use, there is little research or policy guidance for how to build schools’ capacity to leverage data to improve teaching and learning. Building on previous research highlighting the social nature of data use, we contend that in order to understand how capacity develops, research must focus on relationships and networks that support educators’ practice, conceptualizing capacity as socially embedded.

Purpose: This article explores the development of data use capacity in an elementary school through a social network approach. Our analysis focuses on the structure of data advice networks, the characteristics of perceived experts in the network, and the productiveness of the network in terms of influencing beliefs and practice.

Population: Data come from a sample of 42 educators from an elementary school exemplified by its district as a strong user of data to improve teaching and learning. Participants completed a survey about their data use beliefs, practices, and school context, as well as a social network questionnaire indicating from whom they sought advice on using data.

Research Design: We used the survey data to identify characteristics of the schools’ data use networks using descriptive statistics and social network analysis (SNA). SNA was also used to develop measures of structural location in those networks, which were then used to predict similarities in teachers’ beliefs and practices around data use.

Findings: Findings reveal that data use networks are influenced by the larger professional structure of the school, with data advice being from colleagues who are part of their larger professional network. Network structure reveals few highly central “advice givers” and many “advice seekers” connected by teachers and leaders who serve as brokers of advice. We find that brokers may play an important role in developing shared practices, given that the indirect relationships they support are predictive of shared data use practices.

Conclusions: This research is among the first to explore data use through a social network approach and offers early evidence about how educators’ networks enable schools to build capacity for data use. Our findings have implications for the design of professional development, for professional development for school leaders, and for successful implementation of reforms related to data use.



In the last decade, expectations for using data to improve teaching and learning have grown exponentially. Underlying policy efforts to increase educators use of data is a theory of action in which data use improves instruction, whether data inform school improvement efforts or teachers daily instructional decision making. Central to this underlying theory of action is educators capacity for data use. As Daly (2012) argued, The ultimate success of data use for educational improvement may depend on how states and local education agencies build capacity (p. 2).


However, amid calls for increased expectations for data use, there is little understanding of how capacity for data use develops. A great deal of policy effort is toward data use, which focuses generally on creating data systems, management information systems, and assessment systems to provide timely access to quality data for teachers (Data Quality Campaign, 2013; Means, Padilla, & Gallagher, 2010; NCES, n.d.), belying a belief that if you build it, they will come. However, having access to data does not equate to using it to improve teaching and learning (Schildkamp & Kuiper, 2010; Stecker, Fuchs, & Fuchs, 2005). In fact, there is evidence that data systems may be underutilized (Cho & Wayman, 2014; Means, Padilla, DeBarger, & Bakia, 2009; Wayman, Cho, & Shaw, 2009; Wayman, Jimerson, & Cho, 2012).


Additionally, interventions designed to build capacity have been implemented, and research is beginning to provide evidence of their effectiveness (Konstantopoulos, Miller, & van der Ploeg, 2013; May & Robinson, 2007; Saunders, Goldenberg, & Gallimore, 2009). However, as Marsh (2012) illustrated in a review, data use interventions rely on a range of strategies, from professional development to technology support to setting expectations and norms for data use, generally producing mixed evidence in terms of outcomes for educators knowledge, skills, and practices. The range of strategies suggests that there is little consensus on the mechanisms by which educators skills are influenced. Thus, amid calls for increased data use, there is little evidence on how capacity for data use develops.


In this article, we offer evidence of a potentially powerful mechanism for the development of capacity to use data: teacher networks. In the literature to date, we identify two dominant views of capacity in the literaturecapacity as situated at the individual and organizational levels. However, we argue that greater attention needs to be paid to a third conceptualizationcapacity as embedded in social relations. Using the perspective of social network theory, we explore how interactions between educators can support capacity development within schools. In the following pages, we present a brief review of the data use literature, attending specifically to the issue of capacity. We then examine this conceptualization of capacity in the context of an elementary school exemplified by its district as a strong user of data to improve teaching and learning. Our analysis focuses on the structure of data advice networks, the characteristics of perceived experts in the network, and the productiveness of the network in terms of influencing beliefs and practice. Finally, we discuss our results in terms of how a socially embedded conceptualization of capacity may be applied in research, policy, and practice.


CONCEPTUAL APPROACHES TO UNDERSTANDING CAPACITY FOR DATA USE


Research findings often frame data use as an outcome, with capacity understood as organizational conditions that support or enable data use. Specifically, research has identified the importance of structured time as critical for successful collaboration (Blanc et al., 2010; Coburn & Turner, 2011; Datnow, Park, & Wohlstetter, 2007; Halverson, Grigg, Prichett, & Thomas, 2007; Ikemoto & Marsh, 2007; Ingram, Louis, & Schroeder, 2004; Kerr, Marsh, Ikemoto, Darilek, & Barney, 2006; Sharkey & Murnane, 2006; Supovitz , 2002; Supovitz & Klein, 2003; Wayman, 2005). Other resources shown to impact collaborative data use include timely access to data or other evidence of student learning (Kerr et al., 2006; Lachat & Smith, 2005; Lasley, 2009; Marsh, Pane, & Hamilton, 2006; Supovitz & Klein, 2003; Wayman & Cho, 2007; Young, 2006), tools or guides for collaborative activities (Cosner, 2011; Little & Curry, 2009; Nelson, 2008), and additional professional development (Cosner, 2011; Datnow et al., 2007; Fuchs, Fuchs, Karns, Hamlett, & Katz, 1999; Hamilton et al., 2009; Kerr et al., 2006; Saunders et al., 2009; Supovitz, 2002; Wayman & Cho, 2007).


Relatedly, organizational capacity is associated with leadership, given that both the allocation and coordination of resources are in large part functions of district and school leaders. This literature has documented a number of leadership actions that constrain or support data use (Anderson, Leithwood, & Strauss, 2010; Blanc et al., 2010; Knapp, Copland, & Swinnerton, 2007; Datnow et al., 2007; Firestone & Gonzalez, 2007; Hamilton et al., 2009; Honig & Venkateswaran, 2012; Ikemoto & Marsh, 2007; Supovitz & Klein, 2003; Wayman, 2005; Weinbaum, 2009; Young, 2006).


A second research lens focuses on individual capacitythat is, educators knowledge and skills related to data use (see Mandinach, Friedman, Gummer, 2015, this issue). The activity of using data can be described as comprising analytical and action-oriented tasks in which educators must make sense of the data (analysis and interpretation) as well as make choices about how to act (Coburn & Tuner, 2011; Cosner 2011; Marsh et al., 2006 ). This entails a degree of knowledge about both instruction and dataa combination recently termed data literacy for teaching but also labeled pedagogical data literacy (Mandinach & Gummer, 2013) and instructional decision-making (Means, Chen, DeBarger, & Padilla, 2011). Related to educators data literacy are their beliefs and attitudes regarding data. Research has suggested that preexisting cognitive frameworks mediate interpretation and search for information (Honig & Coburn, 2008; Kennedy, 1982) and that individuals vary in their beliefs about forms of evidence and their value (Coburn, 2001; Coburn &Talbert, 2006; Corcoran, Fuhrman, & Belcher, 2001; Light et al., 2005). However, research on data use more often than not identifies the lack of individual capacity in schools and districts (Ikemoto & Marsh, 2007; Means et al., 2009; Supovitz & Klein, 2003; Wayman & Stringfield, 2006). As a result, findings often suggest a need to improve human capacity through investments in professional development linking data to instructional practice (Cosner, 2011; Datnow et al., 2007; Kerr et al.,2006; Saunders et al., 2009; Schildkamp & Kuiper, 2010; Supovitz, 2002; Wayman & Cho, 2007) or address data literacy needs through preservice preparation programs (Mandinach & Gummer, 2013).


Nested between these two conceptualizations of capacity is a third formdata use capacity embedded in social relations or the social structure of the organization. The significance of a socially embedded conceptualization of capacity is suggested by data use research in three ways. First, research has found that data use in schools and school systems is rarely an individual activity, but rather that practice is often social in nature (Halverson et al., 2007; Means et al., 2009, 2011). Discussed in the previous point, research has documented how preexisting cognitive frameworks and beliefs influence how data are valued and interpreted. Therefore, to the extent that data use occurs in a social context, how data are interpreted and transformed into actionable information will depend on the individuals involved in the conversation. Thus, capacity for data use is not an individual attribute but one that occurs at the level of interpretation and actionwhich may often be groups or teams of teachers. Research further suggests that effective data use stems from collaborative processes (e.g., Datnow et al., 2007; Kerr et al., 2006; Lachat & Smith, 2005; Means et al., 2009; Schildkamp & Kuiper, 2010; Wayman & Stringfield, 2006) and can result in better instructional decision making and foster deeper use of data (Lachat & Smith, 2005; Supovitz, Merrill, & Conger, 2010; Young, 2006). Further, most prescriptive models of data use incorporate substantial collaborative elements (Supovitz & Morrison, 2011).


In addition to serving as opportunities for co-construction of knowledge based on data, social relations are opportunities for knowledge to be shared. That is, educators draw on social relations to identify expertise and to access information, resources, or skills throughout the organization. Research documents the role of formal instructional leaders, such as coaches, to support learning about data use (Cosner, 2011; Lachat & Smith, 2005). Further, evidence suggests that learning about data use is not limited to formal or expert resources; rather, research provides descriptions of successful collaboration around data based on learning from and with colleagues with only slightly more advanced data use skills (Wayman, Jimerson, & Cho, 2012; Wayman & Stringfield, 2006; Young, 2006). Thus, conceptualizing capacity as socially embedded elucidates the process by which knowledge and expertise move through an organization.


Relatedly, research on data use consistently has found that the culture of an organization influences data use or inquiry more broadly through the development of norms and expectations for inquiry (Honig & Venkateswaran, 2012; Ikemoto & Marsh, 2007; Ingram et al., 2004; Knapp et al., 2007; Lachat & Smith, 2005; Supovitz & Klein, 2003; Wayman & Stringfield 2006) and through development of norms associated with professional community, such as a need for trust, a focus on student learning, shared values, deprivatized practice, and reflective dialogue (Blanc et al., 2010; Bryk & Schneider, 2002; Hord, 1997; Kruse, Louis, & Bryk, 1995; McLaughlin & Talbert, 2001; Saunders et al., 2009; Supovitz & Morrison, 2011; Vescio, Ross, & Adams, 2008). These dimensions of organizational culture shape the nature of interaction among educators, which further supports the conceptualization of capacity as socially embedded.


These findings suggest that both schools and individuals capacity to use data are influenced by the nature and quality of social relations among educators, which we refer to as teacher networks. However, little work specifically has focused on examining the presence, quality, strength, or impact of teacher networks with respect to data use or connected these relationships to the development of school or individual capacity for data use, though calls for such research have been made (Daly, 2012). The purpose of this research is to explore these issues through a case study of an elementary school. Because this research is concerned with educators interactions and relationships with regard to data, we utilize a social network approach in which focus is shifted from individuals and their attributes to their interactions with others and position in a larger organizational network.  This approach, therefore, enables an understanding of how individuals or actors interactions (or lack thereof) enable or constrain certain opportunitiesfor example, access to information, expertise, or other resources. Thus, in the context of data use, such analysis permits an understanding of schoolwide networks (e.g., who interacts around data, who learns from whom about data use) and individuals position in the network (e.g., who are considered experts, who has access to expertise).  


USING NETWORK ANALYSIS TO UNDERSTAND CAPACITY DEVELOPMENT


Social network theory has been applied to a number of issues in education research, particularly to understand implementation and sustainability of reforms (Atteberry & Bryk, 2010; Coburn & Russell, 2008; Cole & Weinbaum, 2010; Daly & Finnigan, 2009, 2011; Daly, Moolenaar, Bolivar, & Burke, 2010; Penuel, Riel, Krause, & Frank, 2009; Spillane et al., 2009), teacher collaboration (see Moolenaar, 2012, for a discussion), professional learning (Penuel, Sun, Frank, & Gallagher, 2012), and diffusion of innovation (Frank, Zhao, & Borman, 2004; Frank, Zhao, Penuel, Ellefson, & Porter, 2011). Though at the time of this study, no work has specifically focused on the development of capacity for data use, a number of relevant network theory concepts employed in education research are instructive in understanding the role of networks in developing teacher capacity. Although a comprehensive review of network concepts is beyond the scope of this analysis (see Daly, 2012, for such a discussion), we focus on three key issues relevant to our purposes.


NETWORK CHARACTERISTICS


The foundation of social network theory is the social ties between actors, and the pattern of ties among individuals creates a network structure. A good deal of social network analysis in education research has focused on characteristics of network structure as a way of understanding how resources flow to and from individuals in the network. Of particular interest here are the network characteristics density (a measure of how many of the organizations members interact with each other) and centralization (how evenly distributed interactions are across actors). Education research typically uses these measures to assess organizational attributes such as social capital, capacity for reform, and organizational learning (Atteberry & Bryk, 2010; Daly & Finnigan, 2010; Daly et al., 2010; Finnigan & Daly, 2012; Moolenaar & Sleegers, 2010). If capacity is conceptualized as embedded in the social ties among teachers, then the information about the structure of those relations helps us to understand how capacity moves or develops in an organization.


CHARACTERISTICS OF RELATIONS


From a network theory perspective, not all ties among actors are the same. Relations between individuals can be characterized in three ways: relations, interactions, and flows (Borgatti, Mehra, Brass, & Labianca, 2009). Social relations describe the nature of the relationship (e.g., colleagues, friends). Interactions quantify the nature of a social relation (e.g., frequency, strength). Flows indicate resources that move through those relations (e.g., expertise, knowledge). Research on social networks in education have documented that the structure of networks can vary by the type of relation and that each type of relationships can have a different impact on outcomes such as beliefs, attitudes, and practice (Cole & Weinbaum, 2010; Finnigan & Daly, 2012). To help categories ties, previous studies have characterized networks as expressive, referring to natural and social affinities, and instrumental, characterized as developing for a particular purpose, such as advice networks (e.g., Cole & Weinbaum, 2010; Daly & Finnigan, 2010, 2011; Finnigan & Daly, 2012; Spillane & Kim, 2012). Research also shows that the frequency or strength interactions may be related to outcomes of reform, such as in the case of implementation of mathematics reform (Coburn, Russell, Kaufman, & Stein, 2012; Penuel et al., 2012). That is, how often actors interact or the importance of the relationship may help explain outcomes such as degree or fidelity of implementation. There is also is growing attention to the resources that move through relations in an organization. Research in education has focused on how networks serve as a mechanism for creating or constraining access to resources such as expertise (Baker-Doyle & Yoon, 2010; Frank et al., 2004; Penuel et al., 2012) or research evidence (Finnigan, Daly, & Che, 2013).


ACTOR POSITION


Network theory holds that an actors position in a network is related to his or her degree of influence in the organization. The concept of centrality is one often applied in education research. Central actors are hypothesized to have substantial influence or control over how resources, information, or expertise are distributed throughout an organization. Recent work suggests the power of central actors in the diffusion of knowledge and attitude.  For example, studies have explored the centrality of leadership in instructional advice networks (Spillane & Kim, 2012) as a way of understanding influence on the instructional program of a school. In contrast, individuals on the periphery may have fewer ties to others and therefore be dependent on few relations for access to organizational resources. Those in between can serve as connectors or bridges, such as central offices serving as brokers of policy to schools and between groups (Daly & Finnigan, 2011) or coaches supporting the diffusion of an initiative (Atteberry & Bryk, 2010).


In the context of data use, network structure, content and strength of relations, and actor position are useful concepts in understanding the socially embedded nature of capacity to use data. They provide ways to measure which resources, such as knowledge and expertise around data use, are shared, as well as how often and between whom they are shared. Thus, an understanding of teacher networks should be a productive lens for understanding both individuals and schools development of capacity. To explore this proposition, our research examines the role of teacher networks in the development of capacity for data use in an elementary school. Drawing on social network theory, we focus on the structure of data advice networks, the characteristics of those from whom advice is sought, and the productiveness of the network in terms of influencing beliefs and practice. Specifically, the present study seeks to address the following questions: (1) What are the characteristics of professional and data advice networks in the school? (2) Who are the providers of support for data use, and what are their characteristics? (3) Are data advice networks productive in developing shared data beliefs and practices?


DATA AND METHODS


Data for this analysis were drawn from a broader multiple case study of school data use in four elementary schools (serving Grades K5) in two districts in a mid-Atlantic state. Districts were selected based on comparable size and diversity of population. Schools were selected with the advice of the superintendents, who identified one school where data use was considered strength and another as an area of improvement. In separate analyses conducted as part of the research project, one school was identified as an exemplar in terms of data use. We elected to focus on this schoolAllegheny Elementaryfor further investigation of the development of data use capacity.


SAMPLE AND CONTEXT


Allegheny Elementary School is a high-performing school of about 550 students and 53 educators. Allegheny is one of 12 schools in a primarily urban district. The school serves a majority low-socioeconomic-status (88% are eligible for free/reduced lunch) and non-White student population (50% of students were African American, 30% were Hispanic), with a substantial portion of English language learners (20%) and students requiring special education services (12%).1 In 20102011, 85% of students exceeded math and reading standards.


To further describe the context of Allegheny, we draw on previous analyses (see Farley-Ripple & Buttram, 2013, 2014). Many of the conditions research has identified as important for data use were evident in this school, including leadership for data use, a strong sense of professional community, and key resources. The principal was highly involved in the instructional program and targeted resources to meet instructional needs. For example, by hiring a literacy coach, a math coach, and several reading specialists, the principal increased the instructional staff of the school; this enabled him to assign and integrate instructional specialists into professional learning communities, provide release time for teacher collaboration and professional development, and provide more targeted support to struggling teachers and students. The principal emphasized team and leadership development through collaboration and suggested that many teaching-related activities should be done in this collaborative environment, including discussing lesson plans, sharing activities, discussing how students were doing, and ultimately delving into specific topics based on an agenda and goals they would develop together. The school had been implementing professional learning communities for grade-level teams for two years prior to 2010 and had a structure in place to allocate 90 minutes during and after school time for collaborative data use.


As a result of both leadership and collaboration, the school had established norms and practices for using data to drive improvement. Further, teachers and administrators utilized a wide range of datafacilitated by district investment in a comprehensive data systemto inform classroom, grade-level, and schoolwide decisions. For example, teachers were expected to regularly examine data and to set short-term goals for student performance between assessments. At faculty meetings, teams posted both goals and student results by individual teacher in faculty meetings, where instructional strategies were shared, improvements were suggested, and successesboth group and individualwere celebrated.


In many ways, Allegheny is an outlier. In our previous research (Farley-Ripple & Buttram, 2013, 2014), we found that Allegheny was uniquely successful in producing schoolwide data use for instructional improvement. More broadly, it outperformed most elementary schools in the state and all schools in its district despite serving a traditionally disadvantaged population. As an outlier, it is valuable as a case study because the context for developing capacity for data use is not confounded with other issues that research has identified as constraining data useincluding those organizational conditions examined in previous researchthus offering insight into the specific influence of networks on the development of data use capacity. Although its uniqueness limits generalizability of specific findings, we can use what we learn from Allegheny as a foundation for exploring capacity development in other contexts. Furthermore, our analysis of Allegheny illustrates several ways in which a network analytical approach can be valuable in understanding individual and school capacity for data use.


DATA


The larger study from which our data are drawn employed a mixed-methods approach in which qualitative data (interviews, focus groups, and document analysis) were coupled with survey data. Data were collected during the 20102011 school year. For the purposes of the present analyses, we focus on data collection as relevant to the examination of social networks at Allegheny, which was drawn from a schoolwide survey administered in May 2011, with an 80% response rate (n = 42).


The survey included three types of items: social network items; items related to beliefs about data and data-use practices; and teacher background questions. To collect information about teachers social networks, respondents were asked two separate sets of questions using a bounded approach in which we provided a list of all the staff in the school. First, we asked individuals to identify their professional network by listing up to five of their closest professional colleagues, indicating how often they interacted with them on a 5-point scale from daily to less than a few times a year. Although professional networks are not a focus of this research specifically, we included this measure to gauge the frequency with which teachers generally interacted professionally in comparison with interactions around data use and as a measure of the social capital of the teachers and the school. In addition, we asked a set of instrumental questions related to data advice. Specifically, we asked them to identify their data advice network by (1) listing up to five colleagues to whom they go for help or advice using data and (2) how often they go to those colleagues on a 5-point scale from daily to less than a few times a year.


To explore the productivity of networks, we selected two outcomes intended to capture teachers reported beliefs and practices with respect to data use. We utilize two scales from Wayman and colleagues (2009): A five-item scale related to beliefs about the value of data in instruction was utilized to represent the construct pedagogy, and a second five-item scale related to how data were used in instructional decision-making was utilized to represent the construct practice (see Appendix A for items). We recognize that these items represent simple conceptualizations of belief and practice and may not include important dimensions of each construct; however, responses to the items demonstrated sufficient reliability and variability for our exploratory purposes. Last, we included background questions related to role in the school, years of experience, and perceived comfort and expertise using data. Descriptive statistics on all measures used are presented in Appendix A.


ANALYSIS


Analyses of scale items were conducted in SPSS. Appendix B presents more information about the items and reliability for pedagogy, practice, and comfort. Social network analysis was analyzed using UCINET (Borgatti, Everett, & Freeman, 2002) and NetDraw (Borgatti, 2002). The social network items from the survey yielded two matricesprofessional and data advicein which ties were characterized by both direction (who the respondent identified) and strength (how often did the respondent go to that person). We utilized these matrices to answer research questions 1 and 2.


The first research question focuses on the characteristics of professional and data advice networks in the school. We use network measures of density and centrality to characterize the structure of professional and data advice networks. We first calculated densitythe number of ties between actors divided by the total number of possible tiesas a measure of the connectedness of the school. That is, if there are more social ties between actors, we might expect information or resources to flow more quickly through the school. In this case, we are interested in general professional interactions and interactions specific to advice related to data use. Therefore, we compare density of the professional network to the density of the data advice network using a bootstrap paired sample t test.  


As a second overall network measure, we calculated Freemans degree centralization. This measure is based on the number of incoming ties to each actor (indegree) and the number of ties from an actor to others (outdegree). Because our data constrained the outdegree of individuals by requesting identification of up to five of their colleagues, we focus on indegree as a measure of centrality. At a network level, the indegree centrality statistic is a measure of how distributed indegree is for a given network. Thus, higher values (closer to 1) indicate a concentration of indegrees on few individuals and thus a more centralized network. Central actors can be considered influential and may exert disproportionate control over resources. In an advice network, a highly centralized structure may indicate that expertise (or perceived expertise) is located within a few individuals rather than dispersed throughout the organization.


We also use the quadratic assignment procedure (QAP) for measuring the association between these networks. This procedure calculates the correlationbecause all matrices are valued data, the Pearson correlation coefficient is most appropriatebetween any pair of matrices and produces a measure of association. The procedure then runs a series of permutations (here, n = 500) that randomly matches actors, reporting the proportion of permutations where the association is larger or smaller than the observed association. These proportions are used to determine the statistical significance of the observed association. We correlated the two matrices in two ways. First, we dichotomized the matrices to observe the correlation based entirely on whom individuals identified as part of their network, absent measures of strength. This provides a way of assessing whether the members of the networks differ, irrespective of how often individuals interact. Second, we correlated the matrices in which ties indicated strength, which provides a measure of whether the frequency of interaction with members differs between networks.


To answer the second research question, which focuses on providers of support for data use, we draw again on centrality as a means of identifying key actors in the data advice network. We calculate Freedmans indegree for each individual to assess consistency across networks and to delineate central from peripheral actors in the advice network. Indegree is calculated at the actor level by summing the number of actors seeking advice from that individual. Because relations are valued by the frequency of interaction, indegree is the sum of values for all incoming ties to an actor. We articulate three groups: those individuals who were in the top 10th percentile for indegree centrality (n = 6), those from whom anyone sought advice (indegree > 0, n = 24), and those from whom no one sought advice (n = 27). We then utilized descriptive analyses to compare characteristics of each group-based role in the school as well as on responses for the following survey items: years experience in the school and in education; pedagogy and practice scales; school culture scales; items related to perceived comfort and expertise using data; and value of particular professional development opportunities related to data use. We also use flow-betweenness as a measure of centrality to identify those who connect other advice seekers with central advice givers. These analyses provide a profile of the empirically defined experts in the school.


Last, we utilize a QAP regression model to explore the productivity of the advice network in producing similarity in responses to the practice and pedagogy scales (research question 3). Survey responses were used to generate attribute matrices to complement matrices measuring social networks. Attribute matrices allow nodes in a network to be grouped by a characteristic. For example, it is possible to examine relationships within and between groups, with a variable indicating group membership in the attribute data file. Attribute matrices may also be transformed into similarity or difference matrices. Such a transformation produces a measure of how similar a respondent is to (or dissimilar from) each other member of the network. Importantly, in this third analysis, we do not consider whether the advice networks function to improve practice; rather, we focus on development of shared practices (as measured by similarities/differences in the practice scale). We do not presume that measures of data practice utilized here are sensitive to quality of data use, nor are we certain that there is consensus on what constitutes good use of data. Thus, we qualify the results of the third research question in that we focus on the productivity, rather than the effectiveness, of advice networks.


We utilize the similarity transformation in UCINET to generate symmetrical similarity matrices for the scales pedagogy and practice, which produces a continuous variable capturing the absolute difference in scale scores on these two measures. These matrices therefore indicate how similar ones beliefs and practices are to those of every other member of the matrix.


We hypothesize that a productive advice network would lead to similarities in these scales. To predict that similarity, we incorporate a set of covariate matrices. First, we include the professional matrix to control for the baseline social structure of the school. Alternatively, we consider models in which educators role (grade level, administrator, and so on) is included because professional relations are stronger within role categories than between, and role more clearly distinguishes between groups than the professional network does. Second, we add as predictors matrices that capture the frequency of data advice networks. We do this in three ways. We first model similarities in terms of the frequency network itselfa simple measure of reported advice seeking. This matrix, however, does not link all actors to all other actors; that is, Actor A may seek advice from Actor B, who seeks advice from Actor C, yet the frequency matrix will not show any relation between A and C even though it is likely that C has significant but indirect influence on A. We therefore add two alternatives based on transformations of the frequency matrix. The first is a matrix of the geodesic distancethe shortest path between two actors based on the presence of ties irrespective of strength. Because the data advice network measures strength of ties as well, we can further refine the measure to capture the optimum pathdefined as reachability. The strength of a path is defined as the strength of its weakest link. The optimal path between two nodes is defined as the one with the highest overall strength, which is to say, the one with the strongest weakest link. The weighted reachability matrix gives the strength of the optimal path between each pair of nodes.


We utilize a double-semi-partialling multiple regression technique offered in UCINET. Like the QAP correlation procedure, QAP regression (Krackhardt, 1987) produces estimates for R-square and coefficients based on observed data, then compares the results with the results of permutations (here, n = 2,000) in which the rows and columns of the dependent matrix are randomly permuted. Reported results indicate the proportion of permutations for which the coefficient as low or as high as reported were found, with a low proportion (< .05) indicating it is unlikely that the observed relationship occurred by chance. Because our model controls for professional ties, we use a modified technique for multiple regression suggested by Dekker, Krackhardt, and Snijders (2007), referred to as double-semi-partialling.


RESULTS


In this section, we present the results of our analyses and organize this discussion in relation to the specified research questions.


CHARACTERISTICS OF SCHOOL NETWORK STRUCTURE


A visual representation of network structure is useful in contextualizing numeric and statistical measures. Figures 1 and 2 illustrate the professional and general data advice frequency networks, respectively. Relations are represented by arrows, with arrow heads indicating the direction of the relationship. The shape of the nodes indicates the educators role in school: Circles identify classroom teachers in Grades 15, diamonds indicate related arts teachers, triangles indicate instructional specialists such as math coaches and reading teachers, circle-within-squares indicate counselors or educational diagnosticians, and squares indicate administrators. The size of each node illustrates its centrality in the network, with larger nodes having greater indegree, that is, more people coming to them for advice. These figures shed light on key differences in the networks, illustrated statistically later in this article.


Figure 1. Teachers general professional network, by role [39_17852.htm_g/00001.jpg]



Figure 2. Teachers data advice network

[39_17852.htm_g/00002.jpg]


Table 1 presents the network characteristics of, and correlation between, Allegheny Elementarys professional and advice-seeking networks. Both average degree and density suggest that there are substantial differences between professional and data advice networks. In the professional network, ties are quite dense, with 50% of all possible ties between members of the staff present if we discount directionality (that is, if we assume that it doesnt matter whether the relationship is reciprocal or not). In contrast, only 24% of ties are present in the overall frequency of data advice network. If we account for directionality, then these networks are even sparser, with a small percentage of ties being utilized for data advice. These findings are visible in Figures 1 and 2, where the professional network appears more densely connected than the data advice network.



Table 1. Network Characteristics and Pearsons r Correlation

 

Density

(ignoring directionality)

Density

(accounting for directionality)

Indegree Centralization

Dichotomized

Correlation

Valued Matrix Correlation

Professional

.506

0.06

16.68%

.86***

.44***

Data Advice

.24***

0.04***

35.08%

 ***p <.001.




Centralization of the networks differs substantially as well, with the data advice network being far more centralized than the professional network. Similarly, this is evident in Figures 1 and 2, where several actors appear central in the professional network while the data advice network is highly focused on actor 18.


The correlation of dichotomized matrices indicates the degree to which actors interact with the same individuals for multiple purposes. As is evident, the networks are highly correlated, which can be interpreted as an indicator that educators at Allegheny Elementary generally work within their existing and primary professional network when seeking advice related to data use, rather than seeking out advice from others with whom they interact less frequently. Professional and data advice networks are less strongly associated, however, in terms of strength of interaction; the correlation between valued matrices is .44. This suggests that although the individuals constituting the networks are highly similar, there is a difference in the frequency of interaction around general professional issues and data use.


PROVIDERS OF DATA ADVICE


We used actors indegree centrality as a measure of how influential they were in the data advice network of the school. The mean-value-weighted indegree across actors was 8.24, and the range was 0 to 96. Discounting frequency of interaction, the average number of people seeking advice from each individual was 2.11 per actor, with a range of 0 to 26. As discussed earlier, the data advice network is highly centralized (indegree centralization 35%), meaning that a few individuals are particularly important in distributing advice related to data use. We therefore identified three groups to profile as part of the second research question. First, we identified those highly central actors, defined as exceeding the 90th percentile threshold for frequency-weighted indegree centrality (25.4). Six individuals fit that profile (3 responded to survey). Second, we identified the set of actors with indegree greater than zero, indicating that they were a source of advice to others in the school (n = 20, 17 responded to survey). Last, we identified those who were not sources of advice but rather only served as seekers (n = 27, 18 responded to survey). We present characteristics of these three groups in Tables 2 and 3.


Table 2. Role Differences by Centrality Categories


 

Role in School

Total

Core Subject Classroom Teacher

Formal Leader (Administrator, coach, etc.)

Related Arts Teacher (Library, art, etc.)

>90th Percentile

Count

1

5

0

6

% within role

3.6%

29.4%

0.0%

11.3%

Other support givers

Count

12

6

2

20

% within role

42.9%

35.3%

25.0%

37.7%

Advice Seekers

Count

15

6

6

27

% within role

53.6%

35.3%

75.0%

50.9%

% within role

100.0%

100.0%

100.0%

100.0%

Note: Chi-square test of statistical significance yields p <.05.



Table 3. Characteristics of School Staff Based on Centrality Categories


Centrality

Years in School

Years in Education

Pedagogy

Practices

Comfort

Still Need to Learn a Lot

Flow Betweenness***

>90th Percentile

Mean

4.67

10.00

29.67

11.00

24.0000

3.33

.0314

N

3

3

3

2

2

3

6

SD

2.082

7.937

.577

7.071

5.65685

1.528

.06041

Other support providers

Mean

3.76

8.47

27.88

12.75

24.3333

3.29

.1574

N

17

16

17

12

15

17

20

SD

1.786

8.421

2.118

3.467

1.71825

1.213

.28829

Advice seekers

Mean

3.78

7.42

27.61

13.40

23.0000

3.61

.0056

N

18

18

18

15

17

18

27

SD

4.008

7.612

2.090

6.685

3.33542

1.378

.02903

Total

Mean

3.84

8.08

27.89

12.97

23.6471

3.45

.0658

N

38

37

38

29

34

38

53

SD

3.009

7.811

2.064

5.408

2.83786

1.288

.19076

***Analysis of variance (ANOVA) test of statistical significance yields p <.001.


Those offering advice, including the most central actors, were more likely to be in nonclassroom roles and hold formal leadership positions. The principal, four coaches, and a third-grade teacher constituted the most influential actors in the data advice network, whereas core classroom teachers (grade-level teachers) were far more likely to be advice seekers. Results also suggest that highly central actors (those from whom advice is most frequently sought) are less likely to seek advice themselves and that those from whom no advice is sought are also less inclined to seek advice. In the middle are those who both routinely seek and give advice. This creates a picture in which key actorsthough not necessarily highly central in terms of indegreeare mediators of information flowing between more and less central actors in the network. We include a measure of this mediating role, termed flow-betweenness, in the analysis. Aside from role, this is the only other highly significant difference between centrality groups.


In addition, there is some indication that highly central actors had higher scores on the pedagogy scale and were more experienced than their peers, though missing data here are particularly problematic. There is also evidence that advice seekers had lower responses on pedagogy and comfort using data, as well as stronger agreement that they still need to learn a lot about data use and higher scores on practice. However, these differences are not statistically significant.


It is worth focusing here on the most central actor in the data advice networkthe literacy coachwho had an indegree 2.5 times the next highest advice giver. This indicates that this individual serves a highly critical role in supporting data use in the school. Interview data from the larger study confirms that the literacy coach is charged by the district with training teachers to use a variety of instructional programs and that she is deployed by the principal in a number of leadership and support roles. Although these responsibilities are not specific to data use, the instructional staff clearly perceives her role to encompass support for data use.


PRODUCTIVITY OF DATA ADVICE NETWORKS


Answers to our first two research questions inform our analysis of the third: to assess whether the data advice network can be productive in influencing beliefs about the value of data for instruction (pedagogy) as well as reported practices with data use (practice). First, we know that professional and data advice networks are highly correlated in terms of who an individual interacts with but that the frequency and purpose of interaction are far less tightly associated. Therefore, we want to control for teachers preexisting professional networks to distinguish between any professional interaction and those focused on data advice. Second, we know that the data advice network is highly centralized but that a set of actors is mediating the flow of information from the perceived experts (i.e., highly central actors) to those who are primarily advice seekers. Based on this information, we suspect our dependent variables may be a result not just of direct interaction between actors but indirect influence as well. Therefore, we produced several regression models for each of the dependent variables. Results are presented in Table 4.


Table 4. QAP Multiple Regression Models Predicting Responses on Pedagogy and Practice Scales


 

Practice

 

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

R2

0.007

 

0.003

 

0.005

 

15.1***

 

0.008

 

0.009

 

.153*

 

Intercept

5.67*

0*

5.52***

0***

3.70***

0***

14.2***

***

5.68***

0***

4.25***

0***

11.6*

0*

Professional

-.25**

-.08**

 

 

 

 

 

 

-.20*

-.04*

-.18*

-.05**

-.24**

-.08**

Frequency

 

 

-.38

-0.06

 

 

 

 

-0.23

-0.04

 

 

 

 

Geodesic

 

 

 

 

0.3

0.07

 

 

 

 

0.24

0.05

 

 

Reachability

 

 

 

 

 

 

-2.32*

-0.39*

 

 

 

 

-2.67**

-.38**

Note: QAP = quadratic assignment procedure.



Table 4, continued. QAP Multiple Regression Models Predicting Responses on Pedagogy and Practice Scales


 

Pedagogy

 

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

 β

Stand. β

R2

0

0

0

 

0

 

0

 

0

 

0

 

0

 

Intercept

2.3***

0***

2.30***

0***

2.18***

0***

2.1**

0***

2.30***

0***

2.27***

0***

2.28***

0***

Professional

-0.01

-0.01

 

 

 

 

 

 

-0.01

-0.01

-0.01

-0.01

-.01

-.01

Frequency

 

 

-0.03

-0.01

 

 

 

 

-0.02

-0.01

 

 

 

 

Geodesic

 

 

 

 

0.02

0.02

 

 

 

 

0.004

0.003

 

 

Reachability

 

 

 

 

 

 

0.06

0.04

 

 

 

 

0.008

0.004


We first ran models including only the professional network and then only the data advice network measures (frequency, geodesic, and reachability). As in ordinary least squares linear regression models, the R-square is interpreted as the degree to which variance in the dependent variable is explained by the independent variables. The dependent variables are a measure of similarity and difference between responses on each scale, with smaller values interpreted as less difference or greater similarity, and larger values interpreted as less similarity and greater difference. The independent variables are measures of strength of interaction, thus higher values are associated with stronger interactions.


Analyses produced no meaningful explanation for the variability in the pedagogy scale (R2 = 0), indicating that the most important predictors of beliefs about data are not accounted for in this model or that there is insufficient variability in this measure within the school context. In either case, our models offer no evidence suggesting the extent to which data advice networks influence beliefs about the value of data in instruction.


We focus then on predicting reported data use practices. Professional network ties predict less than 1% of variability in practice, though the effect was statistically significant and in the expected direction. An increase in 1 unit of frequency in terms of professional ties is associated with a .25 SD increase in similarities in practice. Models that incorporated direct measures of interaction (frequency) and nonoptimal measures of indirect interaction (geodesic) did not explain much of the similarity in practices (R2 < .01). However, the optimal data advice measurereachabilityexplained substantially greater proportions of the variability in practice: 15.1%. Further, an increase in reachability is associated with a decrease in .39 SD in differences in practice. When controlling for professional network relations, this model fit improves slightly to 15.3%, and the association with reachability is .38 SD.


DISCUSSION OF THE ROLE OF TEACHER NETWORKS AT ALLEGHENY


The purpose of this research has been to consider how a socially embedded conceptualization of capacity can help us to understand how data use capacity develops. Drawing on the results of our analyses, we focus the following discussion on what can be learned from the structure of data advice networks, actor position within those networks, and the productivity of those networks in producing shared practice.


DENSITY, CENTRALIZATION, AND THE ASSOCIATION BETWEEN NETWORKS


Analysis suggests that educators professional networks and data advice networks are highly similar in terms of which individuals teachers seek support from (the relation), yet are less similar when evaluating the strength of relationships as measured by frequency. Further, analyses indicate that professional networks are denser than the data advice network, whereas the latter is more centralized. These findings are interesting for a number of reasons.


The finding that the actors in professional and data advice networks are generally the same suggests the power of professional networks for building capacity. That is, the individuals with whom one primarily interacts are those through whom one accesses organizational resources, including knowledge/expertise, as in this study. Conversely, resources may flow primarily through preexisting networks rather than through issue-specific networks formed on the basis of other factors such as expertise. Professional and data advice networks are less strongly associated, however, in terms of strength of interaction. This finding is intuitive: Professional interactions are not limited to data use, nor is data use part of all professional interactions. Therefore, we might expect frequencies of interaction to differ. Nonetheless, there still exists a moderate, statistically significant correlation indicating that data use may be a substantive part of teachers professional relationships in Allegheny.


Though the composition and frequency of interaction in data advice and professional networks are correlated, centralization and density differ significantly. Again, this is a somewhat intuitive finding. First, we expect professional ties to be denser because it is a more general measure of collegial relations at Allegheny, whereas the data advice network is more instrumental and limited to relations based on specific needs (here, advice on data use). As summarized by Daly and Finnigan (2009) and Daly and colleagues (2010), denser networks are associated with capacity for exchanging resources and deep implementation of reform, which is confirmed in our finding that data advice networks generally tap into the denser web of professional ties. Conversely, sparse ties (i.e., the data advice network) may be associated with the diffusion of innovation because these ties are utilized to seek new informationwhich is confirmed in our finding that data advice networks may be productive resources for developing shared practice. Therefore, our findings support earlier work arguing that both denser and sparser networks may be important within an organizational structure because they provide access to different kinds of information and resources (Haythornthwaite, 2001; Tenkasi & Chesmore, 2003, as cited in Daly et al., 2010).


Relatedly, we might expect the network for data use advice to be more centralizedas it is found to be at Alleghenybecause expertise may not be evenly distributed across an organization; rather, there is likely to be a subset of individuals from whom advice is sought. A greater degree of centralization may indicate that educators in Allegheny perceive different degrees of expertise and seek knowledge only from relevant sources. In this case, advice-seeking based on perceived expertise appears to be productive in developing shared practice, suggesting that capacity for data use can develop through a centralized instrumental network.


KEY ACTORS IN DATA ADVICE NETWORKS


Our analysis of data advice networks examines characteristics of highly central actors (those perceived as experts) in the Allegheny community, moderate advice givers, and advice seekers. Findings reveal few distinct characteristics that define each group: There were no significant differences in terms of experience, responses to pedagogy and practice scales, or comfort using data.


In an advice network, we might expect that actors with a particular level of skill or knowledge would be tapped for advice, which explains earlier findings related to centralization. However, in the data advice network, proxies for expertiseexperience and responses to pedagogy and practice scalesrevealed few differences, none of which were statistically significant. Alternatively, expertise might be whether the actor feels that he or she needs to learn a lot more about data use as well as his or her personal comfort using data. It may be that educators turn to those who seem fluent in particular activities. However, these measures did not distinguish between advice giving/seeking groups either. Thus, findings suggest that how educators in Allegheny define expertise was not observed in this study. Nonetheless, findings related to comfort and learning may also suggest an additional characteristic of the Allegheny school community. Perceived experts belief that they still need to learn a lot about data use and the moderate level of comfort using data imply a belief that they need to continue to learn about data useeven experts are engaged in ongoing learning to improve their practice. The practice of ongoing learning is consistent with other norms and expectations that characterize the culture of Allegheny and may be a contributing factor to its exemplary performance under challenging conditions.


Although proxies for expertise failed to distinguish among advice giving/seeking groups, major differences, however, were associated with position in the school and betweenness in the network. Highly central actors (> 90th percentile) were far more likely to have formal leadership positions, which, absent other observable differences in expertise, suggests that a key component of the data advice network structure in Allegheny is positional authority. That is, educators are more likely to seek advice from those with formal instructional leadership positions, which included the literacy coach, math coach, reading teachers, and principal. This may be due to structural issues in how particular roles and responsibilities are assigned, with formal leaders given duties related to data use support. An alternative may be that formal leaders achieved those roles in part because of expertise not observable in existing measures, in which case positional and expert authority may coexist in highly central individuals.


A related finding is the existence of a group that both seeks and gives advice. These actorsgenerally classroom teachershave a high degree of betweenness and mimic the structure that underlies a coach-to-coach model often employed in professional development. Betweenness makes these individuals critical connection points between highly central actors or perceived experts, and the majority of classroom teachers who are advice seekers onlya position that can be defined as brokers (Moolenaar, Daly, & Sleegers, 2010). These actors facilitate or constrain the distribution and flow of resourcesin this case, advice on data usethroughout the organization. Given the exemplary status of Allegheny as a data user, it seems that these individuals have supported the development of data use capacity. Similarly, this between group may serve a valuable function in making the work of central actors, such as the literacy coach, more manageable. If assigned responsibility for data use support, providing such support to approximately 40 classroom teachers would prove highly time consuming. Through interaction with the between group, support can reach the advice seekers indirectlyan important aspect of the productivity of data advice networks.


PRODUCTIVITY OF ADVICE NETWORKS


Our last objective was to explore whether Alleghenys network structure was productive. That is, can advice seeking networks be a productive way for schools and teachers to generate data use capacity? Findings reveal little explanatory power in predicting beliefs (pedagogy) about the role of data in instruction. This may indicate measurement error or that other factors (e.g., preservice preparation, professional development) not considered in our model are highly influential on beliefs about data use. Findings related to practicethat is, a measure of teachers use of data during the course of instructionare quite different. First, we find that direct interaction does not explain similarities in practice but that a measure of indirect interaction explains a substantial proportion of variance (15%) in differences in practice, even controlling for general professional relationship. This can be interpreted to mean that practice is influenced by those beyond your immediate and direct interactions. That is, it isnt just whom you seek advice from, but from whom they seek advice as well. Influence, therefore, is exerted indirectly.  In the previous discussion, we referred to the significance of the between group who linked highly central actors to advice-seeking actors. Their significance is confirmed in this finding. Overall, the analyses offer empirical evidence that advice networks can be productive in influencing teachers practice.


IMPLICATIONS FOR RESEARCH AND PRACTICE


The purpose of this research has been to explore the development of data use from the perspective that capacity is socially embedded. Through social network analysis, we are able to present information drawn from a unique case study about the nature of professional and advice networks, the characteristics of perceived experts, and the productivity of the network in terms of shared beliefs and practices. In Allegheny, teacher networks reveal a centralized structure for seeking advice on data use that occurs largely within existing professional relationships. In these networks, we observed key roles for administrators and teacher leaders but also for classroom teachers who serve as bridges between perceived experts and less connected colleagues. Ultimately, these networks generated shared practices around data use.


The uniqueness of the case and the methodology employed have limitations that qualify our findings. First, Allegheny is an exemplar in many ways, and it is not expected that relationships observed here translate directly to other, broader contexts. However, the context of this study is generally free of many factors that constrain the development of capacity for data use (e.g., absence of access to data, poor instructional leadership), providing a unique opportunity to examine the mechanism of advice networks independent of other confounding issues. Second, our response rate on the survey, while high, still resulted in missing data (n = 11). Missing data can be highly problematic in social network analysis. In this case, nonrespondents may have reported patterns of relations that differ from those we did observe, creating an opportunity for us to draw inaccurate conclusions about Allegheny. Third, the measures used in this study could be refined and improved. The measures of practice and pedagogy have been used in a number of contexts, but extensive validation of the instruments has not been conducted. There is a need for widely available and validated measures that allow for more consistent estimation of data use behaviors. Findings also suggest that other factors that may differentiate levels of data use expertise were not accounted for in our survey and would be beneficial in future work.  


In spite of these limitations, this research has implications for both research and practice.  This research offers evidence that advice networks may be effective in establishing shared practices within this school community. Although it does so in a context that enjoys strong instructional leadership, structures that support data use, and a school culture reflecting professional community, we can use what we have learned about Allegheny as a foundation for exploring capacity development in other contexts. That is, under what conditions are teacher advice networks viable mechanisms for developing capacity? Given that data advice networks were built from professional networks in Allegheny, in schools with weaker ties among teachers and less social capital among colleagues, we might expect data advice networks to be more sparse or diffuse, perhaps lessening the productivity of networks. We might also anticipate that schools lacking strong leadership for data use will have different structures for seeking advice, perhaps a less centralized structure resulting in more varied sets of data use practices. These are speculations informed by what we observed at Allegheny but offer important directions for future research.


Evidence provided here offers also an empirical argument for the need to conceptualize capacity as socially embedded and that teacher networks may serve as a powerful mechanism for the development of capacity to use data. As consensus grows about the social nature of data use, we anticipate greater attention to network analysis moving forward. In particular, we do not attend to the quality of social relations or the quality of practices. Specifically, we do not focus on what types of relations are likely to produce improvement in data use practice. Work by Coburn and colleagues (2012) focuses on depth or quality of interaction around mathematics reform implementation, and such an approach would be a fruitful way of advancing findings presented here. Second, research is needed to explore the role of leadership in instrumental networks. Our analysis suggests that those with positional authorityin this case, those with formal instructional leadership positionsare positioned to exert substantial influence over practice. Whether this is by design (e.g., in the roles and responsibilities delegated to leaders) or a result of the coexistence of expert and positional authority is not observable in the scope of this project. Further, the prominence of formal instructional leaders in data advice networks may be related to the context of the school and the nature of leadership in general; other research has found varied degrees of centrality for school leadership in advice networks (Spillane & Kim, 2012).


Findings related to the characteristics of professional and data advice networks, key actors in data advice networks, and the productivity of networks may also be instructive for several aspects of policy and practice. Our findings are relevant to the design and implementation of initiatives or interventions to support data use. The socially embedded capacity to use data and social relations as a mechanism for building capacity for data use suggest that efforts to improve teachers use of data may benefit from collaborative structures that focus on the interactions of teachers around data, confirming findings of previous research discussed earlier. Efforts that target individual knowledge, such as traditional professional development for select teachers, could be augmented by creating structures or processes to ensure that knowledge becomes part of the flow of information within the school (e.g., peer coaching models of professional development, for example, Joyce & Showers, 2002). Although the effectiveness of any one strategy is not known, evidence presented here suggests that interventions that tap into social ties among teachers may be highly productive.


Our research also suggests a need to invest in the professional development of school leaders. As in Allegheny, formal instructional leaders may, by position or expertise, serve as highly central actors in data advice networks, yet professional development resources are typically focused on classroom teachers. Furthermore, leadership is not limited to administrators but includes instructional specialists (e.g., coaches or reading teachers) available to support classroom-based teachers. To the extent that these leaders are influential in the advice networks of their school, it is critical that they have a deep understanding of the strategyhere, data usefor which teachers seek support.


Relatedly, our findings are useful in considering leadership strategies for successful implementation of reform. Here, the reform is data use, and findings support that dense ties, instrumental networks, central actors, and between connectors may be defining aspects of capacity development. As school leaders adopt and implement reform, knowledge of the underlying professional structure and the nature of advice networks in the school could be instrumental in successful reform, as suggested by Daly and Finnigan (2009). For example, if employing a peer coaching model of professional development, leaders can capitalize on existing networks by identifying those key teachers who connect expert and advice-seeking colleagues. Similarly, leadership can focus on building dense ties to build socially embedded capacity for reform. For example, leaders can develop collaborative structures that enable meaningful interaction among teachers, creating a denser network to support reform efforts.


Finally, as in all research on data use, there is a need to more explicitly connect data use to changes in teacher practice, then to improvements in student learning. The theory of action underlying data use reform efforts is that all our work will have an impact on the quality of teaching and learning, and as we move forward in research and practice, it is essential to retain this critically important focus.


Acknowledgments


This research was funded by the Spencer Foundations Data Use and Educational Improvement Initiative. We would also like to thank Steve Borgatti (University of Kentucky) for his support in the production of this research.


Note


1. Data accessed through school profiles on the Department of Education website: http://profiles.doe.k12.de.us/SchoolProfiles/State/Default.aspx


References


Anderson, S., Leithwood, K., & Strauss, T. (2010) Leading data use in schools: Organizational conditions and practices at the school and district levels. Leadership and Policy in Schools 9(3), 292327.


Atteberry, A., & Bryk, A. S. (2010). Analyzing the role of social networks in school-based professional development initiatives. In A. J. Daly (Ed.), The ties of change: Social network theory and application in education (pp. 5176). Cambridge, MA: Harvard Education Press.


Baker-Doyle, K. J., & Yoon, S. A. (2010). Making expertise transparent: Using technology to strengthen social networks in teacher professional development. In A. J. Daly (Ed.), Social network theory and educational change (pp. 115126). Cambridge, MA: Harvard Education Press.


Blanc, S., Christman, J. B., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. E. (2010). Learning to learn from data: Benchmarks and instructional communities. Peabody Journal of Education, 85(2), 205225.


Borgatti, S. P. (2002). NetDraw: Graph visualization software. Harvard, MA: Analytic Technologies.


Borgatti, S. P., Everett, M. G., & Freeman, L. C. (2002). UCINET for Windows: Software for social network analysis. Harvard, MA: Analytic Technologies.


Borgatti, S. P., Mehra, A., Brass, D. J., & Labianca, G. (2009). Network analysis in the social sciences. Science, 323, 892895.


Bryk, A. S., & Schneider, B. L. (2002). Trust in schools: A core resource for improvement. New York, NY: Russell Sage Foundation.


Cho, V., & Wayman, J. C. (2014). Districts efforts for data use and computer data systems: The role of sensemaking in system use and implementation. Teachers College Record, 116 (2). Retrieved from http://www.tcrecord.org/Content.asp?ContentId=17349


Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading policy in their professional communities. Educational Evaluation and Policy Analysis, 23(2), 145170.


Coburn, C. E., & Russell, J. L. (2008). District policy and teachers social networks. Education Evaluation and Policy Analysis, 30(3), 203235.


Coburn, C. E., Russell, J. L., Kaufman, J. H., & Stein, M.K. (2012). Supporting sustainability: Teachers advice networks and ambitious instructional reform. American Journal of Education, 119, 137182.


Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American Journal of Education, 112(4), 469495.


Coburn, C. E., & Turner, E. O. (2011): Research on data use: A framework and analysis. Measurement: Interdisciplinary Research and Perspectives, 9(4), 173206.


Cole, R. P., & Weinbaum, E. H. (2010). Changes in attitude: Peer influence in high school reform. In A. J. Daly (Ed.), Social network theory and educational change (pp. 7796). Cambridge, MA: Harvard University Press.


Corcoran, T., Fuhrman, S. H., & Belcher, C. L. (2001). The district role in instructional improvement. Phi Delta Kappan, 83(1), 7884.


Cosner, S. (2011). Supporting the initiation and early development of evidence-based grade-level collaboration in urban elementary schools: Key roles and strategies of principals and literacy coordinators. Urban Education 46(4), 786827.


Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record, 114(11), 138.


Daly, A. J., & Finnigan, K. (2009). A bridge between worlds: Understanding network structure to understand change strategy. Journal of Educational Change, 11(2), 111138.


Daly, A. J., & Finnigan, K. S. (2010). A bridge between worlds: Understanding network structure to understand change strategy. Journal of Educational Change, 11(2), 111138.


Daly, A. J., & Finnigan, K. (2011). The ebb and flow of social network ties between district leaders under high stakes accountability. American Education Research Journal, 48(1), 3979.


Daly, A. J., Moolenaar, N., Bolivar, J., & Burke, P. (2010). Relationships in reform: The role of teachers' social networks. Journal of Educational Administration, 48(3), 2049.


Data Quality Campaign. (2013). Data for action, 2013. Washington, DC: Author. Retrieved from http://dataqualitycampaign.org/files/DataForAction2013.pdf


Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles: University of Southern California, Center on Educational Governance.


Dekker, D., Krackhardt, D., & Snijders, T. A. B. (2007). Sensitivity of MRQAP tests to collinearity and autocorrelation conditions. Psychometrika, 72, 563581.

Farley-Ripple, E. N., & Buttram, J. (2013). Developing collaborative data use through professional learning communities: Implementation evidence from Delaware. Studies in Educational Evaluation. http://dx.doi.org/10.1016/j.stueduc.2013.09.006

Farley-Ripple, E. N., & Buttram, J. (2014). Schools use of interim data: Practices in classrooms, teams, and schools. In A. R. Shoho, B. Barnett, & A. Bowers (Eds.), Using data in schools to inform leadership and decision making. International Research on School Leadership Series (Vol. 5, pp. 3966). Charlotte, NC: Information Age.

Finnigan, K. S., & Daly, A.J. (2012). Mind the gap: Organizational learning and improvement in an underperforming system. American Journal of Education, 119, 4170.


Finnigan, K. S., Daly, A. J., & Che, J. (2013). Systemwide reform in districts under pressure: The role of social networks in defining, acquiring, using, and diffusing research evidence. Journal of Educational Administration, 51(4), 476497.

Firestone, W. A., & Gonzalez, R. A. (2007). Culture and processes affecting data use in school. In P. A. Moss (Ed.), Evidence and decision making (pp. 132154). Malden, MA: Blackwell.


Frank, K. A., Zhao, Y., & Borman, K. (2004). Social capital and the diffusion of innovations within organizations: Application to the implementation of computer technology in schools. Sociology of Education, 77(2), 148171.


Frank, K. A., Zhao, Y., Penuel, W. R., Ellefson, N., & Porter, S. (2011). Focus, fiddle, and friends: Experiences that transform knowledge for the implementation of innovations. Sociology of Education, 84(2), 137156.


Fuchs, L. S., Fuchs, D., Karns, K., Hamlett, C. L., & Katz, M. (1999). Mathematics performance assessment in the classroom: Effects of teacher planning and student problem solving. American Educational Research Journal, 36, 609646.


Halverson, R., Grigg, J., Prichett, R., & Thomas, C. (2007). The new instructional leadership: Creating data-driven instructional systems in schools. Journal of School Leadership, 17(2), 158193.


Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J. A., & Wayman, J. C. (2009). Using student achievement data to support instructional decision making (No. NCEE 2009-4067). Washington, DC: National Center for Educational Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.


Haythornthwaite, C. (2001). Tie strength and the impact of new media. In proceedings of the 34th annual Hawaii international conference on systems sciences, Maui.


Honig, M. I., & Coburn, C. (2008). Evidence-based decision-making in school district central offices: Toward a policy and research agenda. Educational Policy, 22(4), 578608.


Honig, M. I., & Venkateswaran, N. (2012). School-central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199222.


Hord, S. M. (1997). Professional learning communities: What are they and why are they important? Issues&about Change, 6(1), 18.


Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the data driven mantra: Different conceptions of data-driven decision making. National Society for the Study of Education Yearbook, 106(1), 105131.


Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106, 12581287.


Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. Alexandria, VA: ASCD.


Kennedy, M. M. (1982). Evidence and decision. In M. M. Kennedy (Ed.), Working knowledge and other essays (pp. 59103). Cambridge, MA: Huron Institute.


Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education 112(4), 496520.


Knapp, M., Copland, M., & Swinnerton, J. (2007). Understanding the promise and dynamics of data-informed leadership. In P. A. Moss (Ed.), Evidence and decision making (pp. 74104). Malden, MA: Blackwell.


Konstantopoulos, S., Miller, S. R., & van der Ploeg, A. (2013). The Impact of Indianas System of Interim Assessments on Mathematics and Reading Achievement. Educational Evaluation and Policy Analysis, 35(4), 481499.


Krackhardt D. (1987). Predicting with networks: Nonparametric multiple-regression analysis of dyadic data. Social Networks, 10, 359381.


Kruse, S. D., Louis, K. S. & Bryk, A. S. (1995). An emerging framework for analyzing school-based professional community. In K. S. Louis, S. Kruse, & Associates (Eds.), Professionalism and community: Perspectives on reforming urban schools (pp. 322). Long Oaks, CA: Corwin Press.


Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333350.


Lasley, T. J. (2009). Using data to make critical choices. In T. J. Kowalski & T. J. Lasley (Eds.), Handbook of data-based decision making in education (pp. 243258). New York, NY: Routledge.


Light, D., Honey, M., Henze, J., Brunner, C., Wexler, D., Mandinach, E., & Fasca, C. (2005). Linking data and learning: The Grow Network study. New York, NY: Education Development Center, Center for Children and Technology.


Little, J. W., & Curry, M. W. (2009). Structuring talk about teaching and learning: The use of evidence in protocol-based conversation. In L. M. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement (pp. 2942). Dordrecht, the Netherlands: Springer.


Mandinach, E. B., Friedman, J. M., & Gummer, E. S. (2015). How can schools of education help to build educators capacity to use data? A systemic view of the issue. Teachers College Record, 117(4).


Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation programs. Educational Researcher, 42(1), 3037.


Marsh, J. A. (2012). Interventions promoting educators use of data: Research insights and gaps. Teachers College Record, 114(11), 148.


Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research (OP-170). Santa Monica, CA: RAND.


May, H., & Robinson, M. A. (2007). A randomized evaluation of Ohios Personalized Assessment Reporting System (PARS). Philadelphia, PA: Consortium for Policy Research in Education.


McLaughlin, M. W., & Talbert, J. E. (2001). Professional communities and the work of high school teaching. Chicago, IL: University of Chicago Press.


Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers abil­ity to use data to inform instruction: Challenges and supports. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.


Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision making in schools: Teacher access, supports and use. Report prepared for U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Prepared by SRI International, Menlo Park, CA.


Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.


Moolenaar, N. M. (2012). A social network perspective on teacher collaboration in schools: Theory, methodology, and applications. American Journal of Education, 11, 739.


Moolenaar, N. M., Daly, A. J., & Sleegers, P. J. (2010). Occupying the principal position: Examining relationships between transformational leadership, social network position, and schools innovative climate. Educational Administration Quarterly, 46(5), 623670.


Moolenaar, N. M., & Sleegers, P. J. (2010). Social networks, trust, and innovation. How social relationships support trust and innovative climates in Dutch Schools. In A. J. Daly (Ed.), Social network theory and educational change (pp. 97115). Cambridge, MA: Harvard Educational Press.


National Center for Education Statistics. (n.d.). About the SLDS grant program. Retrieved from https://nces.ed.gov/programs/slds/about_SLDS.asp


Nelson, T. (2008). Teachers collaborative inquiry and growth: should we be optimistic? Science Teacher Education, 93, 548580.


Penuel, W. R., Riel, M. R., Krause, A. & Frank, K. A. (2009). Analyzing teachers professional interactions in a school as social capital: A social network approach. Teachers College Record, 111(1), 124163.


Penuel, W. R., Sun, M., Frank, K. A., & Gallagher, H. A. (2012). Using social network analysis to study how collegial interactions can augment teacher learning from external professional development. American Journal of Education, 119, 103136.


Saunders, W. M., Goldenberg, C. N., & Gallimore, R. (2009). Increasing achievement by focusing grade-level teams on improving classroom learning: A prospective quasi-experimental study of Title I schools. American Educational Research Journal, 46(4), 10061033.


Schildkamp, K., & Kuiper, W. (2010). Data informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482496.


Sharkey, N. S., & Murnane, R. J. (2006) Tough choices in designing a formative assessment system. American Journal of Education, 112(4), 572578.


Spillane, J. P., Hunt, B., & Healey, K. (2009). Managing and leading elementary schools: Attending to the formal and informal organization. International Studies in Educational Administration, 37(1), 528.


Spillane, J. P., & Kim, C. M. (2012). An exploratory analysis of formal school leaders position in instructional advice networks in elementary schools. American Journal of Education, 119, 73102.


Stecker, P. M., Fuchs, L. S., & Fuchs, D. (2005). Using curriculum-based measurement to improve student achievement: review of research. Psychology in the Schools, 42(8), 795819.


Supovitz, J. A. (2002). Developing communities of instructional practice. Teachers College Record, 104(8), 15911626.


Supovitz, J. A., & Klein, V. (2003). Mapping a course for improved student learning: How innovative schools use student performance data to guide improvement. Philadelphia, PA: Consortium for Policy Research in Education.


Supovitz, J. A., Merrill, E., & Conger, M. (2010). Productive teacher use of student performance and related data in professional learning communities. Paper presented at the annual meeting of the American Educational Research Association, Denver, CO.


Supovitz, J. A., & Morrison, K. (2011). Does collaboration facilitate data use in schools? Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.


Tenkasi, R., & Chesmore, M. (2003). Social networks and planned organizational change. Journal of Applied Behavioral Science, 39(3), 281300.


Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teacher practice and teacher learning. Teaching and Teacher Education, 24(1), 8091.


Wayman J. C. (2005) Involving teachers in data-based decision-making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Students Placed at Risk 10(3), 295308.


Wayman, J. C., & Cho, V. (2007). Preparing educators to effectively use student data systems. In T. J. Kowalski, & T. J. Lasley (Eds.), Handbook of data-based decision making in education (pp. 89104). New York, NY: Routledge.


Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of educator data use. Unpublished document.


Wayman, J. C., Jimerson, J. B., & Cho, V. (2012). Organizational considerations in establishing the data-informed district. School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice, 23(2), 159178.


Wayman, J. C., & Stringfield, S. (2006). Technology-supported involvement of entire faculties in examination of student data for instructional improvement. American Journal of Education, 112(4), 549571.


Weinbaum, E. (2009). Learning about assessment. An evaluation of a ten-state effort to build assessment capacity in high schools. Philadelphia, PA: Consortium for Policy Research in Education.


Young, V. M. (2006). Teachers use of data: Loose coupling, agenda setting, and team norms. American Journal of Education, 112(4), 521548.



Appendix A. Descriptive Statistics for Measures (n = 42)

 

%

 

Core subject classroom teacher

52.80

 

Formal leader (administrator, coach, etc.)

32.10

 

Related arts teacher (library, art, etc.)

15.10

 
 

Mean

SD

Years in school

3.84

3.13

Years in district

5.53

4.51

Years in education

8.21

7.93

 I feel I have a lot to learn about using data.

3.44

1.29

Scale: Comfort using data

23.24

3.71

Scale: Pedagogy

27.85

2.07

Scale: Practices

13.81

6.68



Appendix B. Items and Reliability Measures for School Context Scales

Scale and Items (6-point agreement scale)

Responses

Cronbachs Alpha

Mean

SD

Data Practice (Wayman et al., 2009)

"

I use data to assess student learning.

"

I use data to diagnose student learning needs.

"

I adjust my instruction based on student data.

"

I use multiple data sources to understand student learning.

"

I use data to plan lessons.

"

I use data to set student learning goals.

119

.917

13.47

5.95

Pedagogy (Wayman et al., 2009)

"

Data can help educators plan instruction.

"

Data can offer information about students that was not already known.

"

Data can help educators know what concepts students are learning.

"

Data can help educators identify learning goals for students.

"

Students benefit when teacher instruction is informed by data.

138

.893

27.01

2.70

Comfort

"

I am comfortable accessing data on my own.

"

 I am able to analyze or manipulate data to get information that I need.

"

 I feel confident when discussing data with others.

"

 I feel comfortable making curricular and instructional decisions based on data.

"

 I am more capable of using data than most of my colleagues.

132

.851

22.94

3.63

Wayman, J. C., Cho, V., & Shaw, S. (2009). Survey of educator data use. Unpublished document.




Cite This Article as: Teachers College Record Volume 117 Number 4, 2015, p. 1-34
https://www.tcrecord.org ID Number: 17852, Date Accessed: 10/23/2021 7:20:22 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Elizabeth Farley-Ripple
    University of Delaware
    E-mail Author
    ELIZABETH FARLEY-RIPPLE is an assistant professor at the University of Delaware. Her research focuses on issues of leadership, organizational improvement, and capacity building at the school and district levels. She is an experienced mixed methodologist, with expertise using large administrative data sets, multilevel models, survey research, and social network analysis as well as engaging in large-scale qualitative data collection and analysis. Dr. Farley-Ripple has published research in respected journals such as Educational Researcher, Educational Policy, the American Journal of Education, Educational Management, Administration, and Leadership, and Urban Education.
  • Joan Buttram
    University of Delaware
    E-mail Author
    JOAN BUTTRAM is the director of the Delaware Education Research and Development Center at the University of Delaware. Dr. Buttram’s research focuses on educational leadership, professional development, and school improvement. She is currently conducting research and/or evaluation studies related to high school reform, professional learning communities, multiple professional development and training programs in both K–12 and higher education, and health services for individuals with disabilities. Prior to coming to UD, she directed the Southwest Regional Educational Laboratory in Austin, Texas, for nine years.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS