Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Factors Influencing the Functioning of Data Teams

by Kim Schildkamp & Cindy Poortman - 2015

Background: Data-based decision making can lead to increased student achievement; however, schools struggle with the implementation of data-based decision making. Professional development in the use of data is therefore urgently needed. However, professional development is often ineffective in terms of improving the knowledge, skills, and attitude of the receiver.

Purpose: We need a more fundamental understanding of how we can increase the effectiveness of data-use-related professional development. This study therefore focuses on the factors influencing a professional development intervention for data-based decision making: the data team procedure. Data teams are teams of teachers and school leaders who collaboratively learn how to use data, following a structured approach and guided by a facilitator from the university. Based on an extensive literature review, we developed a data use framework in which the use of data is influenced by data characteristics, school organization characteristics, and user and team characteristics.

Research Design: We conducted case studies.

Data Collection: We focused on observing in depth the factors that influence the work of the data teams and interviewing the data team members about these factors. Four data teams of six schools for upper secondary education were followed over a period of 2 years. We observed and analyzed 34 meetings and analyzed 23 interviews, combined with our field notes. Although this pilot study only permits analytical generalization of the findings, the findings provide more in-depth insight into the factors that enable and hinder interventions, focusing on supporting collaborative data use in schools.

Findings: The results show that several data characteristics (access and availability of high-quality data), school organizational characteristics (a shared goal, leadership, training and support, involvement of relevant stakeholders), and individual and team characteristics (data literacy, pedagogical content knowledge [PCK], organizational knowledge, attitude, and collaboration) influence the use of data in data teams. The results also show that these influencing factors are interrelated.

Conclusions: Schools need support in all aspects of the use of data (from formulation of a problem definition to taking action based on the data). This study can form a starting point for larger studies into the factors influencing these types of professional development interventions to ensure effective implementation and sustainability.


In a context in which schools are held more and more accountable for the education they provide, data-based decision making has become increasingly important. Our definition of data in the context of schools is information that is systematically collected and organized to represent some aspect of schooling. This definition of data is deliberately broad to include any relevant information derived from qualitative and quantitative methods of analysis (Lai & Schildkamp, 2013; Wayman, Jimerson, & Cho, 2012). Examples include assessment data, structured classroom observation data, and student survey results. Data-based decision making (in short, data use) refers to making decisions based on these data (Lai & Schildkamp, 2013; Mandinach & Honey, 2008).

In the school effectiveness literature, data use is identified as a common core characteristic of high-performing schools (Ragland, Clubine, Constable, & Smith, 2002; Schaffer, Reynolds, & Stringfield, 2012; Snipes, Doolittle, & Herlihy, 2002; Supovitz & Klein, 2003). Studies show that data-based decision making can lead to increased student achievement (Campbell & Levin, 2009; Lai, McNaughton, Timperley, & Hsiao, 2009). However, most teachers do not use data to their best effect or do not use data at all (Schildkamp & Kuiper, 2010; Schildkamp & Teddlie, 2008). Decisions by teachers are generally taken based on intuition and limited observations (Ingram, Louis, & Schroeder, 2004), and these decisions do not always contribute to student learning. Also, in most cases, little attention is paid in teacher training colleges to data use (Herman & Gribbons, 2001; Mandinach & Gummer, 2013) Therefore, professional development in data-based decision making is urgently needed and is essential for improving the quality of schools (Desimone, 2009). This professional development should include training with regard to how to use data and, perhaps even more important, how to connect data to the daily practice of school leaders and teachers (Black & William, 1998; Datnow, Park, & Wohlstetter, 2007; Supovitz & Klein, 2003).

However, professional development is often ineffective in terms of improving the knowledge, skills, and attitude of the receiver. We therefore need a more fundamental understanding of data-use-related professional development (Desimone, 2009). How teachers collaborate around the use of data has not been explored extensively (Datnow, Park, & Kennedy-Lewis, 2013). Moreover, a great deal still remains unknown with regard to the conditions influencing data use (interventions) (Coburn & Turner, 2012; Marsh, 2012). This study therefore focuses on the factors influencing a professional development intervention for collaborative data-based decision making: the data team procedure.

Data teams are teams of 4–6 teachers and 1–2 (assistant) school leaders who collaboratively use data to solve a certain educational problem within the school using a structured approach. The data team procedure includes a comprehensive set of guidelines and activities and support from a facilitator of the university. This external facilitator visits the data team’s school every 2–3 weeks for a meeting to work on the steps for a period of 2 years. Collaboration around the use of data brings focus to the conversations and a sense of purpose, helps teachers to learn from each other how to use data, and allows for a fertile exchange of ideas and strategies (Datnow et al., 2013; Wayman, Midgley, & Stringfield, 2006; Wohlstetter, Datnow, & Park, 2008). Also, collaborative data use is more likely to contribute to teacher and student learning than individual data use (Chen, Heritage, & Lee, 2005; Means, Chen, DeBarger, & Padilla, 2011; Means, Padilla, & Gallagher, 2010; Wayman et al., 2006; Wayman & Stringfield, 2006).

Data teams can be seen as a form of professional development with the ultimate goal of school improvement. Although presented as a rather linear process, the data team procedure is an iterative and cyclic procedure (developed based on Earl & Katz, 2006) consisting of eight steps. The data team members go back and forward between the steps (see also Figure 1) (Schildkamp & Ehren, 2013, pp. 56–57; Schildkamp, Handelzalts, & Poortman, 2012):


Problem definition: The team decides on which educational problem and goals they want to focus their efforts. For example, if the data team decides to focus on grade retention, in this step, the first thing the team has to do is collect data on grade retention (e.g., how many grade repeaters does the school have in each grade?).


Formulating hypotheses: The team develops hypotheses (e.g., on what causes grade retention).


Data collection: The team collects data to test the hypotheses. Several types of data can be collected (e.g., assessment data, inspection reports, and examination results), both quantitative and qualitative data.


Data quality check: Are the collected data reliable and valid?


Data analysis (e.g., summarizing, calculating, comparing): This can involve simple data analyses (e.g., descriptive analyses, summarizing interview data) and more sophisticated analyses (e.g., correlational and regression analyses).


Interpretation and conclusions: If hypotheses turn out to be false, new hypotheses need to be tested. The data team needs to collect additional data (back to Step 2). If the hypotheses are correct, the team draws conclusions based on the collected data.


Implementing improvement measures: The team describes the measures that are needed to solve the problem, and the goals that go with these measures. The team makes team members responsible for implementing the actions and determines which resources are available for implementing the actions. The data team also thinks of ways to monitor the implementation of the actions, sets deadlines, and determines which data are needed to establish the effectiveness of the implemented actions.


Evaluation: Are the actions effective? Are the goals met? Are the problems solved, and is the team satisfied? To evaluate the actions, new data need to be collected. This process continues until the priorities are met and the goals have been accomplished. In that case, the team can continue with a new problem and therefore start with a new “Step 1.”


Figure 1. The data team procedure

Note: Schildkamp & Ehren, 2013, p. 56.


Our theory of action, illustrated in Figure 2, is based on the analyses of several data use models and frameworks (Coburn & Turner, 2011; Lai & Schildkamp, 2013; Mandinach, Honey, Light, & Brunner, 2008; Marsh, 2012; Schildkamp & Kuiper, 2010; Schildkamp & Lai, 2013). Our framework dynamically links a broad view on data and data use to enablers and barriers that influence the use of data. The framework takes into account the process of data use (based on Mandinach & Honey, 2008; Marsh, 2012), the organizational context in which data use is taking place, and the characteristics of the data and data systems available, but also that individual data users in the data team influence the process at a micro level. Although the core section of the framework implies that data use occurs as a linear, rational process, we acknowledge that data use involves a number of complex processes, conditions, and contexts that interact in complex ways. This is illustrated in Figure 2 by showing that school organizational characteristics interact with individual and team characteristics, data characteristics, and data use. The interaction between data and people, in a certain context, results in decisions with regard to what action to take. Data use involves an interpretative process in which data have to be accessed, collected, and analyzed to be turned into information, and combined with understanding and expertise to become meaningful and useful for actions (Coburn & Turner, 2011; Mandinach et al., 2008; Marsh, 2012).


First, when describing the use of data by data teams, we can look at the activities these teams undertake (see Figure 2). These activities follow along the line of the steps of the data use process described by Marsh (2012), with the exception that data teams start with a purpose in the form of a problem definition and a related goal instead of with data. Often teachers and/or school leaders have the conception that something is wrong (which may be supported by, for example, inspection data), but they do not know what exactly the problem is and how big this problem is. So they start with their conception that something is wrong, for example, in the area of mathematics, and start with the purpose of defining and solving this problem. After coming up with a problem definition based on data, hypotheses concerning the problem are formulated and data to investigate these hypotheses are accessed and collected. Data as such have no meaning. These data have to be filtered (e.g., are the data valid and reliable and, if not, additional data need to be collected and a new feedback loop is created), organized to investigate the hypothesis, and analyzed and interpreted to become information. This is all done in Steps 4, 5, and 6 of the data team procedure. Combined with stakeholder understanding and expertise, this becomes actionable knowledge. In the data team procedure, this relates to two possible actions: Either the hypothesis is incorrect and the action is to go back to formulating new hypotheses (a feedback loop is created), or the hypothesis is correct and the data team takes action based on the data. If the data team takes action based on the data, they also need to evaluate (collect new data) if their actions have led to the desired outcomes and goal; in this way, another feedback loop is created (Mandinach et al., 2008; Marsh, 2012; Marsh, Pane, & Hamilton, 2006).

Figure 2. Data use theory of action and factors influencing data use


Note: Based on Coburn & Turner, 2011; Lai & Schildkamp, 2013; Mandinach et al., 2008; Marsh, 2012, p. 4 ; Schildkamp & Kuiper, 2010; Schildkamp & Lai, 2013.

Second, the term depth of inquiry is relevant in studying the process of data use in data teams. Henry (2012) defined depth of inquiry as the degree to which team conversations express higher level thinking skills, such as analysis, synthesis, critique, goal setting, reflection, and monitoring. Conversations that lack depth focus on telling information, retelling, describing, and storytelling. High depth concerns data team members going through the whole data use cycle and developing new knowledge based on data, focused on taking action in their school or classroom.


Our literature review shows that the process of data use is influenced by several factors that can either enable data use or form a barrier toward effective data use. These are summarized in Table 1. First, data characteristics can become an affordance or constraint for the use of data. Schools that have access to high-quality data are more likely to show an increased level of data use (Breiter & Light, 2006; Coburn & Turner, 2011; Park & Datnow, 2009; Schildkamp & Kuiper, 2010; Wayman & Stringfield, 2006; Wohlstetter et al., 2008). The following data characteristics were identified in the literature:

Access to high-quality data (Coburn & Turner, 2011; Kerr, Marsh, Ikemoto, Darilek, & Barney, 2006; Mandinach & Honey, 2008; Marsh et al., 2006; Means et al., 2010; Schildkamp & Kuiper, 2010; Sharkey & Murnane, 2006; Wayman et al., 2006; Wayman & Stringfield 2006; Wayman, Cho, & Johnston, 2007; Wayman, Jimerson, & Cho, 2012).

Availability of multiple sources of data, not just standardized assessment data, that fit with the needs of the user (Coburn & Turner, 2011; Mandinach & Honey, 2008; Means et al., 2010; Schildkamp & Kuiper, 2010; Wayman et al., 2006, 2007, 2012).

Availability of tools and information management systems for data storage, retrieval, and analysis (Breiter & Light, 2006; Coburn & Turner, 2011; Datnow et al., 2007, 2013; Kerr et al., 2006; Mandinach & Honey, 2008; Means et al., 2010; Schildkamp & Kuiper, 2010; Sharkey & Murnane, 2006; Wayman & Stringfield, 2006; Wayman et al., 2006, 2007, 2012; Wohlstetter et al., 2008).

Moreover, data use can be enabled or constrained by certain school organizational characteristics. Our literature review pointed to the following influencing factors:

Leadership: The school leader plays an essential role in the use of data. The school leader needs to encourage, motivate, and facilitate teachers (e.g., provide them with time to use data) (Coburn & Turner, 2011; Datnow et al., 2007, 2013; Kerr et al., 2006; Leithwood, Jantzi, & McElheron-Hopkins, 2006; Levin & Datnow, 2012; Marsh, 2012; Marsh et al., 2006; Means et al., 2010; Park & Datnow, 2009; Schildkamp & Kuiper, 2010; Supovitz & Klein, 2003; Wayman & Stringfield, 2006; Wohlstetter et al., 2008; Young, 2006).

Shared goals: The school needs to have shared and measurable goals. Without clear and agreed-on goals, the effective use of data is difficult (Datnow et al., 2007, 2013; Earl & Katz, 2006; Honig & Venkateswaran, 2012; Kerr et al., 2006; Levin & Datnow, 2012; Park & Datnow, 2009; Schildkamp & Kuiper, 2010; Sharkey & Murnane, 2006; Spillane, 2012; Supovitz & Klein, 2003; Wayman & Stringfield, 2006; Wayman et al., 2007; Wohlstetter et al., 2008; Young, 2006).

Training and support: Training and support for data use either internally (by a data expert, somebody from within the school who has access to data and can help with analysis and interpretation) or externally (through workshops, on-site support from somebody outside the school) can enable data use (Breiter & Light, 2006; Coburn & Turner, 2011; Honig & Venkateswaran, 2012; Kerr et al., 2006; Mandinach & Honey, 2008; Marsh et al., 2006; Marsh, 2012; Means et al., 2010; Nelson & Slavit, 2007; Schildkamp & Kuiper, 2010; Supovitz & Klein, 2003; Wayman & Stringfield, 2006; Wohlstetter et al., 2008; Young, 2006).

Finally, individual and team characteristics can influence the use of data. Data teams consist of individual teachers who work together in a team. As stated by Daly (2012), “The interpretation and use of data for improvement take place within the individual and between educational actors, who, through social interaction co-construct and make sense of data and their use” (p. 2). The following influencing factors were identified in our literature review:

Knowledge and skills (e.g., to use data effectively, pedagogical content knowledge) (Coburn & Turner, 2011; Datnow et al., 2007; Earl & Katz, 2006; Kerr et al., 2006; Levin & Datnow, 2012; Mandinach & Honey, 2008; Marsh et al., 2006; Marsh, 2012; Schildkamp & Kuiper, 2010; Sharkey & Murnane, 2006; Spillane, 2012; Vanhoof, Van Petegem, & De Maeyer, 2009; Wohlstetter et al., 2008; Young, 2006).

Attitude and beliefs (Coburn & Turner, 2011; Datnow et al., 2007; Kerr et al., 2006; Marsh, 2012; Marsh et al., 2006; Schildkamp & Kuiper, 2010; Spillane, 2012; Vanhoof et al., 2009; Wohlstetter et al., 2008; Marsh et al., 2006).

Collaboration of teachers regarding the use of data (Coborn & Turner, 2011; Datnow et al., 2013; Huffman & Kalnin, 2003; Marsh, 2012; Nelson & Slavit, 2007; Park & Datnow, 2009; Schildkamp & Kuiper, 2010; Wayman et al., 2006; Wohlstetter et al., 2008; Young, 2006).

Table 1. Factors Influencing Data Use in Data Teams

Main factors


Data characteristics

Availability of tools and information management system

Access to high-quality data

Availability of multiple sources of data

School organizational characteristics


Shared goals

Training and support

Individual and team characteristics

Knowledge and skills

Attitude and beliefs


The factors that influence the use of data have been studied extensively. However, the majority of these studies concern the use of data in general and not the use of data in data teams, and these studies do not concern the factors influencing a professional development data use intervention. Therefore, the following research question guided the study that took place in secondary education in the Netherlands: Which factors influence data use in the data teams?


How do data characteristics influence the work of data teams?


How do school organizational characteristics influence the work of data teams?


How do individual and team characteristics influence the work of data teams?


To capture the detail, nuance, and patterns of interaction in data teams, we employed a micro-process perspective (see Little, 2012). Little (2002) described a micro-process as “zooming-in” to the interactions that occur between actors. In this case, we zoomed in on the interaction between data team members and the factors that enabled or hindered the work of the data teams. Following Little (2012), this article argues for a more conceptually robust, methodologically sophisticated and extensive program of micro-process research on data use that also anticipates the ways in which local practice both instantiates and constructs institutional and organizational structures, processes, and logics. We therefore employed a case study method (Yin, 2003) focusing on observing in depth the factors that influence the work of the data teams and interviewing the data team members about these factors. A case study methodology was used as we focused on data teams in their natural contexts, using a variety of measures to answer the research questions. Moreover, the purpose of this study was to generalize to theoretical propositions, not to populations (Yin, 2003). Applying a case study design allowed for a deep examination and rich description of the enabling and hindering factors of the data teams in their specific contexts. The observation results, together with our field notes and interview results, formed a comprehensive view of the use of data in the data teams and the factors influencing the use of data.


This study took place in the Netherlands. Dutch schools are rather autonomous. This implies freedom in choosing the religious, ideological, and pedagogical principles on which their education is based and in organizing their teaching activities. Dutch schools have considerable freedom regarding the subject matter taught and the textbooks, assessments, and instructional strategies used (Ministry of Education, Culture & Science, 2000). The Netherlands does have an inspectorate, which holds schools accountable for their education.

This study focuses on secondary education. At the end of secondary education, students have to take a national test. This is the only standardized assessment that is used by all secondary schools in the Netherlands. This final exam consists of an internal school-based assessment and an external national assessment. National examinations are school-independent and equal for each Dutch student within a track. School-internal examinations are developed by schools themselves and differ per school. Both parts of the final examination, the external and internal part, are considered to be equally relevant: For each subject, the average school-internal examination grade makes up 50% of the final examination grade, and the average central examination grade makes up the other 50%. To obtain a leaving certificate, an examinee must have scored passing marks in a specified number of subjects, such as Dutch, English, and mathematics (Schildkamp, Rekers-Mombarg, & Harms, 2012, p. 230). Other important data sources available to Dutch schools include school inspection data; school self-evaluation data; data on intake, transfer, and school leavers; student work; curriculum assessments; and student and parent questionnaire data.


Four data teams of six schools for upper secondary education (one team consisted of members of three different schools) were followed over a period of 2 years (see also Schildkamp, Handelzalts, & Poortman, 2012); all their meetings were observed, their documents were collected, and their members were interviewed (see Table 2). These schools voluntarily approached the university to participate because they had heard about the university’s intention to pilot this new form of professional development. Our initial meeting was with the school leader and one or two teachers, and we talked about the topic that they wanted to investigate. The problems formulated by the school leader, partly based on data (for example, from school inspection reports), were not yet very concrete and specific (see Table 2). For example, one school leader knew that they had several students repeating grades in the higher levels of senior secondary education, and he was told by the Dutch inspectorate that he should work on reducing grade repetition, but he did not know exactly how many grade repeaters they had in the different classes. The school leaders in the four schools decided on the general topics (e.g., grade repetition, mathematic improvement) but left it up to the data teams to specify the problem and come up with a clear and measurable problem definition.

Table 2. Team Characteristics



Years of experience


Data team

Team W topic: High percentage of grade repeaters in higher levels of secondary education


Teacher and mentor





Teacher and mentor


Arts and Music



Teacher and school leader of the havoa department





Teacher and havo coordinator










Teacher and mentor





Data expert




Team B topic: low mathematics achievement in lower levels of secondary education


Mentor, mathematics teacher, and department head





Mentor and mathematics and science teacher


Mathematics and technique



School leader havo and vwob


History (not teaching)



Mathematics, ICT teacher, ICT coordinator


Mathematics and informatics








Data expert




Team C topic: low mathematics achievement in lower levels of secondary education


School leader havo/vwo


Physical eds. (not teaching)



Teacher and mentor location 1





Teacher and mentor location 1





Teacher and mentor location 3


Mathematics and English



Teacher location 2





Teacher location 3





Data expert




Team G topic: Low(er) performance of pre-vocational education students entering senior secondary education







School leader havo and vwo


Music (not teaching)





Physics and mathematics



Teacher and teacher trainer


Religious studies and social studies





History and geography



Data expert




NI: not available for interviews.

a Havo: general secondary education (after completing this track, students can continue to a university for applied sciences).

b Vwo: preuniversity education (after completing this track, students can continue to study at a university).

*The data experts from School W also worked in School B.


We collected observation data by audio-recording all data team meetings and making field notes during the participant observation of these meetings by the researcher. One of the researchers was the facilitator of the data team. Recording the meetings facilitated the micro-process approach. A total of 34 meetings, lasting between 1 and 1.5 hours, were recorded. Also, all data team participants were interviewed using an interview schedule based on the theoretical framework. The interviews concerned the way participants experienced the data teams and what factors they thought influenced their work. With regard to the influencing factors, we asked respondents to describe whether they had experienced any hindering or enabling factors in their work with the data team. Next, we asked questions about the variables described in our framework; for example, with regard to school organizational characteristics, we asked, “Can you describe the role of the school leader with regard to the data team?” With regard to data characteristics, we asked, “Can you describe how you have been collecting and using data with your data team?” With regard to individual and team characteristics, we asked, for example, “Can you describe the process of collaboration in your data team?” These 1-hour interviews, 23 in total, took place approximately 6 months after the start of the data team.


The researcher who did not facilitate the data team conducted all the data analyses. We transcribed both the observation data and the interview data verbatim. We used the program ATLAS.ti for coding the transcripts, relating the coded fragments to each other and comparing the codes of different schools and respondents within and across cases (cf. Boeije, 2002). We went through an initial round of coding using a coding scheme based on our theoretical framework, which arose from our literature review. With regard to the use of data, we checked how many steps of the data team cycle the data teams completed, whether they went through multiple feedback loops, whether the team created actionable knowledge based on the data (e.g., did they talk about possible measures that they could take based on the data?), and the depth of inquiry that was reached (“no depth,” which concerns only storytelling, retelling [known] information and personal anecdotes; “average depth,” which concerns fragments that show data team members’ basic data use and basic understanding and explanations based on data, such as “the percentage of students who pass is too low”; and “high depth,” which concerns data team members developing new knowledge based on data, focused on taking action in their classroom) (Schildkamp, Handelzalts, & Poortman, 2012). With regard to the influencing factors, we employed codes for each of the influencing factors. With regard to school organizational characteristics, for example, we used codes such as school leader support. For data characteristics, we used, for example, access to data. And for individual and team characteristics we used, for example, attitude and beliefs. This first round of coding was followed by more rounds of coding using an iterative approach. For each of the schools, a within-case analysis was conducted, followed by a cross-case analysis comparing data teams with each other. These extensive rounds of analyses of all the data collected led to an in-depth picture of the factors influencing the work of the four data teams.


We used a systematized approach for data collection, consistent with the research questions (Poortman & Schildkamp, 2012). This means that all respondents were approached in the same way in relation to the research questions. In addition, we triangulated the observational data with the field notes and interview data. Reliability was further enhanced by audio-taping all the data team meetings and the interviews, which allowed for thorough analyses of the data by a researcher other than the facilitator. Also, the instruments were based on our theoretical framework. To promote internal validity, we observed the teams for an extended period (2 years) and applied thick description (the context and the behavior) of the results (Poortman & Schildkamp, 2012). In the results, we describe extensively how the factors identified played a role in the teams’ data use, including examples and respondent quotes (Poortman & Schildkamp, 2012). External validity was enhanced by providing case-specific and cross-case thick descriptions (also including citations of respondents) and describing the congruence with the theoretical framework (Poortman & Schildkamp, 2012). Finally, two researchers coded approximately 10% of the same transcripts. We calculated the interrater agreement and found a satisfactory Cohen’s Kappa of 0.76.


Because discussing all the cases in depth is not within the scope of this article, we will present the detailed results of one case here to demonstrate how we analyzed and interpreted the data, and the summaries of the other three cases. We choose this case because this data team went through all the data use steps described in our theoretical framework and was able to solve their problem. This case is thus an exemplary case. First we will describe the process of data use. Next, in line with Table 1, we will describe the influencing factors data characteristics, school organizational characteristics, and individual and team characteristics.



Team W focused on the large number of grade repeaters in the higher levels of senior secondary education. Their initial thought was that they had a problem in the third grade (age 14/15), but in Step 1, data showed that they had the most grade repetition in the fourth year (age 15/16) (on average, over 30% per year). The team went through several feedback loops during 12 official data team meetings (the team sometimes also gathered in subgroups between meetings without the presence of the facilitator or data team expert). This team started with a low depth of inquiry and mainly pursued more external hypotheses in Step 2 of the data team cycle rather than hypotheses about their own functioning during the first meetings: for example, “our problem is caused by our schools’ retention policy” and “our problem is caused by a lack of motivation of our students.” The team collected several forms of data to investigate different hypotheses (Step 3). For example, they collected data from their school registration system, and they checked the quality of these data in Step 4, which were, in this case, considered sufficiently reliable and valid. A motivation survey used in Step 3 to collect data about a motivation hypothesis showed that failing students were not less motivated than students who passed and that students in both groups needed help in planning, more feedback, and more consequent checking of homework (e.g., Step 6: interpretation and conclusion). In addition, the absenteeism data showed that a considerable part of the problem (approximately 16%) could be explained by student absenteeism. Furthermore, a curriculum coherence survey used to investigate another hypothesis, in a next feedback loop, showed that there were problems with the coherence of their curriculum. For example, the curriculum of the third-grade level was not aligned in terms of content taught and assessments used with the curriculum of the fourth-grade level. The team was starting to develop new actionable knowledge based on the collected data, implying a higher level of depth of inquiry. The team came up with several measures (Step 7 of the data team cycle) addressing these issues: a new policy for student absenteeism; more regular checking of homework and more feedback to students; and setting up working groups to create a more coherent curriculum. The next year, evaluation results (Step 8) showed that the number of students repeating a class was significantly reduced to 23%.


Availability of Tools and Information Management System

Team W’s school had an information management system from which data regarding, for example, the number of failing and passing students could be extracted. In meeting 1, the data team referred to this system and the possibilities to collect the required data for defining the problem. The quality care manager mentioned that it was “a lot of work” to extract the numbers of failing and passing students per grade for all years, though. However, the necessary data were collected.

Access to High-Quality Data

In general, the team had access to high-quality data. The team collected data from their information management system in which data are systematically registered. However, further along the process, the team found out that some particular data regarding student absenteeism were not registered sufficiently reliably: There were a lot of unknowns (e.g., they still needed to check whether it was an authorized or unauthorized absenteeism).

Availability of Multiple Sources of Data

The data team was also able to collect multiple sources of data. All the data the team collected were related to their hypothesis about what caused grade repetition. In meeting 4, results of a student survey administered a couple of years earlier were brought to the table, and the team decided they wanted to administer this survey on student learning (e.g., motivation) again to investigate their motivation hypothesis. In the fifth meeting, the team had the results of the third-year students’ survey. The team also administered a teacher survey, obtained from the Dutch Institute for Curriculum Development, on curriculum coherence to investigate another hypothesis. The team also collected student absenteeism data to investigate another hypothesis.


Shared Goal

During the meetings, the goal of the data team was often discussed (e.g., decreasing grade repetition), but the team members did not express by how much. Also, during the interviews, all respondents mentioned the goal of decreasing grade repetition, but none of the respondents mentioned by how much.


The school leader took up different roles in the data team. Sometimes she acted as an equal part of the team. She, for example, proposed several hypotheses that the teachers did not think of beforehand, for example, with regard to teachers’ own role in the classroom. To some team members, leadership was perceived as equally distributed among team members; however, according to others, it was the school leader who made the decisions. The school leader started to take on more of a steering role as of meeting 6. She started to summarize and conclude more actively toward possible solutions at the level of the teachers’ own functioning:

What students are missing. Direction, motivation by the teacher, directing their planning and not expecting them to do it perfectly within two weeks, but maintaining it. Our attitude in that is very important . . . . We should agree about this together. . . . We should address each other about it too.

After the meeting, she explicitly explained to the facilitator that she was actually very impatient about the issue of the teachers relating their own functioning to the problem: “You should look at yourself more, really.” The teachers made only slow progress toward this idea.

Furthermore, the school leader supported the implementation of measures, as became clear in meeting 7. One of the teachers expressed his concern that the solutions that the team comes up with will not be implemented because, for example, it will be too expensive. Another teacher agreed: “Will the results really be used?” The school leader assured them that if the team proved that the solution was good and directed to the interest of the students, it would be implemented.

When it came to taking measures based on data, the school leader also took on an explicit role in planning and organizing, for example, with regard to the assignment regarding curriculum coherence for all the subject matter department; however, the teachers also had an active role in discussing this. They thoroughly discussed the issue of having enough time for the teachers to work on the assignment and agreed on how this could be achieved.

Finally, the school leader had a supporting and motivating role, as expressed by three teachers during the interviews, and was also a role model because, according to one of the teachers, she showed that she also made decisions based on data herself. As another teacher stated, “Sometimes our schedule is cleared (e.g., cancellation of lessons) so that we can participate in the data use meeting. This is special. Our school really tries not to cancel any lessons. If lessons are cancelled for this, this means that it is very important.”

Training and Support: Facilitation by the Researcher (External)

The observation results show that overall, the facilitator from the university was more guiding than substantially directive and that all the members actively participated in the whole process. Throughout the whole process, the facilitator advised the team and supported decision making, for example, with regard to:

Giving advice to focus on a specific grade level instead of focusing on all grade levels

Giving advice with regard to hypotheses: for example, focusing on hypotheses with regard to causes that can be dealt with by the team instead of focusing on uncontrollable factors

Directing the team toward formulating new hypotheses when previous ones have proved to be wrong

Advising what data to collect

Giving advice on how to draw conclusions from the data and how to proceed in general—for example, by stating that the next step for the team should be to synthesize all their knowledge developed about the problem of failing students and draw conclusions

Summarizing team discussions and progress made

Pointing the team to relevant literature; she brought a research report about failing students to one of the meetings to list possible causes that might also apply to this school

Furthermore, the facilitator had an essential role in analyzing the data because none of the data team members had the skills to do this. She involved the team in this process by, for example, helping decide what data were required for the next step and explaining how she would analyze the data. During the interviews, all the respondents also indicated that the support of the facilitator was needed in analyzing the results and to redirect the process when the team members lost their focus (e.g., start making claims without data backing up these claims). After a while, the facilitator did not have to correct the teachers anymore. In the rare case that a teacher proposed an action not based on data, he or she was corrected by colleagues rather than the facilitator. Finally, all the data team members in the interviews indicated that the facilitator brought a more objective perspective to the team because she did not work at the school.

Training and Support: Designated Data Expert (Internal)

The data expert played an important role in this data team both in terms of participating in the discussions and with regard to collecting data from the information management system and preparing the student and teacher surveys for administering. During the interviews, the data team members expressed that the data expert provided the data team with a lot of the data they needed and also was supportive of analyzing and interpreting the results.

Collaboration With and Involvement of Relevant Stakeholders

Although it was not included in our theoretical framework, we found that collaboration between data team members and colleagues from outside the data team can be essential. At meeting 7, the team reported their experience at a meeting with their colleagues in which they reported the activities and outcomes of the data team. However, their colleagues were very defensive and skeptical about the data team outcomes so far. In meeting 7, the data team discussed the fact that the lack of support of colleagues outside the team was a major challenge. However, they acknowledged that they might not have involved their colleagues enough so far. In meeting 8, the team also briefly discussed how to communicate the data team’s plans with regard to the hypothesis about the curriculum coherence. The team concluded that one of the teachers would explain this to all the colleagues during a meeting. During the interviews, three team members also worried that they did not communicate sufficiently about their work with their colleagues outside the data team. As one teacher stated, “We can find out in our data team that a certain hypothesis or myth is incorrect, but in the teacher lounge, these myths can still survive.”


Knowledge and Skills

The observation results show that teachers lacked data literacy. In meeting 6, for example, the school was only able to provide raw data, and the facilitator was needed to help with preparing these data for (more advanced) analyses. The observation results also point to the importance of pedagogical content knowledge (PCK) and knowledge on the organization of the school. The teachers related much of their knowledge about failing students to the discussion of the problem: regarding the trajectory from the lower to higher levels of secondary education; connections between lesson materials in this respect; students’ attitudes; their instruction; and the number of students who fail in their own group or in the departments as a whole.

Attitude and Beliefs

Although one teacher expressed that he does not “care so much about grades,” the team generally demonstrated an interest in data. During meeting 2, for example, the team appeared to actively try to make sense of the data. They also appeared to be curious about the answers that further data collection may yield. At meeting 3, the data team members were still very positive about the data team approach. At the end of the meeting, some team members expressed that they had the feeling they were “getting somewhere.” The data team members stayed consistently positive about the data team approach, as was again demonstrated at meetings 10 and 11. The enthusiasm for the data team procedure was still high, and they discussed how to continue with their work the next school year.

In the interviews, all data team members also expressed a positive attitude toward data use in general and toward the data team approach specifically. The team members all indicated that they appreciated the structured method. As one of the teachers stated,

It is very good to work analytically, and not just say this is the problem and this is what we are going to do. We need to look at the causes by generating hypotheses and collecting data to prove or disprove the hypotheses. I think this is very good and different from the typical intuitive educational world.

And another teacher stated, “Data are important; I am a person with a lot of hypotheses, a lot of gut feelings; I take things for granted. Some of my hypotheses might be true, but it is stronger if I can prove this with data.” One of the teachers even found the work of the data team so important that he participated in the meetings on his days off. He stated that the school leader had made it clear that their work was very important; also, he was very interested in the whole data team procedure and really wanted to learn how to apply it.


The teachers on the data team had worked together in the past, and all expressed how satisfied they were with the collaboration among the team. As one of the teachers stated, “People work together, appointments are kept, everyone is actively involved, people listen to each other and are not afraid to speak up. We all work together at an equal level.” One teacher expressed that he was a bit worried that in one phase of the process, the team did not collaborate sufficiently, and this was in step 6 (the interpretation and conclusions phase). He felt that the department head did a lot of interpretation on her own in advance; this should be a more shared activity.


Another factor not included in our theoretical framework but found in the data concerns ownership. During the first meeting, the entire team agreed that the number of failing students in their school was problematic. All the respondents indicated in the interviews that they felt ownership over the problem of grade repetition. Only one teacher indicated that she did not think grade repetition was necessarily a problem, however, she did acknowledge the importance of the problem for the school:

Students all develop at a different pace. Also, at this stage other things are more important, such as social emotional development. I don’t worry about students repeating a grade; however, I do see it as a problem for the school, because the school is judged on this topic and it also generates negative publicity.


Another factor not included in our theoretical framework that may influence the work of data teams is whether participation was voluntary. In the interviews, when asked how they came to join the team, all the participants indicated that they participated voluntarily.



Team C focused on the problem of low mathematic achievement in the first two grades of secondary education (age 12–14), which they all considered to be a problem. The team had 12 meetings during which they went through several feedback loops. They engaged in an iterative process of generating hypotheses (Step 2 of the data team procedure) about what caused this problem, collecting and analyzing data, and rejecting and accepting hypotheses. Several hypotheses of the team were rejected based on the data. For example, a previously administered motivation questionnaire showed that no relationship could be found between motivation and mathematic achievement. The level of depth of inquiry increased from meeting 6, when the team really started using data to draw conclusions about their problem, hypotheses, and possible solutions (step 6). They were able to accept two important hypotheses. The first hypothesis related to the entry level of students. Assessment data from the primary schools showed that several students had very low mathematics skills. Another hypothesis confirmed, based on two assessments administered at the beginning of the school year and again after a couple of months, that students were able to complete the assessment successfully right after they had been taught how to solve percentage and fractions assignments, but they lost these skills within a couple of months. Based on these hypotheses, the team came up with two measures in step 7 to solve this problem: (1) Introduce repetition on percentages and fractions assignments by means of short pop-quizzes every lesson. (2) For the low-entry-level students, the team found several online programs that students could use to practice. After four months, the teachers administered the assessment again (step 8: evaluation) and discovered that student achievement no longer decreased significantly between tests. The year after that, they administered the same assessment again to the same group of students, and the results remained stable.


This team had access to multiple sources of data: primary school data, motivation questionnaire data, and assessment scores. Regarding assessment scores, however, they did not have all scores from former years. From another program within the school, scores for arithmetic were also available—initially, not all team members knew this, however. The data resulting from the tests initially constructed by the team members for further determining the problem regarding fractions and percentages were not sufficiently reliable because the team members had administered different tests, which were too short, at different moments in their student groups (e.g., problems with the quality of the data). Therefore, a new test was developed. The team needed to go through different channels to get the information they needed, indicating a problem with access to high-quality data.


The team worked on a shared goal of improving the mathematics lessons. The extent of the problem was not explicitly discussed in the meetings, however.

In terms of school leadership, the school leader was often not present during data team meetings. This did not hinder the use of data in the data team, however. This is probably a consequence of the fact that the teachers felt supported by the school leader in terms of time but also by her enthusiasm for the data team and the fact that she constantly indicated how important their work was.

In terms of training and support, the facilitator played a substantial role in explaining the data team procedure and helping the team make decisions about which hypotheses to study, what data to use, and what next steps to take. Regarding decisions about what data to use and regarding the (more advanced) data analysis, the role of the facilitator was larger; the team members gave much input regarding hypotheses and solutions. The team valued the support from the facilitator and indicated that it was necessary, for example, to keep the group from jumping to conclusions. The data expert was only present at the first meeting and only appeared to play a role regarding delivering the required data. The team could turn to him for support of their work with regard to data collection and data analyses.

The team members were from different school locations. They sometimes mentioned how colleagues in other subjects (such as economics) handled percentage sums; however, they did not explicitly discuss communicating data team findings with colleagues outside the data team. They did not collaborate with and involve relevant stakeholders. In meeting 12, they did report that colleagues had also implemented the repetition of fractions and percentage sums for a specific level of students.


With regard to individual characteristics, all team members possessed a certain amount of knowledge and skills with regard to data use (e.g., data literacy). They understood the concepts of reliability and validity; however, they sometimes needed help with analyzing the data and taking action based on the data. The team members stated that data team members who are working on improving their subject also need PCK, as one teacher illustrated: “You need to be able to grasp a certain level of mathematics. You also need to have classroom experience.”

The team members all had a positive attitude regarding the use of data. Three of the teachers and the school leader felt that they were participating in the data team not only to solve this problem but also to learn how to use the data team procedure: “We need to learn the data team procedure. It is a learning process; we do not yet have the skills necessary to do this on our own, but I am confident that we will in the future.” Another teacher stated,

I have seen many interventions come and go, and I have been disappointed a couple of times. I don’t think we will find a solution that will be perfect, but the fact that participating gets me thinking is a very important effect. I already pay more attention to certain things in my classroom as a result.

The data team members also felt ownership of the problem and felt that they worked together effectively. When answering the question regarding what made their collaboration so valuable, all the teachers indicated that this had to do with the fact that they all taught the same subject and that the collaboration involved teachers from different locations. They liked hearing from each other what was going on in these locations and what problems these teachers were facing. Finally, the team participated voluntarily.



Data team B focused on low mathematics achievement in the third grade (age 14/15). In the first meeting, it became clear that the school leader considered this to be a problem but that the teachers wanted to use the data team to show the school leader that it was not a problem. It was not until meeting 5 that the entire team finally agreed that the mathematics results were too low (passing rates varied between classes from 46% to 75%, but several classes did not meet the threshold of 30% passing). The team finally decided to focus on what caused low mathematics achievement. Discussions in this team were confined to a low level of depth because they were mainly about personal experiences and opinions, and the question of whether there actually was a problem at all. The teachers of team B did sometimes discuss causes of their problem at the level of their own functioning, in addition to more external causes, such as “mathematics is difficult.” The team generated several hypotheses but never got to actually collecting data to investigate the hypotheses, so they never completed a feedback loop. The procedure was terminated after six meetings for several reasons: school leader changes, the difficulty to find time to get together, and teachers not seeing it as a priority.


With regard to data characteristics, although data about mathematics grades were available, availability of multiple sources of data and access to high-quality data for further study was problematic. Primary school assessment data, for example, turned out to be confidential and not available to the team for analysis. A hypothesis about the absence of both teachers and students could not be checked because it was not possible to identify absence specifically for mathematics.


Several school organizational characteristics seemed to hinder effective data use in this school. First, for a long time, the teachers and school leaders were unable to come to a shared goal. According to the teachers, they did not have a problem, and therefore, setting an improvement goal was not necessary. According to the department head, “Math is a very difficult subject, so I do not see it as a problem that several students are failing.”

Also, school leadership was problematic. The relationship between the teachers and the school leaders formed a barrier to data use. Some members felt that the data team was started by school management to blame teachers for unsatisfactory grades; this seemed partly right. In the interview, the school leader stated that he wanted to participate to address the functioning of a specific teacher: “She does not see that she has a problem. That is understandable because she has been a teacher for so many years and nobody ever told her that she had a problem. I hope that the data will show that she has a problem that she needs to address.” Also, the school leader was not very active in the data team. In the interviews, he explained that this was deliberate: “I think it is their responsibility. . . . I try to make them look at their own functioning by making statements such as, ‘I would be so ashamed if I would have so many insufficient marks.’” Another issue for some of the teachers was that their work on the data team was not formally facilitated.

In terms of training and support, the data team facilitator tried to guide the team through the data team process even though the atmosphere was sometimes rather tense. She generally summarized the team’s ideas and conclusions. She helped the team focus on the current step in the approach. All the team members indicated that they appreciated and needed the support from the facilitator as an “objective person from outside the school.” Also, they received support from a data expert who had a substantial role in collecting the data and presenting overviews of the results, mostly based on descriptive analyses regarding the problem statement. The progress of the data team was further hindered by a lack of collaboration with colleagues outside the data team.


Knowledge and skills for data use, an important individual characteristic, seemed to be present on this team. The team members, mostly math teachers, did not appear to have problems with understanding the analyses so far. However, the team did not reach the level of more advanced analyses.

The negative attitude toward the use of data hindered progress on this data team. Also, the atmosphere during the first couple of meetings was not very positive, and collaboration seemed to be an issue. Often the data team members did not let each other speak freely and interrupted each other.

Another hindering factor was that collective agreement about the problem was rather questionable initially. There appeared to be no ownership of the problem. Clearly, starting the data team was by the initiative of school management. Several team members were obliged to participate. Teacher members questioned whether they had time, whether they all needed to be present, and whether mathematics achievement was in fact a problem. Furthermore, the math department head did not believe that they were ever going to find the causes of their problem: “This process will take too long to investigate and the problem is too complex.”



During six meetings, team G focused on students with a prevocational education background entering the senior general secondary education track (age 16/17). The team expected that the students coming from the prevocational education track (outside the school) would be doing worse than the “regular” students. The team generated several hypotheses with regard to the achievement of regular students compared with students from a prevocational education track (e.g., performance in different subjects, performance in later years, graduation rates). The team sometimes also mentioned causes regarding their own functioning during the meetings. For example, the role of their extra lessons and the role of their own advice regarding sector-profile combinations were mentioned. Apart from the fact that not enough data were available on the students from prevocational education (in several years, this group was too small), the team could not find many significant differences between these students and regular students. In other words, the original problem statement could not be supported by the data. The data team then continued with the problem of underperformance of students in the fourth grade.


The data expert indicated that particular data characteristics hindered the use of data on this team. Access to high-quality data was problematic: Data were not always available, or it was difficult to find the required data. The information management system available did not make things easier: “Sometimes the system defines things in three different ways, and then I have to collect all these data from three different access points manually . . . ” The team used multiple sources of data about (transfer and subject) grades in several (upper level) school years, and data about prior education of students. It was challenging for them to determine the data related to their hypotheses; in addition, there were too few cases for conclusions to be drawn about most of the hypotheses.


The goal of the data team was not entirely clear to all participants. For some participants, the goal was improving the transition from the vocational education track to the general senior education track; for others, it was a more general goal: “to improve the quality of education.”

According to the team members, leadership in the form of participation of the school leader was crucial. She provided additional knowledge about the problem and supported the data team. According to one of the teachers, “It is not right to leave this all up to the teachers. The school leader needs to take decisions and make policy.” Only one of the teachers was facilitated for participation in the data team. Some other teachers stressed that facilitation should be provided.

Collaboration was also an issue for the team, mainly related to time pressure. The team was not able to plan sufficient meetings, which interrupted the flow of the process and decreased the feeling of involvement. The involvement of everyone on the team was a point of concern.

With regard to training and support, the facilitator played a substantial role in helping the team focus on the current step (e.g., the current hypothesis rather than new ones), understanding the (advanced) analysis, and drawing conclusions. She also summarized the in-between results of the data team. Accordingly, the team members indicated that having a good researcher in the team is essential. Two data experts were available to the team and provided support mainly in the data collection.


The team appeared to have little prior knowledge about using data. For example, in one of the first meetings, a team member reacted in relation to determining a criterion for the problem statement, “Can’t we just look at the data and determine it then?” Although they all laughed about this, it remained challenging to decide. In addition, the facilitator needed to explain the results regarding significant differences extensively. Also, the importance of PCK and organizational knowledge was stressed, as indicated by several members: “A data team member should have sufficient knowledge on the school organization. A new teacher does not belong in a data team. . . . You need knowledge on the topic.”

The attitude of the team toward the use of data was positive. The team appeared committed to the idea of using data to accept or reject hypotheses. Although some of the hypotheses and related analyses were rather complicated, the team appeared to actively try to carry out the steps of the procedure. In addition, they were looking forward to collecting more data. They also decided to continue the project after the pilot, guided by the new problem statement.

The team appeared to feel ownership of the problem, although the decision-making process was perceived differently by the different data team members. Two teachers and the data expert indicated that it was shared to some extent but that ultimately, most decisions were taken by one teacher and the facilitator. The school leader and two other teachers felt that it was a shared decision-making process. All teachers indicated that they decided to participate in the data team voluntarily.


The results of the within-cases analyses were compared and contrasted with each other. In the first section, we will describe the use of data in the four teams. Next, we will focus on how data characteristics influenced the process of data use in the data teams. Thereafter, we will describe how organizational characteristics influenced the use of data in the data teams, followed by the individual and team characteristics.


Table 3 shows that the use of data in the teams varied. Teams W and C were the only teams that were able to complete all eight steps of the data team procedure, both in 12 meetings. They went through several feedback loops and created actionable knowledge based on data. For example, Team W took several measures to reduce grade repetition, and Team C took several measures to improve mathematic achievement. Both teams were able to reach high levels of depth of inquiry.

Team G was not able to complete all eight steps in the six meetings they managed to plan but went through several feedback loops with regard to defining their problem statement. They were able to create actionable knowledge: The data showed that their problem actually was not a problem (students who entered their school from a prevocational track did not do worse than their own students). The team reached average to high levels of depth.

The least successful team was Team B, which did not reach high levels of depth of inquiry and did not complete a single feedback loop. Furthermore, no actionable knowledge was created in this team. Team B quit after six meetings.

Table 3. The Process of Data Use in the Data Teams


Team W

Team C

Team G

Team B

Number of meetings





Feedback loops





Actionable knowledge

Several factors do not cause grade repetition

Measures to reduce grade repetition

Several factors do not cause low mathematics achievement

Measures to increase achievement

The problem does not exist; a new problem statement is formulated


Depth of inquiry



Average to high



All the data teams were facilitated by a data expert to provide support in the data collection (see Table 4). Teams W and G both used the same information management system to collect the necessary data. However, according to their data experts, a lot of work was required to generate the data the teams needed. Three teams (B, C, and G) also had problems accessing some of the data needed and had to deal with incomplete data sets.

Teams W and C seem to have been the most successful in the use of data because they were able to complete all eight steps and reached a high level of depth of inquiry. However, in terms of data characteristics, there did not seem to be large noticeable differences between the teams. All the teams encountered some problems with regard to access and availability of data, but with the support of their data experts, they were able to overcome these problems.

Table 4. Data Characteristics Influencing Data Use in Data Teams


Team W

Team C

Team G

Team B

Access and availability of multiple sources of data

Information management system (“a lot of work”)

Availability of most data; sometimes lack of awareness of availability. Sometimes difficult to access

Information management system (“a lot of work”’)

Accessibility and availability of data is a problem (e.g., not enough cases to draw conclusions)

Is a problem sometimes; primary school assessment data are confidential; student absenteeism is not registered properly

High-quality data

High-quality data; however, some data not up to date or complete

Problems with reliability of assessment data

Problems with accuracy



With regard to school organizational characteristics (see Table 5), all the data teams mentioned goals related to solving their specific problem. In Team G, there was some confusion about what the goal was (improving the transition from prevocational education to senior general education or to improve the quality of education in general). Only in Team B did fewer consensuses exist regarding what the goal of the data team was. This had to do with their problem definition; some of the teachers did not feel that they had a problem. What can also be noticed is that most of the goals formulated by the data teams were not very measurable (e.g., improve education), although this did not seem to stop them in the whole data use process.

In terms of leadership, three different types can be distinguished with regard to the data teams. First, there is the encouraging, supportive, and actively participating leader, as found in the W and G data teams. Second, Team C had a leader who supported and encouraged the data team members but did not actively participate herself. Finally, the teachers of Team B experienced problems with the school leader in the data team. Teachers complained that the school leader did not facilitate them, was not very active on the team, and used data to “blame and shame” teachers. Though most of the teachers were facilitated in time by their school leader to work as part of the data team, the teams still struggled with finding the time to meet, especially Teams B and G. Team B even gave having no time as one of the reasons for disbanding the team. Most of the other teachers stated that they thought it was a priority for which to make time.

In terms of training and support, all the teams were facilitated by the researcher. All the teachers indicated that this was needed, for several reasons: to guide the process, which consists of the eight-step data team procedure; to redirect the teachers when necessary (e.g., you are talking about a solution, but we are still at the hypotheses step); to conduct data analyses; to teach teachers how to use data to improve education; and to bring a research background and a theoretical and scientific background because an objective person from outside the school is needed. Some teachers felt that after 2 years of support, they could function without a researcher. Others felt that they might always need a researcher because of the research skills needed to do this or because the team needs an objective person from outside the school to do this effectively.

Also, the data teams were facilitated by a data expert. Team C never needed this person, but he was available. All the other teams needed the data expert to collect data, either to locate existing data or to provide support in collecting new data (e.g., provide support in the development of a survey).

All the teams also came to realize that it is important to involve and collaborate with relevant stakeholders at an early stage of the data use process. Team W, for example, did not involve their fellow teachers until they had their first results, which were received skeptically by their colleagues. If they had involved their colleagues early on—for example, in developing a problem statement—this might have been prevented.

When comparing more effective data teams (W and C) to less effective data teams (G and B), two factors seem to account for the difference: having a shared goal (not present in Team B, not clear in Team G) and the role of the school leader. The role of the school leader was especially problematic in School B, where the school leader was trying to use the data team to blame teachers for low-achieving students.

Table 5. School Organizational Characteristics Influencing Data Use in Data Teams


Team W

Team C

Team G

Team B

Shared goal

Shared goal but not specific

Shared goal but not specific

Goal not clear

No shared goal for a long time



School leader active and equal part of team; role model; steering teachers to take responsibility for student learning; support implementation of measures; planning and organizing of measures; facilitation in time; motivating and supportive

School leader is not always present but is seen as an equal part of the team; facilitated in time; communicates the importance of the work of the data team; motivating and supportive

School leader adds knowledge on the problem to the team; can help in implementing measures; lack of facilitation

School leader relationship with teachers is problematic; fear of using data to blame teachers; school leader not active; no facilitation

Training and support: facilitation by the researcher and a designated data expert

Researcher: data use support and coaching

Data expert: discussion and data collection and analyses

Researcher: data use support and coaching

Data expert: available if needed; no active participation

Researcher: data use support and coaching

Data expert: discussion and data collection and analyses

Researcher: data use support and coaching

Data expert: discussion and data collection and analyses

Involvement of relevant stakeholders

Collaboration with colleagues outside the team is growing but still problematic

Some collaboration with teachers outside the team

No collaboration with teachers outside the team

No collaboration with teachers outside the team


Concerning individual and team characteristics (see Table 6), in all the data teams, it became apparent that members needed a certain amount of data literacy (e.g., what does a problem definition look like, how do you formulate a hypothesis, how do you analyze data). All the teams were guided by a researcher, but to be able to function without the researcher, they needed to learn these skills. Next to data literacy, several data team members expressed that you need to know your school (e.g., organizational knowledge) and you need knowledge on the subject you teach and how to teach this (PCK).

In general, all data team members of the different schools were rather positive about the use of data in education and the data team procedure. The only really negative teachers can be found in Team B. One of the teachers was very negative, and another teacher who was not interviewed also expressed a very negative attitude during the meetings. These teachers did not perceive that they had a problem, therefore, the use of data was insignificant and meaningless to them.

Team members of Team B also expressed that they felt that the team was not collaborating effectively. One person was doing all the work, and another person was trying to slow the whole process down. Also, during the meetings, people were not allowed to speak freely, and people constantly interrupted each other.

Furthermore, Team B was also the exception when it came to ownership. In all the other data teams, most of the members felt ownership of the problem. According to the respondents of Team B, only the school leader perceived low mathematics achievement as a problem that needed to be solved.

The least effective data team (Team B) seems to have differed on several individual and team characteristics from the other teams. The attitude of the individual data team members was rather negative, in that some of the members did not believe in the use of data. Also, collaboration within the team was problematic. Data team members seemed not to be willing or able to speak freely, and they constantly interrupted each other. There seemed to be no ownership of the problem. Finally, the defensive attitude of the data team members was probably caused by the fact that they were obliged by the school leader to participate, and they feared that they would be “blamed and shamed” based on the data.

Table 6. Individual and Team Characteristics Influencing the Use of Data in Data Teams


Team W

Team C

Team G

Team B

Data literacy

Lack of data literacy

Data literacy to some extent

Lack of data literacy

Data literacy to some extent

Pedagogical content knowledge (PCK)




Not mentioned

Knowledge on your organization (KO)


Not mentioned


Not mentioned









Lack of involvement

Lack of involvement





No ownership








Although this pilot study does not permit generalization beyond theoretical propositions, it provides more in-depth insight into the factors that enable and hinder interventions, focusing on supporting collaborative data use in schools. Regarding our data use theory of action, the data team intervention addresses all the different  leverage points as discussed by Marsh (2012, p. 4) that are important for supporting schools in the use of data: accessing and collecting data (Step 3 from the data team procedure), filtering, organizing, and analyzing data (Steps 4 and 5), combining information with expertise and understanding to build knowledge (Step 6), knowing how to respond and taking action or adjusting one’s practice (Step 7), and assessing the effectiveness of these actions or outcomes that result (Step 8). In addition, it also supports schools in identifying problems, setting improvement goals, and formulating hypotheses regarding the problem (Steps 1 and 2).

The four data teams in our study were not equally effective in the use of data. Teams W and C were the only ones that made it through the eight steps in 2 years’ time and also were able to reach high(er) depths of inquiry. Furthermore, these teams went through several feedback loops, enhancing their learning, and they were able to construct actionable knowledge.

We found that several factors influenced the use of data in the teams. With regard to the data characteristics, in general, access to high-quality data, availability of multiple sources of data, and availability of an information management system appeared to support Team W and Team C, the more successful teams, to eliminate particular hypothesized causes and find causes for their problem and, ultimately, to design and implement measures accordingly. The less successful teams, B and G, sometimes experienced problems with accessing the data they needed.

Several school organizational characteristics also influenced data use in the data teams. First, school leadership is essential. School leader participation was important for several reasons: for support and facilitation (e.g., facilitation in time by clearing schedules for the teachers of Teams W and C); for bringing in a new perspective on the problem and data use (e.g., the school leader of Team W put forward new hypotheses); for modeling effective data use (e.g., according to the teachers of Team W, the school leader showed how she took measures based on data); for ensuring the implementation of measures (e.g., the school leader reassured teachers of Team W that measures based on data would be implemented); and for empowering teachers to use data to make decisions (e.g., the school leader of Teams W and C stressed the importance of making decisions based on data again and again). This also implies that school leaders themselves take data-informed action, that they use data to prompt questions and deliberation, and that they provide direction and support conditions that encourage teachers to ask questions, make inquiry into practice, turn to data, and take actions based on these data. This type of leadership can be called data-informed leadership (Knapp, Copland, & Swinnerton, 2007; Knapp, Swinnerton, Copland, & Monpas-Huber, 2006).

However, the school leader can also have a negative effect on the data team, as happened in Team B, where the school leader tried to use data to “shame and blame” teachers. The school leader in this school needs to build a safe culture of inquiry that supports data-based decision making and where there is trust so teachers can ask questions and use data about practice and performance without the fear of repercussions. It is crucial that teachers feel empowered by data instead of threatened (Copland, 2003; Earl & Katz, 2006; Horn & Little, 2010; Knapp et al., 2006; Means et al., 2010; Wayman & Stringfield, 2006; Young, 2006).

Also, having a shared goal is essential. The goal for Team B was not shared because the team did not really agree on the problem. This team did not consider it to be a problem that nearly half of their students failed mathematics. Other research also stresses the importance of having a measurable and shared goal for data use (Datnow et al., 2007; Earl & Katz, 2006; Kerr et al., 2006; King, 2002; Sharkey & Murnane, 2006; Wayman & Stringfield, 2006; Wayman et al., 2007; Wohlstetter et al., 2008; Young, 2006). The results of this study show that perhaps having a shared goal is more important than having a measurable goal. That most teams defined their goal as higher achievement or less grade repetition without specifying by how much did not seem to hinder their progress.

Also, support from an external researcher who taught the data team members how to use the data was essential. The importance of the role of the data expert and external facilitator can be explained, as suggested by Marsh (2012), from learning theory, which suggests that role modeling and providing learners with opportunities to discuss and reflect with each other, to practice the application of new knowledge, and to receive feedback from an expert are crucial for learning (Collins, Brown, & Holum, 1991; Lave & Wenger, 1991; Poortman, Illeris, & Nieuwenhuis, 2011).

Moreover, the results show, as also concluded by Marsh, Sloan McCombs, and Martorell (2010), that the external facilitator needed to assist in all steps of the data team procedure, not only in collecting and analyzing data but also in identifying appropriate measures based on the data. This last task might be even more difficult than learning how to analyze data because the data will never exactly tell teachers what to do; this also requires expertise and creativity (Marsh et al., 2010). The question is whether data teams can take over the role of the external facilitator in this regard or if data teams always need an external facilitator at this point in the process. However, the role of the facilitator evolved over time. In the beginning, the role of the facilitator was much larger, but after a couple of meetings, the data team participants started taking over some of the behavior of the facilitator. For example, after a couple of meetings, the researcher could intervene to a far lesser extent; teachers corrected themselves and each other when they wanted to rush through the steps and take action based on intuition and prior experience instead of data.

Regarding individual and team characteristics, data literacy knowledge and skills were lacking in almost all team members. Although complex analyses were not required, most teachers and school leaders were not able to analyze the collected data. This had to be done by the data expert and researcher. Furthermore, PCK was also mentioned by several respondents as important. Shulman (1986) stated that PCK includes subject matter content knowledge but also goes beyond it to knowledge how to teach subject matter knowledge. Data can help teachers to identify the conceptions and misconceptions of students, but teachers still need their PCK to determine how to alter their instruction accordingly. Mandinach (2012) referred to this as pedagogical data literacy: the ability to analyze data and, based on the data, combined with PCK, take meaningful action. Based on the results of our study, we would like to add to this that data team members not only need to be data literate and have PCK, but they also need a substantial amount of knowledge on their school organization, for example, the rules, norms, and structures in their organization, especially if they are working on a school-level problem. An important question for further research is whether beginning teachers are able to use data effectively because they are still developing PCK and the necessary knowledge on their school organization.

Poor collaboration between team members may have hindered the work of data Team B as well. For example, the team experienced problems coming to consensus on what caused their problem. Furthermore, the teachers from Team B were obliged to participate, which caused a lot of resistance. Finally, the teachers of Team B did not feel ownership over the problem that they had to investigate, which negatively influenced their willingness to work on this problem. Collaboration, trust, and the willingness and capability to address conflict are necessary ingredients for the use of data (Copland, 2003; Datnow et al., 2013; Horn & Little, 2010; Means et al., 2010; Nelson & Slavit, 2007; Park & Datnow, 2009). As Herman and Gribbons (2001) stated, “Combating a siege mentality and getting beyond blame and defensiveness to actions are problems that go far beyond technical and mundane aspects of data use” (p. 18).

Next to participation on a voluntarily basis and ownership over the problem that is being investigated is one more factor that we need to add to our theory of action based on the results of this study: collaboration with and involvement of relevant stakeholders outside the data teams at an early stage of the data team cycle. Teachers may need to rely on others outside their team to collect certain kinds of data. They might also be dependent on their colleagues for implementing certain measures; for example, in the case of Team W, curriculum coherence was a problem that needed to be addressed by all grade-level and subject-level teams. Furthermore, cross-school collaboration can also be beneficial for teacher learning. Teachers from Team C came from three different locations and really appreciated this cross-site collaboration opportunity because they valued the collective examination of data, each other’s (different) perspectives, and the focused conversations with regard to how to improve mathematics instruction. Similar findings were reported by Nelson and Slavit (2007) and Huffman and Kalnin (2003).


Summarizing, although it was based on a small pilot, our study shows how several data characteristics (access and availability of high-quality data), school organizational characteristics (a shared goal, leadership, training and support, involvement of relevant stakeholders), and individual and team characteristics (data literacy, PCK, organizational knowledge, attitude, and collaboration) influence the use of data in data teams. The results also show how these influencing factors seem to be interrelated. For example, the school leader can influence the data use attitude of participants, collaboration around the use of data can increase data literacy and PCK, and a data expert can facilitate access to the data needed. This provides important insights into how data use can be promoted in schools, regarding, for example, leadership, training and support, and knowledge and skills required for data use and making data available for teachers.

Because several studies (Copland, 2003; Datnow et al., 2013; Horn & Little, 2010; Means et al., 2010; Nelson & Slavit, 2007; Park & Datnow, 2009) show that collaboration is crucial for effective data use, we need to uncover these relationships and determine how they can impede or enable data use in teams. The results of Team B showed that the negative interaction between members within a data team can really impede the use of data. As argued by Daly (2012), there is still too little knowledge on how relationships can support and constrain the use of data. An aspect that needs further exploring, therefore, is the social interaction through which data team members co-construct and make sense of data and use data.

The results of this data team pilot are promising in the sense that in three out of four teams, the teachers really engaged in using data to improve education, and two of the teams actually succeeded in addressing their educational problem. We cannot conclude that school improvement was only caused by the work of the data team; however, it is likely that this contributed to the process. In addition, these three teams continued their work, and some of the data team members also started new data teams in their school. An important question for further research is whether this form of professional development is also sustainable. After 2 years of participation, have the participants gained sufficient knowledge and skills to apply data-based decision making not only in a data team but also in their daily practice (e.g., is transfer of these knowledge and skills taking place)? Has this process, this cycle of inquiry, become part of “how things are done in this school,” and has it become an organizational routine (Spillane, 2012)? A related question is whether former data team participants are able to start and guide new data teams within their school (e.g., a train-the-trainer model) to promote data use not only by the original data team members but also throughout schools as a whole.


The authors would like to thank the Stichting Carmelcollege for making this project possible.


Black, P., & William, D. (1998). Assessment and classroom living. Assessment in Education: Principles, Policy, and Practice, 5(1), 7–74.

Boeije, H. (2002). A purposeful approach to the constant comparative method in the analysis of qualitative interviews. Quality and Quantity, 36(4), 391–409.

Breiter, A., & Light, D. (2006). Data for school improvement: Factors for designing effective information systems to support decision-making in schools. Educational Technology & Society, 9(3), 206–217.

Campbell, C., & Levin, B. (2009). Using data to support educational improvement, Educational Assessment, Evaluation and Accountability, 21(1), 47–65.

Chen, E., Heritage, M., & Lee, J. (2005). Identifying and monitoring students’ learning needs with technology. Journal of Education for Students Placed at Risk, 10(3), 309–332.

Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement, 9, 173–206.

Coburn, C. E., & Turner, E. O. (2012). The practice of data use: An introduction. American Journal of Education, 118(2), 99–111.

Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 38–46.

Copland, M. A. (2003). The Bay Area school collaborative: Building the capacity to lead. In J. Murphy & A. Datnow (Eds.), Leadership lessons from comprehensive school reform (pp. 159–184). Thousand Oaks, CA: Corwin Press.

Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record 114(11), 1–38.

Datnow, A., Park, V., & Kennedy-Lewis, B. (2013). Affordances and constraints in the context of teacher collaboration for the purpose of data use. Journal of Educational Administration 51(3), 341–362.

Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles: Center on Educational Governance, Rossier School of Education, University of Southern California.

Desimone (2009). Improving impact studies of teacher's professional development: Toward better conceptualizations and measures. Educational Researcher 38(3), 181–199.

Earl, L. M., & Katz, S. (2006). Leading schools in a data-rich world: Harnessing data for school improvement. Thousand Oaks, CA: Corwin Press.

Henry, S. F. (2012). Instructional conversations: A qualitative exploration of differences in elementary teachers’ team discussions. Cambridge, MA: Harvard University Graduate School of Education.

Herman, J. L., & Gribbons, B. (2001). Lessons learned in using data to support school inquiry and continuous improvement: Final report to the Stuart Foundation. Center for the Study of Evaluation, National Center for Research on Evaluation, Standards, and Student Testing, Graduate School of Education & Information Studies, University of California, Los Angeles.

Honig, M. I., & Venkateswaran, N. (2012). School-central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199–222.

Horn, I. S., & Little, J. W. (2010). Attending to problems of practice: Routines and resources for professional learning in teachers’ workplace interactions. American Educational Research Journal, 47(1), 181–217.

Huffman, D., & Kalnin, J. (2003). Collaborative inquiry to make data-based decisions in schools. Teaching and Teacher Education, 19(6), 569–580.

Ingram, D., Louis, K. S., & Schroeder, R. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvements: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112, 496–520.

King, M. B. (2002). Professional development to promote schoolwide inquiry. Teaching and Teacher Education, 18(3), 243–257.

Knapp, M. S., Copland, M. A., & Swinnerton, J. A. (2007). Understanding the promise and dynamics of datainformed leadership. Yearbook of the National Society for the Study of Education, 106(1), 74–104.

Knapp, M. S., Swinnerton, J. A., Copland, M. A., & Monpas-Huber, J. (2006). Data-informed leadership in education. Seattle: Center for the Study of Teaching and Policy, University of Washington.

Lai, M. K., McNaughton, S., Timperley, H., & Hsiao, S. (2009). Sustaining continued acceleration in reading comprehension achievement following an intervention. Educational Assessment, Evaluation and Accountability, 21(1), 81–100.

Lai, M. K., & Schildkamp, K. (2013). Data-based decision making: An overview. In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: challenges and opportunities (pp. 9–21). Dordrecht, the Netherlands: Springer.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York, NY: Cambridge University Press.

Leithwood, K., Jantzi, D., & McElheron-Hopkins, C. (2006). The development and testing of a school improvement model. School Effectiveness and School Improvement, 17(4), 441–464.

Levin, J. A., & Datnow, A. (2012). The principal role in data-driven decision making: Using case-study data to develop multi-mediator models of educational reform. School Effectiveness and School Improvement, 23(2), 179–202.

Little, J. W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 118(2), 143–166.

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47(2), 71–85.

Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher 42(1), 30–37.

Mandinach, E., & Honey, M. (Eds.). (2008). Data-driven school improvement: Linking data and learning. New York, NY: Teachers College Press.

Mandinach, E., Honey, M., Light, D., & Brunner, C. (2008). A conceptual framework for data-driven decision-making. In E. B. Mandinach & M. Honey (Eds.), Data-driven school improvement: Linking data and learning (pp. 13–31). New York, NY: Teachers College Press.

Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record 114(11), 1–48.

Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA: RAND Corporation.

Marsh, J. A., Sloan McCombs, J., & Martorell, F. (2010). How instructional coaches support data-driven decision making: Policy implementation and effects in Florida middle schools. Educational Policy, 24(6), 872–907.

Means, B., Chen, E., DeBarger, A., & Padilla, L. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.

Means, B., Padilla, C., & Gallagher, L. (2010) Use of education data at the local level: From accountability to instructional improvement. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.

Ministry of Education, Culture & Science (Ministerie van Onderwijs, Cultuur & Wetenschappen). (2000). Wet op het onderwijstoezicht [Education supervision act]. The Hague, the Netherlands: SDU.

Nelson, T. H., & Slavit, D. (2007). Collaborative inquiry among science and mathematics teachers in the USA: Professional learning experiences through cross-grade, cross-discipline dialogue. Professional Development in Education, 33(1), 23–39.

Park, V., & Datnow, A. (2009). Co-constructing distributed leadership: District and school connections in data-driven decision making. School Leadership and Management, 29(5), 477–494.

Poortman, C. L., Illeris, K., & Nieuwenhuis, L. (2011). Apprenticeship: From learning theory to practice. Journal of Vocational Education and Training, 63(3), 267–287.

Poortman, C. L., & Schildkamp, K. (2012). Alternative quality standards in qualitative research? Quality and Quantity, 46(6), 1727–1751.

Ragland, M., Clubine, B., Constable, D., & Smith, P.A. (2002). Expecting success: A study of five high performing, high poverty schools. Washington, DC: Council of Chief State School Officers and the Charles A. Dana Center at the University of Texas at Austin.

Schaffer, E., Reynolds, D., & Stringfield, S. (2012). Sustaining turnaround at the school and district levels: The high reliability schools project at Sandfields Secondary School. Journal of Education for Students Placed at Risk, 17(1–2), 108–127.

Schildkamp, K., & Ehren, M. C. M. (2013). From “intuition” to data-based decision making in Dutch secondary schools? In K. Schildkamp, M. K. Lai., & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 49–67). Dordrecht, the Netherlands: Springer.

Schildkamp, K., Handelzalts, A., & Poortman, C. (2012, April). Data teams for school improvement. Paper presented at the American Educational Research Association Conference, Vancouver, Canada.

Schildkamp, K., & Kuiper, W. (2010). Data informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496.

Schildkamp, K., & Lai, M. K. (2013). Conclusions and a data use framework. In K. Schildkamp, M. K. Lai., & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 177–192). Dordrecht, the Netherlands: Springer.

Schildkamp, K., & Rekers-Mombarg, L., & Harms, T. (2012). Student group differences in examination results and utilization for policy and school development. School Effectiveness and School Improvement, 23(2), 229–256.

Schildkamp, K., & Teddlie, C. (2008). School performance feedback systems in the USA and in the Netherlands: A comparison. Educational Research and Evaluation, 14(3), 255–282.

Sharkey, N. S., & Murnane, R. J. (2006). Tough choices in designing a formative assessment system. American Journal of Education, 112, 572–588.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14.

Snipes, J., Doolittle, F., & Herlihy, C. (2002). Foundations for success: Case studies of how urban school systems improve student achievement. Washington, DC: MDCR and the Council of Great City Schools.

Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making phenomena. American Journal of Education, 118(2), 113–141.

Supovitz, J. A., & Klein, A. (2003). Mapping a course for improved student learning: How innovative schools systematically use performance data to guide improvement. Philadelphia: Consortium for Policy Research in Education, University of Pennsylvania Graduate School of Education.

Vanhoof, J., Van Petegem, P., & De Maeyer, S. (2009). Attitudes towards school self-evaluation. Studies in Educational Evaluation, 35, 21–28.

Wayman, J. C., Cho, V., & Johnston, M. T. (2007). The data-informed district: A district-wide evaluation of data use in the Natrona County school district. University of Texas, Austin.

Wayman, J. C., Jimerson, J. B., & Cho, V. (2012). Organizational considerations in establishing the data-informed district. School Effectiveness and School Improvement, 23(2), 159–178.

Wayman, J. C., Midgley, S., & Stringfield, S. (2006). Leadership for data-based decision making: Collaborative educator teams. In A. Danzig, K. Borman, B. Jones, & B. Wright (Eds.), New models of professional development for learner centered leadership (pp. 189–205). Mahwah, NJ: Erlbaum.

Wayman, J. C., & Stringfield, S. (2006). Data use for school improvement: School practices and research perspectives. American Journal of Education, 112, 463–468.

Wohlstetter, P., Datnow, A., & Park, V. (2008). Creating a system for data-driven decision-making: Applying the principal-agent framework. School Effectiveness and School Improvement, 19(3), 239–259.

Yin, R. K. (2003). Case study research: Design and methods. Thousand Oaks, CA: Sage publications

Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms. American Journal of Education, 112, 521–548.

Cite This Article as: Teachers College Record Volume 117 Number 4, 2015, p. -
https://www.tcrecord.org ID Number: 17851, Date Accessed: 5/25/2022 1:41:56 PM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Kim Schildkamp
    University of Twente
    E-mail Author
    KIM SCHILDKAMP is an associate professor in the Faculty of Behavioural, Management, and Social Sciences of the University of Twente in The Netherlands. Kimís research (nationally and internationally) focuses on data-based decision making and formative assessment. She has been invited as a guest lecturer and keynote speaker at several conferences and universities, including AERA, the University of Pretoria in South Africa, and the University of Auckland in New Zealand. She has published widely on the use of (assessment) data. She is editor of the book Data-Based Decision Making in Education: Challenges and Opportunities.
  • Cindy Poortman
    University of Twente
    E-mail Author
    CINDY POORTMAN is an assistant professor in the Faculty of Behavioural, Management, and Social Sciences of the University of Twente in The Netherlands. Her research interests include professional development of teachers in professional learning communities, such as in data teams.
Member Center
In Print
This Month's Issue