Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Spreading the Word: Boundary Crossers Building Collective Capacity for Data Use


by Mireille D. Hubers, Cindy Poortman, Kim Schildkamp & Jules M. Pieters - 2019

Background/Context: The data team intervention was designed to support schools in using data while developing a solution to an educational problem. The participating data team members are responsible for building collective capacity within their school for using data and implementing actions related to the improvement plan. This can be challenging, because although they have gained knowledge and experience with data use and the educational problem, their colleagues who were not on the data team have not. As a result, there is a heightened risk for discontinuity between the behaviors of such colleagues and of the data team members: Colleagues will not automatically use data nor implement the actions for improvement. These discontinuities are referred to as boundaries. To establish common ground with their colleagues, data team members need to act as boundary crossers by brokering their knowledge.

Purpose: The present study used a process view to determine how data team members acted as boundary crossers by studying what content they had brokered, the level at which they had addressed that content, and what activities they had used to cross boundaries.

Intervention: Data teams consist of six to eight educators who collaboratively learn how to use data to analyze and address an educational problem at their school. They work following a cyclical procedure.

Research Design: A longitudinal qualitative case study was conducted in four Dutch schools that implemented the data team intervention.

Data Collection and Analysis: Artifacts were collected and all team members were interviewed twice. Log files, minutes of the meetings, and progress reports were used to obtain a complete picture of boundary crossing and to provide background information. A coding scheme was used in order to determine what content was brokered, the level at which the content was addressed, and the activities used to broker the content.

Findings/Results: Findings illustrated that team members mainly brokered knowledge about the educational problem and data use as applied to the educational problem, but rarely about data use in general. Overall, content was almost exclusively addressed at the level of awareness, indicating that only basic information was brokered.

Conclusions/Recommendations: Successful boundary crossing cannot be taken for granted: Team members brokered their knowledge in ways less likely to be effective. When they receive additional support for this, they are likely to increase their team’s effectiveness in building school-wide capacity for both data use and the implementation of actions related to the improvement plan.



Data-based decision-making in education—data use, for short—has received global emphasis in recent years (Datnow, Park, & Kennedy-Lewis, 2013; Schildkamp & Kuiper, 2010). Data use refers to the collection and organization of quantitative or qualitative information representing any relevant aspect about students, their parents, teachers and/or the school, which can then be used to improve the quality of education (Lai & Schildkamp, 2013). One of the reasons for the importance of data use is that such data-based decisions are more likely to be effective than decisions based on intuition and experience (Schildkamp, Poortman, & Handelzalts, 2015). Furthermore, data can support teachers’ reflective processes and provide insight into their strengths and weaknesses, which, in turn, can be used to improve their own performance (Schildkamp & Kuiper, 2010; Visscher & Ehren, 2011). Data can also be used to establish learning goals for students, to monitor the extent to which those goals are reached, and to decide how support can best be provided (Earl & Katz, 2006). As a result, students’ achievement is likely to improve (Carlson, Borman, & Robinson, 2011).


However, despite the benefits associated with data use, most schools do not use data in their decision-making processes, or use it ineffectively (Schildkamp & Kuiper, 2010; Schildkamp & Teddlie, 2008). Therefore, several programs have been developed to support schools in the effective use of data (Coburn & Turner, 2011; Wayman, Midgley, & Stringfield, 2006). The data team intervention (Schildkamp et al., 2015) is an example of such a program, and provides the context for the present study. Teams that work according to this intervention consist of six to eight teachers and school leaders. They meet approximately twice per month under the supervision of a coach to learn how to use data to analyze and address an educational problem in their school (e.g., high grade retention rates). Typically, one data team is formed per secondary school. Besides educating the data team members in data use, the data team intervention also aims at school improvement. This includes increasing the overall level of data use by educators at the school, and implementing the planned actions for improvement to address the educational problem on which the data team members are working.


Increasing the level of data use and implementing the planned actions for improvement require capacity to be built within the entire school (Farley-Ripple & Buttram, 2015; Honig, 2006; Hopkins, 2001; Marsh, Bertrand, & Huguet, 2015). This capacity is built when data team members broker their knowledge to their colleagues outside the data team. Brokerage is a term used to identify knowledge sharing by key individuals responsible for collective capacity building. When data team members successfully broker their knowledge, it is likely to facilitate their colleagues’ participation in discussions on school-wide issues and to increase the school staff’s communication about data use and the issues those data indicate to be important (Huffman & Kalnin, 2003; Lachat & Smith, 2005). The extent to which capacity is built within the school is likely to influence the school’s sustained engagement in enacting the new practices (Atteberry & Bryk, 2010), which, in the present study, refers to the use of data and the implementation of the planned actions for improvement.


Unfortunately, capacity building in this way is a difficult task. An important reason for this is that data team members gain knowledge about data use and the educational problem through their participation in the data team, and, consequently, acquire a corresponding vocabulary. In contrast, their colleagues are unfamiliar with the activities and terminology related to this intervention (Stein & Coburn, 2008). For example, data team members have learned how to analyze their data and draw appropriate conclusions, whereas these steps are likely to be difficult to understand for their colleagues. Due to these differences in knowledge and skills between data team members and their colleagues, discontinuities in behavior around data use and addressing the educational problem will appear. This may, for example, result in colleagues not implementing the improvement measures suggested by the data team members in order solve the problem. Another example is that data team members will be able to use data to deal with future problems, whereas other colleagues may still (only) use intuition and hunches to deal with those problems. This makes it unlikely that the colleagues will use the data and that the educational problem will be solved. Such discontinuities in behavior can be referred to as boundaries. To establish common ground with their colleagues and overcome the discontinuities in behavior, data team members need to broker their knowledge by acting as boundary crossers (Akkerman & Bakker, 2011; Hargadon, 2002).


Previous research on the process of boundary crossing is scarce. Existing studies in educational science focus on the role of teachers and district-level personnel (e.g., Honig, 2006; Wong & Anagnostopoulos, 1998), but not on boundary crossing within one school. Insight is lacking into the role of brokers who cross these boundaries, the objects they use, and how learning at this boundary takes place (Akkerman & Bakker, 2011; Bakker & Akkerman, 2014; Kubiak, 2009). Research on the intersection of learning and data use is particularly scarce (Jimerson & Wayman, 2015). For example, previous research about the data team intervention (Schildkamp & Poortman, 2015) determined the effectiveness of the intervention, but not yet how capacity for data use could be built within the entire school. However, these insights are crucial for coming to a deeper understanding of the dynamic between an intervention, such as the data team intervention, and the resulting on-the-ground responses and actions (Coburn & Turner, 2011; Marsh, 2012). The insights can then be used to further support effective and sustainable data use in education. Therefore, the present study aims to address the aforementioned gap in the literature by identifying how data team members cross boundaries between themselves and their colleagues outside the data team.


THEORETICAL FRAMEWORK


BOUNDARIES IN THE DATA TEAM CONTEXT


Schools are made up of multiple, overlapping communities of practice. Such communities are composed of several individuals who experience a common issue, problem, or goal (Wenger, McDermott, & Snyder, 2002). An example could be a group of mathematics teachers who want to provide their students with a coherent curriculum. Furthermore, these community members interact with one another with a certain frequency and have a common approach to guide their work (Wenger et al., 2002). When individuals are not involved in a certain community, it is difficult for them to “pick up the talk or tasks of an unfamiliar community, because the meanings that are invested in them are rooted in unspoken, tacit understandings that have developed over a long period of coparticipation” (Stein & Coburn, 2008, p. 588). Consequently, discontinuities can occur between the behaviors of those who participate in a community—and thus have a shared history with that specific content—and those who do not (Stein & Coburn, 2008). These discontinuities are referred to as boundaries.


One example of a school-based community of practice is a data team. In the present study, this refers to a specific intervention in which a data team consists of six to eight teachers and school leaders, who meet approximately twice a month for two years to learn how to use data to address an educational problem at their school, such as high grade retention rates. They work following a structured cyclic procedure as illustrated in Figure 1, which includes an extensive set of guidelines and activities. A coach from the university guides them through the intervention. After two years, they should be able to use data independently.


A data team defines an educational problem on which they want to focus their efforts. Next, the team formulates a hypothesis as to the cause of this problem, and collect data to investigate the hypothesis. This ensures the usefulness of the data collected. The team only collects data related to the size of the problem and to the possible causes of the problem.


Typically, one data team is formed per school. As the data team members are the only members of staff that learn about data use, this team structure implies that they are responsible for building collective capacity at their school. This capacity needs to be built for data use (e.g., using data to improve teachers’ instruction and/or to improve the overall quality of education), as well as for addressing the educational problem the data team members try to tackle (here, high grade retention rates). This is likely to be challenging, because the data team members have gained knowledge and experience with data use and the educational problem, whereas their colleagues have not. For example, data team members can talk about the reliability and validity of their data, but these are likely to be unknown concepts for their colleagues. As a result, there is a heightened risk for a discontinuity between their behaviors: Colleagues are not likely to automatically use data nor to implement the planned actions for improvement by themselves. Thus, boundaries are likely to arise between the data team members and their colleagues who are not participating in the team. Fortunately, these boundaries carry the potential for learning, both for the one crossing the boundary and for the one being reached in this way (Akkerman & Bakker, 2011; Wenger, 1998).


To support the team members in brokering their knowledge to their colleagues, the data team manual includes general examples of knowledge brokerage for step 1 (defining a problem), step 6 (drawing conclusions), step 7 (designing and implementing actions for improvement), and step 8 (evaluating whether the problem has been solved). For example, step 1 includes:


Ask your colleagues who are not part of your data team for input. They might have valuable ideas that you and your team members did not yet think about. Besides that, it is important to involve them in the activities of your data team. They will implement the measures for improvement you and your team members will design later on in the process only when they are curious! You can ask for your colleagues’ input in different ways depending on what fits your school culture. Think for example about scheduling a brainstorm session, or hanging a flipchart in the teachers’ lounge. Discuss with your fellow data team members how you want to involve your colleagues.


Moreover, step 8 includes:


When you have found the cause of your problem and you have designed appropriate improvement actions, it would be nice to share your results and conclusions with your colleagues. Talk about your most important findings and write them down on worksheet 8D. You also need to determine how you will share these results with your colleagues. This can be done informally during a lunchbreak, through writing an article for your staff newsletter, or by giving a presentation.


Because such paragraphs are included in the data team manual and the data coach refers to this manual during team meetings, it can be assumed that data team members are likely to broker their knowledge at least to some extent. For example, some teams make sure that “communication with colleagues” is always part of their meetings’ agenda. We will elaborate on this in the results section.


THE PROCESS OF BOUNDARY CROSSING


As data team members are responsible for acting as boundary crossers, the present study focuses on their behavior. This does not mean that their colleagues are passive recipients. On the contrary: Boundary crossing is more likely to be effective when colleagues play an active role. However, in this specific context, they cannot initiate a boundary-crossing activity, as only data team members have the required knowledge about data use and the educational problem to do so.


To act as boundary crossers, data team members need to broker their knowledge. Brokerage is a term used to identify knowledge sharing by key individuals responsible for collective capacity building. In this specific context, such capacity needs to be built for data use as well as the educational problem, which will be clarified in the next section.


Previous research shows that several factors can influence whether or not knowledge is being shared. For example, a review study by Wang and Noe (2010) shows that characteristics of the organizational context, interpersonal and team characteristics, cultural characteristics, individual characteristics, and motivational factors can exert such influence. Moreover, previous research shows that the role of school climate, school leaders, teachers’ beliefs, sense-making, agency, and available time/money all influence the implementation of change, of which boundary crossing is an important aspect (Coburn & Talbert, 2006; Datnow, Park, & Wohlstetter, 2007; Desimone, 2002; Kurland, Peretz, & Hertz-Lazarowitz, 2010; Penuel, Fishman, Cheng, & Sabelli, 2011; Schildkamp & Kuiper, 2010; Visscher, 2002). However, insight is lacking into the role of those who act as boundary crossers, what objects they use and how they use them (Akkerman & Bakker, 2011; Bakker & Akkerman, 2014; Kubiak, 2009). Such a process view is a crucial aspect of understanding how change develops (Crossan & Apaydin, 2009), which is why the present study adopted this view to study boundary crossing. First, the content that should be addressed when acting as boundary crossers is described.


Figure 1. The data team intervention procedure (Schildkamp & Ehren, 2013, p. 56)


[39_22446.htm_g/00002.jpg]


KNOWLEDGE CONTENT AT THE BOUNDARY


As a result of their participation in the data team, team members are expected to learn about at least three types of content related to data use. Thus, they need to broker these types of content when addressing the boundaries between themselves and their colleagues. These types of content are:


The educational problem: This includes knowledge about what turned out to be (or not to be) the cause of the educational problem the data team members are working on, and knowledge about the design and implementation of the actions for improvement. An example is that students who repeated a grade were found to do so because they lacked study skills. Such knowledge is hypothesized to provide an important contribution to solving the problem.

Data use as applied to the educational problem: This includes basic and practical knowledge about data use. Examples include knowing the extent of the problem and knowing how to calculate certain grade retention rates.

Data use in general: This includes abstract knowledge about data use. Examples include knowing when a hypothesis is testable, knowing when data are valid and reliable, and knowing how a hypothesis can be statistically tested. Such knowledge is hypothesized to facilitate a deeper understanding of the data team procedure as such, and has the potential to help team members identify needed data and draw inferences from data in appropriate ways (Coburn & Turner, 2011).


In order to understand what knowledge data team members shared with their colleagues who were not part of the data team, the first research question was:


1.   What type of knowledge content (educational problem, data use as applied to the educational problem, data use in general) is brokered by the data team members?


LEVEL AT WHICH KNOWLEDGE CONTENT IS ADDRESSED


It is important not only to determine what knowledge is shared, but also at what level it is shared. Three such levels can be distinguished (Rogers, 2003):


Awareness level (lowest level): This includes basic information about the existence of the new practices; here, these new practices are data use and the actions to be taken for improvement.

How-to level (intermediate level): This includes all information necessary to properly engage in the new practices, for example, the steps that need to be taken to collect data about grade retention rates.

Principles level (highest level): This includes information on the functional principles underlying the new practices, for example, the statistical theories underlying data analyses. It is possible for the new practices about data use and carrying out the actions for improvement to be adopted without content being addressed at this level, but it increases the risk for misuse and abandonment of the new practices.


Thus, addressing knowledge at the how-to and principles level is crucial for successfully implementing new practices, but this can only be done after the level of awareness has been addressed (Rogers, 2003). Taking the aforementioned into account resulted in the second research question:


2.   At what level is knowledge brokered by the data team members (awareness, how-to, or principles)?


BOUNDARY CROSSING ACTIVITIES


Apart from looking at knowledge content and the level at which it is addressed, it is important to determine how boundaries are being crossed. Data team members can broker their knowledge to their colleagues through disseminating boundary objects, and/or through personal communication. Boundary objects are artifacts that can be used to create and maintain a common identity across different communities of practice (Star & Griesemer, 1989). This means that these artifacts relate to different communities of practice, here data team members and their colleagues, and satisfy the information needs of each group. Therefore, one of the most important characteristics of a boundary object is that its informational content is plastic enough to connect to the knowledge base of all involved parties. Moreover, these objects should address at least one of the contents relevant here (the educational problem, data use as applied to the educational problem, and/or data use in general) and be used to diminish the boundary between data team members and their colleagues. Examples of boundary objects include documents, tools, and programs (Wenger, 1998). Thus, data team members could hand out the data team manual, which explains the data-use process step by step (e.g., how do you define a hypothesis, how do you collect data). This manual is helpful for the team members, but can also be used by their colleagues who are novices regarding data use. Artifacts alone, however, are unlikely to address sufficiently the discontinuity between the behaviors of the data team members and their colleagues. One reason for this is that data team members cannot be sure whether colleagues understand the artifact as intended by them (Wenger, 1998). Such ambiguities cannot be addressed by artifacts alone.


This can be done through personal communication from the data team members to their colleagues, for example, during meetings and conversations, which is thus a better way to broker knowledge (Wenger, 1998). Such conversations are most likely to be effective when they are accompanied by boundary objects, because an object can give a less partial view of a topic, and a boundary crosser can help interpret the object and negotiate its relevance (Wenger, 1998). Furthermore, it is beneficial when colleagues have the opportunity to respond. This gives team members a better idea about their colleagues’ beliefs, knowledge, and preconceptions, which they can use to improve their strategy for knowledge brokerage and the functioning of their team. For example, if colleagues indicate they do not know what the added value of data use is, data team members could target this lack of knowledge in their upcoming brokerage activities. Moreover, when colleagues indicate that an important possible cause has been overlooked in the research activities, the data team members could decide to incorporate this information in their future activities.


However, these subtypes of personal communication are still not the best way to broker knowledge. The most fruitful subtype of personal communication that can be used to cross boundaries is to let colleagues gain as much experience with the data team and its procedure as possible, without demanding full participation in the team (Wenger, 1998). Examples include letting them observe team meetings, or simulating their participation during a workshop. Taking all of this together, knowledge can be brokered in four ways: using boundary objects (least likely to be effective), personal communication, personal communication accompanied by an object, and providing practical experience (most likely to be effective). This results in the final research question:


3.   What boundary-crossing activities are used by the data team members (dissemination of boundary objects, personal communication with/without boundary object, providing practical experience)?


METHOD


CONTEXT


This longitudinal qualitative study took place in the Netherlands. Schools in the Netherlands are increasingly prioritizing data use, as it is the government’s policy to increase schools’ rate of data use to at least 90% of schools by 2018 (Verbeek & Odenthal, 2014). In 2011–2012, the rate was only 28% of schools in primary education and 15% to 28% in secondary education (Dutch Inspectorate of Education, 2013).


Dutch schools have considerable freedom in determining what subject matter they teach; what textbooks, assessments and instructional strategies they use; and the religious or ideological beliefs to which they adhere (Kuiper, Van den Akker, Hooghoff, & Letschert, 2006; Ministry of Education, Culture and Science, 2000). However, the Dutch Inspectorate holds schools accountable for the quality of education they provide. Therefore, data team members often choose to work on an educational problem that is not only their own, but also the Inspectorate’s point of concern. Schools’ freedom in structuring the education they provide also implies that data team members have considerable freedom in designing the actions to be taken to address their problem. These actions can vary from changing instructional strategies to changing the school’s policy on repeating a grade.


The present study was part of a larger project funded by the Dutch Ministry of Education. A total of 25 secondary schools voluntarily signed up for this two-year project by sending in a preliminary general statement of a problem on which they wanted to focus their efforts. Of this group, 10 schools were selected based on their statement of a problem and their geographic location, so that various parts of the Netherlands were represented. All schools chose high grade retention rates,1 which they experienced to be a pressing issue in need of a solution, as the educational problem on which they wanted to focus their efforts. They all had little experience with data use, and were thus comparable in that regard at the beginning of the project. All data team members voluntarily participated in the team, and were explicitly instructed to share their acquired knowledge, experiences, and outcomes of their study with their colleagues. All teams were coached by the same person, who was not an author of this paper.


PARTICIPANTS


Case studies were conducted at four out of the 10 schools.2 Fairview and Jefferson were the only schools that were spread over multiple locations. It was assumed that, when there were relatively few data team members per location, boundary crossing might take different forms to reach a similar (or larger) number of colleagues (83 and 200, respectively). Therefore, these two schools were deliberately chosen. Lincoln and Oak Grove were randomly selected. At those schools, data team members had to broker their knowledge to 99 and 86 colleagues, respectively.


All four schools are confessional, which means that they incorporate religious beliefs in their education (e.g., the meanings of religious holidays are incorporated in the lessons). Fairview provides education for students in the lowest Dutch educational track.3 It has four locations; personnel from two of these worked collaboratively in one data team. A total of 543 students (ages 12–16) attend Fairview 1, and 464 students (ages 12–16) attend Fairview 2. Both numbers of students are slightly below the Dutch average for this specific school type. The data team was initially composed of four members from Fairview 1 and three members from Fairview 2. However, at the beginning of the second year, all three members from Fairview 2 were replaced by two of their colleagues.


Jefferson provides education for students in the intermediate and highest Dutch educational tracks. It has four locations, and students switch locations during their time in secondary education. The overall number of students for all four locations is 2,752 (ages 12–18), which is average for this specific school type. Initially, the data team was composed of one or more staff members at each location and one staff member from the school’s central office. However, at the beginning of the second year, three members quit the data team due to other obligations. They were replaced by a teacher from Jefferson 3, which meant that Jefferson 4 was no longer represented.


Lincoln provides education for students in all three Dutch educational tracks. A total of 1,261 students (ages 12–18) attend the school, which is slightly above the Dutch average for this specific school type. The Lincoln data team was composed of seven members. Halfway through the second year, one of the teachers left the school. No one took his place in the data team.

Oak Grove also provides education for students in all three Dutch educational tracks. A total of 1,185 students (ages 12–18) attend the school, which is slightly above the Dutch average for this specific school type. The data team was composed of six members, who participated in their team both years. See Table 1 for more information on all data team members.


INSTRUMENTS


This study aimed to determine how data team members acted as boundary crossers. To triangulate our findings and to ensure that all instances of boundary crossing were included, five instruments were used. This ensured the construct validity of the findings (Yin, 2009).


Boundary Objects


All data teams were asked to send over all artifacts they had used to share their knowledge with their colleagues. These objects were either emails, articles for the staff newsletter, or presentation slides. Overall, this process included four objects for Fairview, four for Jefferson, three for Lincoln, and three for Oak Grove. At the end of each year, the resulting repository was checked for completeness with the team members.


Interviews


All data team members were individually interviewed at the end of the year(s) during which they had participated,4 see Table 1. A semi-structured interview scheme was used to ask them how they had acted as boundary crossers. Questions included:


You have just explained what you have learned from your participation in the data team. How do you share this knowledge with your colleagues?

In general, what information should be shared with colleagues who did not participate in the data team? And how should this information be shared?

What would be the goal of sharing such information? What factors make it easy or hard to share your knowledge?


The interviews had an average duration of 30 minutes.


Table 1. Participants’ Characteristics for the Four Cases

Name

Location

Years of experience

Function

Subject Areas

Fairview

    

Mr. Anderson*

1

20

Teacher, school leader

Dutch

Mr. Johnson*

1

12

Assistant principal

Ms. Smith*

1

9

Teacher

English

Mr. Williams*

1

8

Teacher, data expert

Science, Chemistry

Mr. Jones

2

35

Teacher, school leader

Mathematics

Ms. Lee

2

10

Teacher

History, Geography

Mr. Garcia

2

7

Teacher

Science, Chemistry

Mr. Miller

2

31

Teacher, school leader

Mathematics

Ms. Rodriguez

2

15

Teacher, quality manager

Fashion & Commerce, Trade & Sales, Economics

Jefferson

 


  

Mr. Baker

3

6

Teacher

Mathematics

Mr. Bennett*

1

33

Principal

Mr. Cooper*

2

17

Teacher

Mathematics

Ms. Morris*

4

3

Teacher, quality manager

German

Ms. Price*

1

33

Support staff

Ms. Russell

3

8

Teacher

Computer Science, Mathematics

Ms. Wood

4

14

Teacher

Physical Education

Lincoln

 


  

Ms. Brown*

32

Teacher

Music, Arts

Ms. Clark*

13

Support staff

Ms. Harris*

30

School leader

Mr. Martinez

15

Teacher

Mathematics

Mr. Thomas*

5

Support staff

Mr. Wilson*

9

Teacher

Chemistry

Ms. Young*

20

Teacher

Mathematics

Oak Grove

 


  

Mr. Bailey*

33

Teacher

Philosophy

Ms. Davis*

15

Teacher

Dutch

Mr. Foster*

42

Quality manager

Ms. Hill*

24

Teacher

French, Mathematics

Mr. Jenkins*

30

Teacher

Mathematics

Ms. Perry*

17

School leader

Note. * indicates two-year participation in the data team.


Log Files


The data coach from the university supervised all data teams included in the present study. Overall, the teams met 24 (Fairview), 19 (Jefferson), 21 (Lincoln), and 23 (Oak Grove) times. The coach wrote a log file after each meeting in which she reflected on how the teams talked about boundary crossing, among other things. All available log files, 81 out of 87, were included as data in this study.


Minutes and Progress Reports


All data teams described what they had done during a specific meeting in their minutes. However, minutes were not available for every meeting, as teams were not required to keep minutes. All available minutes, 69 out of 87, were included as data in the present study. Moreover, those minutes were used to tally the number of times boundary crossing was discussed during a meeting.


Data teams described in their progress reports what they had done for each step of the data team procedure. For example, they recorded what hypothesis they had studied, what data they had collected, and what remarks could be made about the reliability and validity of those data. For each of the four schools, the progress report written at the end of the second year was used, which summarized their progress throughout both years. This gave an overview of what content the data team members could have shared.


DATA ANALYSIS


Boundary Objects and Interviews


A coding scheme was developed and is presented in Tables 2 and 3. The coding scheme encompassed the four brokerage activities that could be used (dissemination of boundary object, personal communication with/without boundary object, providing practical experience), which was inspired by the work of Wenger (1998). Furthermore, the scheme encompassed the three types of knowledge content that could be brokered (educational problem, data use as applied to the educational problem, data use in general), which was inspired by previous work on the data team intervention (Hubers, Poortman, Schildkamp, Pieters, & Handelzalts, 2016). Finally, the coding scheme encompassed the three levels at which knowledge could be brokered (awareness, how-to, and principles), which was inspired by the work of Rogers (2003). When coding the data in Atlas.ti, the boundary-crossing activity was coded first. After that, its content was coded by applying two codes per segment: one for the type of content and one for the level at which the content was brokered.


For each boundary-crossing activity, a summary was made that described the type of knowledge content and the level at which it was shared. These summaries were compared and contrasted, which facilitated within-case and cross-case analyses. Examples of types of knowledge content and quotes from the interviews were translated from Dutch into English to illustrate the results.


To determine inter-rater reliability, the second author coded 10% of the objects (two out of 14) and interviews (two out of 28). This resulted in an inter-rater reliability of .90, which is considered to be almost perfect (Eggen & Sanders, 1993).


Table 2. Coding Scheme for Activities

Boundary-crossing activities

The data team members . . .

Dissemination of boundary object

use an artifact (e.g., email, staff newsletter) to share their knowledge.

Personal communication (without object)

share their knowledge at one-on-one/group conversations without using any objects (e.g., informal conversations during lunch breaks).

Personal communication (with object)

share their knowledge at one-on-one/group conversations and use artifacts in so doing (e.g., presentation slides).

Providing practical experience

let their colleagues gain experience without demanding full participation in the data team (e.g., workshop).





Table 3. Coding Scheme for Types of Knowledge Content and Levels of Information

  

Level

 

 

Awareness (lowest level)

Basic information. Colleagues will not yet know how to act upon the educational problem and/or use data.

How-to (intermediate level)

Information that is required to act upon the educational problem and/or use data. For example, it tells you what steps you have to take.

Principles (highest level)

Information on the working principle underlying the educational problem and/or data use. For colleagues, this information is required to engage in critical reflection and to prevent them from engaging in mindless action.

Content

Educational problem
Information regarding the educational problem without any data being presented. This can include information on the problem, the underlying causes and the actions for improvement.

- “The grade retention rates are too high in our school.”
- “Students who repeat a grade obtain significantly lower results than students who do not repeat a grade.”
- “One of our actions for improvement is that primary schools have to give unequivocal recommendations from now on.”

 - “To solve our educational problem, counselors need to do three things: First . . . Second . . . Third . . .”

- “We found out that you need to look for a cause that is related to your own performance. This is something you can influence.”

Data use as applied to educational problem

Basic and practical information. This can include actual data and/or data collection activities.

- “All sorts of data about our students are available in the computer program.”

- “Last year, our grade retention rate was 40%.”

- “We wanted to study our grade retention rates. Therefore, we have posed two hypotheses. We have found significant results, which are discussed in the team.”

- “You can calculate the grade retention rates yourself. In order to do so, you . . .”

- “We found out that students who are going to repeat a grade can be identified after the first period. When they have a failing mark at that time, they increasingly lose motivation. Eventually, this causes them to repeat a grade after the fourth period.”

Data use in general

Technical and abstract information about data use. This can include the status quo on data use, information on specific analyses, reliability issues, etc.

- “You can study problems in the school through the use of data.”

- “This is the current state of affairs related to data use.”

- “A problem is defined based on available data. One or more hypotheses are posed, and these hypotheses are studied through the use of data to see whether they can be accepted.”

- “When you have collected your interview data, you need to apply codes. To make a coding scheme, you read literature first. Second, you . . .”

- “When your data are not reliable, it will distort your data in the following manner . . .”


Log Files


The log files were used to ensure that all instances of boundary crossing were included, and for background information. First, all log files were read and all information related to knowledge brokerage was flagged and added to the description of the appropriate boundary-crossing activity. During this process, three additional instances of boundary crossing were identified (beyond those that had emerged during the coding of the boundary objects and interviews). The descriptions of these activities were coded with the aforementioned coding scheme. These activities were summarized and included in the comparison. Moreover, the background information was summarized and compared and contrasted both within the four cases and between the four cases. The researchers interpreted and discussed the resulting findings in a cyclical and iterative manner, thereby ensuring the reliability and internal validity of their findings (Poortman & Schildkamp, 2012).


Minutes and Progress Reports


The first author read all the minutes and progress reports and used this information to summarize and describe how team members worked with the data team intervention. Furthermore, a chronological comparison was made for each boundary-crossing activity between content that was shared during that activity, and the overview of content that could have been shared by the data team members at that specific point in time but was not, in fact, shared.


RESULTS


The process of boundary crossing was studied in four data teams. The within-case analysis for each data team is presented on a school-by-school basis. First, how each data team worked with the data team procedure is described, as this facilitates better understanding of the content of their boundary-crossing activities. In addition, the number of times boundary crossing was part of team members’ meetings is provided. This gives a sense of their awareness of the importance of knowledge sharing. Second, conclusions are drawn regarding the types of knowledge content that were shared, the level at which that content was addressed, and the boundary-crossing activities that were used to do so. Finally, the results of the within-case analyses are compared and contrasted with each other, and described in the cross-case analyses.


FAIRVIEW 1 AND 2


Fairview’s data team members met 24 times. They studied the grade retention rates for the lower grade levels. The first hypothesis they considered was that students’ final test scores from primary school were associated with grade retention. Their second hypothesis was that some primary schools gave poorer advice about students’ educational level for secondary education than other schools5. The data team members collected data for both hypotheses, had to refine their data file in some instances, and conducted descriptive analyses. They accepted only their second hypothesis. However, they decided to delve deeper into their problem and study potential causes they had more influence on. Therefore, they interviewed the students about what students thought were the reasons for being demoted to a lower educational level (which results in an increased grade retention rate). The team members designed an interview scheme, collected the appropriate data, and developed and applied a coding scheme to their data. They drew several conclusions, including that teachers’ classroom management skills play an important role in students’ grade demotion. They developed corresponding actions for improvement, such as using a peer-observation scheme to determine which teachers lack adequate classroom management skills.


The minutes of meetings at Fairview revealed that data team members discussed the way in which they tried to inform and involve their colleagues during at least eight of their meetings. The majority of those meetings took place in the second year. Examples of how boundary crossing was reflected in their minutes include: “We need to inform our colleagues from both locations about what we are doing” and “We need to present our data to our colleagues before April 4th, and ask them to provide input for our conclusions.” During the interviews, most data team members indicated that boundaries should be crossed in order to ensure their colleagues’ cooperation in the implementation of the actions for improvement so that the educational problem would be solved. For example, Mr. Anderson said, “Really, everything comes down to communication.” Moreover, some members indicated that their colleagues should use the data team procedure as well. For example, Mr. Johnson stated, “I really hope that this [procedure] will increasingly spread throughout our school.” At Fairview, boundary crossing at took place in the following manner (see also Table 4):


CONTENT


During the interviews, the data team members explained that the results of their research should be communicated, and, if possible, what actions could be taken to solve the problem. Some team members explicitly mentioned that the way in which those results were obtained should not be shared. For example, Ms. Lee stated, “That would be incredibly boring.”


Overall, the content shared by the team members was quite concrete and detailed. For example, in an email they sent to their colleagues, they stated: “In 2011, 88% of our students at Fairview 1 were no longer on the educational track advised by their primary schools. At both Fairview 1 and Fairview 2, they mainly shared knowledge about the educational problem and about data use as applied to the educational problem. Regarding the educational problem, they explained what problem they were studying, which hypotheses they tested, and what actions for improvement they were going to take. For example, Mr. Anderson communicated in his final presentation after the 21st meeting:


Conclusion 2: Students are not able to motivate themselves after they have obtained poor results. Possible action for improvement: Make sure that the counselor and teachers are aware of students’ results sooner, and regularly speak with students about their results.


In so doing, he did not broker guidelines to implement the actions for improvement, as the possible actions for improvement were not yet agreed upon by the data team members.


Regarding data use as applied to the educational problem, team members displayed their data and commented on the data’s reliability and validity. In doing this, they described discrete finalized activities, never an iterative process in which team members had to go back and forth (e.g., when they had to refine their data file after some data appeared to be missing). During the interview, Mr. Anderson explained that he only addressed the activities undertaken by the data team members in his presentations. He said:


Some colleagues questioned whether the data were accurate and wondered what they represented. But I told them I did not want to go into detail, and that I was only going to tell them about the activities we had conducted.


Knowledge about data use in general was shared once at Fairview 1, when team members explained the data team procedure in an email: “A problem is defined based on available data. One or more hypotheses are set up, and through the use of data, these hypotheses are checked for their accuracy.” However, they never explained how their data were analyzed, but only presented their data with the corresponding conclusions. At Fairview 2, content related to data use in general was never shared. Finally, it appeared that no boundary-crossing activities were used to cover content related to the 22nd through 24th meetings at Fairview 1 (during which measures for improvement were formed), and the 8th through 24th meetings at Fairview 2 (during which the bulk of their research was being done).


Level


As the aforementioned quotes illustrate, content was exclusively addressed at the level of awareness for both Fairview 1 and Fairview 2: Colleagues were informed about what the data team members had done, but did not receive any guidelines to act upon the educational problem or data use themselves. In addition, they did not receive any information about the underlying principles regarding the educational problem or data use.


Activities


At Fairview 1, several activities were conducted to cross boundaries. Two boundary objects were used: An email was sent to their colleagues with information about the data team, and a flipchart (boundary object) was available in the teachers’ lounge on which colleagues could write possible causes of the educational problem. Moreover, Mr. Anderson used slides to give three presentations to his colleagues. During the interviews, some data team members also indicated that they had shared their knowledge about the educational problem during informal conversations with their colleagues (personal communication without a boundary object), whereas others indicated they had not shared their knowledge at all. Thus, team members from Fairview 1 relied heavily on Mr. Anderson’s capacity to broker his knowledge: He was the only one who gave the presentations. In one of his presentations, he explicitly asked for input from his colleagues who were not involved in the data team.

At Fairview 2, one presentation was given as well, by Mr. Miller. Moreover, some data team members indicated that they had shared their knowledge about the educational problem during informal conversations with their colleagues (personal communication without a boundary object). Thus, team members from Fairview 2 relied heavily on Mr. Miller’s capacity to broker his knowledge. He was the only one who acted as a boundary crosser and he used personal communication with and without a boundary object to do so. However, he was only a member of the data team during the first year.


When asked how knowledge should be brokered toward their colleagues, all data team members from Fairview 1 and 2 indicated that this should be done through personal communication, as this would allow colleagues to ask questions and provide input. Team members would not use emails to broker their knowledge, as many of their colleagues would not read those. None of the team members mentioned providing practical experience as a possible activity.


When asked what factors hindered their boundary-crossing activities, most team members noted that some of their colleagues were skeptical about the use of data and showed resistance to it. Team members found it difficult to share their knowledge with these colleagues, as they were not open to data use in general and posed difficult questions to the team members. The factor that facilitated their activities was that there are already several opportunities in their work schedule, such as meetings, to broker their knowledge. So, if they were going to share their knowledge, there was ample opportunity to do so.


JEFFERSON 1 THROUGH 4


Jefferson’s data team members met 19 times. They studied their grade retention rates, but were unsure about which educational level(s) or location(s) to include. Therefore, they included all of them. Their first hypothesis was that boys repeat a grade more often than girls. Their second hypothesis was that students’ final test scores from primary school were associated with grade retention. They collected the appropriate data, had to refine their data file in some instances, and conducted descriptive analyses. Their second hypothesis was accepted, but for only for two out of 16 departments. The first hypothesis was rejected. They decided to delve deeper into their problem and studied whether students who repeat a grade from a specific department at Jefferson 1 performed at a lower percentile than students who did not repeat a grade. They collected the appropriate data, analyzed them, and concluded that this was the case for Mathematics and English. They designed actions for improvement, such as providing extra support for low-performing students.


The minutes of meetings at Jefferson revealed that data team members discussed the way in which they tried to inform and involve their colleagues during at least nine of their meetings. Half of those meetings took place in the first year, and the other half in the second year. An example of how boundary crossing was reflected in their minutes is: “The text as written by Mr. Bennett will be placed in our staff’s newsletter. Moreover, our members from locations 2 and 3 will refer to its content also during a staff meeting.” During the interviews, most data team members indicated that boundaries should be crossed to ensure their colleagues’ cooperation in the implementation of the actions for improvement so that their educational problem would be solved. At Jefferson, boundary-crossing activities were similar for all four locations and took place in the following manner (see also Table 4):


Content


During the interviews, the data team members indicated that their findings should be communicated. Communicating the actions for improvement was not mentioned by any of the members. Overall, the content shared by the team members was quite vague and abstract. They mainly addressed content related to data use as applied to the educational problem. In doing so, they explained in general terms what activities they had conducted. For example, Ms. Morris explained in an article in their staff’s newsletter what they were going to do in the upcoming meetings:


We are going to determine the absolute number and relative percentage of students who repeat a grade or are demoted to a lower educational level. When we have collected these data, we will see if this is in fact problematic and, if so, how many locations we can study this for.


Team members never presented actual data, such as data on the extent to which the educational problem was in fact a problem, or even explained what kind of data they had collected or how they had analyzed those data. Content related to the educational problem and to data use in general were both rarely addressed. The former was addressed to explain what problem and hypotheses were being studied and to explain what conclusions were drawn. For example, in an article it was stated that: “The second hypothesis is confirmed for departments ‘vmbo’3 at Jefferson 3, and ‘vwo’ at Jefferson 1. These conclusions do not offer enough starting points to design actions for improvement.” No additional information—for example, rephrasing what this meant—was provided. Data use in general was addressed to explain in general terms what data use and the data team procedure entail. For example, in an article it was stated that: “In light of data-based decision making, we use a cyclic procedure for our research.” This content about data use in general did not include information on how to conduct research. Finally, it appeared that no boundary-crossing activities were used to cover content related to the 15th through 19th meetings, during which data team members mainly designed the actions for improvement.


Level


As the aforementioned quotes illustrate, each type of content was exclusively addressed at the level of awareness: Colleagues were informed about what the data team members had done, but colleagues did not receive any guidelines for acting upon the educational problem or using data themselves. In addition, they did not receive any information about the underlying principles regarding the educational problem or data use.


Activities


The team members disseminated four boundary objects and, during the second year, used some personal communication without a boundary object as well. Three of the objects were written by Ms. Morris, which indicates that the other members relied heavily on her capacity to share her knowledge. Input from colleagues was never explicitly requested. During the interviews, data team members added that they had also used personal communication to broker their knowledge about the educational problem to their colleagues, for example, during lunch breaks. However, this occurred rarely, and exclusively during their second year. Team members were hesitant to talk about the teams’ progress with their colleagues because they did not yet have their final results. Moreover, the coach noted in one of her log files that they were unsure about what content would be most interesting to share.


When asked how knowledge should be brokered toward their colleagues, most data team members mentioned that this can be done by email or an article in the staff’s newsletter, because everyone has access to those resources. However, the majority also indicated that this should be done through personal communications (e.g., during meetings). It did not become clear why they had not used such activities, but Ms. Price said: “I do not think I am the designated person to take the lead in this.” Mr. Baker was the only member who mentioned active participation as a possible activity, because: “You want them to be involved and support our ideas.”


During the interviews, varying factors were mentioned that hindered their boundary-crossing activities. Amongst those was that team members and their colleagues differed in their knowledge base. Most of their colleagues had little to no knowledge about data use, and team members found it difficult to explain this content to them. Another factor that hindered boundary crossing was that some information was confidential, and team members were very careful that their data could not be used to “blame and shame” their colleagues. Finally, team members mentioned that there was not enough time and not enough possibilities to broker their knowledge (e.g., no meetings were all of their colleagues were present). A few members also mentioned a facilitating factor, namely, that all colleagues acknowledged the problem with their grade retention rates and that everyone was committed to solving this problem.


LINCOLN


Lincoln’s data team members met 21 times. They studied their grade retention rates for the upper grade levels. They studied four relatively similar hypotheses: whether shortage points6 for either the core subjects (English, Dutch, and Mathematics) or the remaining subjects in either the eighth or ninth grade were associated with grade retention. They collected the appropriate data and made some refinements in order to correct small errors. They conducted Chi-square analyses and concluded for three out of their four hypotheses that shortage points were indeed associated with grade retention. The data team members wanted to delve deeper into this issue and decided to conduct qualitative research and interview students about why they thought they repeated a grade. The team members designed an interview scheme, collected the appropriate data, and developed and applied a coding scheme to their data. They drew several conclusions, such as the idea that students who repeat a grade experienced a “culture shock” when they transferred to the next grade. In their final meetings, they developed actions for improvement, such as scheduling an information session for students, which should decrease the “culture shock.”


The minutes of meetings at Lincoln revealed that data team members discussed the way in which they tried to inform and involve their colleagues during at least four of their meetings. Most of those meetings took place in the second year. The following quote provides an example of how boundary crossing was reflected in their minutes: “Next time, we need to discuss a research request made by one of our colleagues, and the workshop we would like to organize for our colleagues.” During the interviews, most data team members indicated that boundaries should be crossed to ensure their colleagues’ cooperation in the implementation of the actions for improvement so that their educational problem would be solved. Moreover, some members also indicated that the data team procedure should be brought into the school. For example, Ms. Brown said: “Just to show them: this is useful, it is fun, you gain insights and it really helps you. You should do this as well. I would want to share that.” At Lincoln, boundary crossing at took place in the following manner (see also Table 4):


Content


During the interviews, the data team members indicated that they should share the procedure, conclusions, and actions for improvement with their colleagues. However, overall, the content shared by the team members was quite vague and abstract. They addressed the three types of content in relatively equal amounts. Content related to the educational problem was addressed to explain the problem on which the team members had focused their efforts (e.g., “We are not satisfied with the grade efficiency rate from the ninth grade to the 11th grade.”), which hypotheses they had posed, what conclusions they had drawn, and what actions for improvement had been designed. Content related to data use as applied to the educational problem was addressed to describe the discrete finalized activities they had undertaken. For example, Ms. Harris explained in one of her reports: “These national tests predict the proper educational level for students, and those results will be compared with the predictive value of the assessments made by the teachers.” How and why this was done remained unclear. The data team members did not provide insight into the iterative process in which they had to go back and forth, for example, when they had to refine their data file after some data appeared to be missing.


The actual data were only included in the appendix of the second-year report. Content related to data use in general was addressed to explain in general terms what concepts such as the data team procedure entail. For example, Ms. Harris described in the staff’s newsletter: “Data-based decision making is an important procedure that will lead to better decisions about actions to improve education and increase learning outcomes.” How and why this is the case was not explained. Furthermore, the data team members did not explain how they had analyzed their data, even though they said during the interviews that this would be important information for their colleagues.


Finally, it appeared that Ms. Harris wrote her last report at the end of the second year; through that report, she was able to cover content from all meetings.


Level


As the aforementioned quotes illustrate, content was almost exclusively addressed at the level of awareness: Colleagues were provided with basic information on the educational problem and data use. However, Ms. Harris explained in her final report what actions for improvement were going to be implemented, and who had to complete what tasks at which points during the implementation process. For example: “Action: counselors and data team members determine the final content for the information session. Action: counselors run the information session. Action: counselors evaluate information session.” Because counselors were told about several actions they needed to take, content related to the educational level was addressed at the how-to level, even though it was quite nonspecific. This was the only occurrence of the how-to level. Colleagues did not receive any information on the underlying principles regarding the educational problem or data use.


Activities


The team members disseminated three boundary objects to broker their knowledge. During the first-year interviews, the data team members added that they also talked about the educational problem with their colleagues, for example, during their lunch breaks. However, they stopped doing so during their second year. Team members were hesitant to share information, as they did not yet have their final conclusions. Initially, they intended to provide a practical experience (a workshop) for their colleagues. However, this idea was not carried through. The coach noted several times in her log files that Ms. Harris was aware of the importance of communicating their results, but the other data team members were unsure about it. As a result, communicating their findings remained just a thought.


Ms. Harris wrote the boundary objects, which indicates that the team members relied heavily on her capacity to share her knowledge. Even though Ms. Harris explicitly asked for colleagues’ input, there was no opportunity where they could provide this. Colleagues had to take the initiative to send an email providing such input.


Opinions about the best ways to broker knowledge differed among team members. Some indicated that this should be done through the use of boundary objects (e.g., emails or an item in the staff’s newsletter), some indicated that it should be done through personal communications, and others indicated that it should be done through a combination of objects and personal communication. For example, Ms. Brown explained about the use of artifacts: “They will not read those papers. It is not that they don’t want to, but you should see how much paper ends up in our mailboxes, and emails are flying around.


Several factors were mentioned that hindered their knowledge sharing. Amongst those was that team members and their colleagues differed in their knowledge base. Most of their colleagues had little to no knowledge about data use, and team members found it difficult to explain this content to them. Another factor that hindered boundary crossing was that some information was confidential, and team members were very careful that their data not be used to “blame and shame” their colleagues. Finally, team members mentioned that there was not enough time and not enough possibilities to broker their knowledge (e.g., no meetings were everyone was present). For example, Ms. Baker stated:


You have some things, these are so important that you cannot decide not to do them. And then, when you run out of time, which is not wise, you only do those things [and not broker knowledge related to the data team].


The team members were not able to list factors that promoted their boundary crossing.


OAK GROVE


Oak Grove’s data team members met 23 times. They studied the grade retention rates for an upper grade level. Their first hypothesis was that students’ final test scores from primary school influenced the grade retention rates. Their second hypothesis was that students who repeat a grade have significantly lower scores for the core courses (Dutch, English, Mathematics) than students who do not repeat a grade. Their third hypothesis was that the former have a greater decline in grades for the core courses than the latter. Their fourth hypothesis was that the former have poorer study skills (e.g., text analysis) than the latter. Their final hypothesis was that first-period grades determine who will repeat a grade. For each hypothesis, data were collected and refinements were made. The data were analyzed and the data team members concluded that their second through fourth hypotheses were confirmed. They wanted to design actions for improvement related to text analysis (corresponding with their fourth hypothesis), because this outcome was most concrete. For example, they wanted to administer regularly the questionnaire about study skills that was used to collect data for their hypothesis, to detect which students should take a course to address these skills.


The minutes of meetings at Oak Grove revealed that data team members discussed the way in which they tried to inform and involve their colleagues during at least 14 of their meetings. Half of those meetings took place in the first year, and the other half in the second year. The following quote provides an example of how boundary crossing was reflected in their minutes: “What do we do need to do next year? Our way of working needs to spread towards our colleagues. . . . Also, do we need to share our knowledge with colleagues from other schools within our school board as well?’ During the interviews, most data team members indicated that boundaries should be crossed to ensure their colleagues’ cooperation in the implementation of the actions for improvement so that their educational problem would be solved. Moreover, some members also indicated that the data team procedure should be brought into the school. For example, Ms. Davis stated, “Eventually, this should spread throughout our school.” At Oak Grove, boundary crossing at took place in the following manner (see also Table 4):


Content


During the interviews, the data team members indicated that they should share the procedure, conclusions, and actions for improvement with their colleagues. In addition, some team members explicitly mentioned that the data analyses should not be discussed in too much depth. As Mr. Bailey said, “Almost no one is interested in that.” Overall, the content shared by the team members was quite concrete and detailed. They brokered their knowledge about all three types of content in relatively equal amounts. When they addressed content related to the educational problem, they shared what problem they were studying, what the extent of that problem was, which hypotheses they had tested, and what the underlying causes were. For example, they wrote down on their sheets: “We are not satisfied with the percentage of students that proceeds to the next grade. This decreased from 83% in 2010 to 73% in 2012.” Regarding data use as applied to the educational problem, they displayed their data and commented on their reliability and validity. In doing so, they described discrete finalized activities, never an iterative process in which team members go back and forth, as when they had to refine their data file after some data appeared to be missing. For example, the team members explained in their staff newsletter: “The data team wants to study new topics, such as students’ motivation and study skills. To do this, we are using an available online questionnaire: the VSV . . . in which students indicate how various aspects influence their education.” When they described content related to data use in general, they presented the current level of data use within their school. They provided graphs and their underlying meaning, for example: “Our teachers believe they are capable of responding to students’ individual learning needs.” They did not explain how they had analyzed their data. Finally, it appeared that no boundary-crossing activities were used to cover content related to the 21nd through 23th meetings.


Level


As the aforementioned quotes illustrate, team members from Oak Grove addressed the content exclusively at the level of awareness: Colleagues were informed about what the data team members had done, but colleagues did not receive any guidelines for acting upon the educational problem or using data themselves. In addition, they did not receive any information on the underlying principles regarding the educational problem or data use.


Activities


The team members used dissemination of boundary objects (two articles in the newsletter and a flipchart), and personal communication with and without boundary objects to broker their knowledge. Overall, Ms. Perry was the only team member who did not share her knowledge. She explained this by saying, “I am a school leader, and it can be perceived as threatening when I share these data.” This indicates that the rest of the members were more likely to be “visible” to their colleagues in their role as a data team member. During the interviews, team members added that they had talked to their colleagues every now and then, for example, during their lunch break or during a staff meeting (personal communication without a boundary object). However, they only did so when their colleagues asked them questions. There was an explicit opportunity for colleagues to ask such questions at least once. In addition, colleagues could write possible causes for the educational problem on a flipchart, and the data team members used this input to formulate their first hypothesis.  In the coach’s log file about the 11th meeting, she noted that the team members were struggling to determine what they could present to their colleagues, as they did not yet have their final conclusions, and how they could communicate their results without “blaming and shaming” their colleagues.


All data team members agreed that knowledge should be brokered through personal communications (e.g., during meetings). Some mentioned that boundary objects (e.g., an item in the newsletter, an email, or a flipchart) could be a helpful addition to those communications. As Ms. Davis said, “You just know that brokering your knowledge just once is not enough.” None of the team members mentioned providing a practical experience as a possible activity.


Some data team members indicated that there were no factors at play that hindered their boundary-crossing activities. Others mentioned that some information was confidential, and team members were very careful that their data not be used to “blame and shame” their colleagues. This hindered their boundary-crossing activities. Another factor that hindered them in brokering their knowledge was that there was a difference in knowledge base between team members and their colleagues. Most of their colleagues had little to no knowledge about data use, and team members found it difficult to explain this content to them. Some data team members indicated that there were no factors at play that promoted their knowledge brokerage. Others mentioned that all team members were quite well liked and respected by their colleagues. Because of that, colleagues believed them to be credible sources of information, which facilitated their boundary-crossing activities.


CROSS-CASE ANALYSIS


The process of boundary crossing was studied for four data teams: Fairview, Jefferson, Lincoln, and Oak Grove. The results of these within-case analyses were compared and contrasted with one another, as presented in Table 4.


Content


It appeared that content about the educational problem and data use as applied to the educational problem were brokered most often, whereas knowledge about data use in general was hardly ever brokered. When the data team members brokered content about the educational problem, they explained what problem and hypotheses they were studying, and what the underlying causes were. Surprisingly, the actions for improvement were not brokered in three out of four cases, as no boundary-crossing activities were used to cover content related to data team members’ final meetings. This is incongruent with their statements that boundaries should be crossed in order to ensure their colleagues’ cooperation in the implementation of the actions for improvement.


When data team members brokered knowledge about data use related to the educational problem, they displayed their data and explained the activities they had undertaken as discrete finalized events (e.g., collecting and analyzing data), never as an iterative process (e.g., collecting additional data after some appeared to be missing). When knowledge about data use in general was brokered, team members described the data team procedure in general terms. None of them explained how their data had been analyzed, but only explained what conclusions they had drawn. This is in agreement with the interview data, as most team members stated that one should not go into too much detail regarding the analysis. However, it is in direct contradiction with several data team members’ opinion that boundaries should be crossed in order for the data team procedure to spread throughout the school.


Level


Knowledge was almost exclusively addressed at the level of awareness. This means that colleagues were informed about what the data team members had done, but colleagues did not receive any guidelines for acting upon the educational problem or using data themselves. This contradicts team members’ opinions on why knowledge should be brokered; namely, in order to ensure colleagues’ participation in implementing the actions for improvement and the data team procedure itself. The only exception to the lack of using the how-to level was when Ms. Harris from Lincoln addressed this level by explaining how her colleagues should take part in implementing the actions for improvement. Knowledge was never brokered at the principles level.


Table 4. Summary of How Data Team Members Acted as Boundary Crossers

[39_22446.htm_g/00004.jpg]


Activities


Team members used several boundary-crossing activities to broker their knowledge to their colleagues. Members from Fairview 1 and Oak Grove used the greatest diversity of activities, by using different boundary objects and personal communication with and without boundary objects. Jefferson and Lincoln’s members mainly used dissemination of boundary objects, and to a smaller degree also personal communication without boundary objects (conversations during lunch breaks). Interestingly, most team members indicated during the interviews that knowledge should not (exclusively) be brokered through boundary objects. Personal communication is, according to the team members, always required. These statements were even made by members from Jefferson and Lincoln, who almost exclusively relied on such boundary objects. No team member provided practical experience to cross boundaries. This strategy was also hardly ever mentioned during the interviews.


It appeared that several factors were at play that hindered data team members’ use of boundary-crossing activities. There was a great variation in the factors that were mentioned, but the difference in knowledge base between team members and their colleagues (most of whom had little to no knowledge about the matter) and issues of confidentiality are among the most often cited ones. Some data team members also mentioned factors that promoted knowledge brokerage (e.g., they were a credible source of information in the eyes of their colleagues, and team members experienced a commitment of their colleagues to solve the educational problem), but these differed from member to member.


Finally, it appeared that in all cases but Oak Grove, team members relied heavily on one of their members to act as the boundary crosser. Members from Fairview 1 and Oak Grove were the only ones who gave their colleagues explicit opportunities to provide the team with input, for example by using flipcharts or using presentation slides eliciting questions.


CONCLUSIONS AND DISCUSSION


The present study used a process view to determine how data team members acted as boundary crossers in order to build collective capacity among their colleagues regarding data use and addressing an educational problem.


Content


Regarding the content, it appeared that data team members mainly shared knowledge about the educational problem and about data use as applied to the educational problem. Content about data use in general was rarely shared. Regarding content about the educational problem, team members explained what problem and hypotheses they were studying, and what the underlying causes were. Interestingly, the planned actions for improvement were hardly ever shared, even though the majority of data team members believed they should broker their knowledge in order to ensure their colleagues’ cooperation in the implementation of the actions for improvement.


Regarding data use as applied to the educational problem, team members displayed their data and explained the data-related activities they had undertaken as discrete, finalized activities. Previous research among Fairview’s and Lincoln’s data team members indicated that they had learned especially from those instances where they had made mistakes and had to go back to a certain step, for example, to refine their hypothesis or remove errors from the data file (Hubers et al., 2016). They might not have shared these iterations because they were unaware of the importance of sharing such insights and/or because it is difficult to admit to the mistakes they made, even though these mistakes were helpful for their learning process.


Regarding data use in general, team members did not describe how they had analyzed their data, but only explained the conclusions they had drawn. The absence of content about data use in general is not surprising, and matches our previous work that showed how data team members struggled to build on their relationships with their colleagues (Hubers, Moolenaar, Schildkamp, Daly, Handelzalts, & Pieters, 2018). The reason for this is that knowledge that needs to be brokered has to have certain characteristics (Carlile, 2004). One of those is dependence, which requires that differences in knowledge have consequences. This is not the case for data use in general: Team members can use data regardless of what their colleagues are doing, which means that it has less added value for them to share this content with their colleagues. Moreover, data team members might have used data only as a tool to solve their problem and not as content that is worth learning about in and of itself, given the current accountability context in which data are often used to explain or defend certain actions or decisions (Datnow & Hubbard, 2015). Furthermore, the data team members’ capacity to establish common ground as a starting point for knowledge brokerage is likely to be an important issue here (Carlile, 2004). This is not likely to be problematic for knowledge about the educational problem and data use as applied to that problem, as both types of content are directly relevant for the everyday practice of data team members and their colleagues. In contrast, team members are likely to have more expertise related to data use in general than their colleagues, but are still likely to be struggling to master this content. This makes it challenging to establish common ground for sharing their knowledge with their colleagues. Indeed, several data team members mentioned this challenge as a factor that hindered knowledge brokerage.  


Level


It appeared that the knowledge content was almost exclusively addressed at the level of awareness. This result agrees with Rogers (2003), who stated that most efforts are usually concentrated on diffusing knowledge at this level. This practice is likely to have implications for the extent to which collective capacity is built: Colleagues are likely to be aware that an educational problem is being studied and that data are being used to do so, but neither do they know how to act upon this, nor are they likely to understand what the underlying principles are. Furthermore, previous research about teacher learning has illustrated that teachers often make superficial changes in their behavior (e.g., Coburn, 2004; Spillane, 2000). It could be that this risk of making superficial changes is especially high when knowledge is not addressed at the how-to and principles levels, as colleagues then lack the information necessary for properly using new practices (Rogers, 2003).


Activities


All four data teams used dissemination of boundary objects and personal communication without boundary objects to broker their knowledge, even though the majority of team members believed objects were not likely to be effective. Some team members also used personal communication with a boundary object. Providing a practical experience was never used, whereas Wenger (1998) indicated that this is likely to be the most fruitful way to cross boundaries, as colleagues can gain as much experience with the data team and its procedure as possible, without needing to fully participate in the team. When teams heavily rely on boundary objects alone, it is particularly unlikely that boundaries will be successfully crossed. One reason for this is that they cannot know for sure whether colleagues understand the objects as intended (Wenger, 1998). Such ambiguities cannot be addressed in objects. This also came across in the extent to which colleagues were provided with the opportunity to provide feedback and ask questions, as such opportunities were only explicitly provided when boundary objects were accompanied by personal communication.


Furthermore, it appeared that team members often relied on one or two individuals to act as boundary crossers, which results in an unstable system with an overdependence on those individuals’ capacity to share their knowledge. It is not necessary that all team members interact with all of their colleagues, nor that everything that is being done by individual team members is accounted for by the entire team. However, the less this is the case, the more unlikely it is that the data team members establish an actual community of practice with unitary aims and behaviors (Wenger, 1998).


BOUNDARY CROSSING


Previous research on the process of boundary crossing is scarce, and insight into the role of brokers who cross these boundaries and the objects they use is lacking (Akkerman & Bakker, 2011; Bakker & Akkerman, 2014; Kubiak, 2009), especially in the context of data use (Jimerson & Wayman, 2015). The present study matched our earlier findings that illustrated from a social network perspective that data team members have great difficulty sharing their knowledge (Hubers, Schildkamp, Poortman, & Pieters, 2017). The present study gained in-depth insight into the process of knowledge sharing. To gain this insight, a framework was used that not only specified which types of knowledge content could be brokered, but also incorporated both the level at which that content was addressed and what activities were used to broker knowledge. This approach resulted in a more comprehensive framework for boundary crossing. Applying this framework illustrated that determining how boundaries are being crossed can be a key component in understanding the extent to which school-wide capacity is built. Furthermore, it illustrated that the concept of boundary crossing encompasses much more than “just telling something” to your colleagues.


PRACTICAL IMPLICATIONS


The results of the present study indicate that data team members brokered their knowledge only to a limited extent, and in ways that are less likely to be effective. This agrees with previous research by Jimerson and Wayman (2015), who indicated that knowledge around data use was shared to a limited extent and remained within a small group of educators. Even though the present study determined the way in which boundaries were crossed and not teams’ effectiveness, their low levels of knowledge brokerage are likely to cause difficulty in building collective capacity at the school-wide level. This difficulty places them at risk for not meeting their school improvement goals of using data and solving the educational problem. Unfortunately, another study confirmed this risk (Hubers, Schildkamp, Poortman, & Pieters, 2017).


Even though the importance of building collective capacity is described in the data team manual (Schildkamp et al., 2014) and the coach refers to that section, it appears that data team members need additional support in order to act as boundary crossers. This support should be more explicitly included in the manual, but can also be addressed in a workshop. Moreover, the coach could pay more attention to capacity building. The coach can train the data team members in using data to solve a specific educational problem and in how to act as boundary crossers, to ensure the sustainability of both data use and the improvement measures in the school.


An improved version of the manual could include a general text about the importance of boundary crossing, and, subsequently, explicitly address it for each step of the data team intervention. In addition, it is important to address the coherence among the different aspects of boundary crossing. This means that the type of content, the level at which that content is addressed, and the activities one could undertake to cross boundaries should be targeted simultaneously, as this approach is likely to facilitate deepest knowledge about boundary crossing. For example, team members should be supported in selecting the appropriate content about data use in general; translate that to the awareness, how-to, and principles levels of knowledge; and use this when designing boundary-crossing activities, especially practical experiences such as workshops. In addition, it might be helpful if data team members explicitly mention to their colleagues that they have to “spread the word” to their colleagues. Moreover, it might be beneficial if data team members not only instruct their colleagues about data use (facilitating vertical expertise), but also actively involve them in knowledge co-creation activities (horizontal expertise), as this is likely to result in bigger changes in their colleagues’ teaching practice (Marsh et al., 2015). Furthermore, the importance of involving all team members in boundary-crossing activities should be addressed, and it may be highly productive to tap into their social ties (Farley-Ripple & Buttram, 2015). In addition, colleagues should be given explicit opportunities to respond, provide input, and ask questions. This information can be used to evaluate the boundary-crossing activities, and make sure that any ambiguities are addressed in the near future. This process ensures that all colleagues understand data use and the implementation of actions related to the improvement plan as intended by the data team members. However, it is not only their colleagues who benefit when data team members broker their knowledge. Team members are likely to benefit from these activities as well, as it forces them to think about and discuss what they have learned so far.


Finally, these practical implications might have a wider scope than just the data use context, as previous research has illustrated that one of the biggest challenges of professional learning communities is involving other colleagues and sustaining teachers’ professional development (e.g., Harris & Jones, 2010; Van Veen, Zwart, Meirink, & Verloop, 2010). It might be that educators in other types of bottom-up and small-scale professional development programs go through a similar type of struggle in acting as boundary crossers. In that case, our results can be a starting point in providing wider support for boundary-crossing activities in professional learning communities.


LIMITATIONS


The present study had a few limitations. First, we did not interview data team members’ colleagues. Consequently, we do not know how they received the boundary-crossing activities. Even though our goal was to elaborate on the process of knowledge sharing, the perceived effectiveness of this process would be an interesting venue for future research. Second, we were not present during team members’ presentations. Therefore, we do not know exactly what additional information was provided besides the content that was written down on the presentation slides. However, the interview data and one of the handouts from a presentation confirmed our initial findings. Third, we did not gain insight into the reasons why data team members shaped their boundary-crossing activities in the way they did. Thus, it is unclear whether data team members chose these contents and activities because they did not know what else to do, or whether there were certain factors hindering their desired strategies. However, it is evident that data team members need additional support to optimize their boundary-crossing activities.


FUTURE RESEARCH


Future research should continue to study the process of boundary crossing, both from the front end and the back end. The former could be aimed at gaining insight into how one can educate team members to act as boundary crossers and how support can best be provided. The latter can be aimed at gaining insight into the extent and way in which actual capacity is build. For example, Rogers (2003) stated that people can be organized into five different groups based on their willingness to engage with innovations, for example, “innovators” and “late majority.” It might be that those groups of people respond differently to certain content and/or boundary-crossing activities, which would have important consequences for the ways in which boundaries need to be crossed. For example, it might be that different groups of educators need to be convinced and educated through different kinds of brokerage activities.


Even though the present study gained some insight into the factors that influence knowledge sharing (e.g., fear of “blaming and shaming” and differences in knowledge base between team members and colleagues), additional efforts are needed to understand what factors matter most. Examples include how team members negotiate meaning (Beers, Boshuizen, Kirschner, & Gijselaers, 2006; Wenger, 1998), and the role of leadership and organizational culture (Wang & Noe, 2010). These inquiries could improve our insights into the mechanisms and drivers of the variable actions of team members related to boundary crossing. In addition, efforts are needed to relate the process of boundary crossing to its outcomes. For example, boundary crossing could be related to the sustainability of the data team intervention (see Hubers et al., 2017).


Even though several questions remain to be addressed, studying the process of boundary crossing can be considered a fruitful effort to understand the dynamic between an intervention and the resulting on-the-ground responses and actions (Coburn & Turner, 2011; Marsh, 2012). The present study illustrated that boundary crossing cannot be taken for granted: Team members have difficulty selecting the appropriate content, the level at which they address such content, and the activities they use to act as boundary crossers. When they receive additional support for this selection process, they are likely to increase their team’s effectiveness in building school-wide capacity for both data use and the implementation of actions related to the improvement plan.


Notes:


1. This is the percentage of students who complete their grade level within the normal time period, which is an indicator of educational quality. For example, if students enter the highest educational track and stay on that track until graduation without any delay, the rate is optimal. However, when students repeat a grade or transfer to a lower educational track, the rate drops.


2. School and participant names are pseudonyms.


3. In the Netherlands, there are three main educational tracks. The lowest track (four years, “vmbo”) prepares students for vocational education. The intermediate track (five years, “havo”) prepares students for college/higher education. The highest track (six years, “vwo”) prepares students for university education. These tracks are quite segregated, which means that most students who enter a track stay within that track until graduation.


4. Ms. Russell was never interviewed due to her leave of absence.


5. This placement advice is used in determining students’ educational track in secondary education.


6. Students receive grades from 1 to 10; a 5 is equal to one shortage point, a 4 to two, etc. Thus, these shortage points indicate how far below passing students’ grades are. The inverse of shortage points is excess points.


Acknowledgement


The authors are grateful to the Dutch Ministry of Education, Culture & Science for funding the data team project. Although government-funded, the present study was independently designed and conducted.


References


Akkerman, S. F., & Bakker, A. (2011). Boundary crossing and boundary objects. Review of Educational Research, 81(2), 132–169. doi:10.3102/0034654311404435


Atteberry, A., & Bryk, A. S. (2010). Centrality, connection, and commitment. The role of social networks in a school-based literacy initiative. In A. J. Daly (Ed.), Social Network Theory and Educational Change (pp. 51–76). Cambridge, MA: Harvard University Press.


Bakker, A., & Akkerman, S. F. (2014). Leren door boundary crossing tussen school en werk [Learning by boundary crossing between school and work]. Pedagogische Studiën, 91(1), 8–23.


Beers, P. J., Boshuizen, H. P. A., Kirschner, P. A., & Gijselaers, W. H. (2006). Common ground, complex problems and decision making. Group Decision and Negotiation, 15(6), 529–556. doi:10.1007/s10726-006-9030-1


Carlile, P. R. (2002). A pragmatic view of knowledge and boundaries: Boundary objects in new product development. Organization Science, 13(4), 442–455. doi:10.1287/orsc.13.4.442.2953


Carlson, D., Borman, G., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement. Educational Evaluation and Policy Analysis, 33(3), 378–398. doi:10.3102/0162373711412765


Coburn, C. E. (2004). Beyond decoupling: Rethinking the relationship between the institutional environment and the classroom. Sociology of Education, 77(3), 211–244. doi:10.1177/003804070407700302


Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American Journal of Education, 112, 469–495.


Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research and Perspectives, 9(4), 173–206. doi:10.1080/15366367.2011.626729


Crossan, M. M., & Apaydin, M. (2010). A multi-dimensional framework of organizational innovation: A systematic review of the literature. Journal of Management Studies, 47(6), 1154–1191. doi:10.1111/j.1467-6486.2009.00880.x


Datnow, A., & Hubbard, L. (2015). Teachers’ use of assessment data to inform instruction: Lessons from the past and prospects for the future. Teachers College Record, 117(4), 1–26.


Datnow, A., Park, V., & Kennedy-Lewis, B. (2013). Affordances and constraints in the context of teacher collaboration for the purpose of data use. Journal of Educational Administration, 51(3), 341–362. doi:10.1108/09578231311311500


Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data. How-high performing school systems use data to improve instruction for elementary students. San Francisco, CA: Center on Educational Governance, University of California.


Desimone, L. (2002). How can comprehensive school reform models be successfully implemented? Review of Educational Research, 72(3), 433–479. doi: 10.3102/00346543072003433


Dutch Inspectorate of Education. (2013). De staat van het onderwijs [The state of affairs in education]. Utrecht, Netherlands: Inspectie van het Onderwijs.


Earl, L., & Katz, S. (2006). Leading schools in a data-rich world: Harnessing data for school improvement. California: Corwin.


Eggen, T. J. H. M., & Sanders, P. F. (1993). Psychometrie in de praktijk [Psychometrics in practice]. Arnhem, Netherlands: CITO.

Farley-Ripple, E., & Buttram, J. (2015). The development of capacity for data use: The role of teacher networks in an elementary school. Teachers College Record, 117(4), 1–25.

Hargadon, A. B. (2002). Brokering knowledge: Linking learning and innovation. Research in Organizational Behavior, 24, 41–85. doi:10.1016/S0191-3085(02)24003-4

Harris, A., & Jones, M. (2010). Professional learning communities and system improvement. Improving Schools, 13(2), 172–181.

Honig, M. I. (2006). Street-level bureaucracy revisited: Frontline district central-office administrators as boundary spanners in education policy implementation. Educational Evaluation and Policy Analysis, 28(4), 357–383. doi:10.3102/01623737028004357


Hopkins, D. (2001). School improvement for real: Educational improvement for real. London, UK: Routlege Falmer.


Hubers, M. D., Moolenaar, N. M., Schildkamp, K., Daly, A. J., Handelzalts, A., & Pieters, J. M. (2018). Share and succeed: The development of knowledge sharing and brokerage in data teams’ network structures. Research Papers in Education, 33(2), 216-238. doi:10.1080/02671522.2017.1286682"


Hubers, M. D., Poortman, C. L., Schildkamp, K., Pieters, J. M., & Handelzalts, A. (2016). Opening the black box: Knowledge creation in schools with a data team. Journal of Professional Capital and Community, 1(1), 41–68. doi:10.1108/JPCC-07-2015-0003


Hubers, M. D., Schildkamp, K., Poortman, C. L., & Pieters, J. M. (2017). The quest for sustained data use: Developing organizational routines. Teaching and Teacher Education, 67, 509–521. doi:10.1016/j.tate.2017.07.007


Huffman, D., & Kalnin, J. (2003). Collaborative inquiry to make data-based decisions in schools. Teaching and Teacher Education, 19(6), 569–580.


Jimerson, J. B., & Wayman, J. C. (2015). Professional learning for using data: Examining teacher needs and supports. Teachers College Record, 117(4), 1–36.


Kubiak, C. (2009). Working the interface: Brokerage and learning networks. Educational Management Administration & Leadership, 27(2), 239–256. doi:10.1177/1741143208100300


Kuiper, W., Van den Akker, J., Hooghoff, H., & Letschert, J. F. M. (2006). Curriculum policy and school practice in a European comparative perspective. In J. F. M. Letschert (Ed.), Curriculum development re-invented. Proceedings of the Invitational Conference on the Occasion of the 30 years SLO 1975-2005 (pp. 56–77). Enschede, Netherlands: SLO.


Kurland, H., Peretz, H., Hertz-Lazarowitz, R. (2010). Leadership style and organizational learning: The mediate effect of school vision. Journal of Educational Administration, 48(1), 7–30. doi:10.1108/09578231011015395


Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333–349. doi:10.1207/s15327671espr1003_7


Lai, M. K., & Schildkamp, K. (2013). Data-based decision making: An overview. In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 9–21). Dordrecht, Netherlands: Springer.


Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.


Marsh, J. A., Betrand, M., & Huguet, A. (2015). Using data to alter instructional practice: The mediating role of coaches and professional learning communities. Teachers College Record, 117(4), 1–40.


Ministry of Education, Culture and Science. (2000). Wet op het Onderwijstoezicht [Education supervision act]. Den Haag, Netherlands: SDU.


Penuel, W. R., Fishman, B. J., Cheng, B. H., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher, 40(7), 331–337, doi:10.3102/0013189X11421826.


Poortman, C. L., & Schildkamp, K. (2012). Alternative quality standards in qualitative research? Quality and Quantity, 46(6), 1727–1751. doi:10.1007/s11135-011-9555-5


Rogers, E. M. (2003). Diffusion of innovations (5th edition). New York, NY: Free Press.


Schildkamp, K., & Ehren, M. (2013). The Netherlands: From “intuition”- to “data”-driven decision making in Dutch secondary schools? In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 49–68). Dordrecht, Netherlands: Springer.


Schildkamp, K., Handelzalts, A., Poortman, C., Leusink, H., Meerdink, M., Smit, M., . . . Hubers, M. (2014). De datateam methode: Een concrete aanpak voor onderwijsverbetering [The data team procedure: A concrete approach for educational improvement]. Antwerpen, Netherlands: Garant Publishers.


Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. doi:10.1016/j.tate.2009.06.007


Schildkamp, K., & Poortman, C. L. (2015). Factors influencing the functioning of data teams. Teachers College Record, 117(4), 1–30.


Schildkamp, K., Poortman, C. L., & Handelzalts, A. (2015). Data teams for school improvement. School Effectiveness and School Improvement (Advance online publication). doi:10.1080/09243453.2015.1056192


Schildkamp, K., & Teddlie, C. (2008). School performance feedback systems in the USA and in the Netherlands: A comparison. Educational Research and Evaluation, 14(3), 255–282. doi:10.1080/13803610802048874


Spillane, J. P. (2000). Cognition and policy implementation: District policymakers and the reform of mathematics education. Cognition and Instruction, 18(2), 141–179. doi:10.1207/S1532690XCI1802_01


Star, S. L., & Griesemer, J. R. (1989). Institutional ecology, translations and boundary objects: Amateurs and professionals in Berkeley’s Museum of Vertebrate Zoology. Social Studies of Science, 19(3), 387–420. doi:10.1177/030631289019003001


Stein, M. K., & Coburn, C. E. (2008). Architectures for learning: A comparative analysis of two urban school districts. American Journal of Education, 114(4), 583–626. doi:10.1086/589315


Van Veen, K., Zwart, R., Meirink, J., & Verloop, N. (2010). Professionele ontwikkeling van leraren. Een reviewstudie naar effectieve kenmerken van professionaliseringsinterventies van leraren [Teachers’ professional development. A review of effective characteristics of teachers’ professional development interventions]. ICLON/Expertisecentrum Leren van Docenten.


Verbeek, C., & Odenthal, L. (2014). Opbrengstgericht werken en onderzoeksmatig leiderschap in PO en VO [DBDM and research leadership in primary and secondary education]. In M. Krüger (Ed.), Leidinggeven aan onderzoekende scholen [Leading researching schools] (pp. 67–78). Bussum, Netherlands: Coutinho.


Visscher, A. J. (2002). A framework for studying school performance feedback systems. In A. J. Visscher & R. Coe (Eds.), School improvement through performance feedback (pp. 41–72). Lisse, Netherlands: Swets & Zeitlinger B.V.


Visscher, A., & Ehren, M. (2011). De Eenvoud en complexiteit van opbrengstgericht werken. Analyse in opdracht van de kenniskamer van het ministerie van onderwijs, cultuur en wetenschap [The simplicity and complexity of data-based decision making]. Retrieved on July 13, 2012, from http://www.rijksoverheid.nl/documenten-enpublicaties/rapporten/2011/07/13/de-eenvoud-en-complexiteit-van-opbrengstgerichtwerken.html


Wang, S., & Noe, R. A. (2010). Knowledge sharing: A review and directions for future research. Human Resource Management Review, 20, 115–131. doi:10.1016/j.hrmr.2009.10.001


Wayman, J. C., Midgley, S., & Stringfield, S. (2006). Leadership for data-based decision making: Collaborative educator teams. In A. Danzig, K. Borman, B. Jones, & B. Wright (Eds.), Learner-centered leadership: Research, policy and practice. (pp. 189–205). Mahwah, NJ: Lawrence Erlbaum.


Wenger, E. (1998). Communities of practice: Learning, meaning and identity. Cambridge, UK: Cambridge University Press.


Wenger, E., McDermott, R., & Snyder, W. (2002). Cultivating communities of practice. Boston, MA: Harvard Business School Press.


Wong, K. K., & Anagnostopoulos, D. (1998). Can integrated governance reconstruct teaching? Lessons learned from two low-performing Chicago high schools. Educational Policy, 12, 31–47. doi:10.1177/0895904898012001003


Yin, R. K. (2009). Case study research: Design and methods. Thousand Oaks, CA: SAGE.





Cite This Article as: Teachers College Record Volume 121 Number 1, 2019, p. 1-45
https://www.tcrecord.org ID Number: 22446, Date Accessed: 9/22/2021 10:05:03 AM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Mireille Hubers
    University of Twente
    E-mail Author
    MIREILLE D. HUBERS, PhD, is an Assistant Professor at the University of Twente, Department of Educational Science, in the Netherlands. Her main research interests concern the antecedents and the process of sustained school improvement and sustained teachers’ professional development. She recently coauthored “Share and Succeed: The Development of Knowledge Sharing and Brokerage in Data Teams’ Network Structures” in Research Papers in Education, and “The Quest for Sustained Data Use: Developing Organizational Routines” in Teacher Education.
  • Cindy Poortman
    University of Twente
    E-mail Author
    CINDY L. POORTMAN, PhD, is an Assistant Professor at the University of Twente, Institute for Teacher Training and Professional Development, in the Netherlands. Her main research interest concerns teachers’ professional development in teams, such as data teams and teacher design teams. She has coauthored “Understanding Teacher Design Teams: A Mixed Methods Approach to Developing a Descriptive Framework” in Teaching and Teacher Education, and “Factors Influencing the Functioning of Data Teams” in Teachers College Record.
  • Kim Schildkamp
    University of Twente
    E-mail Author
    KIM SCHILDKAMP, PhD, is an Associate Professor at the University of Twente, Institute for Teacher Training and Professional Development, the Netherlands. Her research interests include data-based decision making, formative assessment, and professional development. She has published widely in journals and edited books, and received grants, scholarships, and awards for her work on data-based decision making. She is the developer of the data team procedure, which has been implemented in schools in the Netherlands, Sweden, and England. Among her recent publications is Data-based Decision Making in Education: Challenges and Opportunities (Springer), coauthored with M. K. Lai and L. Earl.
  • Jules Pieters
    University of Twente
    E-mail Author
    JULES M. PIETERS, PhD, is a Professor Emeritus of Applied Psychology at the University of Twente. He was involved in research projects on inquiry and collaboration in teacher professional development, on co-designing of curriculum materials and learning environments by teacher design teams, and on knowledge dissemination. He has coauthored “Engaging Students: The Role of Teacher Beliefs and Interpersonal Behavior in Fostering Student Engagement in Vocational Education” in Teaching and Teacher Education.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS