Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Data Use Practices for Improved Mathematics Teaching and Learning: The Importance of Productive Dissonance and Recurring Feedback Cycles

by Jolley Bruce Christman, Caroline B. Ebby & Kimberly A. Edmunds - 2016

Background: A growing number of studies argue that data use practices in schools have not sufficiently attended to teachers’ learning about students, subject matter, and instruction. The result has been changes in instructional management (e.g., student grouping, assignment of students to tutoring) rather than instructional improvement. Further, there is a paucity of research on how teachers make sense of data and their ensuing instructional actions.

Purpose: We report findings from qualitative research on an intervention designed to put teacher learning about mathematics instruction center stage in data use practices. The research sought to understand what happened as teachers made sense of data in their professional learning communities (PLCs), what changes they made in their mathematics instruction, and why they made the changes.

Research Design: The theoretical foundation for the research is situative theory, which conceptualizes teacher growth as “a process of increasing participation in the practice of teaching, and through this participation a process of becoming knowledgeable in and about teaching.” A case study approach was chosen to illuminate the complex interrelationships among intervention components and their influence on teachers: (1) between individual teacher sensemaking about data and collective sensemaking in PLCs and (2) between sensemaking and instructional changes. Additionally, case study methodology facilitates theory building grounded directly in data by providing nuanced accounts of the phenomena under study that uncover concepts and coherently relate them to one another. Teacher interpretation of data is ripe for theory building.

Findings: The case study of Ms. Walker illustrates in rich detail the developmental nature of her growth and the important roles of dissonance, collegial discussion, and productive dissonance in that process. Due to considerable progress in both her questioning strategies and her ability to build on student thinking to focus on important mathematical ideas, Ms. Walker was able to move beyond surface instructional adjustments to demonstrate substantial instructional improvement.

Conclusion/Recommendations: We argue that a fuller understanding of how teachers experience dissonance, and the supports necessary to make that dissonance productive, can enrich the design and implementation of data use practices. The research also offers an example of the contribution that microprocess studies can make to research on data use practices. We encourage researchers to attend carefully to teacher sensemaking and interrogate the concepts of dissonance and productive dissonance in future theory building about data use practices.

The Common Core State Standards in Mathematics present an ambitious vision for student learning that challenges educators across the country to make good on reforms that have shown promise for strengthening teaching practice and raising student achievement. One of the most prominent and widespread of these reforms is the use of data to inform instruction (Boudett, City, & Murnane, 2005). However, a growing number of studies point to the limitations of data use as it is currently practiced in schools (e.g., Christman et al., 2009; Datnow & Hubbard, 2015; Faria et al., 2012; Farley-Ripple & Buttram, 2014; Horn, Kane & Wilson, 2015; Olah, Lawrence, & Riggan, 2010; Shepard, 2009). One claim emerging from this research is that data use has not sufficiently attended to teachers’ learning about their students, their subject matter, or their instruction (Christman et al., 2009; Olah et al., 2010).

In this paper, we report findings from the qualitative component of a design experiment that aimed to put teachers’ professional learning center stage in discussions of data in professional learning communities (PLCs) (Supovitz, 2013). In design experiments, a research team designs, implements, and studies an innovation in an effort to improve the intervention and document its impact (Brown, 1992; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003). The innovation discussed here, called the Linking intervention, was designed to strengthen elementary teachers’ understanding of students’ mathematical thinking and help them to develop responsive instructional strategies that pressed students to engage deeply with key mathematical concepts. The multimethod research design of the larger study included an experimental component as well as qualitative research. The results of the experiment, reported elsewhere (Supovitz, 2013), indicated large effects—on the order of about one-third to one-half standard deviation in magnitude—on teachers’ subsequent instructional practice based on judgments of external raters. There were also small, but statistically significant, effects on student learning.

The purpose of the case study research reported here is to illuminate how the Linking intervention worked to effect these improvements in teachers’ practice. An important goal was to shed light on teachers’ individual and collective sensemaking because of the paucity of observational research on how teachers analyze data and the ensuing actions that they take in their classrooms (Coburn & Turner, 2012; Little, 2012). An implicit assumption underlying the practice of having teachers examine data is that teachers will reconsider what they know about their students and what they think works in their teaching. This reconsideration of practice can only occur if teachers perceive some type of inconsistency or dissonance between their beliefs and actions variously called “disequilibrium” (Jones & Nimmo, 1999), “cognitive conflict” (Cobb, Wood, & Yackel, 1990), “teacher efficacy doubts” (Wheatley, 2002) or a “stance of deliberative uncertainty” (Ball & Cohen, 1999) in the research literature. The case of Ms. Walker1 presented here makes explicit her sensemaking processes and elaborates what dissonance looks and feels like. It illustrates the importance of moving through dissonance to productive dissonance in order for instructional change to occur.

From a cognitive perspective, dissonance—the state of experiencing contradictions between two beliefs or between currently held beliefs and new information—can be an essential catalyst for learning. However, when individuals confront information that challenges existing beliefs, they may also ignore or dismiss the information as not being valid instead of reflecting on and reconsidering their current beliefs in light of the new information. In order for dissonance to lead to learning and professional growth, it must be taken up in ways that lead to improvements in instructional practice rather than dismissed or absorbed into current beliefs. In our work, we refer to productive dissonance as dissonance that is taken up by teachers in ways that lead to the enactment of revised beliefs and practices in the classroom. We argue that a fuller understanding of teachers’ interpretive processes as they examine data, including how they experience dissonance and the supports necessary to make that dissonance productive, can enrich the design and implementation of data use practices in schools. We also encourage researchers to attend carefully to teacher sensemaking and interrogate the concepts of dissonance and productive dissonance in future theory building.

Below, we describe the context of the study and show how the intervention design features drew upon research on teacher learning to address the limitations in current data use models. We then present the case of Ms. Walker to highlight the interplay between interpretive processes and programmatic features in transforming data use activities into opportunities for teacher growth and learning.



The Linking intervention emerged from a university–district collaboration. The intervention was designed to support two important district initiatives: (1) district-wide implementation of professional learning communities (PLCs) in which teachers would examine student data to inform instruction and (2) improvement of elementary mathematics instruction. Our research team, consisting of researchers from the Consortium for Policy Research in Education at the University of Pennsylvania and Research for Action, an independent research organization, worked with leaders in a large suburban school district in the northeast. The district had 19 schools, including 12 elementary schools, and served approximately 11,000 predominantly middle to upper-middle class students. Approximately 81% of students were white, 9% Asian, 6% African American, and 4% Hispanic or Latino.

In the 2 years prior to the implementation of the Linking intervention, the district had invested in PLC training, providing teachers with multiple days of professional development on the DuFour model (DuFour, 2004; DuFour, DuFour, & Eaker, 2008). The district provided teachers at each grade level with common time—about 45 minutes—each week to hold PLC meetings. Teachers were expected to use their PLC time to examine data about student learning as well as discuss curriculum and students. In terms of the mathematics curriculum, for kindergarten through grade two, the district had recently begun using Investigations (Pearson Education Inc., 2008), a textbook series that reflects a student-centered, inquiry-based and conceptually oriented approach to learning mathematics. In grades 3–5, the district used a combination of Investigations and Scott Foresman-Addison Wesley Mathematics (Pearson Education Inc., 2005), a textbook series that conveys mathematics more procedurally and reflects a transmission or direct instruction-oriented approach to mathematics instruction. When the university–district collaboration began, there were eight district-based mathematics coaches who provided support to classroom teachers across the 12 elementary schools and who were involved in the initial planning of the Linking intervention.

Teachers in grades 1–5 were recruited from 10 of the 12 elementary schools in the district to participate in the study. Many teachers were initially reluctant to participate, reflecting concerns about the use of video recording for evaluative purposes. In response, we provided detailed explanations about the measures in place that would limit access to the recordings to researchers and the teacher herself. In addition, district administrators were not present at PLC meetings where the video recordings were discussed. Grade level teams were randomly assigned to the intervention. The Linking intervention was fully implemented with the treatment group of 35 teachers in 14 PLCs across 10 elementary schools. When implementation began, district budget cuts reduced the existing mathematics support to one coach serving all 12 elementary schools. As a result of these cuts, classroom coaching, an initial part of the Linking design, was not part of the intervention.


The assumptions that underlie the widespread attention to data use are that: (1) Teachers will examine data to glean new insights about their students’ learning and reconsider their classroom practice based on these insights; (2) they will implement new instructional strategies in their classrooms based on student needs; (3) they will assess the effects of these changes and modify as needed; and (4) recurring feedback cycles of data examination and action will generate continuous improvement in teaching and learning (Black, Harrison, Lee, Marshall, & Wiliam, 2003; Boudett et al., 2005; Datnow, Park & Wohlstetter, 2007; Halverson, Pritchett, & Watson, 2007; Hamilton et al., 2009; Kerr, Marsh, Ikemoto, Darilek, & Barney, 2006; Mandinach & Honey, 2008). These assumptions are illustrated in the feedback cycle in Figure 1.

Figure 1. Data Use for Instructional Improvement


However, recent research on data use in educational contexts has identified numerous challenges to achieving the full potential of reviewing data and raises questions about these underlying assumptions. Currently, many schools have teachers routinely review the data generated by state accountability tests and the interim or benchmark tests used by their districts to predict performance on state measures (Christman et al., 2009; Coburn & Turner, 2011; Faria et al., 2012; Farley-Ripple & Buttram, 2014). While helpful to state departments of education and school districts, the value of these accountability assessments to classroom teachers is limited. Their focus on students’ level of proficiency towards meeting grade-level standards rarely illuminates students’ understanding and misunderstanding of subject matter, thereby offering little to go on in terms of instructional implications (Christman et al., 2009; Goertz, Olah, & Riggan, 2009; Horn et al., 2015). In addition, these assessment data require teachers “to infer back” to what they did instructionally weeks or even months previously that produced their students’ performance (Supovitz, Foley, & Mishook, 2012), thereby losing the proven efficacy of shorter cycle formative assessments (Black et al., 2003; Black & Wiliam, 1998).

Many teachers also lack the knowledge and ability to analyze data on student learning to inform instruction in substantive ways (Heritage, Kim, Vendlinksi & Herman, 2009; Supovitz, Ebby & Sirinides, 2014). These limitations have resulted in data analyses that remain superficial and often do not support processes of teachers’ collective inquiry and knowledge building that might contribute to their professional growth. Not surprisingly, they have often led to “strategic” changes such as expanded instruction in test-taking skills, regrouping of students for instruction, and reteaching of curriculum topics and skills—often using the same instructional strategies (Christman et al., 2009, Goertz et al., 2009; Olah et al., 2010).


In designing the Linking intervention, our team sought to address limitations in data use practices. The intervention builds on research on teacher learning and teacher growth. Over the last two decades, researchers have proposed that efforts to support teacher learning and growth need to be subject-specific and focused on pedagogical content knowledge (Ball & Bass, 2000; Desimone, 2009; Hill & Grossman, 2013; Kennedy, 1998). In mathematics, for example, teachers need to develop understanding of the content as well as how students learn that content, what pedagogical methods and representations are effective in engaging students in the content, what it means to reason mathematically, and how mathematical concepts are connected and build upon one another in the curriculum (Ball, Thames, & Phelps, 2008). Research also suggests that professional development should use tools and artifacts of practice—what we call here “classroom-embedded data”—to generate ongoing, collaborative inquiry into student thinking and instructional responses to their thinking (Ball & Cohen, 1999; Borko, 2004).

With this literature in mind, the Linking intervention included three important design elements as illustrated in Figure 2: (1) the provision of classroom-embedded data that made visible instructional practice and student mathematical thinking, (2) the enrichment of teacher interpretation of data through skilled facilitation of PLC discussions, and (3) recurring feedback cycles, each of which continued the focus on the linkages between instruction and student thinking. As we discuss below, each of these elements was designed to illuminate the linkages between student mathematical reasoning and instruction.

Figure 2. Linking Model for Data Use


Classroom-Embedded Data.

Research on how teachers gain expertise in their craft points to the value of using artifacts drawn directly from practice—i.e., curriculum materials, student work, video recordings of classroom instruction—to generate productive inquiry into teaching practice and student learning (Ball & Cohen, 1999; Borko, Jacobs, Eiteljorg & Pittman, 2008; Brophy, 2004; Cobb, McClain, Lamberg, & Dean, 2003). The Linking intervention made use of such classroom-embedded data: written feedback on mathematics lessons, student solutions to open-ended items from end-of-unit assessments, and video clips of classroom interaction. During the fall, winter, and spring, teachers received written comments about a video recorded Investigations mathematics lesson. The lessons were chosen from units that focused on core mathematical concepts at each grade level (i.e., addition and subtraction in grades one and two; number operations and fractions in grade three; and multiplication, division, and fractions in grades four and five). The feedback was written by experienced mathematics teachers who had graduate training in mathematics education as well as training in the Instructional Quality Assessment (IQA), an established mathematics lesson observation tool that identifies the key dimensions of evidence-based mathematics teaching and learning shown in Table 1 (Boston, 2012). The IQA was used to focus on two dimensions of mathematics instruction: eliciting and responding to student thinking and pressing students towards important mathematics concepts. The feedback was sent in a private email to each teacher.

Table 1. Instructional Quality Assessment (IQA) Rubrics


IQA Rubric

Dimensions of Mathematics Instruction

Accountable Talk



Proportion of students who participate in discussion


Rigor and nature of teacher questions

Asking (Teacher Press)

Extent to which teachers press students to support contributions with evidence and/or reasoning

Providing (Student Response)

Extent to which students are able to support contributions with evidence and/or reasoning

Academic Rigor


Potential of the Task

Potential of the task to engage students in rigorous thinking about challenging content

Implementation of the Task

The level of rigor at which students engage in the task

Student Discussion Following Task

The extent to which students show their work and explain their thinking about important mathematical content

Follow-up PLC discussions focused on two kinds of data: student work from end-of-unit assessments and video clips of teacher-student interactions from their own and their colleagues’ lessons. Teachers examined student solutions to open-ended questions that addressed the unit’s core mathematical ideas and required students to show their work. They viewed video clips chosen to focus on talk moves made by teachers to elicit and build on student mathematical reasoning about the core mathematical ideas.

Video clips provide the opportunity for teachers to examine complex classroom interactions (Brophy, 2004; Sherin, 2004) and are most effective when clips are carefully selected to highlight specific aspects of teaching (Brophy, 2004). In the case of Linking, facilitators chose clips to highlight teacher questioning and student reasoning around the mathematical concepts that were represented in the student work.

Facilitated Analysis of Data

Many schools have instituted PLCs, providing teachers with regularly scheduled common meeting time during which they collaboratively review data and plan instruction with their colleagues. There is ample research evidence to support the claim that PLCs can lead to changes in teacher practice as well as increases in student learning (Borko, 2004; Earl & Timperley, 2008; Vescio, Ross, & Adams, 2008). Yet we also know that the effectiveness of a PLC as a lever for professional growth depends on the quality of PLC activities. Several studies demonstrate that effective PLCs focus on student learning (Bolam et al., 2005; DuFour, 2004; Newmann, 1996; Phillips, 2003) and/or instructional practice (Supovitz, 2002; Supovitz & Christman, 2003) and call for greater attention to pedagogical content knowledge in PLC activities (Bausmith & Barry, 2011).

In mathematics education, current delineations of effective instruction place student thinking and student voice at the center of the teaching and learning process (Stein, Engle, Smith, & Hughes, 2008). They emphasize teaching practices such as questioning to elicit student reasoning and orchestrating discussions to build on student thinking and lead towards increasingly complex mathematical knowledge (Lampert, 2001; Nelson, 2001; Sleep, 2012). While early mathematics reform efforts emphasized helping teachers to elicit student thinking, more recent efforts focus on how to build on the student thinking to press for deeper student understanding of important mathematical concepts (Stein, Engle, Smith, & Hughes, 2008). As the Common Core State Standards are implemented, it will be increasingly important for teachers to have content-specific frameworks to generate instructional strategies that address student needs as they strive to develop rigor and more sophisticated thinking and reasoning (Bausmith & Barry, 2011; Van Driel & Berry, 2012).

In the Linking intervention, a trained facilitator focused PLC data discussions on linkages between student mathematical reasoning and instructional practices. Discussions of student work focused on strategies that students used rather than the correctness of the answer—thus offering a window into variations in students’ mathematical reasoning and the levels of sophistication in their reasoning. Discussions of video clips drew out examples of pedagogical strategies teachers used to elicit student thinking and press students towards important mathematical concepts and student responses to these strategies. These clips sometimes provided examples of effective strategies and other times, missed opportunities. Finally, discussions focused on drawing connections between teachers’ instructional strategies and student understanding. And so, like the individual written feedback on video-recorded lessons, the structured PLC discussions were designed to reinforce and promote instructional practices that supported students’ deeper understanding of mathematical concepts and more sophisticated problem-solving skills (Ebby & Oettinger, 2013).

Recurring Feedback Cycles

As shown in Figure 3, the intervention aimed to deepen teachers’ exploration of data by establishing three feedback cycles over the course of the year. These recurring cycles gave teachers multiple opportunities to examine classroom practice in relation to student engagement with mathematical reasoning and concepts. As teachers adopted new strategies, they could apply lessons learned from feedback, data analysis, and discussion of video clips to their subsequent practice. Two foci remained constant across these recurring cycles: (1) the use of the IQA sustained teachers’ attention on two dimensions of teaching practice: accountable talk (questioning practices) and academic rigor (engaging students in reasoning about important mathematics); and (2) classroom observations and PLC activities focused on a particular core content area over the year.

Figure 3. Recurring Feedback Cycles


Each cycle also contained two important opportunities for the teachers to get feedback on their practice. They received individual written feedback on their instruction after the observation, but then during the PLC they had the opportunity to discuss student work and video clips of instruction collaboratively with their colleagues. The feedback was therefore both individually targeted to the teacher’s classroom practice and also publicly shared in relation to collective grade level and curricular goals.



The theoretical foundation for our case study research is situative theory (Borko, 2004; Cobb & Bowers, 1999; Putnam & Borko, 2000), which conceptualizes teacher growth “as a process of increasing participation in the practice of teaching, and through this participation a process of becoming knowledgeable in and about teaching” (Adler, 2000, as quoted in Borko, 2004, p. 4). Situative theory locates teacher learning in the multiple situations of daily life in schools and classrooms and attends to the interplay between individual learning as well as the coconstruction of knowledge with colleagues and students (Cobb, 1994). It highlights the opportunities for teachers’ collective inquiry and “pedagogical reasoning” (Cobb et al., 2003) in their PLCs.

A situative approach prioritizes sensemaking, a neglected focus in the existing data use research, but a vital component of the Linking intervention. There have been few studies that attend to how teachers interpret data and turn it into actionable knowledge in the context of the classroom (Coburn & Turner, 2012; Little, 2012). Sensemaking theory argues that the meaning of data is not given, but rather constructed individually and also collectively (Erickson, 2004). It considers how complexly layered contexts influence interpretation of data and subsequent action. Clarke and Hollingsworth (2002) draw on situative theory to propose a nonlinear model of teacher growth that accounts for the interconnections between practice, meaning-making, and context, where enactment and reflection serve as the mechanism for change across four domains: personal (knowledge, beliefs and attitudes), practice (professional experimentation), consequence (salient outcomes), and the external domain (sources of information, stimulus, or support).A key part of the process is how the teacher evaluates the outcomes of instructional changes made in the classroom.

A situative approach also prioritizes routines and tools of practice that strengthen or constrain teachers’ analyses of data and the subsequent actions that they take. As noted earlier, data use for instructional improvement assumes iterative and recurring feedback cycles in which teachers access data, make sense of it, apply the knowledge in their classrooms, and assess the application and the response of students.

Drawing upon situative theory as well as research on data use and teacher professional learning, this study sought to address the following questions: How did the Linking innovation effect improvements in teachers’ mathematical teaching practices? What happened as teachers made sense of the Linking data in their PLCs? What changes did they make in their ensuing mathematics instruction and why?


Over the course of an academic year, our research team conducted observations of PLC meetings and interviews of teachers and facilitators. Thirty-eight meetings from 14 PLCs were observed. Included in the field note write-ups were artifacts such as chart paper notes generated from PLC discussions and samples of student work. Brief teacher interviews (n = 15) and facilitator interviews (n = 9) occurred during the school year immediately following the PLC meetings observed. At the end of the school year, we conducted in-depth interviews with 11 teachers asking them to reflect on what stood out to them about their individual feedback and their work with colleagues in their PLCs.

The data analysis proceeded in three stages. During the first stage, four members of the research team read through the corpus of PLC field notes and interview transcriptions. Our research team met to discuss emergent themes. We developed a code list and then coded the interviews and field notes, continually refining the coding scheme as necessary. During this stage, the theme of dissonance, or inconsistency between teachers’ beliefs and actions, emerged as important to teachers’ sensemaking and learning.

At this juncture, the research team decided to adopt a case study approach (Gomm, Hammersley, & Foster, 2000; Stake, 1995; Yin, 1994) for two reasons. First, case study methodology is particularly well-suited for understanding complex interrelationships (Hodkinson & Hodkinson, 2001), in this instance the relationships among intervention components and their influence on teachers, between teacher individual sensemaking about data and collective sensemaking in PLCs, between sensemaking and classroom instructional changes, and between teacher actions and student responses. Second, case study methodology can track developmental trajectories over time, here a trajectory of teacher professional growth over multiple feedback cycles during the course of an academic year (Hodkinson & Hodkinson, 2001).

Most importantly, case study methodology can facilitate theory building grounded directly in data (Glaser & Strauss, 1999; Eisenhardt, 1989) by providing richly detailed accounts of the phenomena under study that uncover concepts and coherently relate them to one another. Teacher interpretation of data is ripe for theory building.

The limitations of case study research are well known and primarily have to do with generalizability—the typicality and representativeness of the case under study (Hodkinson & Hodkinson, 2001; Eisenhardt, 1989). That limitation applies here. Nevertheless, we believe that the case of Ms. Walker is instructive because of its close attention to a teacher’s sensemaking and its contribution towards a theory of teacher sensemaking about data.

The research team chose three teachers for extended case studies. We selected Ms. Walker, a third grade teacher, because of the considerable progress in her IQA scores over the course of the year. As shown in Figure 4, there was a relatively large jump in her use of questioning strategies as measured by the Accountable Talk component of the IQA rubric between the first and second video recorded lessons and also significant gains in Academic Rigor, or practices that build on and extend student thinking towards the important mathematical goals.

Figure 4. Growth in Ms. Walker's IQA Scores over Time in Relation to Mean IQA Scores for Treatment Group


We chose two other teachers to offer contrasting cases. One teacher made moderate progress on the Accountable Talk portion of the IQA but little or no progress on Academic Rigor while the other made little progress on either dimension. The culture of their PLCs also provided points of contrast, both with one another and to that of Ms. Walker.

In stage two of the analysis, case studies of each of the three teachers were constructed based on the written feedback, the field notes from PLC meetings, and the end-of-year extended interview which provided the teacher’s narration of her experience in the program. The case studies traced teachers’ participation in the project over the course of the year and focused on teachers’ sensemaking through the three cycles of data review, thus providing a window into each teacher’s learning process over time.

In stage three, we used cross-case comparison to shed light on the most relevant issues under study (Stake, 1995). We could have created an account of findings that looked across the three cases and discussed relevant themes, including dissonance and productive dissonance. However, we chose to tell the story of Ms. Walker because rich information about her professional growth over time is highly relevant to understanding the role of productive dissonance and recurring feedback cycles in how teachers make sense of data and the changes that they make in their practice (Patton, 1987). Additionally, we judged this “best case” (Patton, 1987) to be the most useful for clearly communicating to readers our theorizing about teacher sensemaking.




The case study of Ms. Walker makes visible the story behind the improvements in her IQA scores over the course of the year of the Linking intervention. It illustrates in rich detail the instructional changes that she made in her classroom and why she made them, the developmental nature of her growth and the important role of dissonance, collegial discussion, and productive dissonance in that process. In her first observed lesson, Ms. Walker focused on mathematical procedures and answers, asking few questions about how students arrived at their solutions. In her second video recorded lesson, in contrast, the observer noted that Ms. Walker consistently asked students for their strategies and reasoning. By the third observed lesson, Ms. Walker not only elicited student thinking, she also pressed students to focus in on the important mathematical ideas. In addition, over the year, Ms. Walker became increasingly curious and reflective about her own and her colleagues’ teaching. She frequently engaged in reasoning about her practice and became more articulate about mathematics instruction. Our analysis traces this development and also highlights points of dissonance in her instructional growth over the three feedback cycles.

The awakening: “I saw that she did a lot more questioning.”

In the first cycle, Ms. Walker taught a lesson on addition and subtraction in which she directed students towards standard procedures for solving problems involving money rather than encouraging strategies based on number sense and estimation as emphasized in the curriculum. The written feedback she received about the lesson highlighted these weaknesses and suggested asking more open-ended questions, focusing on number sense and pressing students to justify their solutions. The observer wrote:

In the introductory discussion, by asking "How do you know those amounts add to $1.50?" rather than talking students directly through a set procedure, you will be able to elicit and highlight more sophisticated reasoning about number and operations. (Written Feedback, Fall)

The observer also pointed out that Ms. Walker did not consistently press students to justify their solutions and suggested: “You could ask, ‘How do you know that equals a dollar?’ rather than writing it vertically and adding each column” (Written Feedback, Fall). At the time of this first observed lesson, Ms. Walker’s classroom practice followed a direct instruction model in which she demonstrated how to solve the problems rather than allowing students to generate their own solutions. There was more emphasis on developing procedural knowledge than on rigorous mathematical thinking. The written feedback pointed out ways in which she was not getting to the conceptual level of number sense and estimation and suggested some questioning strategies to help her get there.

In the ensuing PLC meeting, the facilitator had Ms. Walker and her colleagues share student work for an addition problem on the end-of-unit assessment. She then summarized the three main strategies that emerged on chart paper: “adding tens and ones separately, using the next ten as a landmark and adjusting, and adding on tens and ones to one number.” Notably, Ms. Walker had not elicited these student strategies in her lesson. Instead, she had directed students towards the use of the standard addition algorithm. During the discussion, Ms. Walker expressed reservations about the utility of having students develop and use multiple strategies, noting that some students may become confused by the time they get to the end of unit assessment. The facilitator turned this comment back to the teachers, asking them “to think about how to help students solidify strategies” while they watched the video clip.

In the video clip, Ms. Walker's grade partner, Ms. Levine, was leading a discussion in which students voiced multiple strategies for solving addition and subtraction problems. The facilitator focused attention on the students’ thinking and how the teacher’s questions were engaging students in mathematical reasoning. The teachers noted that students were able to correct their own mistakes and that they had a deep understanding of the operation of subtraction. Ms. Walker commented that she saw Ms. Levine asking students for strategies, not just answers. The facilitator also pointed out how Ms. Levine re-voiced student strategies, named them, and recorded them as the discussion went along, noting that these were teaching moves that could help students solidify their understanding as well as their ability to draw on different strategies.

At this PLC meeting, Ms. Walker had an important opportunity to think about multiple solution strategies and also see her colleague handle a shared lesson in a different way than she had. The written feedback Ms. Walker received on her own instruction created a moment of dissonance. Looking at student work and watching the video clip of Ms. Levine as she taught the same concept differently intensified the dissonance, but also helped Ms. Walker envision how to address it in her own practice. In the end-of-year interview, Ms. Walker returned to this lesson, her feedback, and the video clip, noting its significance to her professional learning:

In that video clip, Ms. Levine asked the students about different ways that they came up with combinations of the money. I saw that she did a lot more questioning than I did . . . and that there was a much more lively discussion in her room about how they came up with those different ways. (Interview, Spring)

In this quote Ms. Walker explains how observing a video of a colleague opened up her vision of what she might do in her own classroom. In this instance, two Linking intervention components worked together in a positive way. The written feedback on Ms. Walker’s instruction created a sense of dissonance, and the PLC discussion focused on pedagogical practices to elicit student thinking, thus offering a bridge to productive dissonance. Having a more skilled colleague in her PLC was a critical contextual factor.

Making instructional changes: “I did not always allow them to show various examples or ideas.”

In the next video recorded lesson, which focused on multiplication in the context of a game, there was a marked difference in Ms. Walker’s questioning strategies. She opened up the discourse in her classroom by asking students for their strategies and their reasoning. The observer offered positive feedback on Ms. Walker's questioning strategies, as well as suggestions for how to push students to think more deeply about the important mathematical concepts and generate more sophisticated strategies. The feedback explained how she could use an open array (a rectangular array with only the outer dimensions labeled) as a model to illustrate the relationship between multiplication facts. It also suggested that she could encourage students to draw upon facts that they knew to find unknown facts. The dissonance created by the feedback was now around what to do with student strategies once they were elicited.

In the follow-up PLC meeting, the discussion focused on these ideas through the examination of student work as well as the video clip. As the teachers looked at student work, the facilitator asked them to order the strategies in terms of sophistication. The discussion focused on how to move students along the learning trajectory towards more sophisticated strategies (Sztajn, Confrey, Wilson, & Edgington, 2012). The facilitator also pointed out that the open array was an important model for students to both record their thinking and to help them move away from more concrete (less sophisticated) counting strategies.

The facilitator then showed a video clip of Ms. Walker’s instruction in which she was eliciting multiple strategies for figuring out the total of a 4 x 9 array. The group members remarked on how Ms. Walker probed for student understanding, used wait time, and asked the students to put their hands down and think about the problem before sharing strategies. The facilitator focused the discussion on clarifying the different strategies that emerged: counting by 4s, adding 18 + 18, and using 4 x 10 to figure out 4 x 9. Notably, at this point, the discussion moved beyond acknowledging and understanding the different strategies to explore how to use an array model to transition students from less to more sophisticated strategies for multiplication. Looking at student thinking from this developmental perspective provided a framework for thinking about how to respond to and deepen student thinking once it was elicited. Overall, this experience affirmed the marked improvement in Ms. Walker's questioning strategies, while also helping her set new goals for focusing in on the important mathematical ideas and supporting more sophisticated solution strategies.

Refining practice: “My questions became more focused.”

In the spring observation, Ms. Walker was teaching a game in which students had to find equivalent fractions and combine fractions to make a whole. The observer noted, “Your questioning strategies helped to maintain a high level of academic rigor throughout the lesson,” and gave the following example:

Your questioning strategies really allowed children to reflect on and articulate their understanding of important underlying concepts. . . . You also pushed their thinking by having a student “prove” that [39_21628.htm_g/00010.jpg] was the same as 1 whole, asking students to give an example, and following up on a correct answer with another question to push their thinking. (Written Feedback, Spring)

Ms. Walker’s practice had progressed beyond merely asking students to explain their answers. She was now asking pointed questions that pressed students to concentrate on the important mathematical ideas. At the end of the year, she reflected on this aspect of change in her own practice:

By questioning the students more—a lot of very focused but careful questions to get their minds thinking—it seemed that a lot of growth happened in terms of the way that they looked at mathematics and looked at the patterns and relationships that they saw in mathematics. It seemed like it really helped a lot of them to have a better idea from the concrete to the abstract way of looking at things. (Interview, Spring)

Ms. Walker's questioning was no longer just a means to elicit student reasoning. As she put it, she was “pushing them to think about other relationships,” and she could see how that was leading to more sophisticated strategies on the learning trajectory for multiplicative reasoning.

In the final PLC meeting, the facilitator focused the discussion of student work on evidence of students’ underlying conceptual understanding of fractions. Some of the student work was difficult to interpret and the teachers helped each other make sense of how the students may have been thinking about the problems. For example, in making sense of the way that one student had drawn out a picture to solve a partitioning problem, it became clear that the student was also drawing on a more abstract understanding of equivalence that was not immediately evident from the work shown. Later Ms. Walker remarked that it was affirming for her to see that most of her students showed evidence of understanding important concepts such as equivalence and that it was also informative to see the variety of solutions. She noted, “Some kids were more abstract, while others were more concrete still in their thinking” (Interview, Spring). Her reflections indicate that she was internalizing the learning trajectory as she interpreted her students’ reasoning. Whereas in the beginning of the year she felt that different strategies might be confusing for her students, she now saw how a variety of strategies could provide her with information about her students’ developmental progress:

Not everyone did it the same way; some kids were more abstract, while others were more concrete, still, in their thinking. So it kind of gave you a clear indication of what type of strategies the kids are still comfortable with, because a lot of times when you think about the concrete, sometimes some of us do need those visuals to help us before we get to the point of being able to show representations of something in a more abstract way. But you could really see where each one was. (Interview, Spring)

The facilitator then showed a clip from Ms. Reinhardt’s class. In the clip, a student explained how he found an expression equivalent to the fraction 4/6 by building off the concept that there are 2/12s in each sixth. He became confused as he tried to explain. Rather than probe or clarify his thinking, the teacher asked other students if they could find an easier way to find 4/6 since they did not have blocks to represent twelfths. The facilitator chose the clip as a way to discuss the important mathematical ideas in his solution strategy and also to reflect on how to more effectively probe and clarify student thinking. When Ms. Reinhardt admitted that she did not understand his strategy and looked to an external reason (his brother) for it, Ms. Walker countered with her own interpretation:

Ms. Reinhardt: I don’t understand how he got there from what we were doing in class. His brother is autistic and they talk a lot. Maybe he got it from him.

Ms. Walker: He realized 2/12 is the same as 1/6. My students who’ve taken off with equivalent fractions, they will come up with different ways [of finding equivalencies]. (PLC, Spring)

Ms. Walker’s statement about students coming up with “different ways” signals an important change from her earlier assertion that students may get confused when they see different strategies. It also served to push her colleague to consider the mathematics of this moment in more depth. The facilitator then showed a short clip from Ms. Walker’s class in which a student offered a similar strategy for finding equivalent expressions using fractions. When Ms. Walker asked the student to explain his answer, he reasoned about the effect of halving the denominator but ended up getting confused and made a mathematical error. Ms. Walker called on another student who compared the fractions to 1/2 to explain why the expressions could not be equivalent. In this discussion the students were grappling with important mathematical ideas about fractions, using mental reasoning rather than physical models. As the observer noted in the feedback to Ms. Walker, “These ideas are central to understanding and reasoning about fractional relationships and, by letting them emerge, you really raised the level of academic rigor of the lesson.” (Written Feedback, Spring)

In the PLC discussion, Ms. Levine, who had been the model for Ms. Walker in the first PLC, remarked: “You let him explain it. Again, you want to honor the child’s thinking.” The group continued to discuss the pros and cons of letting children go on with their explanations and how to engage more students in the important ideas of the lesson. Ms. Walker remarked, “The higher thinkers need their time too. . . the girl who helped him out, she understood what he was trying to do.” Ms. Walker’s contributions during the PLC discussions indicate that she was connecting her questioning strategies to a deeper level of understanding in her students around the important mathematical ideas. The focus on student thinking in the PLC discussion of both student work and instructional practice further affirmed and strengthened this connection. Ms. Walker's reflection on the discussion in this PLC shows how she had moved from merely allowing students to share their strategies to seeing that sharing as an important way for students to solidify and extend their understanding:

I think it’s just very important . . . that sometimes the kids really have to be able to voice what they’re thinking, especially when it comes to more abstract types of concepts and thinking. . . . Sometimes we’re so focused on making sure that the children that are struggling get what they need that sometimes your higher children already know what it is that you’re teaching, and they need that extra level. (Interview, Spring)

It is also significant that the facilitator had chosen a clip of Ms. Walker's instruction to serve as a model for another teacher who had not capitalized on an opportunity to extend and deepen her students’ thinking. Ms. Walker had thus transitioned over the year from a team member with less instructional expertise to one who could serve as a model for the others. The focus on student thinking in the context of classroom instructional practice not only contributed to Ms. Walker's continued growth in expertise in becoming more knowledgeable about her own teaching and about student learning, but also to the growth in the expertise and capacity of the PLC itself.


The written feedback over the course of the year indicated that Ms. Walker’s progress was three-fold. First, she incorporated questioning strategies that elicited students’ mathematical thinking. Second, in her own words, she “gradually released” authority for mathematical thinking from herself to her students. And third, she incorporated pedagogical moves that pushed students to think more deeply about important mathematical concepts. In addition, Ms. Walker was aware that she had made these changes:

With each lesson, it seemed like my questioning and the academic rigor continued to build more and more. When I look at the [feedback from] the first lesson, the way that I questioned the students was not enough. And I did not always allow them to show various examples or ideas of what it was that they were trying to figure out. And so when I look across each lesson, it seems like my questions became more focused, and I allowed the students more time to explain or justify their reasoning. (Interview, Spring)

Ms. Walker’s growth trajectory suggests a developmental sequence in which learning to ask questions that will elicit student thinking is the initial stage. Learning to build on and extend student thinking to get to important mathematical goals—“the mathematical point” (Sleep, 2012)—represents a more sophisticated level of ambitious teaching practice. The Linking project’s recurring feedback cycles that focused on core content and key instructional dimensions played an important role in helping her move beyond eliciting and accepting multiple student strategies to hone in on the important mathematical goals.

As the case study illustrates, the dissonance created by the written feedback became a lever for change. If the feedback remained private, Ms. Walker might have ignored it, but seeing and discussing her colleague’s practice in relation to that feedback made it harder to dismiss and gave it new meaning, steering the dissonance into a more productive direction. In addition, Ms. Walker had repeated opportunities to examine student thinking (both her own students and those of her colleagues) in the context of their work on end-of-unit assessments and in the classroom context in relation to instructional practices, thus crossing the boundaries of the personal, practical, and consequential domains (Clarke & Hollingsworth, 2002). Having the opportunity to view video clips of her colleagues teaching the common lessons offered her a glimpse across classroom boundaries, something which is not often available to teachers but can help to break the norms of privacy in teaching (Lortie, 1975; Sarason, 1982). These video clips not only provided teachers with inspiration, but also gave them a common artifact from which to explore student thinking in ways that were grounded in practice (Ball & Cohen, 1990). The collaborative inquiry represents a move towards making practitioner knowledge—or knowledge generated within the particulars of classroom practice—public, open for discussion and debate (Hiebert, Gallimore, & Stigler, 2002; Lieberman & Pointer Mace, 2010).

It is also important to note that the selection and discussion of video clips centered around student engagement with an understanding of the core mathematical content. This focus on student thinking created a common link between the student work on the end-of-unit tests and the teachers’ instructional practices. Ms. Walker and her colleagues analyzed student work in relation to student reasoning and situated these strategies in a learning trajectory that described the development towards more sophisticated thinking and reasoning. This knowledge then informed and strengthened the way that they interpreted classroom video clips as they focused on instructional strategies that both elicited student thinking and pressed students towards more sophisticated understanding. As Ms. Walker reflected on her experiences over the year, she highlighted the importance of looking at video clips of her colleagues:

And honestly, I learned a lot from that clip and the others. Because although we collaborate a lot together as a grade level, it’s very rare that we see each other teach. So it was really nice to see what my grade level partners actually do in their rooms. To be able to see how a teacher who realizes a child does not understand something—and how she starts to ask those probing questions. I saw an example of how you go about that and how you help a child to understand what you’re asking them. (Interview, Spring)

Observing video clips of one another’s instruction makes teaching public so that it can be shared, critiqued, and emulated (Lieberman & Pointer Mace, 2010). A colleague’s instruction can be an external stimulus for change (the source of dissonance), provided the teacher makes sense of that instruction in a way that can be applied to her own beliefs and practices (the dissonance becomes productive). Focusing the discussion of pedagogical strategies around student thinking proved to be a critical way to help teachers draw such connections in the Linking intervention.

There were also two important dimensions of PLC capacity that supported this collective inquiry into student thinking and instructional practice and helped shape the dissonance productively: (1) social capital, manifested as interpersonal dynamics, a sense of trust, and willingness to engage in collective inquiry around student thinking and classroom practice within the PLC, and (2) human capital, or the knowledge and expertise of the individuals that made up the PLC. In Ms. Walker’s case, the quality of the teachers’ talk in the grade group suggested that the PLC was a collegial group of teachers who were accustomed to discussing their students and their classroom practice. The facilitator noted that the teachers eagerly delved into the process of looking at student work, both identifying the different strategies that students used and reflecting on what those strategies told them about students’ mathematical reasoning and their level of sophistication with understanding and applying key mathematical concepts and skills. It is likely that the openness and curiosity of her colleagues reinforced Ms. Walker’s willingness to pay attention to her feedback and try to incorporate it into her practice. The collegiality of the group varied across the PLCs in the study and sometimes became a barrier to change. In one of the other case studies, for example, problematic group dynamics undermined the opportunity for collective inquiry around student thinking and shifted the focus of the discussion from one that could have supported teacher growth into a debate over pedagogical stances.

Additionally, the presence of pedagogical expertise within her PLC allowed Ms. Walker to view more skilled instructional practice and make sense of the feedback she had received on her own instruction. The fact that by the end of the year Ms. Walker was able to serve as a pedagogical model for the third member of the PLC highlights the interplay between individual growth and growth in the capacity of the group. One of the other case studies provides a contrasting example where the teacher did not understand how to move forward from the dissonance created by the feedback on her lesson observation. Unlike Ms. Walker, she did not have the chance to see a colleague demonstrate the kind of instructional practice that was referenced in her written feedback, and she felt as though she was “left hanging.” The inquiry into classroom-based data in a collaborative context that included teacher expertise was therefore an important factor in both generating dissonance and scaffolding it into productive dissonance.


In schools across the country, time is set aside for teachers to examine data together in PLCs. However, research increasingly demonstrates that giving teachers access to data and scheduling time for them to meet is not sufficient to generate the kind of professional learning and instructional improvement (Horn et. al, 2015) that are likely to result in improved learning opportunities and deeper understanding of mathematics for students. The case of Ms. Walker highlights the importance of the interpretive process in transforming data use activities into opportunities for teacher growth and learning. In order for changes in teacher knowledge and practice to occur, the process of looking at data must include instances of disequilibrium that are scaffolded into what we call productive dissonance. As illustrated in Figure 5, in the Linking intervention this productive dissonance emerged from the ongoing facilitated collaborative analysis of classroom-embedded data that linked instruction to student learning. Furthermore, by enhancing the sensemaking step in the data use cycle, teachers like Ms. Walker were able to move beyond surface instructional changes or adjustments to demonstrate more substantial instructional improvement.

Figure 5. Enhanced Data Use Model for Instructional Improvement


The following discussion draws from the case study analysis to highlight features of the Linking intervention and the particular case of Ms. Walker that were critical in generating and sustaining her progression through dissonance to productive dissonance and ultimately to instructional change.

The Nature of the Data Matters

Ms. Walker did not only examine static outcome data on student learning as a way to gauge the effectiveness of her instruction. She examined open-ended assessment items to identify student strategies and uncover their mathematical reasoning about specific mathematical concepts. This put student thinking at the center of data interpretation. In addition, she received written feedback about a lesson in which she and her colleagues were teaching the same concepts, and she viewed video clips of her own and her colleagues’ teaching. These data were intimately connected to her daily practice and interactions with particular students, and illuminated not only the result of student learning, but also the process. In Ms. Walker’s case, the video clip of a colleague’s classroom was the catalyst and bridge to productive dissonance. The use of video in teacher training and professional development is re­gaining interest (Borko et al., 2008), and some have proposed a video bank of expert teaching (Bausmith & Barry, 2011; Hiebert et al., 2002). The Linking intervention points to the value of more close-in video, which, in the case of Ms. Walker, provided an image of possibility with students, curriculum, and organizational context very similar to her own.

Tools and Processes Can Support Teacher Interpretation of Data

In this study, the feedback about instructional practice was built around two pedagogical constructs from the IQA: eliciting student reasoning and responding to students in ways that engage them with increasingly sophisticated mathematical concepts. These constructs served as the basis for PLC discussions of video clips as well, grounding the discussions in specific mathematical content and disciplinary ways of knowing and deepening teacher sensemaking. The focus on making sense of student work provided an important link between the pedagogical strategies and student learning and allowed Ms. Walker to think about her practice through the lens of student thinking.

This finding supports the argument that feedback to teachers about their classroom practice should not be generic, but rather it should focus on discipline-specific aspects of teaching practice (Hill & Grossman, 2013). When feedback is focused on one or two substantive and actionable discipline-based pedagogical concepts or strategies, teachers have something concrete to work towards in subsequent lessons. When this kind of concrete guidance is tied to the specific mathematics content under study, it can generate instructional improvement that goes beyond the surface (e.g., asking more open-ended questions) to deeper substantive change (e.g., helping students transition from repeated addition to multiplicative strategies through the use of an array). This also suggests that instructional feedback may be more effective when constructed by observers with subject matter (content and pedagogical) expertise.

Effective Use of Tools and Processes Depends Upon Strong Facilitation and Group Dynamics

A critical factor that helped Ms. Walker take up the dissonance productively to make substantive changes in her practice was the scaffolding created through the collaborative group inquiry around student learning and instructional practice. Research from an organizational learning perspective reminds us that both human and social capital are important conditions for supporting and sustaining professional growth (Coleman, 1988). It is clear from our cases that dissonance by itself does not necessarily lead to productive change, and supporting the process requires both purposeful facilitation and careful attention to group composition and dynamics. Examining instructional practice in relation to student learning in a group setting can intensify the dissonance but there also needs to be a supportive collegial environment in place and some expertise from colleagues and/or the facilitator to steer it in a productive direction.

The case of Ms. Walker also illustrates the developmental nature of instructional growth in mathematics from eliciting student reasoning to knowing how to refine and shape that thinking towards important mathematical goals. This development in many ways mirrors the development in mathematics education research from first to second generation reform efforts (Stein et al., 2008). Teachers who have become proficient at eliciting student reasoning then need strong skills in analyzing that reasoning in order to press students towards deeper understanding of the concepts under study (Sleep, 2012). The Linking intervention provided teachers with actionable feedback and also opportunities to practice this kind of analysis of student reasoning with guidance from someone with expertise in this area, thus preparing them to undertake the next level of mathematical instruction in their teaching. This differs from the standard model of instructional coaching in that the feedback and practice did not just remain at the individual level. Collaborative sharing and discussion of instructional practice and student work in the PLC engendered a collective responsibility for improving student learning in relation to grade-level goals and expectations, making the dissonance that was created more difficult to dismiss.

Professional Growth Requires an Ongoing Process of Continuous Improvement Through Repeated Feedback Cycles

In addition to these three factors in the interpretive process (meaningful data, supportive tools, and expert facilitation), the presence of repeated feedback cycles is critical for instructional growth. In the case of Ms. Walker, the three repeated feedback cycles provided the opportunity to experiment with and refine new strategies for eliciting student thinking and pressing for the mathematical point. These feedback cycles were not focused on instructional practice at a general level as is the case with many teacher observation instruments, but rather focused on two key ideas about mathematical pedagogy: eliciting students’ mathematical reasoning and using questioning to press students’ thinking towards important concepts, which were kept in play over the course of the year. Teachers were given time to process these ideas with colleagues in grade-level groups and try them out in their lessons throughout the year. It is likely that repeated video recording and written feedback motivated Ms. Walker and other teachers to try out new instructional strategies and continuously assess and refine them so that they would demonstrate improvement in subsequent observed lessons. It is also important to note that this written feedback remained separate from any formal evaluation efforts and was not shared with anyone but the teacher. Ms. Walker's case demonstrates substantial growth over the year as each cycle built upon the previous one. While we did not see this magnitude of change across all teachers, the IQA ratings of the treatment group showed significant growth over the three time points (Supovitz, 2013). The fact that other teachers were just beginning to make changes in their practice by the third feedback cycle suggests that more than three feedback cycles may be necessary to produce change across a larger group of teachers.



This article presents an analytic approach that brings together two areas of research that have typically remained separate: data use in schools and the professional growth of teachers in mathematics instruction. In doing so, it emphasizes data use practices as a context and catalyst for learning in and about teaching specific content. The case of Ms. Walker offers an example of the fruitfulness of bridging the dichotomy between data use and teacher growth. The integrated conceptual framework uncovered the significance of dissonance, but also productive dissonance in supporting a teacher’s increased sophistication with mathematics instruction centered on student thinking and her growing expertise in pedagogical reasoning. Examining rich, classroom-embedded data that make student thinking visible, having tools and processes to support the interpretation of that data, paying attention to the role of expert facilitation and group dynamics, and providing ongoing opportunities for feedback, analysis and practice are critical factors in supporting this growth over time. These factors do not necessarily need to be in place in exactly the same way or in the same intensity as we see in the case of Ms. Walker; rather her case offers important lessons about the possibilities of using data to inform instruction when it builds on what we know about teacher learning. The factors we identified from this case can serve as important goals for researchers, leaders and practitioners to pursue when designing and implementing interventions to improve teachers’ use of data to inform instruction.

The research also offers an example of the contribution that micro-process studies can make to understanding data use practices in schools. It uses situative theory and pays close attention to how teachers make sense of data, highlighting the importance of this sensemaking in the process of instructional growth. It also sheds light on helpful tools, structures and processes—a neglected area of inquiry in education policy research (Cobb et al., 2003). Through this lens, the concepts of dissonance and productive dissonance emerged as important constructs in theorizing about teachers’ sensemaking that can enrich the design and implementation of data use practices in schools and support teachers’ instructional improvement.


This research was funded by the Spencer Foundation.


1. All names used in this article are pseudonyms.


Ball, D. L., & Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to teach: Knowing and using mathematics. In J. Boaler (Ed.), Multiple perspectives on mathematics teaching and learning (pp. 84–104). Westport, CT: Ablex.

Ball, D. L., & Cohen, D. K. (1999). Developing practice, developing practitioners: Toward a practice-based theory of professional education. In G. Sykes and L. Darling-Hammond (Eds.), Teaching as the learning profession: Handbook of policy and practice (pp. 3–32). San Francisco: Jossey Bass.

Ball, D. L., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching what makes it special? Journal of Teacher Education, 59(5), 389–407.

Bausmith, J. M., & Barry, C. (2011). Revisiting professional learning communities to increase college readiness: The importance of pedagogical content knowledge. Educational Researcher, 40(4), 175–178.

Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning. New York: Open University Press.

Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 92(1), 139–148.

Bolam, R., McMahon, A., Stoll, L., Thomas, S., Wallace, M., Greenwood, A., Hawkey, K., Ingram, M., Atkinson, A., & Smith, M. (2005). Creating and sustaining effective professional learning communities. Research Report 637. London: DfES and University of Bristol.

Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3–15.

Borko, H., Jacobs, J., Eiteljorg, E., & Pittman, M. E. (2008). Video as a tool for fostering productive discussions in mathematics professional development. Teaching and Teacher Education, 24, 417–436.

Boston, M. (2012). Assessing instructional quality in mathematics. The Elementary School Journal, 113(1), 76–104.

Boudett, K. P., City, E. A., & Murnane, R. J. (2005). Data wise. Cambridge, MA: Harvard Education Press.

Brophy, J. (Ed.). (2004). Advances in research on teaching: Vol. 10. Using video in teacher education. Oxford, UK: Elsevier.

Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

Christman, J. B., Neild, R. C., Bulkley, K., Blanc, S., Liu, R., Mitchell, C., & Travers, E. (2009). Making the most of interim assessment data: Lessons from Philadelphia. Philadelphia, PA: Research for Action.

Clarke, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth. Teaching and Teacher Education, 18(8), 947–967.

Cobb, P. (1994) Where is the mind? Constructivist and sociocultural perspectives on mathematical development. Educational Researcher, 23(7), 13–20.

Cobb, P., & Bowers, J. (1999). Cognitive and situated learning perspectives in theory and practice. Educational Researcher, 28(2), 4–15.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.

Cobb, P., McClain, K., Lamberg, T., & Dean, C. (2003). Situating teachers’ instructional practices in the institutional setting of the school and district. Educational Researcher, 32(6), 13–24.

Cobb, P., Wood, T., & Yackel, E. (1990). Classrooms as learning environments for teachers and researchers. In R. B. Davis, C. A. Mayer, & N. Noddings (Eds.), Constructivist views on the teaching and learning of mathematics (pp. 125–146). Reston, VA: National Council of Teachers of Mathematics.

Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research and Perspectives, 9(4), 173–206.

Coburn, C. E., & Turner, E. O. (2012). The practice of data use: An introduction. American Journal of Education, 118(2), 99–111.

Coleman, J. S. (1988). Social capital in the creation of human capital. The American Journal of Sociology, 94(Supplement), S95–S120.

Datnow, A., & Hubbard, L. (2015). Teachers’ use of assessment data to inform instruction: Lessons from the past and prospects for the future. Teachers College Record, 117(4). ID Number: 17848.

Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles, CA: University of Southern California, Center on Educational Governance.

Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–199.

DuFour, R. (2004). What is a “professional learning community”? Educational Leadership61(8), 6–11.

DuFour, R., DuFour, R., Eaker, R. (2008). Revisiting professional learning communities at work: New insights for improving schools. Bloomington, IN: Solution Tree.

Earl, L. M., & Timperley, H. (2008). (Eds). Professional learning conversations: Challenges in using evidence for improvement. New York: Springer.

Ebby, C. B., & Oettinger, A. (2013). Facilitating productive discussions in professional development contexts. Paper presented at the research pre-session of the Annual Meeting of the National Council of Teachers of Mathematics, Denver, CO, April 2013.

Eisenhardt, K. M. (1989). Building theory from case study research. Academic Management Review, 14(4), 532–550.

Erickson, F. (2004). Talk and social theory: Ecologies of speaking and listening in everyday life. Cambridge: Polity.

Faria, A., Heppen, J., Li, Y., Stachel, S., Jones, W., Sawyer, K., Palacios, M. (2012). Charting success: Data use and student achievement in urban schools. Washington, DC: Council of the Great City Schools.

Farley-Ripple, E., & Buttram, J. L. (2014). Developing collaborative data use through professional learning communities: Early lessons from Delaware. Studies in Educational Evaluation, 42, 41–53.

Glaser, B., & Strauss, A. (1999). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Transaction.

Goertz, M., Oláh, L., & Riggan, M. (2009). From testing to teaching: The use of interim assessments in classroom instruction. CPRE Research Report #RR-65. Philadelphia: Consortium for Policy Research in Education.

Gomm, R., Hammersley, M., & Foster, P. (Eds.). (2000). Case study method. London: Sage.

Halverson, R., Pritchett, R. B. & Watson, J. G. (2007). Formative feedback systems and the new instructional leadership. Madison, WI: University of Wisconsin.

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute for Education Sciences, US. Department of Education.

Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational Measurement: Issues and Practice, 28(3), 24–31.

Hiebert, J., Gallimore, R., & Stigler, J. (2002). A knowledge base for the teaching profession: What would it look like and how can we get one? Educational Researcher, 31(5), 3–15.

Hill, H., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371–384.

Hodkinson, P., & Hodkinson, H. (2001). The strengths and limitations of case study research. Paper presented at the Learning and Skills Development Agency Conference, Cambridge, UK, December 2001.

Horn, I. S., Kane, B. D. & Wilson, J. (2015). Making sense of student performance data: Data use logics and mathematics teachers learning opportunities. American Educational Research Journal, 52, 208–242.

Jones, E., & Nimmo, J. (1999). Collaboration, conflict, and change: Thoughts on education as provocation. Young Children, 54(1), 5–10.

Kennedy, M. (1998). Form and substance in inservice teacher education. Research Monograph, 13. National Institute for Science Education.

Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112(4), 496–520.

Lampert, M. (2001). Teaching problems and the problems of teaching. New Haven: Yale University Press.

Lieberman, A., & Pointer Mace, D. (2010). Making practice public: Teacher learning in the 21st century. Journal of Teacher Education, 61(1-2), 77–88.

Little, J. W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 118(2), 143–166.

Lortie, D. (1975). Schoolteacher. Chicago: University of Chicago Press. Mandinach, E., & Honey, M. (Eds.), Data driven school improvement: Linking data and learning. New York: Teachers College Press.

Nelson, B. S. (2001). Constructing facilitative teaching. In T. Wood, B. S. Nelson, & J. Warfield (Eds.), Beyond classical pedagogy: Teaching elementary school mathematics (pp. 251–273). Mahwah, NJ: Erlbaum.

Newmann, F. M. (1996). Authentic achievement: Restructuring schools for intellectual quality. San Francisco: Jossey-Bass Publishers.

Oláh, L. N., Lawrence, N. R., & Riggan, M. (2010). Learning to learn from benchmark assessment data: How teachers analyze results. Peabody Journal of Education, 85(2), 226–245.

Patton, M. Q. (1987). How to use qualitative methods in evaluation. Newbury Park, CA: Sage.

Pearson Education, Inc. (2005). Scott Foresman-Addison Wesley Mathematics. Upper Saddle River, NJ: Pearson.

Pearson Education, Inc. (2008). Investigations in Number, Data and Space (2nd ed.). Upper Saddle River, NJ: Pearson.

Phillips, J. (2003). Powerful learning: Creating learning communities in urban school reform. Journal of Curriculum and Supervision, 18(3), 240–258.

Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher29(1), 4–15.

Sarason, S. (1982). The culture of the school and the problem of change. Boston: Allyn & Bacon.

Shepard, L. A. (2009). Commentary: Evaluating the validity of formative and interim assessment. Educational Measurement: Issues and Practice, 28(3), 32–37.

Sherin, M. G. (2004). New perspectives on the role of video in teacher education. In J. Brophy (Ed.), Using video in teacher education. (pp. 1–27). Bingley, UK: JAI Press.

Sleep, L. (2012). The work of steering instruction toward the mathematical point: A decomposition of teaching practice. American Educational Research Journal, 49(5), 935–970.

Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.

Stein, M. K., Engle, R. A., Smith, M., & Hughes, E. K. (2008). Orchestrating productive mathematical discussions: Five practices for helping teachers move beyond show and tell. Mathematical Thinking and Learning, 10(4), 313–340.

Supovitz, J. A. (2002). Developing communities of instructional practice. Teachers College Record, 104(8), 1591–1626.

Supovitz, J. A. (2013). The linking study: An experiment to strengthen teachers' engagement with data on teaching and learning. Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco, CA, April–May 2013.

Supovitz, J. A., & Christman, J. B. (2003). Developing communities of instructional practice: Lessons from Cincinnati and Philadelphia. Philadelphia: Research for Action.

Supovitz, J., Ebby, C. B., Sirinides, P. (2014). Teacher Analysis of Student Knowledge: A measure of learning trajectory-oriented formative assessment. Retrieved from http://www.cpre.org/sites/default/files/researchreport/1446_taskreport.pdf

Supovitz, J. A., Foley, E., & Mishook, J. (2012). In search of leading indicators in education. Education Policy Analysis Archives, 20(19), 1–27.

Sztajn, P., Confrey, J., Wilson, P. H., & Edgington, C. (2012). Learning trajectory based instruction: Towards a theory of teaching. Educational Researcher, 41(5), 147–156.

Van Driel, J. H., & Berry, A. (2012). Teacher professional development focusing on pedagogical content knowledge. Educational Researcher, 41(1), 26–28.

Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education24(1), 80–91.

Wheatley, K. F. (2002). The potential benefits of teacher efficacy doubts for educational reform. Teaching and Teacher Education, 18(1), 5–22.

Yin, R. K. (1994). Case study research: Design and methods (2nd ed.). London: Sage.

Cite This Article as: Teachers College Record Volume 118 Number 11, 2016, p. 1-32
https://www.tcrecord.org ID Number: 21628, Date Accessed: 10/25/2021 10:55:40 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Jolley Christman
    Research for Action
    JOLLEY BRUCE CHRISTMAN is a founder of Research for Action, whose mission has been to conduct rigorous studies designed to provide a broad range of educational stakeholders with the information they need to improve student outcomes and strengthen schools and communities. Her research interests include professional learning communities and organizational learning, civic engagement and school reform, and the privatization of public education. Her work has been included in recent compilations, including Between Public and Private: Politics, Governance and the New Portfolio Models for Urban School Reform, American School Reform: What Works, What Fails, and Why, and The Transformation of Great American School Districts.
  • Caroline Ebby
    University of Pennsylvania
    E-mail Author
    CAROLINE B. EBBY is a senior researcher at the Consortium for Policy Research in Education (CPRE) and an adjunct associate professor of mathematics education at the Graduate School of Education at the University of Pennsylvania. Her research focuses on the use of learning trajectories and formative assessment to improve mathematics instruction. She leads the development, use, and analysis of the TASK instrument, an online assessment of how teachers interpret student work for instruction. She has published articles in Journal for Mathematics Teacher Education, Journal of Mathematical Behavior, Teaching Children Mathematics, Mathematics Teaching in the Middle School, and most recently a chapter “Conceptualizing Teachers’ Capacity for Learning Trajectory Oriented Formative Assessment in Mathematics” in J. A. Middleton, J. Cai, & S. Hwang (Eds)., Large Scale Studies in Mathematics Education (Springer, 2015).
  • Kimberly Edmunds
    Equal Measure
    E-mail Author
    KIMBERLY A. EDMUNDS is a senior consultant at Equal Measure in Philadelphia, PA, where she works on multiple evaluation projects, particularly in the areas of college and career readiness and postsecondary access and success. Stemming from her graduate studies in urban spatial analytics, she has a special interest in discovering ways to integrate geographic information systems as part of mixed-methods research that illuminates the complex relationships between educational opportunity and structural barriers, like residential segregation, concentrated poverty, and school choice systems. She is furthering her interdisciplinary interests through a graduate program in organization development and leadership. Recent publications include Edmunds, K., Pearsall, H., & Porterfield, L. (2015). Narrowing pathways? Exploring the spatial dynamics of postsecondary STEM preparation in Philadelphia, Pennsylvania. The Urban Review, 47(1), 1–25; Hartmann, T., Gao, J., Kumar, A., & Edmunds, K. (2013). A Snapshot of OST Programming in Philadelphia: An Evaluation of Six 21st Century Community Learning Center Grantees. Philadelphia: Research for Action.
Member Center
In Print
This Month's Issue