Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Building a Conceptual Framework for Data Literacy


by Edith Gummer & Ellen B. Mandinach - 2015

Background: The increasing focus on education as an evidence-based practice requires that educators can effectively use data to inform their practice. At the level of classroom instructional decision making, the nature of the specific knowledge and skills teachers need to use data effectively is complex and not well characterized. Being able to characterize this requisite knowledge and skills supports definition and measurement of data literacy. Evolving from empirical analyses, an emergent conceptual framework of knowledge and skills is proposed for the construct, data literacy for teaching. The framework is based on a domain analysis, which is the first step of an evidence-centered design process for data literacy. The framework is contextualized in existing research, with an objective of having it ground future work in the development of instruments to measure data literacy.

Purpose: This article reports on work to develop a conceptual framework to undergird research, development, and capacity building around data literacy for teaching. The emergent nature of the framework is intended to inform the discussions around data literacy so that continued refinement of operational definitions of the construct will emerge. Without such operational definitions, measurement of progress toward teacher data literacy is not possible.

Research Design: The conceptual framework is based on a sequence of qualitative studies that sought to determine the nature of knowledge and skills that are required for teachers to be considered data literate. A first study examined the ways that the knowledge and skills around the use of data were characterized in practical guides, books, and manuals on data use, formative assessment, and related topics. These characteristics were integrated with definitions of data literacy elicited from experts. A second study examined the licensure and certification documents required by states for teacher candidates for their treatment of data- and assessment-related knowledge and skills. The synthesis of these studies and their components have yielded an evolving conceptual framework for a new construct: data literacy for teachers.

Conclusions: The conceptual framework described in this article reflects an evolving effort to understand what it means for teachers to be data literate—that is, what knowledge and skills are required for teachers to use data effectively and responsibly set within an iterative inquiry cycle. The work posits that the construct comprises three interacting domains (data use for teaching, content knowledge, and pedagogical content knowledge), six components of the inquiry cycle (identify problems, frame questions, use data, transform data into information, transform information into a decision, and evaluate outcomes), and, finally, 59 elements of knowledge and skills embedded within those components. However, the complex construct requires additional discussion from policy, research, and practitioners to refine and reorganize it and to expand it beyond a cognitive focus on knowledge and skills to include beliefs/values, identity, and epistemic elements. Next steps will include structuring an ongoing discussion about the nature of the framework and expansion beyond domain analysis through the evidence-centered design process to the development of a suite of instruments to measure the construct.



This article reports on the work to develop a conceptual framework to undergird research, development, and capacity building around data literacy for teaching. The framework is intended to inform the discussions around data literacy so that continued refinement of an operational definition of the construct will emerge. Without such an operational definition, it is impossible for data literacy to be measured. Data literacy is interpreted as the collection, examination, analysis, and interpretation of data to inform some sort of decision in an educational setting. Educational researchers have been evolving both definitions of data literacy and have developed sets of theoretical frameworks to inform research, capacity building, and policy making. But efforts to characterize data literacy to inform the systematic development of measures has lagged behind research efforts that study how multiple stakeholders in education use data. Measurement of data literacy is well behind the policy literature that provides the mandate for data use at all levels of the education system, from the classroom to state and federal agencies (see for example, Aguerrebere, 2009; Cibulka, 2013; Council for the Accreditation of Educator Preparation, 2013; CCSSO Task Force, 2012; National Council for Accreditation of Teacher Education, 2010).


The framework laid out here takes a cognitive approach to examining data literacy from both the perspectives of individual cognition and distributed cognition because the practices of teachers are frequently demonstrated in the collaborative inquiry around data in which they engage (U.S. Department of Education, 2009, 2010). We define data literacy for teaching as follows: Data literacy for teaching is the ability to transform information into actionable instructional knowledge and practices by collecting, analyzing, and interpreting all types of data (assessment, school climate, behavioral, snapshot, longitudinal, moment-to-moment, and so on) to help determine instructional steps. It combines an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge, and an understanding of how children learn.


The construct is only one aspect of a data-literate educator (Data Quality Campaign, 2014). Additional aspects of data literacy, for teachers as well as administrators, involve the ethical and responsible use of data, including appropriate considerations of the privacy of students.


Measurement of a construct of teaching practice is a complex endeavor. We argue that data literacy for teaching is made up of a domain of knowledge and skills that we are calling data use for teaching—a domain that is closely intertwined with disciplinary knowledge, pedagogical content knowledge, understanding about student development, and understanding about the context of schooling. If we try to isolate these other types of knowledge and skills, we risk making the instruments we develop artificial and no longer representative of the practice. At the other end of complexity of measurement development is the incorporation of too many constructs or too complex a framework so that assessment design becomes impossible. The conceptual framework for data literacy for teaching that we present here recognizes the importance and interconnectedness of these other domains, but it focuses on the domain of data use for teaching in order to clarify the essential elements of the larger construct. We argue that in the development of instruments, we will also need to strongly consider the domains of disciplinary knowledge and practices and of pedagogical content knowledge.


This article presents a conceptualization of data literacy for teaching in terms of a knowledge representation that characterizes the cognitive domain of data use for teaching, including the knowledge and skills and their relationships that contribute in part to the complex construct of data literacy for teaching (Mislevy et al., 2010). Ultimately, we are working to develop a framework that can be used to guide the research agenda and the development of measurements of data literacy for teaching from within a community of practice (Shaffer, 2005). We borrow an epistemic framework (Collins & Ferguson, 1993; Shaffer et al., 2007) that identifies the knowledge, skills, values, identity, and epistemology components of a complex practice that embody the domain so that a rich set of instruments may be identified, coordinated, or developed. Knowledge and skills represent what a teacher should know and be able to do to use data to inform teaching. Values represent the beliefs that teachers hold about data use. Identity represents the ways that teachers see themselves as data users. And epistemology represents the ways in which teachers justify the warrants (or explanations) they use to associate claims and evidence as they use data to justify teaching decisions. An epistemic framework examines the ways that professionals link together the mentioned components to structure their ways of thinking and doing in a domain. Such a framework represents the initial stage of determining the validity arguments for the assessment of a complex practice. This article focuses on our efforts to articulate the knowledge and skills with which educators are likely to engage when they use data to inform teaching practices. Subsequent work will focus on identifying the values, identity, and epistemology components of data literacy for teachers.


The article starts with a brief discussion of how the theoretical and empirical literature in data use can help inform the development of a conceptual framework for data literacy for teaching. We provide a description of the current status of instruments in the area of data-driven decision making or data literacy to strongly establish the need for instrument development to inform the study of data use in education. We then describe evidence-centered design (ECD) (Mislevy, Steinberg, & Almond, 2003) and its relevance for the development of a suite of instruments to measure data literacy for teaching. We briefly summarize our efforts to develop a working or operational definition of data literacy as the first step in ECD that has culminated in the description of knowledge and skills in Mandinach, Friedman, and Gummer (2015, this issue). We present an initial knowledge representation of the multiple aspects of the complex set of knowledge and skills that contribute to data literacy as the beginning of a domain analysis, the first layer of the ECD process. We conclude the article with a discussion of how the representation of the interactions of the complex concepts indicates how we might move from domain analysis of data use for teaching through intervening steps of connecting that domain to content and pedagogical content knowledge in order to develop a suite of instruments that are supportive of, and do not constrain, research in data literacy.


GROUNDINGS FROM THE LITERATURE


THEORETICAL PERSPECTIVES


Researchers theorizing about the domain of data use provide insights into the nature of the knowledge, skills, values, identities, and epistemologies that are embedded in data literacy for teachers. For example, Ingram, Louis, and Schroeder (2004) noted that teachers in their sample of schools who most explicitly appeared to be advanced in data use were “dismissive of externally generated achievement data” and that this “is a cultural trait that teachers learn and pass on to other teachers as the ‘right way’ to think, act, and feel about the use of data” (p. 1273). This represents a value aspect of the construct of data literacy for teaching.


Spillane (2012) described a theoretical framework for data use anchored in organizational routines that emerge in the work practice of educators as appropriate for investigating data use in schools. He drew on other theoretical traditions, including distributed cognition, which have significant implications for the conceptual framework laid out in this study. One of the reasons that it has been so difficult to come to a common definition of data literacy is the complex nature of the construct, which involves both individual knowledge and skills and those held by different groups that get evoked in different circumstances. Distributed cognition suggests that truly understanding the nature of the knowledge, skills, values, identity, and epistemology of data literacy cannot be characterized only in terms of what an individual knows, does, and believes.


Spillane (2012) noted that routines direct researchers’ attention to the interactions among school staff, moving the study of or characterization of data literacy beyond knowledge, behavior, and actions of any one individual. “It is in these interactions that school leaders and teachers negotiate about what data are worth noticing—meriting their attention—and what these data mean, if anything, for current practice at the school and classroom levels” (p. 117). These interactions are cognitive in that they entail the ways that people perceive, process, negotiate, and ultimately use information. He indicated that “the cognitive aspects of these interactions are not just about interpreting particular pieces of data or information but also about selecting information, framing it in particular ways, and negotiating the relevance of some bits of information and not other bits” (p. 127). Spillane’s focus on socio-technical systems suggests that in characterizing data literacy of teachers, we must examine not only “intramental models but also to intermental models—models or representations of learning, teaching, and achievement contained in the material and abstract tools that school staff use in their interactions with one another” (p. 128). Moss (2012) supported this system perspective, arguing that “questions about individual cognition, beliefs, dispositions, or identities need to be understood in terms of how they are shaped by cumulating interactions with other people and tools over time” (p. 225).


Spillane (2012) clearly invoked the notion of identity as an important characteristic of data use as we conduct our domain analysis of data literacy for teaching. He described the competing identities of a principal and her teachers in one of his studies.


Ms. Koh sees her role as an instructional leader, a professional identity that represents a departure compared with past principals at Kosten who saw their role as preserving teachers’ professional autonomy. Veteran teachers perceive Ms. Koh’s notions about the principalship as undermining their identities as teachers, as autonomous professionals in the classroom. (p. 132)


The implications for the conceptual framework developed through our domain analysis include the notion that the cognitive perspective we develop has to include both individual and distributed cognition as well as values and identities.


Little (2012) reinforced the importance of the values and identities that teachers espouse as they engage in the use of data. She described the importance of determining the nature of the world views that are held by teachers, recognizing that the interpretation of data is the point at which issues of meaning, utility, and trustworthiness of data are invoked, echoing the perspective of Coburn, Toure, and Yamahita (2009). “Inevitably, that interpretive work—such as assigning relevance to particular data points, patterns, or trends—relies on the worldviews that teachers and other participants bring to the task” (Little, 2012, p. 161). Colyvas (2012) examined the ways in which performance metrics, such as student scores on state assessments, influence the ways in which educators consider or value their practices and identities by facilitating comparisons of one’s own practice with that of others or with external benchmarks. Performance measures focus attention and resources on some areas while obscuring attention to others and can support different behaviors dependent on different value systems—such as a focus on improvement as opposed to optimizing the measures of a teacher’s performance through unethical practices.


Honig and Venkateswaran (2012) examined the relationships between school and central office use of data to determine the extent to which central office staff efforts contribute to school-level use of data. They highlighted the importance of considering a systemic approach to conducting research on data use in schools and districts. From the perspective of the current study, the research studies described provide a glimpse of how data literacy for teaching is similar to and different from data literacy for administration. In addition, the authors argued that a systems approach to studying data use by educators would help identify which processes and practices embedded in different system subunits and their interactions are essential for effective data use. Their synthesis also identified the elements of the educational system around which educators needed to develop understandings in order to better engage in local contexts to facilitate data use.


INSTRUMENT DEVELOPMENT AND USE STUDIES


No instruments to measure data literacy were publicly available when we started our work in data-driven decision making. Few reports that described the extent to which teachers engaged in data use, typically characterized as data-driven decision making, provided any information about how their own instruments had been developed. For example, Dembosky, Pane, Barner, and Christina (2006) surveyed teachers to determine the nature of the purposes for which they used student achievement data and identified three uses. In this study, the nature of the purposes for the use of data included prior year achievement data used to inform current year topics on which to spend more instructional time. Midyear district assessments served to identify areas in which students were deficient and to refocus curriculum at the classroom level. Teachers also reported that they used achievement data to differentiate instruction for small groups. Finally, teachers indicated that they used data to inform instruction for individual students, including identifying which students might benefit from pullout programs. However, no details were provided in the report about the nature of the survey items or instrument development. We could infer from this report that measuring teachers’ understandings of data use for instructional decisions was an important component of studying data use, but no particulars on how this knowledge—understanding the purpose of data use, and the related skills, such as being able to use appropriate data for the intended purpose—might be characterized in a measurement or might be related to other aspects of data literacy.


Means, Chen, DeBarger, and Padilla (U.S. Department of Education, 2010) provided information on the nature of the data concepts and skills as well as items used to measure data literacy, identified as data-driven decision making, in the national Study of Education Data Systems and Decision Making. They used hypothetical data use scenarios and related questions in interviews with individual teachers and small groups of teachers and other educators. This focus on individual and distributed cognition reflects the theoretical perspective indicated in Spillane’s (2012) work described earlier. Five skill areas identified in collaboration with experts in educational data systems and measurement are: data location, data comprehension, data interpretation, instructional decision making and question posing. A specific focus is on teachers’ understanding of specific data analysis concepts and skills, including probability, generalizability, data computation, and reduction. Scenarios were developed using the expert groups’ perceptions of the types of probes that would elicit the data concepts and skills and used tasks and items that had been reported in research or teacher preparation practice as models. The scenarios and questions were reviewed by a content and an assessment expert. Information about the relatively high level of interrater reliability of the rubric-based scoring of the interviews is provided in an interim report (U.S. Department of Education, 2009).


Dunn, Airola, and Garrison (2013) reported on the instruments used to measure the improvements in teachers’ knowledge, concerns, and efficacy in data use. The framework that they used to structure this measurement focused on the intensity of teachers’ concerns about data use and teacher efficacy in data-driven decision making. The researchers developed three instruments to get at this tri-part model, including the DDDM Knowledge Test, a Stages of Concern for data-driven decision making based on the work of Hall, George, and Rutherford (1979), and the DDDM Efficacy Assessment (3D-MEA). The knowledge measure is the least developed of the three instruments, and little information is available about that aspect of the model. In Dunn and colleagues’ (2013) work, the authors presented two multiple-choice knowledge questions. One of the knowledge items addresses teachers’ understanding of validity, while the other examines the appropriate relationship between the use of test items or data and curricular decisions. No additional information is available on that instrument.


Operating from the social cognitive theory of Bandura (1997), the researchers examine teachers’ beliefs that they can use data effectively to inform decision-making in the classroom and their concerns or anxieties around that use (Dunn, Airola, Lo, & Garrison, 2013b). Dunn et al. (2013b) focused on classroom level data-driven decision making (DDDM) and used the term for teachers’ processes in identifying “patterns of performance that unveil students’ strengths and weaknesses relative to students’ learning goals as well as the selection and planning of instructional strategies and interventions to facilitate student achievement of learning goals” (p. 87). Through exploratory and confirmatory factor analysis, these authors identified four aspects of teachers’ use of data at the classroom level, to include the ability to access data and select appropriate reports and the ability to use technology-based data systems. They also identified teachers’ skills in interpreting and evaluating findings from the analysis of data for instructional decisions and the anxiety related to engaging in data use practices. Using a 5-level Likert scale, the 3D-MEA includes 22 items that address anxiety for data-driven decision making and four types of efficacy. Initially the authors indicated that they were measuring efficacy for data identification and access, data technology use, and data analysis, interpretation, and use for instructional decision making. Further analysis indicated that the construct of efficacy for data analysis, interpretation, and use for instructional decision making was essentially two constructs that separate teachers’ beliefs in their skills in successfully analyzing and interpreting student data from their beliefs about their ability to connect data interpretation to what happens next in the classroom. These researchers also reported on reliability of the five scales used to measure these constructs and indicated adequate levels of internal consistency within each scale. Strong correlations were also indicated as being shared among the scales (Dunn et al., 2013b).


The work of Dunn and her collaborators in all three of the mentioned studies corroborates the findings from a qualitative study of teachers’ self-efficacy around issues surrounding the use of data at the classroom level that indicate that teachers tend to possess low levels of confidence in their ability to engage in DDDM (Bettesworth, Alonzo, & Duesbery, 2008). However, much of Bettesworth and colleagues’ data were collected through the use of a Likert type of survey for which no technical quality information exists. But the nature of the assessment framework development expressed from the perspective of efficacy and concerns for data use does not provide us with a framework that we can use to develop an adequately comprehensive suite of instruments to characterize the cognitive aspects of the knowledge and skills that make up data literacy for teaching.


The work of Supovitz and his colleagues (Supovitz, Ebby, & Sirinides, 2013) reported the characteristics of tasks that have been used to examine teachers’ practices of data use within the professional development of a Mathematics and Science Partnership project, the Ongoing Assessment Project (OGAP). More focused on formative assessment than data use, the Teacher Analysis of Student Knowledge (TASK) instrument measures teachers’ abilities to analyze students’ mathematical thinking within a grade-specific content area and formulate reasoned instructional responses. The online assessment measures teachers’ knowledge of mathematical concepts, analysis of student understanding, and knowledge of mathematical pedagogical content knowledge in the form of understanding mathematical learning trajectories. The instrument also measures teachers’ understanding of instructional decision making. Teacher responses are analyzed based on the demands of the task using a set of rubrics. A technical report (Ebby, Sirinides, Supovitz, & Oettinger, 2013) provides the theoretical grounding of the development of the TASK instrument from the formative assessment literature. It also provides evidence of interrater reliability and internal consistency, as well as initial criterion validity evidence in the relationship between teachers’ scores on the TASK instrument and the Mathematical Knowledge for Teaching instrument (MKT) (Hill, Schilling, & Ball, 2004; Schilling, Blunk, & Hill, 2007). The researchers reported strong evidence for interrater reliability and moderate to high levels of internal consistency. They reported low to moderate statistical associations between the teachers’ scores on the TASK instruments and their MKT.  


These lines of research and development illustrate the nascent state of instruments intended to measure data literacy for teaching. The focus on data literacy differs across the three studies. The work of Dunn and colleagues (2013) focused on teachers’ efficacy and affect, whereas the work of Supovitz and colleagues (2013) focused on teachers’ knowledge and skills of formative classroom data connected to instructional decision making. Means, Chen, DeBarger, and Padilla (U.S. Department of Education, 2010) addressed a more comprehensive perspective of data literacy, identifying multiple components of data literacy and the subconcepts and skills that embody them, but they did not provide a systematic analysis that connected these components together and provided the connections either to larger domains such as content and pedagogical content knowledge or to the subdomains such as understanding of measurement in education concepts. The next section on ECD provides the description of the process we have initiated to develop a more comprehensive description of the analysis of the domain of data use for teaching.

EVIDENCE-CENTERED DESIGN


Recognizing that the field of data-driven decision making is highly complex, systemic in nature, and informed by a diverse set of stakeholders with unique perspectives, we argue that a more systematic process was needed to develop the validity arguments we would need to claim that we have a valid suite of instruments to measure data literacy for teaching. We need evidence that would support the inferences or judgments about proficiency of individuals or groups of educators based on the results of the use of the measures we are developing (American Educational Research Association, 2000). To begin the development of that validity argument, the project team turned to ECD (Mislevy & Haertel, 2006; Mislevy & Riconscente, 2005; Mislevy et al., 2003; Shaffer et al., 2007).


ECD is a process to develop the evidentiary reasoning that contributes to the validity arguments for assessment design. ECD is based on the premise that construct validity is the multifaceted set of arguments that provide the warrant that connects the claims or inferences about a performance and the evidence that is provided by the instruments, such as tests, inventories, and rubrics (Messick, 1989). Messick (1994) provided the argument for how construct validity should be developed:


A construct-centered approach would begin by asking what complex of knowledge, skills or other attributes should be assessed, presumably because they are tied to explicit or implicit objectives of instruction or are otherwise valued by society. Next, what behaviors of performances should reveal those constructs, and what tasks or situations should elicit those behaviors? Thus, the nature of the construct guides the selection or construction of relevant tasks, as well as the rational development of construct-based scoring criteria and rubrics. (p. 16)


ECD moves through a number of layers that define the work that must happen in order to structure instrument design. The domain analysis involves the identification of information about the domain of interest, in this case, the practice of data use for teaching. This includes the concepts, terminology, tools, and representational forms that people working in the domain use to characterize their work (Mislevy & Riconscente, 2005). This domain analysis identifies the ways in which knowledge is represented in the domain and the nature of the skills that are used by those actors as they engage in the practices of the domain. It represents “substantive information about the domain of interest that has direct implications for assessment: how knowledge is constructed, acquired, used, and communicated” (Mislevy & Riconscente, 2005, p. 4).


The knowledge representation that results from the domain analysis is a “physical or conceptual structure that depicts entities and relationships in some domain, in a way that can be shared among different individuals or the same individual at different points in time” (Mislevy et al., 2010, p. 4). A knowledge representation provides an embodiment of the domain in such a way that essential elements of the domain may be brought into focus to support collaborative discourse about the study of a phenomenon. This embodiment also highlights a potential weakness of a knowledge representation in that what is not represented may not be studied. A complex domain of data literacy for teaching might have multiple knowledge representations that identified different components and relationships and that serve different purposes.


The second layer of ECD involves domain modeling whereby the knowledge and skills within a domain are connected to the characteristics of task features and potential work products or observations. These include diagrams of the relationships among claims or inferences made about performance and the evidence embodied in tasks and performances connected by warrants that explain those relationships. The third layer, the conceptual assessment framework, provides information about the nature of the models of students (in our case, teachers), tasks and items, rubrics, and measurement models that support the assessment argument structure. The assessment implementation layer describes how the assessment will be implemented with the forms of the tasks, pilot test data to support evaluation procedures, and the fit to measurement models. Finally, the assessment delivery layer provides the final versions of the tasks, work products, and score structure (Mislevy & Riconscente, 2005). This is the trajectory of our work developing measures of teacher data literacy.


For a practice as complex as data literacy for teaching, a domain analysis of knowledge and skills that characterize data use for teaching does not provide sufficient categories to characterize what people should know and be able to do to be successful practitioners. A simple identification of the cognitive areas of knowledge and skills is unlikely to capture the complexity of the use of data by teachers in practice. An epistemic frame, used as the theory of learning that supports the development of simulations and educational games (Shaffer et al., 2007), provides a richer set of categories. These include knowledge, the declarative, procedural schematic, and strategic elements of a domain; skills, the things that people do to be effective in a domain; values, the beliefs that people hold about a domain; identity, the way that people see themselves as members of a community acting in a domain; and epistemology, the warrants that connect evidence to claims or actions within a domain (Shaffer et al., 2007). Our focus in this article is to characterize the knowledge and skills elements of the domain of data literacy for teaching. Future work will characterize the other aspects of domain analysis that are necessary.


We have described the multiple empirical steps of the domain analysis in which we have been engaged to develop a knowledge representation of data use for teaching (Mandinach & Gummer, 2012, 2013; Mandinach et al., 2015, this issue). These steps have included examining the focus and content of the professional development materials for educators that provide the opportunities for developing data literacy and examining the consensus of definitions and the components of data literacy from a group of experts, including researchers and professional development providers (Mandinach & Gummer, 2012, 2013). We have synthesized an examination of the state teacher licensure standards with the components of data literacy from the examination of professional development materials and expert conceptualizations (Mandinach et al., 2015, this issue). These analyses have identified 59 sets of knowledge and skill statements that we have organized into components, subcomponents, and elements. The sheer number of knowledge and skill statements in our domain analysis illuminates the complexity of the construct. We present in the next section an organizational framework, our representation of the multiple conceptual areas of the domain to capture knowledge, and skills that represent data literacy for teaching, and we identify some areas of knowledge and skill that still need further refinement.


DOMAIN ANALYSIS AS A CONCEPTUAL FRAMEWORK: A KNOWLEDGE REPRESENTATION


The four diagrams that follow represent our current stage in our domain analysis of data literacy for teaching. The first diagram shows how we characterize the overarching construct of data literacy and how we organize our description of the domain of data use for teaching. The second diagram begins to unpack the domain of data use for teaching into components, subcomponents, and elements of knowledge and skills that make up the inquiry process in instructional decision making and begin to demonstrate the relationships within the domain. Putting all the components, subcomponents, and elements in the same diagram would result in a figure that was too dense to consider adequately. The third and fourth diagrams demonstrate the complexity of two of the inquiry process components, subcomponents, and elements.


For the purpose of this article, we have focused on just two additional domains of knowledge and skills that teachers need in order to be considered data literate. Data literacy for teaching does not just address data use for teaching, but incorporates knowledge and skills from other broad domains of teaching, including disciplinary content and pedagogical content knowledge (Figure 1). Teachers do not use data to inform teaching without reference to the instructional goals and objectives that address the disciplinary areas they teach: reading, mathematics, social science, or science. This area is most explicitly represented in mathematics, in which multiple researchers have examined what teachers need to know about mathematics to teach (Ball, Thames, & Phelps, 2008; Confrey, Maloney, Nguyen, Mojica, & Myers, 2009; Hill, Rowan, & Ball, 2005).


Figure 1. Organization of data literacy conceptual framework


[39_17856.htm_g/00002.jpg]


This content knowledge is integrated with an understanding of how that particular content knowledge can be translated through curricular materials and instructional strategies into experiences that facilitate student learning—what Shulman (1986) called pedagogical content knowledge. What Shulman left out of his early writing about pedagogical content knowledge was the information that teachers derive from multiple sources to inform their teaching, including questioning of students, classroom assessments, and the results of external tests. From our perspective, the data that teachers use include additional sources beyond student performances in the classroom and on assessments at the classroom, school, and state level, and they encompass demographic, attendance, and behavioral data, among others. The knowledge and skills that undergird the use of data for teaching that provides that information are the focus of the third circle of this diagram.


We have needed some language to describe the complexity of the knowledge and skills that make up the domain of data use for teaching that, combined with the domains of disciplinary knowledge and practices and pedagogical content knowledge, defines data literacy for teaching. Figure 1 also shows the terminology that we are using for our conceptual framework that emerged from our original studies. The overarching construct we are examining is data literacy for teaching. The construct contains three important domains: disciplinary content knowledge and practices, pedagogical content knowledge and practices, and data use for teaching knowledge and skills. The domain of data use for teaching includes the components of the inquiry process. What the diagram does not show is what we are calling the subcomponents of the inquiry process and the elements that make up those subcomponents. These are specific knowledge and skills that are shown in more detail in Figures 2, 3, and 4.



Figure 2 presents our organization of the components, subcomponents, and some elements of the domain of data use for teaching that coalesced from our examination of data literacy professional development materials, conferencing with researchers and data literacy developers, and examining state licensure and certification documents (Mandinach & Gummer, 2013). The diagrams take the components, subcomponents, and elements of the domain of data use for teaching that have emerged from those investigations and attempts to organize them into a conceptual framework that we can use to consider what we should be targeting as we develop instruments. But including the subcomponents and elements we have identified makes a single diagram too complex to describe. We have chosen to show the subcomponents and elements of the component transform information into decision to highlight the connection to pedagogical content knowledge that is included in the domain. Figures 3 and 4 show the subcomponents and elements of the components of the inquiry cycle that are identified as use data and transform data into information.


Figure 2 starts with teachers having to know how to identify problems, which is associated with the subcomponent of involving stakeholders. They also need to know how to frame questions, which in our analysis was not decomposed into subcomponents. As they progress through a cycle of inquiry, teachers need to know about and understand how to use data and transform data into information, and the subcomponents and elements of these two components are shown in Figures 3 and 4. Teachers then need to know how to transform information into a decision. It is here that we begin to see the interconnections with pedagogical content knowledge as we identified two subcomponents: determine next steps and understand the context for the decision. The first subcomponent, determine next steps, is associated with multiple elements as teachers monitor, diagnose, and make instructional adjustments. This final element is where the knowledge and skills in instruction start to be connected to the inquiry process. It is not clear the extent to which the verbs that make up the element of make instructional adjustments represent unique sets of knowledge and skills, or if they are synonyms that could be aggregated together. Finally, in the inquiry process, teachers need to evaluate outcomes and reengage in the inquiry process at the beginning. Note that we are not identifying the grain size of this inquiry process. It could be within an individual lesson, across a set of lessons, or across a whole unit of study.


Figure 2. The domain of data use for teaching


[39_17856.htm_g/00003.jpg]


Figure 2 illuminates some aspects of the inquiry process; the components identify problems and frame questions lack subcomponents and elements. In addition, Figure 2 shows the most compelling issue that exists across the domain analysis of data use for teaching: the potential for redundancy of the terms that has emerged from our empirical work. This is not a trivial issue, given that verbs count in the development of assessments. It is not clear whether the subelements planning, guiding, designing, adjusting, differentiating, and individualizing instruction associated with the element make instructional adjustments may require different types of knowledge and skills, may be required at different times in the inquiry process, or may be associated with different types of data. For instance, planning instruction for a whole class is different from differentiating instruction for groups of students, and both of these are different from individualizing instruction for one particular student. But it is not clear that planning instruction is different from designing instruction, and modifying instruction may not be different from adapting instruction.


Figure 3 provides an explosion of the subcomponents, elements, and subelements of the component data use. The subcomponents include identify data, generate data, understand data properties, understand data qualities, and understand and apply data. Each of these subcomponents includes multiple elements and subelements that teachers need to know and to be able to do as the grain size of the categories becomes more finely identified. For example, tracing through the subcomponent identify data, it is necessary for teachers to understand the purposes of different data and understand what data are not applicable in a given context. Tracing through the subcomponent understand data quality evokes a different set of knowledge and skills that include understand problematic data and understand elements of data quality—accuracy and completeness. Generate data is another subcomponent of the use data component, and it includes understand assessment, which expands into understand statistics and psychometrics, and use formative and summative assessments. The understand how to apply data subcomponent includes elements of finding, locating, accessing, or retrieving data, which requires that teachers understand how to use technologies to support data use.


Figure 3. Subcomponents, elements and subelements for the component use data

[39_17856.htm_g/00004.jpg]


The use data component does not appear to have any subcomponents that do not contain elements. However, it is not clear that a number of the elements would not fit just as reasonably under multiple subcomponents. For instance, the reflect on data element is associated with the understand data quality subcomponent. This element would fit just as well under the identify data or understand data properties subcomponents.


Figure 4 illustrates the ways in which the knowledge and skills associated with the component transform data into information emerged from our domain analysis. Connected closely with the use of data, such sets of knowledge and skills include teachers’ abilities to analyze data, generate hypotheses to explain data, test assumptions, and consider impact and consequences understanding. The analyze data subcomponent has some of the same sorts of elements of knowledge and skills that are found under the understand how to apply data subcomponent in the previous component of the inquiry process. These include use statistics, and understand data displays and representations. Other elements include being able to assess patterns and trends and synthesize diverse data. Four subcomponents appear to address similar concepts that may encompass similar sets of knowledge and skills. These include summarize and explain data, interpret or make meaning of data, drawing inferences and conclusions, and probing data for causality. These four and three additional subcomponents test assumptions, generate hypotheses, and consider impact and consequences of data interpretation lack additional elements.


This component transform data into information has some of the same issues as the other components of the inquiry process previously described and identifies another issue. It is not clear that all the sets of knowledge and skills have been identified. For instance, what does a teacher need to know and be able to do to generate hypotheses based on data or to test assumptions? It is also not clear whether the elements identified under analyze data are correctly assigned or whether they belong to a different subcomponent.


Figure 4. Subcomponents and elements for the component transform data into information


[39_17856.htm_g/00005.jpg] 

Figures 2, 3, and 4 make up a knowledge representation of domain of data use for teaching that has emerged from our multiple levels of analysis. The diagrams are intended to be a vehicle for discourse because they illuminate the concepts and skills and their relationships and processes in the domain. We have sought to capture the different terms that are used across the multiple sources but identify where there have been redundancies and synonyms to reduce them to a manageable number. The way in which we have laid out the relationships among the statements of knowledge and skill represents one arrangement that seems logical to us. But other researchers may see different connections among the constructs and a different knowledge representation. We anticipate that other researchers will have different interpretations and invite the discourse that such interpretations might inform. Our next steps with the domain analysis include validating the knowledge representation with multiple communities of researchers and developers who work in the data literacy field.


SPECULATIONS ABOUT THE USE OF THE CONCEPTUAL FRAMEWORK


What are the implications of the domain analysis of data use for teaching given the intersection of disciplinary knowledge and practices and pedagogical content knowledge that we argue articulate the construct of data literacy for teaching? The importance of disciplinary knowledge and practices is evident from the beginning of the inquiry process. For teachers, knowing what content a student should know—typically represented in content standards such as the Common Core State Standards for Reading or Mathematics or the Next Generation Science Standards—is essential to identify problems and to help frame questions. The role of disciplinary content knowledge and practices is evident in the use data component, given that teachers need to know what aspects of the content domain are important to identify and generate data. Pedagogical content knowledge comes into play as teachers transform information into decisions about how to proceed with instruction as teachers plan, modify, and adapt instructional strategies and events.


How might we use the knowledge representation that we have constructed of data use for teaching for multiple purposes to inform research and development around the construct of data literacy for teaching more generally and to assist in the development measures of teachers’ knowledge and skills? Once we refine this domain analysis, the next step in ECD will be to start the process of domain modeling, which connects the domain analysis to tasks and work products. For instance, we can use the different elements of understand the purposes of different data and understand what data are not applicable to design a scenario task in which we present teachers with different sets of data for a particular decision to see if they can identify data that are appropriate for that decision. We can use the domain analysis to look back at the tasks that have been used by other researchers to see which aspects of data literacy for teaching have been the focus of those assessments and that still need to be developed. We can also use the domain analysis to structure observation protocols of what teachers are invoking and doing as they engage in data use at various levels of instruction. Finally, we can use the elements and subcomponents of the domain analysis to structure a suite of tasks and assessment items that collectively begin to develop instruments to measure data literacy for teaching.


We recognize that the construct of data literacy for teaching might be organized in multiple ways. Even the knowledge representations that we have devised might be subdivided, actions traced across them in multiple ways, and different sections activated by particular sets of behaviors by teachers as they engage in classroom practices. Our goal here is to stimulate discourse among researchers and developers in the field that will help to refine the framework and examine its utility in actual practice as programs of professional development and concurrent instrument development are obtained.


Acknowledgments


The authors would like to acknowledge the contributions of Jeremy Friedman, who has served as a research associate on this work. The authors would also like to acknowledge Martin Orland for his contributions to and support on this work. Finally, the authors would like to recognize other key individuals in the field of data use who have pushed our thinking, in particular Jeff Wayman, Diana Nunnaley, and Laura Hamilton.


References


Aguerrebere, J. (2009). Panel discussion at Teachers’ Use of Data to Impact Teaching and Learning conference, Alliance for Excellent Education, Washington, DC.


American Educational Research Association. (2000). The standards for educational and psychological testing. Washington, DC: Author.


Ball, D. L., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching: What makes it special? Journal of Teacher Education, 59, 389–407.


Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: W. H. Freeman.


Bettesworth, L. R., Alonzo, J., & Duesbery, L. (2008). Swimming in the depths: Educators’ ongoing effective use of data to guide decision making. In T. J. Kowalski & T. J. Lasley (Eds.), Handbook on data-based decision making in education (pp. 286–303). New York, NY: Routledge.


CCSSO Task Force on Teacher Preparation and Entry Into the Profession. (2012). Our responsibility, our promise: Transforming teacher preparation and entry into the profession. Washington, DC: Council of Chief State School Officers.


Cibulka, J.  (2013, August).  Leveraging the InTASC standards and progressions. Presentation made during the webinar, Next Generation Standards & Accreditation Policies for Teacher Prep. Washington, DC: Alliance for Excellent Education.


Coburn, C. E., Toure, J., & Yamahita, M.  (2009). Evidence, interpretation, and persuasion:  Instructional decision making in the district central office. Teachers College Record, 111(4), 1115–1161.


Collins, A., & Ferguson, W. (1993) Epistemic forms and epistemic games: Structures and strategies to guide inquiry. Educational Psychologist, 28, 25–42.


Colyvas, J. A. (2012). Performance metrics as formal structures and through the lens of social mechanisms: When do they work and how do they influence? American Journal of Education, 112(2), 167–197.


Confrey, J., Maloney, A. P., Nguyen, K. H., Mojica, G., & Myers, M. (2009). Equipartitioning/splitting as a foundation of rational number reasoning using learning trajectories. Presentation at the conference of the International Group for the Psychology of Mathematics Education, Thessaloniki, Greece.


Council for the Accreditation of Educator Preparation. (2013, August). CAEP accreditation standards. Washington, DC: Author.


Data Quality Campaign. (2014, February). Teacher data literacy: It’s about time. Washington, DC: Author.


Dembosky, J. W., Pane, J. F., Barney, H., & Christina, R. (2006). Data driven decisionmaking in southwestern Pennsylvania school districts. Santa Monica, CA: RAND Corporation.


Dunn, K. E., Airola, D. T., & Garrison, M. (2013). Concerns, knowledge, and efficacy: An application of the teacher change model to data driven decision-making professional development. Creative Education, 4, 673–682.


Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013a). Becoming data driven: The influence of teachers’ sense of efficacy on concerns related to data-driven decision making. Journal of Experimental Education, 81, 222–241.


Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013b).  What teachers think about what they can do with data: Development and validation of data driven decision making efficacy and anxiety inventory. Contemporary Educational Psychology, 38, 87–98.


Ebby, C., Sirinides, P., Supovitz, J., & Oettinger. A. (2013, May). Teacher analysis of student knowledge TASK: A measure of learning-trajectory-oriented formative assessment. Paper presented at the annual meeting of the American Educational Research Association, San Francisco, CA.


Hall, G. E., George, A. A., & Rutherford, W. L. (1979). Measuring stages of concerns about the innovation: A manual for use of the SoC Questionnaire. University of Texas, Austin.


Hill, H. C., Rowan, B., & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for teaching on student achievement, American Educational Research Journal, 42(2), 371–406.


Hill, H. C., Schilling, S. G., & Ball, D. L. (2004). Developing measures of teachers’ mathematics knowledge for teaching. Elementary School Journal, 105(1), 11–30.


Honig, M. I., & Venkateswaran, N. (2012). School-central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 112(2), 199–222.


Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision making: Barrier to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.


Little, J. W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 112(2), 143–166.


Mandinach, E. B., Friedman, J. M., & Gummer, E. S. (2015). How can schools of education help to build educators’ capacity to use data? A systemic view of the issue. Teachers College Record, 117(4).


Mandinach, E. B., & Gummer, E. S. (2012). Navigating the landscape of data literacy: It IS complex. Washington, DC and Portland, OR: WestEd and Education Northwest.


Mandinach, E. B., & Gummer, E. S. (2013). Defining data literacy: A report on a convening of experts. Journal of Educational Research and Policy Studies, 13(2), 6–28.


Messick, S. J. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). New York, NY: American Council on Education and Macmillan.


Messick, S. J. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.


Mislevy, R. J., Behrens, J. T., Bennett, R. E., Demark, S. F., Frezzo, D. C., Levy, R., . . . Winters, F. I. (2010). On the roles of external knowledge representations in assessment design. Journal of Technology, Learning, and Assessment, 8(2). Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla


Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25, 6–20.


Mislevy, R. J., & Riconscente, M. (2005). Evidence-centered assessment design: Layers, structures, and terminology (Principled Assessment Designs for Inquiry Technical Report 9). Menlo Park, CA: SRI International


Mislevy, R. J., Steinberg, L. S., & Almond, R. A. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3–67.


Moss, P. A. (2012). Exploring the macro-micro dynamic in data use practice. American Journal of Education, 112(2), 223–232.


National Council for Accreditation of Teacher Education. (2010). Transforming teacher education through clinical practice: A national strategy to prepare effective teachers. Washington, DC: Author.


Schilling, S. G., Blunk, M., & Hill, H. C. (2007). Test validation and the MKT measures: Generalizations and conclusions. Measurement, 5(2–3), 118–128.


Shaffer, D. W. (2005). Epistemography and the participant structures of a professional practicum: A story behind the story of journalism 828 (WCER Working Paper No. 2005–8). University of Wisconsin-Madison, Wisconsin Center for Education Research. Retrieved from http://www.wcer.wisc.edu/publications/workingPapers/Working_Paper_No_2005_8.pdf


Shaffer, D. W., Hatfield, D. Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., . . . Mislevy, R. J. (2007). Epistemic network analysis: A prototype for 21st century assessment of learning. International Journal of Learning and Media, 1(2), 33–53.


Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14.


Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making phenomena. American Journal of Education, 112(2), 113–241.


Supovitz, J., Ebby, C. B., & Sirinides, P. (2013). Teacher analysis of student knowledge: A measure of learning trajectory-oriented formative assessment. Consortium for Policy Research in Education. Retrieved from http://www.cpre.org/sites/default/files/researchreport/1446_taskreport.pdf


U.S. Department of Education, Office of Planning, Evaluation and Policy Development. (2009). Implementing data-informed decision making in schools: Teacher access, supports and use. Washington, DC: Author.


U.S. Department of Education, Office of Planning, Evaluation and Policy Development. (2010). Teachers’ ability to use data to inform instruction. Washington, DC: Author.




Cite This Article as: Teachers College Record Volume 117 Number 4, 2015, p. 1-22
https://www.tcrecord.org ID Number: 17856, Date Accessed: 5/22/2022 9:47:31 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Edith Gummer
    WestEd and the National Science Foundation
    E-mail Author
    EDITH S. GUMMER was a WestEd senior research scientist and a program officer at the National Science Foundation in the Education and Human Resources directorate. Her research focuses on ways to maximize the use of data to inform instructional decision-making and on policies that balance student data privacy and access to data for researchers. She is currently the director of education research and policy at the Ewing Marion Kauffman Foundation, identifying ways to sustain educational research and development through entrepreneurship.
  • Ellen Mandinach
    WestEd
    E-mail Author
    ELLEN B. MANDINACH is a senior research scientist and the director of the data for decisions initiative at WestEd. She specializes in research on data-driven decision making. She has written widely on data use at state, district, school, and classroom levels, including two books, Transforming Teaching and Learning Through Data-Driven Decision Making and Data-Driven School Improvement: Linking Data and Learning. She holds a Ph.D. in educational psychology from Stanford University.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS