Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Supporting Teachers’ Use of Individual Diagnostic Items


by E. Caroline Wylie & Joseph F. Ciofalo - September 05, 2008

This article addresses the concept of using single diagnostic items formatively during the course of instruction, which is the basis of a project in which 4th and 8th grade mathematics and science teachers are incorporating diagnostic items into their every day classroom practice. In this commentary, we focus on a framework for thinking about sources of misconceptions and how that framework was used to support teachers’ developing their own diagnostic items in order to generate richer evidence of student learning. The article concludes with a broader perspective on how this work can influence teaching and learning

INTRODUCTION


Previously we outlined an approach of using individual diagnostic items as a formative assessment tool (Ciofalo & Wylie, 2006). We described how student misconceptions were used to create diagnostic items and how teachers could integrate them into their instruction. We situated these items within a larger formative assessment framework. Building on the work of Black, Harrison, Lee, Marshall, and Wiliam, (2003), diagnostic items become part of an assessment for learning process in which “information about learning is evoked and then used to modify the teaching and learning activities in which teachers and students are engaged [emphasis in the original]” (p. 122).


Following that article, we have continued working with groups of teachers on the Diagnostic Items in Mathematics and Science (DIMS) project. We are now halfway through a randomized control trial in which we are exploring the impact of diagnostic items on student learning. We hold to the premise that the utility of a single, well-conceived diagnostic item, with answers collected simultaneously from the entire class, is worth the reliability trade-off (Wylie & Wiliam, 2006): that is, although the reliability of a single item is less than that of a longer test, the immediacy of the information returned and the ease of the interpretation make the teacher’s use of the information more likely.


Reflecting on our work, the focus has shifted from expanding the bank of items to supporting teachers’ use of those items. The emphasis more recently has been on the provision of ongoing professional development to support teachers’ effective use of the diagnostic items and to enable them to be resources for each other as they learn to incorporate the items into instruction, and to use the evidence to inform next steps.


During this project, the value of helping teachers become more aware of student misconceptions has been confirmed. For many teachers it is not something that they had previously considered. In written reflections, more than a third of participating teachers indicated that they had not been aware of the breadth of student misconceptions. As one 4th grade mathematics teacher noted, “I never really thought much about misconceptions before DIMS. Listening to student thinking and ideas has been great.” The DIMS items gave the teachers a way to consider students’ approaches to concepts and not just the correctness of an answer. One 8th grade science teacher said, “I enjoyed the free ride these questions granted me into the minds of my students.”


In this commentary we will discuss two related tools that we provided: a classification system for misconceptions and item-writing support. We will describe both and then conclude with a broader perspective on how this work can influence teaching and learning.


CLASSIFYING MISCONCEPTIONS


To help teachers think about misconceptions we developed a classification system. We created a set of categories by looking across the misconceptions that we had previously identified from existing research (e.g., Driver, Squires, Rushworth, & Wood-Robinson, 1994; Stavy & Tirosh, 2000), teaching resources, and experienced practitioners. We identified eight clusters of misconceptions, which are briefly described below along with some examples.


Failure to recognize the limitation of diagrams, models, and other representations: Diagrams, models, and other types of representations used in scientific and mathematical discourse and materials can be inaccurate, incomplete, poorly scaled, or limitations are not articulated. While a teacher may understand limitations, students are more literal. For example, some students may interpret a commonly used poster of the Solar System to mean that, since the Sun is located on the left with the planets arranged horizontally, Jupiter is the center of the Solar System. In mathematics, younger students rarely see triangles that do not have a horizontal base and may describe them as “upside down triangles.”

 

Going too far with abstractions, generalizations, and simplifications: Characteristics and descriptions of mathematical procedures, expressions, or concepts, (and scientific groups and objects) can be too abstract, general, or simplified. For example, students need to understand when certain mathematical “truths” no longer universally apply, e.g., addition always results in a larger number (until negative numbers are involved). In science, students need to know that not all birds fly.


Confusion with language and vocabulary: Words and phrases used in everyday communication can convey a confusing, or completely different meaning in scientific or mathematical communication. For example, a student meeting the term “continental drift” for the first time may apply what he or she knows about the meaning of the word, and as a result think that the continents actually float upon the oceans. Similarly, mathematical terms such as mean, plane, and point have meanings that are different from everyday language.


Facts”: “Facts” based solely on intuition or faulty reasoning can be misleading, inaccurate, or incomplete in relation to both scientific and mathematical concepts. For example, students sometimes incorrectly think that the first digit to the right of the decimal point is the “oneths” (trying to be symmetrical around the decimal point), or that heavier objects fall faster.


Real-world experiences and perceptions can appear contradictory to scientific learning: Everyday life and perceptual experiences can be misleading, inaccurate, or incomplete in relation to various scientific explorations, studies, or concepts. For example, the five human senses seem infallible to young students, yet human vision differs from certain insects that see in the UV portion of the spectrum; it is hard to believe that sounds exist that humans cannot hear; touch is deceptive and so wood feels “warmer” than metal in the same room.


Common sayings, beliefs, and myths: Familiar sayings, beliefs, or myths can be misleading, inaccurate, or incomplete in relation to various scientific explorations and concepts. For example, humans coexisted with/killed the dinosaurs; meteors are “falling stars.”


Metaphors and analogies that might only partially explain a scientific concept: Metaphors and analogous comparisons of scientific objects or concepts with everyday terms can be misleading, inaccurate, or incomplete. For example, the following analogies are partly helpful, but each has limitations: electricity as water flowing; the heart as a pump; the eye as a camera.


Equivalencies: Various mathematical procedures or concepts are sometimes viewed incorrectly as being equivalent. For example, students may try to manipulate fractions using whole number reasoning, think that calculations involving time are the same as those using decimals, or assume that subtraction and division are commutative, like addition and multiplication.


This classification system became a resource to help teachers make sense of the many misconceptions we had presented to them and that their students came to class with every year. Furthermore, it helped them to become more aware of additional misconceptions that we had not documented. It was useful also as a starting point for the teachers as they began thinking about writing their own diagnostic items.


ITEM-WRITING SUPPORT


We encouraged teachers to write diagnostic items for two reasons. First, the set of DIMS items, while large, does not have items for every possible instructional topic. Second and related, we wanted teachers to develop a routine of using these items as part of regular instruction. Recognizing that while teachers ask questions on a frequent basis, writing these targeted diagnostic items is a skill that takes time to develop. Thus, we devoted time to this topic at two of the professional development sessions. We presented the teachers with a simple item-writing heuristic. The steps are outlined below.


Identify the content: The starting point is the content of current instruction and the critical knowledge, skills, or abilities that teachers want their students to grasp before building on that understanding.


Identify potential student misconceptions: We suggest that teachers use the misconception classification to think about potential student misconceptions related to the selected content. Teachers can draw from patterns of student misunderstandings or confusions that they have observed, teaching materials, or the misconceptions that we have identified.


Draft the item: One or more misconceptions form the starting point for incorrect answer choices, and the item itself is written to trigger the incorrect answers just identified. Teachers also need to generate a correct answer choice. (In some instances, particularly in mathematics when there are multiple representations for answer choices, we have found that multiple-correct answers can lead to rich instructional opportunities.)


Try out the item: The teachers share drafts of their items with peers, explaining the content, context, and the kinds of misconceptions they are trying to identify through the particular item. The feedback often results in revisions. They then use the items with their students and sometimes make revisions afterwards to further clarify or improve them.


The heuristic is straightforward, but the process takes time and effort. At the end of this year, having provided examples, modeled the process, and had teachers try it, we observed that some had taken to writing items on a frequent basis, while others were more hesitant. However, several teachers told us that they were collecting information from their students in preparation for writing items in the upcoming year. One 8th grade science teacher explained, “At this moment I haven’t written diagnostic questions; instead, I am collecting my students’ words (misconceptions) to use in future questions I will create.” This is a useful strategy: listening to students to identify underlying misconceptions with the intention of using that information to generate future items that would allow her to quickly find out how many other students in the class shared those misconceptions.


CONCLUSIONS


Our initial goal with the project was to support teachers’ use of diagnostic items as a tool to identify student misconceptions in the course of instruction, either before introducing new learning or as a check for understanding. The goal has been to help teachers know, not just that students do not understand a particular concept, but to shed some light on the particulars of what they do not understand (Ciofalo & Wylie, 2006).


As time progresses, however, we have come to look at this project as more than the creation of a large bank of diagnostic items and a program of teacher professional development focused on the use of that resource. Rather the DIMS project appeals even to teachers who have not thought previously about formative assessment. Starting with the created resource for teachers has provided an entry point into the larger ideas of formative assessment: using evidence of learning to direct instruction (Leahy et al., 2005). By providing teachers with a more systematic way to think about student misconceptions and student thinking, and by helping them see that item-writing is something that they can do themselves, we have been able to broaden teachers’ views of student learning and support them in understanding how they can approach teaching more diagnostically.


Acknowledgements

We gratefully acknowledge the support of the Institute of Education Sciences, which made this work possible. Funding provided under grant number R305K040051.


Any opinions expressed [in the publication] are those of the author(s) and not necessarily of the Educational Testing Service.


References


Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for Learning: Putting it into Practice. Buckingham, UK: Open University Press.


Ciofalo, J. F., Wylie, E. C. (2006). Using diagnostic classroom assessment: One item at a time. Teachers College Record, http://www.tcrecord.org/content.asp?contentid=12285.


Driver, R. A., Squires, A., Rushworth, S. P., & Wood-Robinson, V. (1994). Making sense of secondary science: Research into children’s ideas. London and New York: Routledge Falmer.


Leahy, S., Lyon, C., Thompson, M., & Wiliam, D. (2005). Classroom assessment: minute-by-minute and day-by-day. Educational Leadership, 63(3), 18-24.


Stavy, R., & Tirosh, D. (2000). How students (mis-)understand science and mathematics: Intuitive rules. New York: Teachers College Press.


Wylie, E. C., Wiliam, D. (2006, April). Diagnostic Questions: Is There Value in Just One? Paper presented at the annual meeting of the American Educational Research Association (AERA) and the National Council on Measurement in Education (NCME), San Francisco, CA.




Cite This Article as: Teachers College Record, Date Published: September 05, 2008
https://www.tcrecord.org ID Number: 15363, Date Accessed: 10/16/2021 10:33:08 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • E. Wylie
    Educational Testing Service
    E-mail Author
    CAROLINE WYLIE is a researcher at Educational Testing Service. Her research interests include psychometric issues and assessor training for performance assessments, teacher licensure/certification, formative assessment and the creation of sustainable, scaleable professional development for teachers.
  • Joseph Ciofalo
    Educational Testing Service
    E-mail Author
    JOSEPH CIOFALO is a developer/facilitator at Educational Testing Service. His educational interests include advanced teacher certification, teacher performance assessments, formative assessment, and the creation of sustainable, scaleable professional development for teachers.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS