Title
Subscribe Today
Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Integrating Computer-Based Curricula in the Classroom: Lessons from a Blended Learning Intervention


by J. Cameron Anglum, Laura M. Desimone & Kirsten Lee Hill - 2020

Context. This study analyzes the implementation of a blended learning middle school mathematics intervention in a large urban school district in the northeastern United States. Blended learning models integrate online instructional tools within traditional methods of classroom instruction.

Focus of Study. As their use increases in classrooms across the country, there remains much unknown about how teachers integrate blended learning strategies into their pedagogical practices and what factors, including school, teacher, and student attributes, facilitate or hinder these approaches. Our findings provide insight into how teachers integrate computer-based curricula in their classrooms, findings particularly instructive for under-resourced urban school districts.

Research Design. The study is designed as part of a within-teacher randomized controlled field trial, a design which enables direct comparisons of teacher practices between each of his or her two mathematics classrooms. To draw these comparisons, we utilize a range of detailed teacher survey data as well as rich teacher interview data.

Conclusions. We believe our findings about the choices teachers make in using software in the classroom and the barriers they face in doing so are applicable to the implementation of a wide variety of computer-based interventions in urban environments, whether they are part of curricular innovations, blended learning instructional strategies, or targeted academic interventions.



INTRODUCTION & OBJECTIVES


Blended learning models have increased markedly in popularity in K12 settings over the past two decades (Means, Toyama, Murphy, & Baki, 2013). Sometimes referred to as a hybrid model of instruction, blended learning approaches typically combine elements of traditional classroom instruction with supplementary online tools (Osguthorpe & Graham, 2003). It is estimated that more than six million K12 students across all 50 states participate in some type of online learning experience each year during their schooling (Picciano, 2016). Despite the increased popularity of these approaches, we know little about what key trade-offs teachers make to incorporate blended learning in the classroom, and what hinders and facilitates teachers and students use of such approaches (i.e., Ertmer, Ottenbreit-Leftwich, Sadik, Sendurur, & Sendurur, 2012; Wachira & Keengwe, 2011).


In this study, we examine the implementation of a blended learning intervention in middle school mathematics classrooms. Our analysis of teacher implementation surveys and student-user software statistics, combined with teacher interview data, provide a systematic and rich description of implementation, with generalizable findings to blended learning environments. This implementation study was conducted within the context of a within-teacher randomized control trial (RCT), funded by the Institute of Education Sciences (IES), that was designed to test the effectiveness of adaptive, computer-based perceptual learning modules (PLMs) at improving mathematics achievement for students in the sixth grade. The RCT took place in a large, diverse, and under-resourced urban school district in the northeastern United States burdened by severe budget shortfalls and extensive teacher layoffs in the years prior to and during implementation. Our implementation analysis included a sample of 43 teachers and 2,114 students who completed end-of-unit surveys and assessments, respectively.


The PLMs provided computer-based units that corresponded to two specific areas of contentfractions and linear and area measurementcovered by participating schools mathematics curricula. The PLMs targeted these content areas because they had been previously identified as difficult for sixth-grade students (see National Mathematics Advisory Panel, 2008; see also Lehrer, Jaslow, & Curtis, 2003; Hiebert, 1981a, 1981b; Behr, Harel, Post, & Lesh, 1992).


Recent research points to teacher-level implementation variation as one of the key explanations for the success or failure of a classroom intervention (Carroll et al., 2007; Harn, Parisi, & Stoolmiller, 2013). The results of the PLM RCT showed that the intervention did have positive effects on student achievement. Our implementation study was designed to understand how and why teachers varied in their implementation of the PLM intervention, and to learn about contextual factors that could hinder or facilitate implementation. We provide evidence to identify and explain implementation variation and offer lessons that are broadly applicable to other computer-based interventions in urban settings. Specifically, our analysis aims to address the following research questions:


1.

How do teachers organize instruction when they integrate a software-based intervention in their classes?

a.

What changes in instructional strategies and classroom management were fostered by the intervention?

b.

How do teachers use data generated by the software to shape classroom instruction?

2.

What factors hinder or facilitate the implementation of the PLM intervention?

3.

How often did students use the intervention software and how well did they perform?

4.

What are the implications of our findings for the implementation of software-based interventions as curricular complements?


LITERATURE REVIEW


Within the broader context of technology integration in the classroom, we consider the PLM intervention to be an example of a blended learning instructional approach. The aim of blended learning is to find a harmonious balance between online access to knowledge and face-to-face human interaction (Osguthorpe & Graham, 2003, p. 228). Blended learning typically incorporates three elements: (a) a portion of learning occurs online with at least some student control over pacing; (b) part of instruction is maintained in traditional school environments; and (c) online and traditional instruction are coordinated to provide an integrated student experience within a subject (i.e., Staker, 2011; Staker & Horn, 2012). Student pacing control and the symbiotic relationship between online and other instructional modalities are commonly emphasized as crucial blended learning components. The PLM intervention is consistent with these elements, and given the rapid expansion of blended learning in K12 settings (Means et al., 2013), our study provides timely insights on how to effectively implement such interventions.


Figure 1 outlines our logic model associated with the classroom implementation of a blended learning intervention. Focused primarily on the mediators and moderators of implementation and their impact both on the interventions implementation and student outcomes, each component is discussed below in the context of our research questions.



Figure 1: Logic Model of the Implementation of a Blended Learning Intervention


[39_23086.htm_g/00002.jpg]




INSTRUCTIONAL CHOICES


While teachers must make instructional decisions in order to integrate any new practice in their classrooms, we know especially little about the tradeoffs they make when integrating blended learning strategies. Because integrating blended learning into classrooms typically involves more than the simple combination of (face-to-face) teaching plus e-learning teachers must make important decisions when implementing such approaches (Kerres & Witt, 2003, p. 106). For example, teachers must decide how to integrate the technology, when to interact with students (as opposed to allowing students to struggle), how to structure class time (e.g., to use peer learning or rotations), and how much traditional instruction to eliminate and when to do so (Christensen, Horn, & Staker, 2013). In developing their approach to blended learning, teachers have many options and yet we know little about how different school and classroom contexts influence these decisions (Kerres & Witt, 2003). Moreover, implementers of blended learning methods are cautioned to avoid adopting one-size-fits-all strategies agnostic to a teachers strengths and specific classroom environments. As Kerres and Witt (2003) noted, It remains difficult to formulate general guidelines for the specification of blended learning arrangements. Learners, for example, evaluate communication tools and scenarios differently and according to their personal situation (p. 111).


Even though technology is used increasingly as an instructional intervention, the relative value of introducing technology in the classroom is oft debated. Zhao, Pugh, Sheldon, and Byers (2002) referred to the integration as the messy process through which teachers struggle to negotiate a foreign and potentially disruptive innovation into their familiar environments (p. 483). Common criticisms of the use of technology in the classroom include cost, lack of teacher preparedness, conflicts with teacher pedagogical preferences, conflicts with school norms, and conflicts with traditional subject-matter teaching methods (Hew & Brush, 2007). Grimes and Warschauer (2008) noted that these issues might be exacerbated in school environments with high counts of students of low socioeconomic status, and Windschitl and Sahl (2002) noted that technological integration in classes taught by novice teachers may pose additional challenges. Such critiques cast doubt over the implementation of one-size-fits-all approaches characteristic of certain technological interventions, including one-to-one laptop student distribution, if implemented in a manner agnostic to school-, teacher-, and student-specific attributes.


Although it is clear that teachers in blended learning environments engage in decision-making processes concerning classroom management strategies and pedagogical approaches, there is limited empirical evidence concerning what tradeoffs teachers make in implementing such decisions. Similarly, there is limited evidence regarding whether teachers make these decisions systematically based on factors including classroom attributes and pedagogical preferences and how these decisions vary by school- and student-specific contexts. For example, while it is known that teachers often pursue models that rotate students through activity groups to use technology, it is less clear how teachers decide to group their students and whether groupings and activities are differentiated consistently across classroom contexts in attempts to benefit students with a range of abilities (Hew & Brush, 2007).


Staker and Horn (2012) depicted four models of blended learning: rotation model, flex model, self-blended model, and enriched-virtual model. Only in the rotation model do teachers, either through a fixed schedule or through their discretion, maintain some level of control over student rotation through learning modalities. Within the rotation model, Staker and Horn identifies four sub-models: station rotation, lab rotation, flipped classroom, and individual rotation. In these models, teachers make decisions about the amount of time students spend learning through traditional whole-class instruction, group work, individual online work, and individual paper-and-pencil work, among other forms of instruction. These models differ in the extent to which teachers direct student rotation through instructional stations and whether each station is situated within teachers classrooms or within other areas of the school, including computer labs.


In rotation model settings, little is known about the direct instructional tradeoffs teachers make to incorporate online learning strategies. Russell, Bebell, and Higgins (2004) found that students in classrooms with one-to-one computer access engaged less frequently in large group work, small group work, pair work, and whole class discussion, compared to students with shared computer access. In contrast, Grimes and Warschauer (2008) found that technological integration in the classroom fostered improved student learning autonomy, a byproduct of the individualized learning that adaptive computer technology may offer its users. In consideration of their increasing use and popularity, a greater body of empirical evidence regarding the implementation of blended learning strategies will help promote the effective evaluation of technology integration in the classroom. Such evidence may shed light on the tradeoffs in instructional strategies teachers make in order to implement such strategies, contextualize what is gained and lost in blended learning environments on the whole, and differentiate how specific contexts may influence such processes.


DATA USE


In the postNo Child Left Behind landscape, effective use of student data has been prioritized as a central lever to improve student learning and achievement. These data-utilization mandates exist at all levels of the school system from national and state analyses to specific school district and classroom applications (Bakia, Means, Gallagher, Chen, & Jones, 2009). Although principals often are identified as key catalysts of data use in schools, teachers play vital roles to establish practices to reinforce the importance of student data use to improve teacher practices (Blanc et al., 2010).


Recent work has examined how teachers in different teaching environments utilize data in their classroom practices. A 2017 RAND study on personalized learning compared a nationally representative sample of teachers to those in schools which received funding to implement personalized learning initiatives including technology-enabled learning (Pane, Steiner, Baird, Hamilton, & Paine, 2017). On the one hand, the study found that a majority of nationally representative teachers use student achievement data at least a moderate amount to (1) identify topics to further emphasize (52%); (2) tailor individualized instruction (53%); and (3) modify pace of instruction (57%). Teachers participating in a personalized learning environment, however, engaged in these practices more frequently (69%, 69%, and 66%, respectively).


Although there is great potential for teacher data use to identify where students are struggling and consequently which content areas to emphasize, many schools fail to provide the necessary time, training, and access required for teachers to effectively consume and integrate student-generated data to improve their instructional practices (Marsh, 2012). To address this shortcoming, many districts have employed a range of coaching strategies to facilitate teacher use of student learning data. Marsh, Bertrand, and Huguet (2015) found the use of coaches and professional learning communities to be associated with changes to teachers instructional delivery beyond minor changes to material or content matter. These changes often were achieved through the combination of individual comprehension (vertical expertise) and co-created understanding (horizontal expertise). Much remains unknown, however, about how teacher data use might contribute to changing how teachers organize their overall instruction. In our study, we explore how teachers use this real-time data. Specifically, we consider whether and how they use the data to shape the daily warm-ups, content review, and one-on-one work with students. If, as with many other data-use interventions, teachers decline to use such data to change their instruction, we consider why they make this decision.


BARRIERS TO TECHNOLOGICAL INTEGRATION


Prior work has shown that software-based interventions may be impeded by several key factors including, teacher, student, and school characteristics. These factors are discussed below.


Teacher-Level Factors


Teachers prior use of and comfort with technological teaching resources as well as teachers existing curricula and pedagogical preferences have been found to impact the integration of instructional technologies within existing classroom learning environments (Earle, 2002). Reviewing the literature regarding technology integration in K12 classrooms from 1995 to 2006, Hew and Brush (2007) found resource access, teacher skills, and teacher attitudes and beliefs about technology to be the most commonly cited barriers to the kind of successful integration that occurs, for example, as part of strong blended learning environments. Mueller, Wood, Willoughby, Ross, and Specht (2008) categorized barriers to technological integration into environmental and teacher-specific characteristics, suggesting that teachers with higher levels of technological exposure and more positive attitudes about the effectiveness of technology as an instructional tool are more likely to fully and successfully integrate technological resources in their classrooms. And while they found no relationships between teaching experience and effective technology integration, they suggested that continual changes in technology may result in teachers being perpetual novices in the process of technology integration (p. 1524). Further, Mueller et al. argued that novice teachers may be more concerned with issues associated with classroom management and curriculum development than with technological integration in the classroom.


Student Characteristics


As resource access and teachers technological aptitudes have improved in many settings over the past decade, research concerning prevalent barriers to classroom technology assimilation have shifted to focus on teacher beliefs about effective pedagogy and individualized student-centered learning (Ertmer et al., 2012). In this vein, student characteristics, including motivation and self-efficacy, have been found to be strong predictors of academic success in blended learning environments, requiring teachers to adapt pedagogical techniques to a diversity of student learning styles (Clayton, Blumberg, & Auld, 2010). In integrating technology into the classroom, teachers also must remain cognizant of a wide range of student prior experience with technology, due in part to inconsistent technology application by each students prior teachers (Wachira & Keengwe, 2011). Teachers, however, often rely on preconceptions of negative relationships between stable student characteristics including English language learner (ELL) and special education designations, to form low achievement expectations (Bertrand & Marsh, 2015). These low expectations also may be reinforced through teacher knowledge of prior student achievement data.


In the implementation of personalized learning strategies including blended learning, students typically assume greater autonomy over their learning through control over pacing and self-regulation of behavior and academic progress (Bingham, 2017). Teachers and students alike, however, may be unprepared to successfully channel individual student autonomy into improved academic achievement as classroom technology use increases. In fact, recent evidence has demonstrated that student misbehavior represents a significant obstacle to effective instruction in personalized learning environments for a majority of teachers (Bingham, 2016; Pane et al., 2017). Bingham (2016) conducted a case study of an urban charter high school which attempted to implement a personalized learning curriculum through blended learning strategies. Through classroom observations, teacher and staff interviews, and document analysis, Bingham found that teachers spent a significant portion of their instructional time attempting to address student misuse of technological resources, strategies that often led teachers to decrease use of technological resources for instruction. In subsequent work, Bingham (2017) advised that the purposeful development of student autonomy, perhaps through structured instructional models characterized by environments of discipline and accountability, may better support the introduction of personalized learning.   


School-Level Factors


School-level resource limitations persist in many urban school environments (Wachira & Keengwe, 2011). Describing an intervention in Canadian schools, Hechter and Vermette (2013) drew attention to tremendous disparities across school settings, factors particularly salient in the integration of technology in the classroom in urban school settings in the United States. The authors explained, In these diverse rural and urban settings, teachers meet the needs of their students through rich and varied technological access, at times, or the complete lack thereof, at others (p. 75). In many schools, the quality of the technological infrastructure (i.e., internet bandwidth, sufficient availability of functioning hardware to enable flexible class scheduling), technical support, and data management capabilities remain significant barriers to effective technology use (Bailey, Schneider & Vander Ark, 2013; Bingham et al., 2018). Finally, additional concerns regarding the implementation of blended learning interventions include insufficient professional development (Drexler, Baralt, & Dawson, 2008), new challenges in classroom management, and a lack of additional preparation time for teachers (Beaver, Hallar, Westmaas, & Englander, 2015).


Implications for Technological Integration


As the use of blended learning approaches in K12 settings proliferates, it is critical that the field continue to increase our understanding of the trade-offs teachers make in their instructional approaches, how these tradeoffs impact student learning, and how we can best support teachers in implementing technological interventions to produce positive effects. For example, how do teachers decide whether to implement whole-class or partial class rotations? In what ways do teachers utilize real-time data generated by classroom technological resources? What instructional adaptations and trade-offs do teachers make to accommodate the insertion of technological interventions into their existing curricula? What contextual factors hinder or facilitate teachers use of blended learning approaches? Building on this important work, the study at hand intends to shed light on the ways in which teachers integrated online instruction with traditional instruction in a manner sensitive to their particular classroom environments and their broader contextnamely, their location in predominantly under-resourced urban schools.


THE PERCEPTUAL LEARNING MODULES (PLM) INTERVENTION


As indicated previously, this study was conducted within the scope of a larger within-teacher RCT conducted over two sixth grade student cohorts during the 20132014 and 20142015 school years, respectively. Consistent with a within-teacher RCT design, to be eligible to participate in the study, each teacher was required to teach at least two classes of sixth-grade mathematics in the given school year. One class was randomly assigned to treatment status wherein the teacher implemented the PLM intervention (hereafter referred to as the PLM class). In the control class, teachers were instructed to follow standard classroom practices without incorporating any element of the PLM interventionthat is, we asked the teachers to refrain from using the software in the control classrooms, employing instructional strategies they may have gleaned from student software use in the intervention class, or altering their pacing based on progress in the PLM classroom. Such a within-teacher design has strong internal validity because it holds constant across-teacher variation (see Agodini, Dynarski, Honey, & Levin, 2003), thereby helping to eliminate teacher characteristics that may explain study results.


In preparation for the intervention, teachers participated in professional development training sessions specific to the PLM intervention, including topics such as academic motivation, software logistics, and integration of the computer program in their classrooms. Introductory sessions in the August prior to each cohort ran for approximately 16 hours and were supplemented by three one- to two-hour sessions during the school year. Prior to implementation, research team members coordinated with teachers to identify areas in their curricula where they could supplement or replace elements of existing lesson plans with the intervention modules. Teachers were asked to direct their students to use the PLMs for 20 to 30 minutes at a time, several times a week, for 3 to 4 weeks per module (or approximately 816 hours for the entire school year). A module was defined as a set of lessons that covered a content area within one of the two curriculum topics. While researchers provided resources and guidance such as curriculum mapping regarding PLM content areas, the final decisions regarding the strategy and timing of PLM integration in the classroom were left to teachers individual discretion in consideration of their respective computer availability, class schedules, and curricular details, among other contextual factors. Additional technical support was offered throughout the school year to ensure access to functioning technology.


STUDY DESIGN AND SAMPLE


As described above, the PLM intervention was evaluated as a within-teacher RCT, a design that supported a straightforward contrast of teacher practices in their intervention classrooms with their respective control classrooms. Further, the nature of the computer-based intervention with treatment-class, student-specific software logins minimized potential treatment spillover effects in the control classes. Prior to each school year we recruited and enrolled teachers in the intervention. Thirty-four teachers enrolled in Cohort 1 and 35 teachers enrolled in Cohort 2. A number of teachers in each enrolled cohort withdrew from the study prior to implementation for a variety of reasons including teacher reassignment and school staff layoffs. The final teacher analysis sample consisted of 43 unique sixth-grade mathematics teachers who completed end-of-unit surveys and assessments. Teachers were spread across two cohorts; 19 teachers participated in both cohorts, and 24 teachers participated in either Cohort 1 or Cohort 2. We include teacher survey responses only in the first year of participation to avoid any bias in our analysis due to changes in teacher practices from a first year to a second year of implementation. Table 1 summarizes the demographic information, educational qualifications, and professional experience of the teachers included in the final analysis.



[39_23086.htm_g/00004.jpg]



At the point of enrollment, each of a teachers two classes were randomly assigned to either the PLM or control condition. These classes comprised 3,780 students who were assigned to classes before the studys randomization process. In addition to teachers who withdrew, a number of students transferred away from a participating class or school during the intervention. Finally, a number of students were absent or suspended from school during final assessments. The final student analysis sample consisted of 2,114 students, 1,037 PLM students and 1,077 control students, each of whom completed end-of-unit assessments. Table 2 summarizes demographic information and academic achievement of students included in the final analysis.



Table 2. Characteristics of Students in Analysis Sample

 

Cohort 1

Cohort 2

 

Unit 1

Unit 2

Unit 1

Unit 2

 

Control

PLM

Control

PLM

Control

PLM

Control

PLM

Minority

0.66

(0.47)

0.63
(0.48)

0.66
(0.47)

0.61
(0.49)

0.68
(0.47)

0.66
(0.47)

0.69
(0.46)

0.66
(0.47)

Female

0.52
(0.50)

0.53
(0.50)

0.51
(0.50)

0.52
(0.50)

0.53
(0.50)

0.52
(0.50)

0.53
(0.50)

0.52
(0.50)

FRPL

0.69
(0.47)

0.68
(0.47)

0.68
(0.47)

0.67
(0.47)

0.64
(0.48)

0.69
(0.46)

0.64
(0.48)

0.68
(0.47)

IEP

0.09
(0.29)

0.09
(0.29)

0.10
(0.30)

0.10
(0.30)

0.06
(0.25)

0.09
(0.29)

0.07
(0.26)

0.09
(0.28)

ELL

0.04
(0.19)

0.08*
(0.27)

0.04
(0.19)

0.07*
(0.26)

0.07
(0.25)

0.05
(0.21)

0.06
(0.24)

0.05
(0.22)

Math achievement

0.23
(0.98)

0.12
(0.99)

0.21
(0.96)

0.12
(1.01)

0.11
(1.03)

0.00
(0.96)

0.09
(1.02)

0.01
(0.93)

Reading achievement

0.21
(0.99)

0.14
(0.97)

0.20
(0.98)

0.13
(1.00)

0.11
(1.02)

0.01
(0.94)

0.08
(1.03)

0.04
(0.93)

Students

515

498

455

427

562

539

596

592

Notes: Proportion (standard deviation) of sample characteristics reported. Minority status indicates students not identified as White or Asian. Math and reading scores are scaled as standard deviations and grand-mean centered.
* indicates statistical difference at the α < 0.05 level.




IMPLEMENTATION DATA


Demographic Surveys


At the beginning of each cohort, teachers participated in an online survey to report on their demographic information, educational backgrounds, credentials and certifications, and teaching experience. Each participating teacher completed this survey once.


Implementation Surveys


At the conclusion of each unit of the intervention, after students completed the unit assessment, researchers administered a web-based implementation survey to each teacher. The implementation surveys covered a range of topics related to instruction and the intervention, including instructional strategies, hindrances to the implementation, and teacher use of intervention data. Survey items drew from previously validated surveys and went through multiple iterations of expert review and cognitive interviews (Desimone & Le Floch, 2004). In Cohort 1 (201314 school year), 26 of 27 teachers completed the Unit 1 survey and 22 of 23 teachers completed the Unit 2 survey. Some teachers in Cohort 1 were unable to completely implement Unit 2 and therefore were not asked to complete an implementation survey for the unit. In Cohort 2 (201415 school year), 11 of 11 first time participating teachers completed the Unit 1 survey and 10 of 11 first time participating teachers completed the Unit 2 survey. In total, across both units and cohorts, first time participating teachers completed 69 implementation surveys.


Teacher Interviews


To supplement teacher survey responses and learn more about teacher-level implementation of the intervention, we conducted in-depth teacher interviews. Specifically, a subset of teachers in each cohort were asked to discuss their classroom implementation of the PLM intervention, including the structure and teaching strategies used in their PLM class, comparisons between their PLM and control classes, hindrances to implementation, and their perceived impact of the intervention on their students and their teaching. The interview protocols went through multiple iterations of expert review and cognitive interviews (Author, 2002). In identifying interview candidates, we chose a sample that captured maximum variation across teacher- and school-level characteristics that we hypothesized were most likely to impact implementation. Such characteristics included teacher comfort with technology, teaching experience, mathematics curriculum, school environment, organization of implementation, and cohort. Using these guidelines, 10 teachers in Cohort 1 were identified and contacted to be interviewed. Of these 10, we successfully interviewed nine teachers. In Cohort 2, we identified and contacted 12 teachers and successfully interviewed 10 teachers. In total, we interviewed 19 teachers; each of the interviews took place in May or June of the respective school year, at which point teachers had completed a majority of the PLM intervention. All interviews lasted approximately 45 minutes and were recorded and transcribed.


Student-User Data


The PLM software collected a variety of data on each student participant and each teacher, including number of sessions and problems completed, problem accuracy, and module mastery. Software data were generated by 1,037 students who participated in the interventions PLM classes and completed end-of-unit assessments.


ANALYTIC STRATEGY


Using survey data, we conducted descriptive analyses to describe the sample of teachers, their pedagogical practices, and classroom hindrances to the interventions implementation. Exploiting the advantages of the within-teacher design of the intervention, we also used within-subject analysis of variance (ANOVA) to quantify tradeoffs in instructional strategies each teacher made in implementing the PLM intervention compared to each control classroom as reported by teachers on the implementation surveys. The within-subject design enables direct comparison of each teachers PLM and control classes (for further review see Charness, Gneezy, & Kuhn, 2012); difficulties often associated with within-subject designs, including contamination or spillover effects, were minimized through strict restrictions regarding use of the intervention software in control classes.1 Using a mixed-methods approach (Author, 2004; Johnson & Onwuegbuzie, 2004), we supplement quantitative survey and software data analysis with qualitative interview analysis to facilitate an in-depth understanding of variation in implementation within and across teachers. Guided by our logic model, interview data was coded for instructional strategies and tradeoffs in instructional choices as well as facilitators and barriers to implementation of the PLM intervention. (see Appendix B for further detail).


RESULTS


Organization of Instruction


Descriptive analysis of survey data revealed that teachers varied in how they elected to deliver the intervention. Half of the teachers reported implementing the PLMs to only a portion of their class at a given time rather than to all students simultaneously. We hypothesized that resource constraints, specifically a lack of computers, would motivate teacher decisions to implement partial as opposed to whole-class groupings. In interviews, several teachers attributed this practice to school resource constraints (e.g., a lack of computers), as we conjectured.


In addition, however, several teachers cited a variety of alternate factors that led them to implement the PLM using partial-class groupings, including pedagogical preference and student attributes such as special education or ELL designations. For instance, two teachers indicated they would have elected to use split-class instruction irrespective of their computer access, emphasizing that the PLM intervention offers a pedagogical advantage of increased opportunity for individualized student attentionthat is, the teacher could work individually with some students while others were working on the computers. In these instances, the choice to use partial-class instruction for the PLMs seems to be motivated by teachers beliefs about effective instruction and not access to resources. Noting a benefit of split-class instruction, Jennifer2 commented, I like doing half and half because its easier to teach like, you know, the first, you know, fourteen people, kids up front and I can circulate a little more. Another teacher noted a benefit to split-class implementation due to the student makeup of his class, particularly its high number of ELL students. Michael explained:


The technology lent itself to me to help my native speakers help my ELLs so it, it works out that not even having enough computers was actually a good thing so the ELLs could sit and watch so its a good thing. And then my native speakers can help. When theyre helping the ELL students, they actually have a good little dialogue.


Classroom qualities such as limited access to computers, harder-to-serve student populations, and teacher discretion in adapting to such circumstances are factors of implementation that may be common to technological interventions. Further, such issues may be more prevalent in under-resourced school settings such as the fiscally challenged urban school district in which the study is situated. While we do not directly compare the provision of technological resources in urban schools versus their non-urban counterparts, our findings support the forewarning by Wachira and Keengwe (2011) that shortcomings in technological resource provision may persist in urban school districts. Further, our findings illustrate how specific circumstances or contexts such as having a substantial number of ELL students may influence how teachers implement technology. Such differentiated teacher decision making recalls Kerres and Witts (2003) warning that any universal guidelines of classroom technology integration likely suffer from a lack of contextual understanding crucial to the learning of specific groups of students. Consequently, we believe these mediators of implementation warrant further investigation particularly in contexts of diverse student bodies with varying access to technological instructional tools. On the one hand, our analysis supports the notion promoted by Wachira and Keengwe and others that technology provision in urban school districts remains incomplete. As a result, strategies aimed to improve student achievement via technological interventions must not assume that existing school resources will be sufficient for all methods of implementation. On the other hand, our results also highlight a potential adaptation teachers may make to such circumstances to accommodate specific student populations, ELL students in this context. These adaptations may serve as models for the potential of blended learning to support student learning in creative ways, such as providing opportunities for peer-to-peer learning and more individualized teacher contact, as demonstrated by the teachers in our study.


Instructional Tradeoffs


Next, exploiting the within-teacher design of the RCT, we employed within-subject ANOVAs to explore the tradeoffs in instructional techniques teachers made in order to accommodate the insertion of the PLM intervention into their lesson plans. To do so, we analyzed within-teacher variance across the PLM and control classes among seven different instructional techniques, chosen based on their prominence as fundamental and consequential ways the organization of instruction may vary (Beaver, Hallar, & Westmass, 2014; Ertmer et al., 2012): (a) formal presentation of the material by the teacher, (b) individual seatwork of students own choice, (c) one-on-one instruction with the teacher, (d) individual seatwork assigned by teacher, (e) students working with their peers, (f) small group instruction with the teacher, and (g) non-PLM work on computers. On implementation surveys, teachers reported the frequency of use of each of the aforementioned instructional techniques for both their intervention and control classes as a percentage of their class time.3 Survey items concerning pedagogical strategies were presented to teachers as mutually exclusive options to separate use of the PLMs from other pedagogical techniques.4 Importantly, the only difference between the PLM and control classes was the insertion of the PLMs and, thus, any differences in the use of various instructional techniques between PLM and control classes in a within-teacher RCT research design may be attributed to the intervention. As a result, we expected teachers to report less frequent use of certain instructional strategies in order to accommodate class time using the PLMs. Figure 2 displays these teacher-reported allocations of class time to the instructional techniques.



Figure 2. Tradeoffs in Instructional Techniques

[39_23086.htm_g/00006.jpg]

 

Notes: Teacher-reported averages of time spent on instructional techniques displayed as percent of class time. **Differences between the control and intervention classes in individual work assigned by the teacher and small group instruction by the teacher were statistically significant at the α < 0.01 level.




Several noteworthy findings emerged from our analyses of comparisons of class time allocations. In comparison to their respective control classes, on average, in PLM classes group work with peers declined by six percentage points (p < .01) and teachers reduced individual student seatwork four percentage points (p < .001), Although the overall magnitude of these differences may appear small, it is important to recall that the PLM intervention was intended only to augment narrow areas of the curriculum and did not replace entire class sessions or full sections of teachers curricula.


Commenting on her decision making, Heather explained, I spent the same amount of time on whole group in both groups. I did not spend the same amount of time on cooperative practice or small groups. The PLM group definitely had to compromise on elbow partner time. These insights support the findings of Russell et al. (2004) investigating differences in one-to-one student laptop computer access versus shared student computer access. Russel et al. found that those students with more computer access worked less frequently in group peer work, an important potential byproduct of technological integration in the classroom witnessed in the implementation of the PLM blended learning intervention. Although investigating an intervention utilizing slightly different technological applications, Russel et al.s findings concerning computer use, supported by our findings, together shed light on tradeoffs associated with increased student technology use. In this case, the tradeoff was decreased peer interaction.


Our findings also indicate that teachers made decisions to maintain class time typically allocated to five other instructional techniques, namely: (a) formal presentation by the teacher, (b) one-on-one work instruction, (c) computer use unrelated to the intervention, (d) seatwork of a students choice, and (e) small group instruction. Given that the PLM intervention intended to supplant portions of teacher instruction, it was surprising to us that teachers reported that they maintained the frequency of formal presentation and one-on-one instruction in their PLM classes. This may suggest that some teachers implement technological interventions in a way that minimizes disruptions to their typical classroom routines, for example maintaining time devoted to split group rotations and one-on-one student interactions. One teacher, Amy, commented, Im up close and personal anyway. . . . They work at tables anyway and I usually visit every table . . . group by group anyway. Finally, teachers used the PLM intervention to supplement their allocation of class time to computer use rather than substituting existing computer use for the intervention. This lack of substitution may indicate that the teachers in our study were willing to increase student computer use on the whole rather than limiting student computer use to time on the interventions software, an important observation about which research is lacking.


In sum, teachers adjusted their instructional techniques to accommodate use of the PLMs in some instructional domains while maintaining their established pedagogical strategies by allocating the same amount of time to other domains. Evidence of both types of teacher practices integrating the PLMs sheds light on teacher decisions crucial to understanding the nuances of the impact of the intervention and integration of technology into the classroom more generally.


Data Use


Next, we examined how teachers used data generated by the interventions instructional software to shape their classroom instruction. In our intervention, as with other technology-based approaches (e.g., Wayman & Stringfield, 2006), teachers were provided with real-time student- and class-specific statistics including number of problems completed, questions answered correctly and incorrectly, learning point accuracy, and module mastery.5 Teachers could access the data at the moment students used the program or anytime thereafter. The intervention provided teachers with professional development intended to provide guidance pertaining to the data provided through the intervention, thus enabling teachers to utilize the data at their discretion.6 Table 3 outlines eight ways teachers were directed to report on their use of the intervention data in their classroom instruction. The first column reports survey responses to the frequency of use of activities related to software data, measured on a 1 to 5 scale, with 1 indicating never and 5 indicating every day or almost every day within a module. The second column indicates the proportion of teachers that used each strategy at least once per module.



Table 3. Teacher Use of Software-Generated Data

 

Mean
(SD)

Proportion
once per module

Monitor student progress

3.37
(0.90)

0.89

Tailor individual instruction

2.34
(0.99)

0.80

Identify areas to improve knowledge  and/or teaching skills

2.34
(1.06)

0.73

Identify curricular and/or instructional gaps

2.25
(0.94)

0.76

Use for grading

2.00
(0.88)

0.68

Identify materials

1.86
(0.97)

0.54

Shape control class instruction

1.80
(1.14)

0.41

Guide testing

1.75
(1.08)

0.39

Total survey observations

69

Notes: 1 = Never, 2 = Once for the unit, 3 = About once a week, 4 = Several times a week, 5 = every day or almost every day.




Teachers reported using the softwares data most frequently to monitor student progress (3.37 indicates more than once per week, on average) and, subsequently, to identify students in need of individualized attention (2.34 indicates more than once per intervention unit, on average). These findings directly support the findings of the RAND study (Pane et al., 2017) where a majority of teachers reported using real-time data to identify areas to further emphasize and to adapt instruction to individual students. One teacher in our study, Melissa, explained:


I would have checked whos progressing and whos not progressing. . . . I go take them to the lab and Ill sit with two or three kids you know, who need, who I see from the PLM system itself, need the assistance.


This teacher identified a common theme of teachers software data use: the identification of students in need of additional instruction. In a similar vein, Kimberly remarked:


A lot of it was to see their accuracy level and their progression through the models, because that kind of gave me an idea of who was really struggling. Because some of my students are very shy and will not tell me when they need help.


As both the survey and interview data indicate, for many teachers the data provided by the PLM software provided a means by which they could target specialized instruction for specific student segments within their classes. In addition, teachers reported using intervention student data less frequently, though more than once per intervention unit, to improve their teaching skills and improve their curricula. Finally, teachers reported using intervention student data less frequently (less than once per unit, on average) to shape their assessment policies. These findings strengthen the evidence base regarding teacher data use at a time when the availability and breadth of student- and classroom-level data continue to expand. While beyond the scope of this study, efforts to understand how teachers use such assessment data, versus homework, quizzes and other metrics of student progress, would be a fruitful line of inquiry.


Hindrances to Implementation and Classroom Instruction


To explore circumstances that may have hindered the implementation of the intervention, our implementation surveys asked teachers to report on the degree to which a series of factors hindered the implementation of the intervention. These factors included variables specific to use of online instructional tools as well as general characteristics of students and classrooms that are hypothesized to be of particular importance in under-resourced school environments. First, teachers reported on their access to and comfort with online learning tools. Table 4 shows how teachers rated hindrances to their use of the software. Hindrances were measured on a Likert scale from 1 to 4, with 1 indicating the factor was not at all a hindrance to implementation and 4 indicating the factor was a great hindrance to implementation.


Approximately 50 percent of teachers reported computer access to be at least a slight hindrance in their implementation of the intervention. This finding supports the warnings of Wachira and Keengwe (2011) and Hechter and Vermette (2013) that while resource access has improved on average across school settings, urban school environments often persist in a lack of complete technological access. In our setting, several teachers with incomplete technological access employed partial-class groupings to overcome this shortcoming. Further, in our study teachers also reported other technical aspects characteristic of technology use, including internet connectivity, hardware, and software difficulties, to be mild hindrances; these findings are consistent with many other technology interventions in urban school districts though, once again, they do not address any potential changes in technological access compared to their counterparts in non-urban school districts (Hechter & Vermette, 2013).



Table 4. Hindrances to the Interventions Implementation: Instructional Software

 

Mean

(SD)

Proportion
slight hindrance

School-level resource and software factors

Computer access

1.83
(1.03)

0.49

Internet connectivity

1.67
(0.78)

0.51

Hardware

1.62
(0.97)

0.36

Slow software

1.55
(0.76)

0.42

Software login issues

1.38
(0.52)

0.36

Firewall

1.16
(0.41)

0.14

Student and teacher factors

Student attention

1.84
(1.08)

0.46

Student tech comfort

1.28
(0.54)

0.23

Teacher tech comfort

1.22
(0.51)

0.17

Survey observations

69

Notes: 1 = Not at all a hindrance, 2 = Slight hindrance, 3 = Moderate hindrance, 4 = Great hindrance




Notably, in survey responses teachers reported that teacher and student comfort with technology were unrelated to the interventions implementation. Mueller et al. (2008) found teacher experience and comfort with technology to be among the most important predictors of successful technological integration. Mueller et al. argue, A teachers judgment about his or her ability to perform actions which lead to student learning is based on past experience. It follows that a teachers positive personal or vicarious experiences with computer technology will lead to greater integration (p. 1526). While we expected teacher survey self-reports on comfort with technology to be related to their implementation, we found no relationship here. In interviews, however, teachers described their comfort with technology as being an aid to implementation. On the whole, our findings indicate that teacher comfort with technology did not impede its integration in the classroom and might have actually aided in the delivery of the intervention. Consistent with this finding, Melissa reported, The main pluses were the kids loved the technology. I feel comfortable with the technology. I had access to the technology.


Contrary to this finding, one novice teacher expressed frustration with online instruction. The teacher explained:


It is frustrating in the beginning as a new teacher coming into the program that PLM is a different way of thinking, of solving different math skills. So be patient because its not traditional in what some educators have learned or taught.  


Although most teachers were comfortable using technological teaching resources, the remarks from the novice teacher highlight a notable caveat to this trend. As classroom utilization of technology expands, the warning advised by Windschitl and Sahl (2002) and Mueller et al. (2008) concerning novice teachers bear due consideration particularly in contexts where teachers and students have less experience in technologically aided instruction.


Student and School Characteristics


The implementation surveys also asked teachers about specific student and school characteristics that may have obstructed classroom instruction on the whole. First, a majority of teachers reported that a wide range of student abilities (91%), student absenteeism (84%), and student misbehavior (75%) adversely impacted general instruction in their PLM classrooms. Table 5 reports teachers assessments of the degree to which student characteristics impeded the interventions implementation.



Table 5. Hindrances to Classroom Instruction: Student Characteristics

 

PLM class

Control class

Mean
(SD)

Proportion slight hindrance

Mean

(SD)

Proportion
slight hindrance

Wide range of abilities

2.88

(0.95)

0.91

2.78

(1.00)

0.88

Absenteeism

2.51

(1.02)

0.84

2.37

(1.07)

0.76

Lack of engagement

2.37

(1.10)

0.71

2.29

(1.07)

0.71

Misbehavior

2.32

(1.08)

0.75

2.32

(1.15)

0.68

Tardiness

2.12

(1.12)

0.61

1.98

(1.11)

0.54

Survey observations

69

69

Notes: 1 = Not at all a hindrance, 2 = Slight hindrance, 3 = Moderate hindrance, 4 = Great hindrance




On the one hand, several of the reported hindrances may be hypothesized to be of particular detriment in intervention classrooms where students utilized instructional technology to a greater extent. Kimberly noted, I actually am sort of anti-technology. I think that it can be useful sometimes, but I see it as more of a distraction for them. . . . They see it as like play time. They dont take it as seriously. Also speaking to the impact of student misbehavior on implementation of the intervention, another teacher, Lisa, identified strong classroom management as a key to successful computer-based instruction:


You have to have a lot of classroom management and know whats going on at all times&There are times where if I wasnt managing the classroom, they could be on another website, maybe taking pictures, they could be doing anything.


On the other hand, though the difficulty in classroom management reported by teachers is consistent with prior research (e.g., Beaver et al., 2015; Bingham, 2016, 2017), one should be cautious in attributing these student attributes as hindrances unique to online instruction. On average, teachers did not report (with statistical significance) any student characteristic, including wide ability ranges and behavioral issues, to be a greater hindrance to instruction in their PLM classes than in their control classes. Instead, the importance of these student characteristics should be considered for instruction in difficult under-resourced school environments broadly rather than in instances of online instruction in particular. Moreover, some teachers strategically utilized the intervention to address wide variations in student ability by establishing peer support structures. In doing so they used the PLM intervention as tool for addressing a hindrance that cut across their two mathematics classrooms. Christopher noted:


There are some really, really low performing kids who . . . were really frustrated and so I had to move pretty quickly to like pairing them up. Thats another reason it was clear that I had to walk around, cause they would shut down, they would daydream.


As mentioned previously, one teacher utilized the PLMs to assist ELL students. Michael explained, My ELL population is increasing and getting larger. So they would pair up, they would watch the native speaker and then it was a couple of days and they would work on their own the best that they could. Finally, some teachers established ability groupings to target the interventions instruction to specific subgroups. Angela described:


I would use it for maybe just my lower level (students) to kind of give them something else to help them to help them understand. I feel like the high kids, they didnt really need it. They knew everything anyway.


In sum, while teachers reported that certain student characteristics served as hindrances to classroom instruction in their PLM classrooms, they likewise reported similar struggles in their control classrooms. Further, certain teachers used the intervention to pair struggling students with their higher achieving peers or to establish ability groupings within their classrooms, thus channeling the intervention to directly address instructional concerns. Despite the lack of statistical significance differentiating treatment and control classrooms in terms of student-level hindrances to instruction, we believe there is practical significance in the challenges teachers faced in implementing technological interventions in challenging academic environments. Our findings support the notion that although student attributes continue to impact teacher instruction, the challenges teachers faced implementing the PLMs did not differ from their challenges in their traditional practices. In addition, certain teachers were able to utilize the PLMs to address prevalent student characteristics in their classrooms in a manner that may have prevented them from further impeding instruction.


Finally, teachers were directed on the implementation surveys to report on school-level characteristics that may have impacted their classroom instruction. Similar to the student characteristics explored previously, teachers did not report any school characteristic to be a greater hindrance in their PLM classes than in their control classes, suggesting the school characteristics were approximately equally problematic in both of their classrooms irrespective of the intervention. Table 6 reports school-level characteristics for both PLM and control classes.



Table 6. Hindrances to Classroom instruction: School Characteristics

 

PLM class

Control class

Mean
(SD)

Proportion slight hindrance

Mean
(SD)

Proportion slight hindrance

Pressure to prepare for tests

2.64
(1.08)

0.80

2.60
(1.07)

0.80

Interruptions to class time

2.39
(0.98)

0.81

2.36
(1.00)

0.80

Lack of extra student help

2.39
(1.07)

0.76

2.39
(1.08)

0.75

Large class sizes

2.31
(1.19)

0.64

2.27
(1.17)

0.64

Insufficient time

2.19

(0.97)

0.73

2.19

(0.99)

0.71

Lack of teacher planning time

1.88
(0.93)

0.58

1.88
(0.93)

0.58

Lack of materials

1.81
(0.90)

0.53

1.81
(0.91)

0.53

Pressure to teach other subjects

1.78
(1.04)

0.46

1.78
(1.04)

0.46

Change in school leadership

1.75
(1.09)

0.37

1.75
(1.09)

0.38

Lack of principal support

1.36
(0.66)

0.27

1.37
(0.69)

0.27

Lack of teacher support

1.32
(0.66)

0.24

1.27
(0.55)

0.22

Lack of research team support*

1.05
(0.22)

0.05

1.07
(0.25)

0.07

Total survey observations

69

69

Notes: 1 = Not at all a hindrance, 2 = Slight hindrance, 3 = Moderate hindrance, 4 = Great hindrance; *Teachers were asked to report on research team support only in Cohort 2.




Test preparation pressure, interruptions to class time, and lack of extra student help emerged as the greatest school-level hindrances to classroom instruction, with more than 75% of teachers reporting these issues to be a least a mild hindrance to their classroom instruction in both their PLM and control classes. Jason lamented, Too many variables go on with teachers day-to-day, you know, you may have an assembly, then its a half a day, its just a lot. Teachers did not report that a lack of additional support from the research team, other teachers, or principals hindered their instruction. Regarding the professional development associated with the intervention, Kimberly remarked, I thought the professional developments were really great . . . everything was very clear and laid out. They gave us time, they walked us through everything . . . it was like just the right amount of what everyone needed to do. The RCT research team provided many hours of professional development specific to the intervention, conditions that may not be as commonplace in traditional school settings, particularly in under-resourced urban school districts (Drexler et al., 2008).


Student Software Use and Performance


Within the PLM intervention, each student in the PLM classes maintained a unique software login enabling student-specific adaptive software response to user progress. Exploring this data, Table 7 outlines the practical details of student use of the PLM modules. The first section of Table 7 outlines classroom per-module averages of total sessions, time spent, problems computed, problem accuracy, and learning point accuracy.7 On average, students engaged with the PLMs for 8.6 hours over the duration of the school year across a classroom average of over 46 sessions. Students completed an average of 371 problems with 46% accuracy. Finally, students achieved learning point mastery on 55% of the subject matter subdomains within each overarching module. The second two sections of Table 7 differentiate those students who reached and did not reach mastery within each module, respectively. On the whole, the software-generated data revealed substantial variation in student use, including number of sessions completed and problems attempted. Perhaps unsurprisingly, students who reached mastery completed more sessions, attempted more problems, and answered those problems with greater accuracy than those students who did not reach module mastery. Students completed the fewest sessions and attempted the fewest problems in the module Slice and Clone 2, likely because the module built upon concepts introduced in Slice and Clone 1, wherein 57% of students reached mastery.



Table 7. PLM Implementation Characteristics by Module

 

Slice and Clone 1

Slice and Clone 2

Start to End

Area measurement

Total

Classroom averages

Classroom sessions


16.3

(8.7)


8.5
(6.1)


11.1
(6.1)


11.9
(8.1)


46.6

(21.1)

Time spent

3.2
(1.5)

1.4
(1.0)

2.3
(1.0)

2.0
(1.2)

8.6

(3.4)

Problems completed

125.8
(51.7)

40.6
(26.5)

57.3
(22.3)

154.4
(77.2)

371.1

(129.2)

Problem accuracy

0.55
(0.13)

0.35
(0.18)

0.42
(0.15)

0.50
(0.13)

0.46

(0.16)

Learning point mastery

0.62
(0.27)

0.41

(0.30)

0.51
(0.26)

0.64
(0.25)

0.55

(0.29)

Classroom observations

63

63

58

62

63

Students who reached module mastery

 

Proportion of students

0.57

0.49

0.55

0.49

0.53

Classroom sessions

16.7
(13.8)

12.2
(11.0)

12.9
(15.3)

12.5
(10.3)

61.4

(27.4)

Problems completed

151.0
(78.8)

71.7
(37.6)

186.0
(125.6)

77.0
(28.7)

561.7

(170.0)

Problem accuracy

0.61

(0.17)

0.65

(0.17)

0.64

(0.16)

0.65

(0.13)

0.64

(0.16)

Student observations

587

387

556

461

1,992

Students who did not reach module mastery

 

Proportion of students

0.43

0.51

0.45

0.51

0.47

Classroom sessions

15.5

(17.2)

6.4

(14.5)

10.7

(14.0)

8.3

(10.0)

34.9

(26.2)

Problems completed

99.2

(99.2)

24.5

(41.7)

120.4

(125.3)

35.8

(40.5)

231.8

(153.2)

Problem accuracy

0.35

(0.18)

0.37

(0.23)

0.44

(0.17)

0.29

(0.21)

0.36

(0.21)

Student observations

450

406

452

472

1,781

Notes: Time spent is measured in hours. The modules each contain the following number of learning points and total possible questions: Slice and Clone 1: 11, 1,175; Slice and Clone 2: 8, 922; Area Measurement: 10, 1,102; Start to End: 8, 959. Classroom observations are measured by each teacher by module by cohort observation. Student observations are measured by each student by module observation.




At the end of each content unit, student math achievement was assessed for students in both PLM and control classrooms; there were no statistically significant differences in prior math or reading achievement across PLM and control groups in either cohort. The assessments consisted of problems drawn from state standardized assessments and the National Assessment of Educational Progress (NAEP) based on their relevancy to both PLM material and control curricula. Analysis of the efficacy of the PLM intervention, including effects on student achievement, are detailed in (citation omitted for blind review). In summary, the PLM intervention was shown to be successful in eliciting gains in student achievement across a majority of units across both cohorts; remaining units found no statistically significant differences between PLM and control students.

 

Implications of Variation in Implementation


Our final research question concerns the implications of our findings for the implementation of software-based interventions as curricular complements, lessons broadly applicable to many instances of classroom technological integration. Our findings bear importance for the integration of technology in the classroom, including necessary professional development, classroom organization, tradeoffs in instructional techniques, and barriers to effective integration, particularly in low-resourced urban schools using blended learning instructional approaches.


First, we examined teacher decisions regarding the organization of instruction in implementing the intervention in their classrooms. We found that some teachers used whole-class instruction while some teachers implemented a partial-class grouping approach. Although school resource constraints may have been a primary motivation for split-class implementation for many teachers, some teachers cited pedagogical preference or the tailoring of instruction to students with attributes such as special education or ELL designations as rationales for partial-class groupings for implementation. Such findings reinforce Kerres and Witts (2003) notion that technological integration in blended learning environments must not pursue a one-size-fits-all implementation strategy agnostic of important school, teacher, and student attributes. Adding to this notion, we provide evidence that teachers may use technology-based interventions as opportunities to foster peer-to-peer interaction, as well as individualized instruction. Likewise, another key finding is that teacher professional development concerning blended learning must present a full suite of implementation strategies to enable teachers to adapt its integration to their specific contexts. Specifically, we found that, teachers differentially used partial and whole-class implementation and varied in their use of peer work, individual work, and small-group instruction across their PLM and control classes. Thus, effective professional development should avail information and practice with a variety of implementation strategies to prepare teachers for productive adaptation (Penuel, Fishman, Yamaguchi, & Gallagher, 2007).


We also identified tradeoffs in instructional activities teachers made to accommodate the intervention in their lesson plans. To implement the computer-based intervention, teachers were confronted with several key decisions regarding their methods of classroom instruction. We found that teachers substituted time normally allocated to teacher-assigned seatwork and peer work for time spent on the intervention. Equally notable, time spent on other activities including formal teacher presentation, one-on-one work with the teacher, and non-intervention-related computer use were unaffected by the intervention. Many of these decisions may serve as vital lessons for the application of blended learning strategies more broadly. In intervention classes, teachers elected to alter time devoted to some typical instructional techniques to accommodate the integration of the intervention. If teachers are given broad latitude regarding the instructional techniques in their lesson plans, close attention should be devoted both to the techniques teachers cut back on to integrate online instruction in their curricula and to those techniques they maintain so as to accurately assess what is lost pedagogically in scenarios of increased computer use.


Next, we examined how teachers used the student-user data generated by the intervention software. Eighty-four percent of teachers used the interventions data to monitor student progress, in terms of student progression through content areas and in terms of their problem accuracy, important windows into both individual and aggregate student progress and understanding of specific academic content. Three quarters of teachers used software data to tailor individual student instruction, identify areas to improve teacher knowledge or teaching skills, and to identify instructional gaps in their teaching, significant markers to improve instruction. The variety of ways teachers used software-generated data highlights the capacity of blended learning components to augment teacher knowledge and instruction. Targeted professional development, as provided through the PLM intervention, may be used to further the extent to which software-generated data is used to inform specialized instruction, particularly for diverse student groups. In this vein, Russell, Bebell, ODwyer, and OConnor (2003) advised, With such a wide variety of technology applications available, it seems prudent to focus teacher preparation on specific types of uses rather than on familiarizing them with technology in general. Although a thorough understanding of technological tools for instruction is necessary for teachers to successfully apply to their classrooms, effective professional development likewise should aid teachers in thinking through applications to specific contexts.


Student-generated software data enabled us to examine student PLM use and, through end-of-unit assessments, how their math achievement may have changed. On average, students that achieved module mastery completed more sessions, attempted more problems, and answered problems with greater accuracy than students who did not achieve module mastery. On the whole, PLM students outperformed control class students on the Start to End and Area Measurement modules with standardized effect sizes ranging from 0.18 to 0.48 standard deviations. The Slice and Clone 1 and 2 modules witnessed mixed results. Lastly, achievement gains witnessed by PLM students persisted 1 year after the intervention as measured through delayed posttests, though some losses in control material achievement also were found relative to control class students. Further research into the causes of variation in student software use and subsequent achievement should attempt to explain student participation and achievement in the context of the varied school, teacher, and student characteristics of the studys participants.


Finally, teachers reported on the degree to which they encountered barriers to both implementation of the PLM intervention specifically and classroom instruction more generally, including technological access and school- and student-level characteristics. Teachers most frequently identified student characteristics, including wide ability ranges, absenteeism, and behavior, as the greatest student-related hindrances to classroom instruction, though not to a greater degree than in their respective control classrooms, on average. Similarly, school-level characteristics, including test-preparation pressures, interruptions to class time, and a lack of additional student resources, were reported by roughly three quarters of teachers as barriers to classroom teaching regardless of PLM or control class. Importantly, computer and technical issues were rarely reported to hamper implementation of the PLMs. Similarly, teachers also conveyed that teacher and student comfort with technological resources was not a hindrance to implementation of the intervention. On the whole, schools and teachers should consider a wide assortment of school- and student-related characteristics beyond technological constraints when integrating online instruction into their lesson plans and supporting teachers in their use of blended learning strategies. While students may enjoy time using technological resources, teachers employing blended learning strategies must maintain vigilant class-management strategies in light of teacher-reported concerns of student abilities and classroom behavior.


CONCLUSIONS


Our analysis provides insights about how teachers adapt their instruction in the context of a computer-based curricular intervention. Our findings have widespread implications for interventions that integrate computer use with classroom teaching. Within the context of a software-based mathematics intervention that had positive effects on student learning, we present new evidence explaining teacher rationales for variation in implementation and the tradeoffs teachers made to adapt a blended learning instructional strategy to their respective classroom contexts. These findings have implications for how we support teachers use of such techniques in initial professional development and beyond, and how we study their effects.


Specifically, we found that teachers not only organize their classrooms to use computer interventions based on resource constraints but also to allow them to employ pedagogical techniques they believe are effective, such as peer-to-peer interaction and increased individualized attention to struggling students. Thus, professional development targeted to use of computer interventions ought to explicitly explore these implementation alternatives, particularly in contexts where whole-class implementation may not be possible. Similarly, research on the interventions should include a study of these pedagogical choices, as some of the benefits of the approaches may result from peer interaction or teacher attention rather than the particular approach taught in the computer-based intervention. Understanding the multiple ways teachers use computer-based curricula and how that impacts teaching practices and student learning is necessary to best shape teacher supports and monitoring and evaluation systems to ensure that interventions are maximally effective in improving student learning.


In studying the barriers to implementing blended learning, teachers in our study indicated that existing student-, class-, and school-level factors, rather than elements of the intervention itself, hindered their instruction. In fact, a majority of surveyed teachers reported the lack of additional academic student support, the lack of student engagement, student absenteeism, and the wide range of student abilitiesall elements prevalent in demanding urban schoolsas hindrances to their classroom teaching. A key takeaway of our analysis, however, is that technology-based interventions do not present unsurmountable challenges in under-resourced urban schools. The barriers teachers cite are common to most interventions, and teachers manage to implement such interventions effectively even without enough computers for all students.


As with so many other curricular and pedagogical reforms, often it is not the content or delivery method that impedes implementation, it is the same set of challenges that seem to pervade under resourced urban schools everywheremixed-ability classrooms, pressure to prepare for standardized assessments, and a lack of supplemental student help, among many other factors. We consider this a major finding of our work. Technology seems to have come far enough that despite its few unique challenges, perhaps now it is joining the fray to be another promising opportunity that might be realized if we can more effectively address the problems that have permeated urban education for decades. As the integration of instructional technologies becomes increasingly prevalent in K12 settings, additional research on the implementation of such interventions will shed light on the mediating and moderating effects of student-, teacher-, and school-level circumstances vital to a complete understanding of the settings in which the blending learning interventions will achieve their greatest successes.



Notes


1. In addition, in Table 3 teachers reported that they used the intervention data to shape their control classroom instruction less than once per unit, on average.


2. All teacher names are pseudonyms.


3. Specifically, teachers were provided the following instructions and asked the following question: The next set of questions asks about how you organize instruction. You may organize instruction differently for different classes. We know that teachers use multiple strategies throughout the class; percentages do not need to add to 100%. Thinking about the (Unit 1) Time Period, on a typical day, approximately how often did students experience the following types of instruction? Participants were asked to identify the percentage of class time allocated to each activity, measured in increments of 10 percent. See Appendix B for further detail.


4. All survey items were reviewed using pilot testing and cognitive labs to safeguard that teachers interpreted questions regarding tradeoffs in instructional techniques to exclude PLM work.


5. Learning point mastery measures the proportion of learning points, sub-domains of each subject-matter, for which students reach mastery; to reach learning point mastery, a student must answer four out of five questions correctly in succession. To reach module mastery, a student must reach mastery in each learning point within the module.


6. Professional development regarding the potential uses of PLM data was presented to both cohorts but emphasized to a greater extent to the second cohort of teachers. Despite this emphasis, teacher data use was found to be consistent across both cohorts.


7. Problem accuracy measures the proportion of questions answered correctly. Learning point mastery measures the proportion of learning points, sub-domains of each subject-matter, for which students reach mastery; to reach learning point mastery, a student must answer four out of five questions correctly in succession. To reach module mastery, a student must reach mastery in each learning point within the module. If a student answers several questions incorrectly and subsequently answers four out of five questions correctly in succession, that student would have demonstrated low overall accuracy but, also, would have demonstrated mastery on the learning point in question.



References


Agodini, R., Dynarski, M., Honey, M., & Levin, D. (2003). The effectiveness of educational technology: Issues and recommendations for the national study. Princeton, NJ: Mathematica Policy Research.


Alexander, R. (2001). Culture and pedagogy: International comparisons in primary education. Malden, MA: Blackwell.


Atkinson, P., Coffey, A., & Delamont, S. (2003). Key themes in qualitative research: Continuities and change. Walnut Creek, CA: AltaMira Press.


Bailey, J., Schneider, C., & Vander Ark, T. (2013). Blended learning implementation guide 2.0. DIGITAL SHIFT.


Bakia, M., Means, B., Gallagher, L., Chen, E., & Jones, K. (2009). Evaluation of the enhancing education through technology program: Final report. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service.


Beaver, J. K., Hallar, B., & Westmaas, L. (2014). Blended learning: Defining models and examining conditions to support implementation (PERC Research Brief). Philadelphia, PA: Research For Action.


Beaver, J. K., Hallar, B., Westmaas, L., & Englander, K. (2015). Blended learning: Lessons from best practice sites and the Philadelphia context (PERC Research Brief). Philadelphia, PA: Research for Action.


Behr, M. J., Harel, G., Post, T., & Lesh, R. (1992). Rational number, ratio, and proportion. In Douglas A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 296333). New York: Macmillan.


Bertrand, M., & Marsh, J. A. (2015). Teachers sensemaking of data and implications for equity. American Educational Research Journal, 52(5), 861893.


Bingham, A. J. (2016). Drowning digitally? How disequilibrium shapes practice in a blended learning charter school. Teachers College Record, 118(1), n1.


Bingham, A. J. (2017). Personalized learning in high technology charter schools. Journal of Educational Change, 18(4), 521549.


Bingham, A. J., Pane, J. F., Steiner, E. D., & Hamilton, L. S. (2018). Ahead of the curve: Implementation challenges in personalized learning school models. Educational Policy, 32(3), 454-489.


Blanc, S., Christman, J., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. (2010). Learning to learn from data: Benchmarks and instructional communities. Peabody Journal of Education, 85, 205225.


Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007). A conceptual framework for implementation fidelity. Implementation Science2(1), 40.


Charness, G., Gneezy, U., & Kuhn, M. A. (2012). Experimental methods: Between-subject and within-subject design. Journal of Economic Behavior & Organization81(1), 18.


Christensen, C. M., Horn, M. B., & Staker, H. (2013). Is K12 blended learning disruptive? An introduction to the theory of hybrids. San Mateo, CA: Clayton Christensen Institute for Disruptive Innovation.


Clayton, K., Blumberg, F., & Auld, D. P. (2010). The relationship between motivation, learning strategies and choice of environment whether traditional or including an online component. British Journal of Educational Technology41(3), 349364.


Desimone, L.M., & Le Floch, K. C. (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26(1), 1-22.


Drexler, W., Baralt, A., & Dawson, K. (2008). The Teach Web 2.0 Consortium: A tool to promote educational social networking and Web 2.0 use among educators. Educational Media International45(4), 271283.


Earle, R. (2002). The integration of instructional technology into public education: Promises and challenges. ET Magazine, 42(1), 513.


Ertmer, P. A., Ottenbreit-Leftwich, A. T., Sadik, O., Sendurur, E., & Sendurur, P. (2012). Teacher beliefs and technology integration practices: A critical relationship. Computers & Education59(2), 423435.


Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine.


Goetz, J., & LeCompte, M. (1984). Ethnography and qualitative design in educational research. Orlando, FL: Academic Press.


Grimes, D., & Warschauer, M. (2008). Learning with laptops: A multi-method case study. Journal of Educational Computing Research38(3), 305332.


Harn, B., Parisi, D., & Stoolmiller, M. (2013). Balancing fidelity with flexibility and fit: What do we really know about fidelity of implementation in schools? Exceptional Children79(2), 181193.


Hechter, R. P., & Vermette, L. A. (2013). Technology integration in K12 science classrooms: An analysis of barriers and implications. Themes in Science and Technology Education6(2), 7390.


Hew, K. F., & Brush, T. (2007). Integrating technology into K12 teaching and learning: Current knowledge gaps and recommendations for future research. Educational Technology Research and Development55(3), 223252.


Hiebert, J. (1981a). Cognitive development and learning linear measurement. Journal for Research in Mathematics Education, 12, 197211.


Hiebert, J. (1981b). Units of measure: Results and implications from national assessment. Arithmetic Teacher, 28, 3843.


Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher33(7), 1426.


Kellman, P. J. (2002). Perceptual learning. In H. Pashler & C. R. Gallistel (Eds.), Stevens handbook of experimental psychology (3rd ed., Vol. 3, pp. 259299). New York: John Wiley & Sons.


Kellman, P. J., Massey, C. M., & Son, J. Y. (2010). Perceptual learning modules in mathematics: Enhancing students pattern recognition, structure extraction, and fluency. Topics in Cognitive Science2(2), 285305.


Kerres, M., & Witt, C. D. (2003). A didactical framework for the design of blended learning arrangements. Journal of Educational Media28(23), 101113.


Lehrer, R., Jaslow, L., & Curtis, C. (2003). Developing and understanding of measurement in the elementary grades. In D. Clements & G. Bright (Eds.), Learning and teaching measurement. Reston, VA: National Council of Teachers of Mathematics.


Marsh, J. A. (2012). Interventions promoting educators use of data: Research insights and gaps. Teachers College Record114(11), 148.


Marsh, J. A., Bertrand, M., & Huguet, A. (2015). Using data to alter instructional practice. Teachers College Record, 117(4), 140.


Means, B., Toyama, Y., Murphy, R.., & Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record115(3), 147.


Mueller, J., Wood, E., Willoughby, T., Ross, C., & Specht, J. (2008). Identifying discriminating variables between teachers who fully integrate computers and teachers with limited integration. Computers & Education51(4), 15231537.


National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education


Osguthorpe, R. T., & Graham, C. R. (2003). Blended learning environments: Definitions and directions. Quarterly Review of Distance Education4(3), 227233.


Pane, J., Steiner, E., Baird, M., Hamilton, L. and Pane, J. 2017. Informing progress: Insights on personalized learning implementation and effects. Arlington, VA: RAND.


Penuel, W. R., Fishman, B. J., Yamaguchi, R., & Gallagher, L. P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal44(4), 921958.


Picciano, A. G. (2016). Research in online and blended learning: New challenges, new opportunities. In C. D. Dzuiban, A. G. Picciano, C. R. Graham, & P. D. Moskal, Conducting research in online and blended learning environments: New pedagogical frontiers (pp. 111). New York: Routledge.


Russell, M., Bebell, D., & Higgins, J. (2004). Laptop learning: A comparison of teaching and learning in upper elementary classrooms equipped with shared carts of laptops and permanent 1: 1 laptops. Journal of Educational Computing Research30(4), 313330.


Russell, M., Bebell, D., O'Dwyer, L., & O'Connor, K. (2003). Examining teacher technology use implications for preservice and inservice teacher preparation. Journal of Teacher Education54(4), 297310.


Ryan, G. W., & Bernard, H. R. (2003). Data management and analysis methods. In N. K. Denzin & Y. S. Lincoln (Eds.), Collecting and interpreting qualitative materials (2nd ed.; pp. 259309). Thousand Oaks, CA: Sage.


Staker, H. (2011). The rise of K12 blended learning: Profiles of emerging models. San Mateo, CA: Innosight Institute.


Staker, H., & Horn, M. B. (2012). Classifying K12 blended learning. San Mateo, CA: Innosight Institute.


Wachira, P., & Keengwe, J. (2011). Technology integration barriers: Urban school mathematics teachers perspectives. Journal of Science Education and Technology20(1), 1725.


Wayman, J. C., & Stringfield, S. (2006). Technology-supported involvement of entire faculties in examination of student data for instructional improvement. American Journal of Education112(4), 549571.


Windschitl, M., & Sahl, K. (2002). Tracing teachers use of technology in a laptop computer school: The interplay of teacher beliefs, social dynamics, and institutional culture. American Educational Research Journal39(1), 165205.


Zhao, Y., Pugh, K., Sheldon, S., & Byers, J. L. (2002). Conditions for classroom technology innovations. Teachers College Record104(3), 482515.



APPENDIX A: PLM INTERVENTION OVERVIEW


The central goal of the study was to estimate student achievement effects on PLM-related mathematics material. The PLM intervention involved student use of a computer program through which students engaged in math problems that involve direct interaction with a variety of mathematical structures, relations, and representations. The specific PLMs used in this intervention are based on the work of cognitive scientists Philip Kellman and Chris Massey. Their research shows that perceptual learning techniques improve learning via specific emphasis on student conceptual discovery through the recognition of mathematical patterns and structures, and student fluency through the repeated application of such concepts to a variety of applications (Kellman, 2002; Kellman, Massey, & Son, 2010).



Figure A1. PLM Example: Slice and Clone 1

[39_23086.htm_g/00008.jpg]


[39_23086.htm_g/00010.jpg]


Notes: Slice and Clone 1 software environment makes the structure and relations underlying fraction concepts tangible to learners by providing them with interactive on-screen tools that they can manipulate. The students task is to start with a given quantity and use the slicing and cloning tools to create a new quantity. As shown in the top panel, students operate a slicer tool (in the upper left) to cut a continuous extent into a desired number of pieces, thus creating a base unit. As shown in the bottom panel, when they have created a successful unit, it drops down into cloner tool (bottom left) that will iterate that unit a desired number of times and output the result. While these screenshots are static, the actual PLM is fully interactive with customized animated feedback at every step.




The PLM intervention was divided into two academic content units. Unit 1 covered content pertaining to fractions and linear measurement, and comprised three modules: Slice and Clone 1, Slice and Clone 2, and Start to End. Unit 2, Area Measurement, covered topics in measurement. The separation of the two units was purposeful in that it reflected the organization of teachers existing curricula. Specifically, the subject matter addressed across the Slice and Clone 1, Slice and Clone 2, and Start to End modules was represented across a broad section of the control curriculum and, as a result, the researchers created one assessment combining this material labeled Unit 1. The subject matter relating to the Area Measurement module was interwoven throughout a wide span of the control curriculum labeled Unit 2. This unit structure ensured appropriate content overlap of what was taught in the PLM and control classes. At the conclusion of each unit, students completed a researcher-constructed assessment, which included both PLM-specific material and additional non-PLM material aligned to teachers curricula taught within the same period of implementation for each PLM module. Lastly, delayed posttests covering PLM-specific and additional non-PLM material from the implementation periods of both Unit 1 and Unit 2 were administered nearly a year after the completion of each cohort, in the spring of 2015 for the 20132014 cohort and in the spring of 2016 for the 20142015 cohort.




APPENDIX B: DATA ANALYSIS


TEACHER INTERVIEW DATA


The 19 teacher interviews we conducted contributed significantly to our research findings by providing rich evidence pertaining to variation in implementation and teacher-specific classroom contexts. Interview data both supplemented and reinforced quantitative survey analysis findings. We derived our primary coding framework for teacher interviews from our logic model (see Figure 1), research questions, and literature review (Alexander, 2001). As our analysis progressed, we broadened our thematic focus as new topics emerged from transcript review. In this manner, we employed the constant comparative method to form our interview codes while multiple readers analyzed the same transcripts to ensure consistency across analyses (Glaser & Strauss, 1967). This process enabled our analysis to focus on two important lines of analysis: First, the themes of the PLM intervention we hypothesized to be likely to influence teacher implementation; and second, unexpected themes which emerged in transcript analyses. As such, the theories anchoring our logic model and research questions along with our transcript analyses informed the evolution of our coding framework to include emergent themes (Goetz & LeCompte, 1984).


The codes that emerged included: whole and partial class groupings, instructional strategies and tradeoffs, data use (Score Reporter), school and student attributes, resource availability, obstacles and hindrances to teaching, struggling students, and teacher and student interaction with technology. To most effectively share some of the insights we gained from the teacher interviews, we frequently cite important teacher quotes as thematic illustrations (see Atkinson, Coffey, & Delamont, 2003; Ryan & Bernard, 2003). The qualitative evidence we cite directly pertains to our research questions; any evidence we excluded fell beyond the scope of the studys analysis. As previously discussed, in identifying teachers to interview, we purposefully sought variation in teacher and school characteristics influential to the interventions implementation. As such, our interview data echoes a range of teacher experiences, though the majority of our codes and themes of analysis were widely shared among many interview subjects.

Interview Coding Examples


The following section provides excerpts of two teacher interviews stemming from the first interview question (see protocol in next section). These excerpts were analyzed to pertain to the following codes: instructional strategies and tradeoffs, data use (Score Reporter), and whole and partial class groupings. The excerpts included in the manuscript demonstrate examples of the important evidence gained from teacher interviews pertaining to instructional tradeoffs (or the lack thereof) and data us employed by teachers implementing the PLMs in their classrooms.


Interview Excerpt With Amy:


Interviewer: So, when youre working with your homeroom class, how would you structure your math lesson?

Amy: So what I do is I enter the, today is a bad example, but say, for example were doing reflection, so I will start with a do now, like Ill have a question, and then I would do the example by myself. And then they have to do it with a partner and then they have to do it independently. And then they have to do, theyll go into the book and theyll do it and then we come back as a group.

Interviewer: Okay.

Amy: So thats how I do it.

Interviewer: Like a traditional sequence.

Amy: Even with them, thats how I do it with the groups, and I sorta, you know, they have, call it here gradual release. So we do the gradual release like that.

Interviewer: Okay. Do you feel like, I mean do you notice any benefits from working with the smaller group? You said in your PLM class that youre able to work with a smaller group&

Amy: Hmm hmm.

Interviewer: To try to teach them, do the lesson. Is that helpful do you think for them or&

Amy: Not really. Well, it can be because Im up close and personal anyway, but I dont know. I guess that was, that will be more of a question for them because, yeah, because they, they work at tables anyway and I usually visit every table in sort of, with them group by group anyway because theyre in here for 135 minutes so just for the first 45, its just dedicated to PLM. And then we go in together as group so.

Interview excerpt with Melissa:

Interviewer: And getting at how you actually structure a typical math lesson, on a PLM day in your PLM class, how would you describe the type of activities that you do?

Melissa: Well, whatever, whatevers going on with the normal curriculum. I check that homework first.

Interviewer: Hmm hmm.

Melissa: And then I will have checked going into PLM into the, the management, I would have checked whos progressing and whos not progressing. And then normally what I do is I go take them to the lab and Ill sit with two or three kids you know, who need, who I see from the PLM system itself, need the assistance. I have them, I then basically float around the room for the rest of the time.


Teacher Interview Protocol


The following is the protocol the research team used to interview teachers:

Thank you so much for speaking with me today. We will be speaking with a small number of teachers about their experiences implementing the PLMS. This information will complement the information we collect from the surveys. You will be helping us understand what factors help you with your se of the PLM, what serves as a barrier, and how your use of the PLM has developed over time. We are also interested in your reactions to how the PLMs have influenced your instruction and student learning. I would like to tape record this interview and want to emphasize that there will be no linkage of your responses with your name or you schools name. This information will be completely confidential. No personally identifiable information will be disclosed during the study (just like in the consent form you signed). Is my tape-recording the interview okay with you? Great. Before we begin, do you have any questions or concerns? Ill start the recorder and begin the interview now.


To get started, I will ask you a few questions about the implementation of the PLM intervention:


1.

How did you structure (or organize) your class time on days when you used the PLM? (Would you please describe a typical math lesson in your treatment class, on a day when you used the PLM?)

a.

Does the whole class use the PLM at once? Why or why not?

b.

Do students typically use the PLM for the entire class period? Why or why not?

c.

Why did you decide to either do whole class or partial class instruction when you use the PLM? (Prompt on: timing constraints, computer/technological constraints, personal teaching preferences)

2.

Would you please describe a typical math lesson in your PLM class, on a day when you used the PLM?

a.

What types of activities do you do? (E.g., Cycling or stations, peer collaboration, work one-on-one with students)

b.

How do you determine how much time to allot for the PLM and other activities?

3.

We understand you may have to alter the way you organize your instruction in your PLM class on days when students are using the PLM. To help us understand these changes, I am going to ask you to compare your instruction in your PLM and Control Class. Remember, your Control Class is your math class for which students are not provided log-ins for the PLMs but they do still take the assessments. Thinking of both your PLM and Control classes, can you describe to us the trade-offs/differences in how you organize instruction in your PLM class?

a.

Time spent in group work

b.

Time spent working one-on-one with students

c.

Time spent in whole-group instruction

d.

Strategies you use with struggling learners (e.g., Special Education, ELL)

e.

Decisions about content and pedagogy (use of worksheets, material/topics you present)

f.

Decisions about pacing


Now we want to ask you about success and challenges with implementing the PLM.  We can start with the positive first.


4.

 What factors facilitated your implementation of the PLMs?

a.

Teacher comfort with technology

b.

Student comfort with technology

c.

School leadership (e.g. competing demands)

d.

Material support (e.g. planning documents)

e.

Technical support

5.

What were your biggest obstacles in implementing the intervention?

a.

Teacher comfort with technology

b.

Student comfort with technology

c.

School leadership (e.g. competing demands)

d.

Material support (e.g. planning documents)

e.

Technical support

6.

How did you handle students who struggled with the PLM, and by that we mean students who were slow to complete the modules?

a.

 How did you address the speeds at which different students worked through the modules?

b.

How did you handle students completing the modules at different times?

7.

Now, wed like to ask you a little bit about the professional development that you received. Can you say whats helpful and what could be improved about the professional development?  

8.

Now we want to know specifically about Score Reporter; did you use the PLM Score Reporter?

If they say no:

a.

Some teachers use Score Reporter and some do not. Wed like to understand why teachers choose to look at it or not. Would you explain a little bit about why you chose not to use it?

b.

Is there something that we could have done to make Score Reporter more useful to you?

If they say yes:

a.

Would you describe how often you monitored the dashboard?

i.

If teacher did not monitor the dashboard, ask why not.

ii.

If did, ask how decided when to monitor it

b.

Did you use the information from Score Reporter to change your approach to instruction in your PLM class?

i.

If yes, ask: would you describe an example of how you used the information from the dashboard?

1.

For example: to focus warm ups, target one on one instruction, etc.

2.

Did it change your pacing of content in your PLM class?

c.

Did the information from Score Reporter impact the content coverage or pacing of your math instruction in your Control class at all?

i.

How so?

d.

Is there something that we could have done to make Score Reporter more useful to you?

1.

We would like to understand how the PLM approach is similar or different to how you taught math before you participated in this intervention and howif at allthe intervention influenced how you teach math.

a.

Were there any new ideas, concepts, techniques offered, or was it pretty consistent with how you taught it before??

b.

In what waysif at alldid the PLM change how you think about math or the strategies you use to teach it?

i.

Prompt specifically on whether this change was in the PLM, the Control class, or BOTH

c.

In what waysif at alldid it affect how you pace your instruction or the amount of time you spent on particular topics?

i.

Prompt specifically on whether this change was in the PLM, the Control Class, or BOTH

d.

Are there any changes in how you approach mathematics instruction in general that you attribute to the professional development?

e.

Did if affect how you planned your lessons for a second year?

i.

Prompt specifically on whether this change was in the PLM, the Control Class, or BOTH

2.

What effects do you think the PLM intervention had on students this year?

a.

To what extent do you think it influenced&.

i.

 student motivation, or confidence?

ii.

student learning (the way students learn or think about math)

iii.

The classroom environment (for example: student behavior)

b.

Did certain students do better or worse with the PLMs? Why do you think that is?

3.

Some teachers find computers helpful, others may not find them as useful. Can you describe how you feel about using the computer as a tool to aid in your instruction?

a.

Prompt on: what caused changes (if applicable)

b.

Autonomy, self-efficacy, ideas about your role as the teacher

4.

What is your overall impression of the intervention?

a.

Would you choose to do it again? Why or why not?

b.

Beyond what we have already talked about today, what do you think are the strengths and weaknesses of the PLM intervention?

c.

What could have been done to facilitate your implementation of the PLM?

5.

If a teacher were just getting started on a PLM, what type of advice might you give her/him about how to be successful with it?

a.

(If clarification needed) If you were on an advisory board for a school district interested in implementing this intervention, what would you want them to consider?

6.

Is there anything else you think we should know, to understand your view of the PLM, and how it worked in your class and school?

7.

Is there anything else we could have done to support you or helped you to facilitate use of the PLM?

Are there any questions that you would like to ask me? Thank you so much for your time.


Select Implementation Survey Questions


1.

During the Block A Time Period, how did you usually organize the use of the PLMs? Participants were asked to choose from the following responses: All students used the PLMs at the same time; Only a portion of the class used the PLMs at once.


2.

The next set of questions asks about how you organize instruction. You may organize instruction differently for different classes. We know that teachers use multiple strategies throughout the class; percentages do not need to add to 100%. Thinking about the Block A Time Period, on a typical day, approximately how often did students experience the following types of instruction: (1) Formal presentation of material by you; (2) Individual seat-work assigned by you; (3) Individual seat-work of their own choice; (4) Work with peers; (5) One-on-one instruction with you; (6) Small group instruction with you; (7) Work on computers (non-PLM). Participants were asked to choose from responses listed in a drop-down menu: No time on this, Less than 10%, About 10%. About 20%, About 30% . . . About 100%.


3.

During the Block A Time Period, how frequently did you use the PLM data for each of the following activities: (1) To identify and correct gaps in curriculum and instruction; (2) To tailor instruction to students' individual needs; (3) To guide testing; (4) To identify supplemental materials; (5) To identify areas where I need to strengthen my content knowledge or teaching skills; (6) To shape my instruction in the control classroom. Participants were asked to choose from the following responses: Never, Once for the Unit, About once a week, Several times a week, Every day or almost every day.


4.

During the Block A Time Period, to what extent was each of the following factors a hindrance to your classroom teaching in your classes: (1) Student absenteeism; (2) Student tardiness; (3) Lack of student engagement; (4) Student misbehavior; (5) Insufficient class time to cover all of the unit; (6) Wide range of student abilities; (7) Large class size; (8) Class-School-level interruptions to class time (e.g., changes in schedules, assemblies, students getting called out of class, etc.); (9) Lack of teacher planning time built into the school day; (10) Pressure to cover topics unrelated to mathematics; (11) Lack of support from the principal; (12) Lack of support from other teachers; (13) Inadequate textbooks, materials or other non-technological instructional resources; (14) Lack of school resources to provide the extra help for students who need it; (15) Pressure to cover math material outside the unit because of standardized test preparation; (16) Changes in school priorities or leadership. Participants were asked to choose from responses listed in a drop-down menu: Not at all a hindrance, Slight hindrance, Moderate, hindrance, Great hindrance.


5.

During the Block A Time Period, to what extent were the following hindrances to your students use of the PLMs? (1) Problem accessing computers for the PLMs (e.g., unable to reserve computer room, laptops missing from cart, etc.); (2) Problems with student log-in; (3) PLM software running slowly; (4) Device did not work (e.g., computer, keyboard, mouse, etc.); (5) Issues with Internet connectivity; (6) Your comfort level/expertise with technology; (7) Student comfort level/expertise with technology; (8) Students doing things other than the PLM on the computer (e.g., surfing the Internet, playing games, checking e-mail, etc.); (9) Not enough time for students to work on the PLM. Participants were asked to choose from the following responses: Not at all a hindrance, Slight hindrance, Moderate, hindrance, Great hindrance.




Cite This Article as: Teachers College Record Volume 122 Number 1, 2020, p. 1-50
https://www.tcrecord.org ID Number: 23086, Date Accessed: 10/23/2021 8:05:25 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • J. Cameron Anglum
    Saint Louis University
    E-mail Author
    J. CAMERON ANGLUM is an Assistant Professor at the Saint Louis University School of Education. His research examines government policies and interventions that affect the distribution of resources in K-12 public schools, particularly effects witnessed by urban schools and those schools serving the largest shares of disadvantaged students.

    Steinberg, M., Quinn, R., Kreisman, D., & Anglum, J.C. (2016). Did Pennsylvania’s statewide school finance reform increase education spending or provide tax relief? National Tax Journal, 69(3), 545–582.

    Jargowsky, P., Wood, Z., Anglum, J.C., & Karp, D. (2016). Expanding educational opportunity in urban school districts. In S. Wachter & L. Ding (Eds.), Shared prosperity in America's communities. Philadelphia: University of Pennsylvania Press.


  • Laura Desimone
    University of Delaware
    E-mail Author
    LAURA M. DESIMONE is the Associate Dean for Research at the University of Delaware College of Education and Human Development. Her research focuses on the effects of education policy at the state, local, and classroom levels. She is especially interested in studying school and classroom implementation, and the effects of teacher learning interventions on teachers and students.

    Desimone, L. M., & Pak, K. (2017). Instructional coaching as high-quality professional development. Theory into Practice, 56(1), 3–12.

    Covay Minor, E., Desimone, L., Caines Lee, J., & Hochberg, E. D. (2016). Insights on how to shape teacher learning policy: The role of teacher content knowledge in explaining differential effects of professional development. Education Policy Analysis Archives, 24(61), n61.


  • Kirsten Hill
    Independent Scholar
    E-mail Author
    KIRSTEN LEE HILL is an education researcher and consultant. She advocates for cross-stakeholder collaboration and strives to make research relevant and accessible to, and informed by, all stakeholders in public education.

    Desimone, L., & Hill, K. (2017). Inside the black box: Examining mediators and moderators of a middle school science intervention. Educational Evaluation and Policy Analysis, 39(3), 511–536.

    Desimone, L. M., Wolford, T., & Hill, K. L. (2016). Research-practice: A practical conceptual framework. AERA Open, 2(4), 2332858416679599.


 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS