Understanding Differences in Instructional Quality Between High and Lower Value-Added Schools in a Large Urban District


by Thomas M. Smith, Courtney Preston, Katherine Taylor Haynes & Laura Neergaard Booker - 2015

Background/Context: High schools are under increasing pressure to move beyond just graduating students, and many high schools today continue to have low rates of student retention and learning, particularly for students from traditionally low-performing subgroups. Differential dropout rates, wherein low-income students, minorities, and English language learners leave school at higher rates than other students, only compound the problem.

Purpose/Objective: In this study, we examine differences in instructional quality between two higher and two lower value-added high schools, as measured by the Classroom Assessment Scoring System – Secondary (CLASS-S). It explores (a) differences in levels of instructional quality, (b) differences in the proportions of students taking advanced courses, and (c) differences in the way teachers think and talk about their classroom challenges.

Research Design: This is a mixed methods study that combines data from classroom observation protocols with teacher interview data. We use multilevel statistical models to address the first two research questions and emergent, inductive coding to determine commonalities within schools in how teachers implement higher-quality instructional practices.

Findings/Results: We find that the average difference in instructional quality, as measured by the CLASS-S, was not very wide across our four case study schools and that the biggest differences were between the two higher value-added high schools. Our interview data suggest that teachers in the two higher value-added schools are more proactive about providing emotional support and preventing behavioral problems, and intentional about attending to content and engaging students in higher-order thinking.

Conclusions/Recommendations: The lack of variation in classroom instructional practice across schools also suggests the need to attend to the ways that schools support academic learning outside of the classroom. Qualitative findings signal that the quality of classroom instruction is not the only critical input to students’ learning gains when trying to identify what leads schools to place highly in value-added rankings.




High schools are under increasing pressure to move beyond just graduating students. As President Obama challenges all Americans to commit to at least one year of higher education or post-secondary training, the federal government is pushing high school reform by (1) offering incentives to states to raise standards and align policies and structures to the goal of college and career readiness, and (2) using grants to encourage partnerships between school districts, colleges and universities, community organizations, and business to better align academic concepts with real-world challenges. These initiatives focus on the gaps between the demands of an increasingly global economy and the current outputs of our high schools. Many high schools today continue to have low rates of student retention and learning, particularly for students from traditionally low-performing subgroups (Becker & Luthar, 2002; Cook & Evans, 2000; Lee, 2002, 2004). While racial and ethnic gaps in reading and mathematics achievement between both 17-year-old White and Black students and White and Hispanic students narrowed between 1978 and the early 1990s, these gaps have remained stagnant over the last two decades. Currently, gaps between Black and Hispanic 17-year-olds and their White counterparts range from two to more than three years of learning (Rampey, Dion, & Donahue, 2009). Gaps are even wider in the senior year of high school between native English speakers and English language learners. Differential dropout rates, wherein low-income students, minorities, and English language learners leave school at higher rates than other students, only compound the problem (see Kaufman & Chapman, 2004; Snyder, Dillow, & Hoffman, 2009).


A growing consensus among practitioners and researchers around the “essential components” of successful schooling has emerged from years of research. These components include a rigorous and aligned curriculum, quality instruction, personalized learning connections, a culture of learning and professional behavior, connections to external communities, systematic use of data, systemic performance accountability, and learner-centered leadership (Goldring, Porter, Murphy, Elliott, & Cravens, 2009; Preston, Goldring, Guthrie, & Ramsey, 2012). While the components are recognizable in practice, far less is known about the ways in which educators develop, implement, integrate, and sustain them. This paper examines how one of these components, quality instruction, plays out across two higher value-added (HVA) high schools (ones making above average achievement gains and maintaining below average dropout rates for students from traditionally low-performing groups) and two lower value-added (LVA) high schools in the same large urban district. While recent research is clear on the impact of teachers on student achievement (Nye, Konstantopoulos, & Hedges, 2004; Rivkin, Hanushek, & Kain, 2005; Rockoff, Jacob, Kane, & Staiger, 2008), there is scant research examining exactly how instruction in HVA schools collectively differs from instruction in LVA schools in ways that might be leading to differences in learning rates.


We first review the literature pertaining to quality instruction—particularly at the high school level—and then introduce our methods, describe our four case study schools, and present our findings. We conclude with a discussion and implications for future research.


QUALITY INSTRUCTION: TEACHER INFLUENCES ON CLASSROOM LEARNING ENVIRONMENTS


There is little empirical research that links specific classroom practices to student achievement. However, research has begun to link classroom observation measures to student achievement. Students in classrooms with higher scores on the Classroom Assessment Scoring System – Secondary (CLASS-S) make greater academic and social gains than those in classrooms with lower CLASS-S scores, though most of these gains comes from studies conducted at the preschool and elementary level (Curby et al., 2009; LaParo, Pianta, & Stuhlman, 2004). Positive correlations have been found between algebra end-of-course exams (EOCs) and CLASS-S domains: The highest correlation is with Classroom Organization and the lowest with Emotional Support (Bell, Gitomer, McCaffrey, Hamre, & Pianta, 2011).


Using data from Cincinnati middle school students, Kane, Taylor, Tyler, and Wooten (2010) find differences between HVA and LVA teachers on a teacher evaluation observation protocol. HVA teachers are more likely to use a broad range of content appropriate instructional strategies and convey accurate content knowledge. A similar study of effective middle school math instruction, measured by student value-added and observation, revealed a strong positive correlation between teacher value-added and expert observational ratings using the Mathematical Quality of Instruction (MQI) tool (Hill, Kapitula, & Umland, 2011). The MQI includes aspects of instruction such as how teachers respond to students’ questions and misunderstandings, content knowledge, the connectedness between classroom practice and mathematics, and use of multiple representations (Hill et al., 2008). Further, the Measures of Effective Teaching (MET) Project, using data from eight school districts in eight states, identified differences in classroom practice between higher and lower value-added teachers using five different classroom observation protocols. These differences in effectiveness held when students were randomly assigned to their classes the next school year (MET Project, 2013). These five observation protocols, including the CLASS, which is used in our study, measure a wide range of classroom competencies such as behavior management, use of assessments, teacher knowledge, student participation and engagement, and intellectual challenge (MET Project, 2012). However, the MET project does not break the observation protocols down into their specific domains and dimensions, but rather uses an aggregate score of all components for each observational protocol. We employ the CLASS-S in our study because, unlike other observation protocols, it is specific to secondary schools. Additionally, its three domains, Emotional Support, Organizational Support, and Instructional Support, are well aligned with the extant literature base describing quality instruction.  


As classroom learning environments and effective instructional practices vary between elementary and secondary schools (Firestone & Herriott, 1982), our literature review focuses on aspects of quality instruction at the middle and high school levels. Although our analysis focuses specifically on high schools, we include research at the middle school level because of the small research base focusing specifically on high schools. The vast majority of the extant research on effective instruction at the high school level is conceptual or takes the form of either (1) case studies describing the practices of highly effective teachers or (2) descriptive studies of programs designed to increase student achievement, particularly in math (e.g., Anderman, Andrzejewski, & Allen, 2011; Boaler & Staples, 2008). While there are many ways to define quality instruction, such models, together with the review of the literature on effective instructional practices at the high school level that follows, suggest two key strands contributing to quality instruction: classroom environment and classroom instructional practices.


The classroom environment strand includes elements of both classroom organization and classroom climate, which align to CLASS-S’s Organizational Support and Emotional Support, respectively. In a study of high school teachers whose students recognize them as supporting their engagement and motivation, Anderman et al. (2011) identify three contextual features of classrooms that support motivation and engagement: a motivational climate, students’ perceptions of social support from their teacher, and academic press. Teachers seen as supporting their students’ engagement and motivation were those who provided structure for them, cared about how they did in school, valued their input, built and maintained rapport with them, respected them by talking to them rather then at them, and listened to them (Anderman et al., 2011; Cothran, Kulinna, & Garrahy, 2003; Klem & Connell, 2004). High school students tend to value academic rewards like grades, fun yet educational experiences, fair and empathetic judgment, and a sense of humor, fairness, accessibility, and empathy from their teachers (Muller, Katz, & Dance, 1999). Specifically, students desire teachers to “be mentors who can see the world from the student’s perspective and yet provide wise advice, direction, admonishment, and praise; thereby they would facilitate learning” (p. 317). Other important components of an emotionally supportive classroom climate setting include high expectations for all students, making content both relevant and important to real life (Boaler & Staples, 2008), and creating structures and a classroom climate where students are allowed to try to experience failure without negative consequences (Alper, Fendel, Fraser, & Resek, 1997).


Classroom organizational practices, including classroom management, constitute the other key area of the classroom environment strand. Classroom management is often a major reason cited for why beginning teachers’ effectiveness grows over the first few years of their teaching careers and improved classroom management skills could contribute to this growth (Berliner, 1988; Clotfelter, Ladd, & Vigdor, 2007; Harris & Sass, 2011; Kagan, 1992). Reviewing a number of conceptualizations of classroom management, Emmer and Stough (2001) conclude, “classroom management encompasses both establishing and maintaining order, designing effective instruction, dealing with students as a group, responding to the needs of individual students, and effectively handling the discipline and adjustment of individual students” (p. 104). Gregory and Ripski (2008) define a relational approach to classroom discipline as a teachers’ emphasis on building connections and personal relationships with students as a means to students’ cooperation. They find that such an approach to discipline is significantly related to high school student cooperation and negatively related to student defiance. Furthermore, middle and high school students report the most effective teachers set clear expectations and consequences for their students early in the year. They also report that students value consistency, and that teachers must strike a balance between being too strict and too lax. “The key to being a successful, strict teacher relied partially on building positive relationships with students” (Cothran et al., 2003, p. 438).


Instructional practices, aligned to CLASS-S’s Instructional Support, comprise the second strand of quality instruction. Common classroom instructional practices emerging from the literature include collaborative group work and inquiry-based learning (Staples, 2007), formative assessment (Brown, 2008), scaffolding, and introducing new concepts concretely (i.e., introducing the geometric concept of area using physical representations rather than as an abstract idea in a textbook) (Alper et al., 1997). A number of researchers (Allensworth, Correa, & Ponisciak, 2008; Cohen & Hill, 2000; Darling-Hammond, Ancess & Ort, 2002; Marzano, 2001; Wenglinsky, 2002) agree that effective teachers provide opportunities for students to learn from each other. For example, the effectiveness of group learning opportunities was associated with Chicago students’ ACT scores, suggesting that: “English subject test scores were particularly high in classrooms where students regularly improve a piece of writing as a class or in partners” (Allensworth et al., 2008, p. 50).


Making instruction relevant to students is another key component of quality instructional practices: “One way of getting students to engage in their course work is to help them see that the work they do in school will prepare them for their future goals” (Allensworth et al., 2008, p. 60). Rosiek (2003) refers to this as emotional scaffolding, “teachers’ use of analogies, metaphors, and narratives to influence students’ emotional response to specific aspects of the subject matter in a way that promotes student learning” (p. 402). In practice, such “authentic” pedagogy requires instructors to incorporate short- and long-term projects into their instruction and assessment plans. Put succinctly, “the more students do real-world problems, the better the school performs” (Wenglinsky, 2004, p. 6).


In summary, the current literature base on quality instruction at the secondary level emphasizes two strands: what teachers do to enhance the learning environment in their classroom and what instructional practices leverage student learning. While the literature on effective practice at the high school level is sparser than at the middle and elementary levels, the themes of motivational support, social support, and disciplinary support come through clearly as ways to enhance the learning environment. In the instructional strand, accurate delivery of content, use of inquiry-based tasks, student collaboration, formative assessment, links to real-world situations, and encouragement of multiple representations are emphasized. Most of the studies that we found in our literature search were case studies focused on practices implemented by “effective” or “master” teachers. The teaching practices of teachers within and across core subject departments in the same school are rarely examined. Several more recent studies have examined relationships between student value-added scores and individual teachers’ practices at the middle school level.


As there is so little empirical literature detailing differences in instructional practices between teachers in HVA and LVA high schools, we seek to expand this literature base by examining differences in the quality of instruction in English/Language Arts (ELA), mathematics, and science between HVA and LVA high schools within the social and political context of a single large urban district. As prior literature suggests that there is more within-school than between-school variability in both student achievement and instructional quality (Rivkin et al., 2005; Rockoff, 2004), our purpose is to identify whether there might be differences in how classroom instructional interactions are approached by schools that are “beating the odds” (HVA) for students from traditionally low-performing groups compared to schools where the learning gains for students in these groups fall below the district average (LVA). Further, as the tracking literature suggests instructional quality is unequally distributed between students in honors/advanced courses compared to regular/on-level courses (Applebee, Langer, Nystrand, & Gamoran, 2003; Gamoran, Nystrand, Berends, & LePore, 1995; Oakes, 2005), we also focus on differences in instructional quality by track and the proportion of students who take honors/advanced classes.


If some schools have systematically more effective instructional programs than others, it is important to identify the particular domains of practice, as well as the aspects of the school organization that support them, so that they can be shared and implemented in other schools. Thus, while the data that we present cannot establish a causal link between instructional practices and school value-added, we believe that an in-depth case study approach can help us understand how observed instructional practices in high schools, as well as the ways that schools are explicitly or implicitly organizing to improve instruction, may contribute to the success of a school in value-added rankings. As many states have incorporated high school value-added measures into school and principal evaluations, pushed by federal initiatives such as Race to the Top, Teacher Incentive Fund, and Elementary and Secondary Education Act (ESEA), it is important to investigate how forms of classroom practice differed between HVA and LVA schools. To accomplish this goal, we explore the following research questions:


1.

In what domains of instruction do HVA high schools have higher levels of instructional quality than LVA high schools, as measured by the CLASS-S? Are the instructional quality gaps between advanced and regular courses narrower in the HVA schools?

2.

What are the differences between HVA and LVA schools in the proportions of students taking advanced courses?

3.

Do the differences in the ways that teachers in HVA and LVA schools talk and think about their challenges in the classroom help to explain differences in school-level value-added?


METHODS


Our first task was to select a school district that had significant numbers of students from traditionally low-performing groups as well as schools that were achieving above average gains and schools that were achieving below average gains for students in these groups. Grissom, Kalogrides, and Loeb (2014) show that school-level value-added measures are positively correlated with many other measures of school performance, making value-added an attractive selection metric. Meyer (1997) makes a persuasive argument that school-level value-added is a significant advance over ways in which schools have typically been compared (such as school performance average and median test scores). Another advantage of school value-added over teacher value-added relates to the potential bias that tracking has on teacher value-added estimates (Harris, 2011). While internal tracking mechanisms can lead to bias in teacher value-added scores, the aggregate efficacy of a school’s tracking policy would be a desirable component of a school-level value-added measure, as the organization of the schedule and the allocation of teachers to courses should influence school performance measures.


By applying a simple value-added achievement model (VAM) to estimate the relative performance of all high schools in Florida, Broward County Public Schools (BCPS) was selected as meeting these characteristics.1 The district serves large proportions of traditionally underperforming student subgroups, including low-income, minority, and English language learners (ELL). The student population during the 2010–2011 school year was 38% African American, 28% Hispanic, 27% White, and 7% other. In the district, 48% of students are eligible for free or reduced-price lunches and 10% are classified as ELL. Four high schools in the district—two higher performing and two lower performing—were selected for case study on the basis of the school level estimates from the VAM analysis (the schools are described below).


In order to get a better estimate of the relative effectiveness of BCPS high schools in promoting student learning, we estimated a simple value added achievement model of the following form:


ΔAit Xit + ϕm + Γit + νit    (1)


Ait represents the achievement gain for student i in year t relative to their prior-year score in year t-1, and X is a vector of individual student characteristics including gender, race/ethnicity, limited English proficiency (LEP) program participation, free-lunch status, reduced-price lunch status, gifted program participation, a set of broad disability categories for students in special education, student mobility (within-year and between-year school change) and pre-high-school (grade 8) attendance, and normed math and reading test scores. The variable m is a school-specific fixed effect. Grade-by-year indicators, Γit, are also included to account for any unmeasured grade and year influences, such as variation in the difficulty of the test. The estimated value of m is the average test score gain of students at school m, conditional on observed student characteristics. It thus represents the combined effect on student learning of all school-related inputs, including teacher quality, average peer influences, instructional materials, physical facilities, and school leadership. It is analogous to the value-added often computed for individual teachers and can thus be considered a school value-added measure. Data on student gains in both math and reading over the years 2004–2005 to 2008–2009 were used to estimate the value-added model, so the estimated school effects represent the average contribution of a high school to student learning gains in either math or reading over the 2005–2006 to 2008–2009 time period, conditional on observed student characteristics. Separate analyses were conducted for math and reading, as well as for various student groups (all students, free/reduced-price lunch students, LEP students, and Black and Hispanic students.


The next step was to select two higher value-added and two lower-value added schools for in-depth case study investigations. We wanted to select schools that were relatively high performing for all student groups as well as schools that were relatively ineffective for each student group. We then cross checked that the higher-performing schools, as measured by value-added, also had graduation rates for students in traditionally low-performing subgroups that were above the district average. Charter schools and magnet schools were excluded from selection, as the choice component in the admissions process may have influenced these schools’ value-added results. Two HVA schools and two LVA schools that serve large proportions of students in traditionally low-performing subgroups were recommended to our district partners for selection as case study schools. Once the district leadership approved the list, each school’s principal was invited to participate in the study. As one of the principals declined to participate, they were replaced with another school with similar rank order for value-added performance and similar subgroup representation. Additional detail on the value-added models and selection criteria can be found in Sass (2012).


Researchers collected data during three week-long visits to each of the four case study high schools during the fall, winter, and spring of the 2010–2011 school year. The study was designed to identify the programs, policies, and practices that effective schools in BCPS used to coordinate the essential components into successful outcomes for students.


THE FOUR CASE STUDY SCHOOLS


Next, we describe the context of the four case study schools in which we conducted our teacher observations and interviews over the 2010–2011 school year for a comparison of the four schools (see Table 1).


Table 1.  Demographic Characteristics and Performance Indicators of Case Study High Schools

 

LVA Schools

 

HVA Schools

 

Starling

Bayfront

 

Water Oak

Torreya

School characteristics

     

Enrollment

1,600–2,000

1,900–2,300

 

2,600–3,000

2,000–2,400

      

Percent minority

55–65%

55–65%

 

50–60%

65–75%

      

Percent economically disadvantaged

60–70%

45–55%

 

30–40%

45–55%

      

Percent Limited English Proficient

10–15%

5–10%

 

5–10%

5–10%

2010 Graduation Rate

<80%

<80%

 

>85%

>85%

2011 School Grade

C

A

 

B

A

      

Note. The state accountability rating and graduation rate were the most recent data available at the time of school selection. See http://schoolgrades.fldoe.org/pdf/1011/Guidesheet2011SchoolGrades.pdf. Demographics represent the composition of the schools at the time of our visits (2010–2011). The value-added ranks are derived from three years of data of school-level value-added in math, science, and reading. The most recent year was 2009–2010.


HIGHER VALUE-ADDED (HVA) SCHOOLS


Water Oak


Water Oak was one of the two HVA schools and enrolled between 2,600 and 3,000 students2 during the 2010–2011 school year. Of those students, between 30%–40% qualified for free or reduced-priced lunch. Students of minority status comprised 50%­–60% of the student population, and 5%–10% of its students were classified as English language learners. The school grade according to Florida’s grading system has changed from an A to a B3 over the past several years. Its Florida Differentiated Accountability status was Correct II.4


Torreya


Torreya was the second HVA school and had approximately 2,000–2,400 students during the school year. Students eligible for free or reduced-priced lunches represented 45%–55% of the student population. The majority of the student body (65%–75%) was comprised of racial/ethnic minorities. Between 5%–10% of students were English language learners. Torreya’s school grade has been an A over the last several years and is categorized as a Correct I5 status due to the school’s success in meeting AYP criteria recently. Torreya was the only case study school to fall into this category. Student admission is based on an academically nonselective lottery system in which enrollment does not need to mirror district demographics.


LOWER VALUE-ADDED (LVA) SCHOOLS


Starling


Starling was one of the LVA schools. During the 2010–2011 school year, Starling served between 1,600 and 2,000 students. Students qualifying for free or reduced-priced lunch comprised approximately 60%–70% of enrollments. Between 55%–65% of the population was of minority status, and 10%–15% of its students were classified as English language learners. The school grade has moved between a C and a D over the last several years, and its Differentiated Accountability status was Correct II.4


Bayfront


Bayfront was the second LVA school and had between 1,900 and 2,300 students in 2010. Students qualifying for free or reduced-priced lunch made up 45%–55% of the student body. Approximately 55%–65% of the population was of minority status, and 5%–10% of its students were classified as English language learners. Its school grade has fluctuated between an A and a B over the last several years. During the 2010–2011 academic year, it was in Correct II4 status.


MEASURING QUALITY INSTRUCTION


We investigated quality of instruction through observations of classroom instruction as well as coding of interviews with school administrators, teachers, and students regarding how schools are explicitly or implicitly organizing to improve instruction.


CLASSROOM OBSERVATIONS


These classroom observations were coded using the Classroom Assessment Scoring System – Secondary (CLASS-S), developed by Pianta, Hamre, Hayes, Mintz, and LaParo (2007). The CLASS was originally designed to measure preschool and early elementary teachers’ instructional practices, but has been expanded for application to the secondary level. We chose the CLASS-S because of its alignment with the strands of quality instruction that emerge from the literature as important for student learning. CLASS-S assesses the quality of teachers’ social and instructional interactions with students as well as the intentionality and productivity evident in classroom settings, mapping well to the classroom learning environments and effective instructional practices identified in our literature review. The focus of the CLASS is on what teachers do with the materials they have and on their interactions with students. The rubric is designed to apply across curricula, lesson formats, and variation in the physical setup of the classroom. The CLASS-S measures middle and secondary teachers’ practices and instructional quality across content areas in three broad domains: (a) Emotional Support, (b) Organizational Support, and (c) Instructional Support. Each domain is organized into multiple dimensions and each dimension consists of several indicators (see Table 2). The indicators map well to the literature we reviewed above on what teachers do to enhance the learning environment in their classroom and what instructional practices leverage student learning: Emotional Support and Organizational Support parallel the two aspects of classroom learning environment—classroom climate and classroom organization, respectively—while Instructional Support encompasses the instructional practices that emerge from the literature as a component of quality instruction.


The coding scheme is designed for raters to use while watching either live or video classroom instruction for 20-minute segments while taking notes on the CLASS-S indicators. Raters then take 10–15 minutes to review their coding manual and assign scores to each of the 11 dimensions. CLASS-S scoring is completed immediately after each observation cycle. Coders rate each dimension as low (1, 2), mid (3, 4, 5), or high (6, 7). While the CLASS-S manual provides general scoring guidelines, it notes that “observers should view the dimensions as holistic descriptions of classrooms that fall in the low, mid, or high range.” All coders completed a CLASS observer training and were certified by completing an online reliability test prior to their observing and coding of classroom practice in our four case study schools.


We targeted 10th grade English/language arts, mathematics, and science classes in fall, winter, and spring of the 2010–2011 school year. Seventy-three teachers were observed, with between two and seventeen 20-minute segments coded for each teacher (for a total of 685 segments). Because research on tracking in high schools suggests that higher-track classes tend to have higher-quality instruction than lower track, we wanted to assess whether this was occurring in our case study schools, as well as whether HVA schools “compressed” or narrowed the differences in the instructional quality between their higher- and lower-track classes more than LVA schools. To increase the number of advanced classes observed in each school, a small number of additional advanced classes were sampled in ninth, 10th, and 12th grade. For example, we asked to observe a higher-track course taught by the same 10th-grade teacher whom we may have already observed teaching a regular-track course, and vice versa.



Table 2. Overview of 2007 CLASS-S Dimensions, Domains, and Indicators


Domain

Dimension

Indicators

Emotional Support

Positive Climate

Relationships

Positive affect

Positive communications

Respect

Negative Climate

Negative affect

Punitive control

Disrespect

Teacher Sensitivity

Awareness

Responsiveness to academic & social/emotional needs and cues

Effectiveness in addressing problems

Student comfort

Regard for Adolescent Perspective

Support for student autonomy & leadership

Connections to current lift

Student ideas and opinions

Meaningful peer interactions

Flexibility

Classroom Organization

Behavior Management

Clear expectations

Proactive

Effective redirection of misbehavior

Student behavior

Productivity

Maximizing learning time

Routines

Transitions

Instructional Learning Formats

Learning targets/organization

Variety of modalities, strategies, and materials

Active facilitation

Effective engagement

Instructional Support

Content Understanding

Depth of understanding

Communication of concepts and procedures

Background knowledge and misconceptions

Transmission of content knowledge and procedures

Analysis and Problem Solving

Opportunities for higher-level thinking

Problem solving

Metacognition

Quality of Feedback

Feedback loops

Prompting thought processes

Scaffolding

Providing information

Encouragement and affirmation

Student Engagement

 

Active engagement

Sustained engagement


CLASS-S: ANALYTIC STRATEGY


To answer our first research question, In what domains of instruction do HVA high schools have higher levels of instructional quality than LVA high schools?, and Are the instructional quality gaps between advanced and regular courses narrower in the HVA schools?, we estimated differences among schools in average scores and the size of the gap between regular and advanced classes using a multilevel statistical model (Raudenbush & Bryk, 2002), which adjusts for the clustering of observation segments within teachers. In these models, the dependent variable is the CLASS-S domain score and the independent variables are fixed effects for each school (with Torreya omitted, it becomes the comparison school), along with controls for grade level, subject level (English/Language Arts, Mathematics, or Science), and course track (advanced vs. regular). We conducted a multivariate hypothesis test to determine whether the HVA schools (Torreya and Water Oak) had higher CLASS-S dimension scores than the LVA schools (Bayfront and Starling). We also include in the Appendix two data tables, where the first, Appendix Table 1, combines data for the two HVA schools and LVA schools. The coefficient on HVA is the average difference in the CLASS-S domain score between the teachers observed in the HVA schools and the teachers observed in the LVA schools. The second table, Appendix Table 2, includes the interaction between school value-added status and track (HVA X Honors). Here the coefficient on HVA reflects difference between the CLASS-S scores of teachers in regular-track classes in the two HVA schools compared to teachers in the LVA schools. The controls are included to adjust for between-school differences in the number of class periods observed with these characteristics. Interactions between school-level fixed effects and the track level of the course observed were also tested to determine if the HVA schools have narrower track gaps than the LVA schools.


ANALYSIS OF ADMINISTRATIVE DATA


To answer our second research question, What are the differences between HVA and LVA schools in the proportions of students taking advanced courses?, we analyzed master schedules for each of the four case study schools for school year 2010/2011. We estimated the proportion of courses offered that were advanced placement or honors by subject area and school.


ANALYSIS OF INTERVIEW DATA


To answer our third research question, To what extent are there differences between how teachers from HVA and LVA schools talk about their instructional practices?, we analyzed interview data from the teachers that we observed. Sixty-seven teacher interviews, 16 to 18 in each school, were transcribed and coded for teachers’ answers to several key questions related to quality instruction,6 including: (1) What are the major challenges for improving student learning?; (2) What are you doing to address these challenges?; and (3) What are you doing to improve the quality of your instruction in your classroom? Using NVivo 9 software (Bazeley, 2007; Edhlund, 2011; QSR International, 2009), we coded the teacher interview data using emergent, inductive methods (Charmaz, 2006; Glasser & Strauss, 1967) and applying an a priori framework (Patton, 2001) that followed our three main questions mentioned above. The purpose was to identify teachers’ perceptions of their classroom challenges, what they do instructionally to address those challenges, and what strategies they are using in their classrooms to improve student learning. We then engaged in a data reduction process (Patton, 2001) to consolidate the coders’ emergent themes. We were particularly interested in commonalities within schools in how teachers were organizing to implement higher-quality instructional practices.


Table 3. Results of HLM Models Predicting Variation in CLASS-S Dimension Scores


 

Emotional Support

Classroom Organization

Instructional Support

Student Engagement

 

(1)

(2)

(3)

(4)

(5)

(6)

(7)

(8)

School (Torreya is the comparison category)

        

Bayfront

-.27

(.20)

-.65**

(.23)

-.38

(.24)

-.62*

(.27)

-.46

(.25)

-.74*

(.29)

-.12

(.26)

-.48

(.30)

Starling

.05

(.21)

-.08

(.24)

-.38

(.26)

-.36

(.29)

-.27

(.27)

-.29

(.31)

-.05

(.28)

-.22

(.31)

Water Oak

-.26

(.20)

-.29

(.24)

-.56*

(.25)

-.59*

(.29)

-.85**

(.26)

-1.03**

(.31)

-.48

(.27)

-.77*

(.32)

Honors

.44**

(.07)

.18

(.15)

.41***

(.09)

.30

(.18)

.37***

(.11)

.13

(.22)

.59***

(.10)

.19

(.21)

Bayfront* Honors

 

.86***

(.23)

 

.56*

(.28)

 

.66

(.34)

 

.81*

(.33)

Starling* Honors

 

.24

(.19)

 

-.05

(.23)

 

.03

(.29)

 

.32

(.28)

Water Oak* Honors

 

.08

(.21)

 

.07

(.26)

 

.34

(.32)

 

.55

(.30)

9th Grade

.06

(.14)

-.01

(.14)

.08

(.17)

.04

(.17)

.30

(.21)

.26

(.21)

.20

(.20)

.18

(.20)

11th Grade

.67**

(.21)

.63

(.21)

-.17

(.25)

-.18

(.25)

.22

(.32)

.22

(.32)

-.45

(.30)

-.43

(.30)

12th Grade

-.15

(.24)

-.33

(.25)

-.83**

(.28)

-.91**

(.29)

-.35

(.35)

-.35

(.37)

-1.28***

(.33)

-1.26***

(.35)

Mathematics

-.39*

(.17)

-.33

(.18)

-.27

(.22)

-.23

(.22)

-.20

(.22)

-.15

(.23)

-.30

(.23)

-.25

(.23)

Science

-.31

(.17)

-.29

(.18)

-.09

(.21)

-.07

(.21)

-.30

(.22)

-.28

(.22)

-.07

(.23)

-.05

(.23)

Visit 2

-.02

(.06)

-.05

(.06)

-.02

(.07)

-.03

(.07)

-.16

(.09)

-.17

(.09)

.13

(.08)

.11

(.08)

Visit 3

.10

(.09)

-.03

(.10)

-0.11

(.11)

-0.22

(.13)

-0.21

(.14)

-0.37*

(.16)

0.20

(.13)

0.03

(.15)

Constant

5.28***

(.18)

5.42***

(.20)

5.30***

(.22)

5.36***

(.23)

4.67***

(.23)

4.80***

(.25)

4.90***

(.23)

5.11***

(.25)

 N

669

669

662

662

641

641

680

680

Standard errors in parentheses

* p < .05, ** p < .01, and *** p < .001

FINDINGS


RQ1: TO WHAT EXTENT DO HVA HIGH SCHOOLS HAVE HIGHER LEVELS OF INSTRUCTIONAL QUALITY THAN LVA HIGH SCHOOLS? ARE THE INSTRUCTIONAL QUALITY GAPS BETWEEN ADVANCED AND REGULAR COURSES NARROWER IN THE HVA SCHOOLS?


Based on the research reviewed above and the fact that the primary purpose of current policy efforts to improve teacher quality is to increase student learning gains, our null hypothesis was that HVA schools would have higher CLASS-S scores for Emotional Support, Classroom Organization, Instructional Support, and Student Engagement than the LVA schools. We also hypothesized that that the HVA schools would have a narrower gap between higher-track classes (honors and above) and regular/on-level classes. Our HLM results, shown in Table 3 and Appendix Table 1, suggest that patterns of instructional quality do not always distinguish HVA from LVA schools. In Table 3, there are two columns of HLM estimates for each of the four CLASS-S domains reported. The first column includes fixed effects (dummy variables) for Bayfront, Starling, and White Oak (Torreya is the comparison category). These can be interpreted as the differences in CLASS-S scores between each of these schools and Torreya. In Appendix Table 1, the coefficient on HVA is the average difference in the CLASS-S domain score between the teachers observed in the HVA schools and the teachers observed in the LVA schools.


The second column of Table 3 and each of the columns in Appendix Table 2 include an interaction between “honors” and “school” (Table 3) or “honors” and “HVA” (Appendix Table 2). In Table 3 these interaction coefficients represent the difference between the honors/regular course gaps at Torreya and Bayfront, Starling, and White Oak schools. The coefficient on “honors” represents the adjusted gap for Torreya. In Appendix Table 2 the coefficient on HVA reflects the difference between the CLASS-S scores of teachers in regular-track classes in the two HVA schools compared to teachers in the LVA schools. The coefficient on HVA X Honors reflects difference in the gap between honors and regular classes between the two HVA schools and the LVA schools.


Graphs displaying the schools’ average differences across each domain and by track are presented below. Both HVA and LVA schools had CLASS-S dimension scores in the mid-range of the 1–7 point scale, with low scores being 1 or 2 and high scores being 6 or 7. Contrary to expectations, Water Oak, an HVA school, tended to be on the lower end of these distributions, while Torreya, the other HVA school, tended to be at the upper end. Across all four schools, advanced/honors courses had higher average scores than regular classes (with differences of about a half a point, approximately half a standard deviation). Differences by domain are detailed below.  


Emotional Support


The CLASS-S Emotional Support domain includes Positive Climate, Negative Climate, Teacher Sensitivity, and Regard for Adolescent Perspective. Teachers at all four schools had mid-level Emotional Support scores on average in both honors and regular classes, with the honors courses average approaching the high range. There were no statistically significant differences between the four schools on the Emotional Support domain score (Model 1 of Table 3), controlling for track grade level, subject, and time of year of the observation. As a result, there were no statistically significant differences between the HVA and the LVA schools (H0: βBayfront + βStarling = βWaterOak + βTorreya; χ2 = .02; p = 0.876).


Across all four schools, teachers of honors courses had, on average, significantly higher Emotional Support scores (Model 1, b = 0.44, p < .0001). When an interaction between track and school is added, results show that the average gap between honors and regular classes was narrower on the Emotional Support domain in HVA than LVA schools (χ2 = 12.27; p = 0.0005), although most of this gap was driven by the difference between Torreya (HVA) rather than Bayfront (LVA) (Model 2, b = 0.86), suggesting that Emotional Support in regular classes is particularly problematic in Bayfront. Differences in the Positive Climate domain illustrate these overall differences. For example, while there are no statistically significant differences across the four schools in average Positive Climate, the adding of an interaction term between school and track shows that Bayfront has the lowest average Positive Climate score in regular classes and the largest gap between its regular and honors classes (compared to Torreya and Water Oak, results not shown).  


An example of a classroom behavior that would result in a mid-level score on Positive Climate might be “the teacher and some students appear generally supportive and interested in one another, but these interactions are muted or not representative of the majority of students in the class.”7 Scores across the schools on the domain Regard for Adolescent Perspectives were similar, with scores in the mid range and no statistically significant differences between the four schools for regular classes, but a wider gap between honors and regular in Bayfront compared to Torreya. An example of a mid-range score in Regard for Adolescent Perspectives might be “material is sometimes connected to the current experiences of adolescents and sometimes makes salient how or why the material is of value to students.”  


Classroom Organization


The Classroom Organization dimension includes the domains of Behavior Management, Productivity, and Instructional Learning Formats. While the two HVA schools did not show systematically better Classroom Organization scores than the LVA schools (H0: βBayfront + βStarling = βWaterOak + βTorreya; χ2 = 0.32; p = 0.57), Torreya (HVA) had a higher average Classroom Organization score than Water Oak—the other HVA school (Model 3, b = -0.560, p = 0.027)—controlling for track, grade level, subject, and time of year of the observation. As was the case with Emotional Support, teachers of honors courses had significantly higher Classroom Organization scores, although the average gap between honors and regular classes was not statistically significant between the HVA and LVA schools (χ2 = 1.55; p = 0.213). The gap was narrower in Torreya (HVA) than Bayfront (LVA), however (Model 4, b = 0.56).  Classroom Organization in the mid range reflects observations where “most of the time there are tasks for students, but learning time is sometimes limited by disruption and/or inefficient completion of management tasks.”


Instructional Support


The Instructional Support domain consists of Content Understanding, Analysis and Problem Solving, and Quality of Feedback. As with Emotional Support and Classroom Organization, teachers of honors courses had significantly higher scores than teachers of regular courses (Model 5, b = 0.37, p = 0.001) with this gap being largest at Bayfront (Model 6, b = 0.66, p = .054), although there were no statistically significant gaps between the honors and regular courses in the other schools (χ2 = 1.37; df = 2; p = 0.50). The widest gaps in average scores were again between two HVA schools, Torreya and Water Oak (Model 5, b = -0.85), while there was no difference, on average, between the Institutional Support scores between the HVA and LVA schools (H0: βBayfront + βStarling = βWaterOak + βTorreya; χ2  = 0.10; p = 0.75). This gap in scores between the two HVA schools held for each of the domains of Content Understanding, Analysis and Problem Solving, and Quality of Feedback (results not shown). A mid-level score on Content Understanding could reflect cases where “class discussion and materials communicate a few of the essential attributes of concepts/procedures but examples are limited in scope or not consistently provided.” A classroom scoring in the mid range on Analysis and Problem Solving would reflect observations where “students occasionally engage in higher-order thinking through inquiry and analysis, but these episodes are brief or limited in depth.”


Student Engagement


Finally, in the area of Student Engagement there were also no statistically significant differences between the HVA and LVA schools (H0: βBayfront Starling

= βWaterOak + βTorreya; χ2 = 0.69; p = 0.41), although Water Oak (HVA) had the lowest average score (difference between Water Oak and Torreya = -0.48, p = 0.072, Model 7), controlling for grade, subject, and time of year of the observation. As with the other domains, the gap between honors and regular courses in Student Engagement scores was widest between Bayfront (LVA)—over 1 point or 78% of a standard deviation—and Torreya (HVA) (Model 8, difference between Bayfront and Torreya in honors/regular gap = 0.81, p = 0.014).


In summary, rather than there being a clear distinction across the dimensions of instructional quality between HVA and LVA schools, school-level averages across all four schools tended to be in the middle to low-middle range on the CLASS-S domains (around 3 or 4 on the 1–7 point scale), with the largest gaps tending to be between Torreya and Water Oak—the two HVA schools. While students enrolled in advanced courses were also more likely to receive higher-quality instruction across all of the categories, emphasizing the importance of examining the distribution of students enrolled in honors and regular class across the four schools, the gap was often widest at Bayfront (LVA) and narrowest at Torreya (HVA), with the gap at Starling (LVA) often similar to the gap in the two HVA schools.


Figure 1. Predicted CLASS-S emotional support score by school and track.

[39_18110.htm_g/00002.jpg]




Figure 2. Predicted CLASS-S classroom organization score by school and track.

[39_18110.htm_g/00004.jpg]



Figure 3. Predicted CLASS-S instructional support score by school and track.

[39_18110.htm_g/00006.jpg]

Figure 4. Predicted CLASS-S student engagement score by school and track.

[39_18110.htm_g/00008.jpg]


RQ2: WHAT ARE THE DIFFERENCES BETWEEN HVA AND LVA SCHOOLS IN THE PROPORTIONS OF STUDENTS TAKING ADVANCED COURSES?


Figure 5 shows the proportion of courses offered in core subjects that are classified as advanced (including honors, gifted, dual enrollment, and Advanced Placement). Courses and counts were obtained through analysis of the four case study schools’ master schedules. While one of the HVA schools, Torreya, had the highest proportion of courses at the advanced level and Starling, an LVA, had the lowest proportion—there was no clear distinction between the other HVA and LVA schools. Bayfront (LVA) had a ratio of advanced courses similar to Water Oak (HVA) in both social studies and language arts (56% to 57% and 63% to 62%, respectively), and Torreya in language arts (63% to 63%). Distributions across course levels by school were similar for courses that were predominantly taken by 10th graders.


This distribution of course offerings complements the CLASS-S scores—HVA school Torreya had a greater proportion of students in AP and honors courses, while HVA school Water Oak had generally similar proportions of students in AP or honors courses as the two LVA schools. Thus, these data are not consistent with a hypothesis that HVA schools have either narrower honors gaps or a greater proportion of students in honors courses.


RQ3: ARE THERE DIFFERENCES BETWEEN HOW TEACHERS FROM HVA AND LVA SCHOOLS TALK ABOUT THEIR INSTRUCTIONAL PRACTICES?


While we hypothesized that HVA schools would provide higher-quality instruction than LVA schools, this was not the pattern that we found on the CLASS-S scores. While Torreya, an HVA school, often had among the higher average scores across CLASS-S dimensions, the other HVA, Water Oak, tended to have among the lowest scores across the CLASS-S dimensions. Thus, we look to the interview data to try to understand how teachers’ orientation toward instruction might differ between the HVA and LVA schools. We were particularly interested in how teachers at Water Oak talked about instruction, as they tended to have similar or lower CLASS-S than the two LVA schools. It may be that there are differences in instructional quality between the HVA and LVA schools that are not picked up by the CLASS because of the way the measure is designed. There could also be differences in instructional (e.g., tutoring) or socio-emotional supports (e.g., relationships between adults and students) outside of the classroom that are not being picked up by the CLASS-S scores. We were also interested in why the largest gap between advanced and regular courses in instructional quality was at Bayfront while the gaps for Starling, the other LVA, were not measurably different from HVA schools in most cases. As noted above, most of the information on teachers’ instructional practices came from the following questions asked of the observed teachers in interviews: (1) What are the major challenges for improving student learning?; (2) What are you doing to address these challenges?; and (3) What are you doing to improve the quality of your instruction in your classroom? The following themes were the most salient across teachers in the interviews:


Figure 5. Proportion of advanced placement and honors courses offered, by school and subject.


[39_18110.htm_g/00010.jpg]


Emotional Support


Although teachers in both HVA and LVA schools mentioned the importance of providing emotional support to students, there was more talk of specific strategies for support in both of the HVA schools. Specifically, the discussions of teachers in the HVA schools converged around four areas of Emotional Support: connections to students’ lives, a culture of respect, building relationships with students, and opportunities for students to collaborate. For example, two teachers mention preferring not to write referrals for their students, instead making an effort to find out what is causing a student’s problems, while in LVA schools, teachers report lack of support from administration on discipline issues: “There have been too many times when I have asked for help, and have gotten a referral taken care of three weeks later. . . . There is a climate in those regular classes where they most certainly cannot learn, even the ones who want to.” An emphasis on student collaboration, group work, and shared learning emerges from teachers at Water Oak. In contrast, this convergence around specific strategies for providing emotional support was not evident in the interviews of teachers in the LVA schools. Further, the teachers in LVA schools who brought up specific themes related to emotional support tended to describe the challenges to providing the sort of emotional support that the CLASS-S coding framework rewards, rather than how they actually provide the emotional support. For example, one teacher in an LVA mentioned that the school does not want teacher-centered instruction, which may provide little opportunity for instructional flexibility or peer interaction, but it is at times necessary because students lack necessary background knowledge. Another reports that gaining the respect of students is difficult: “The discipline got in the way of any rapport being able to be established in the classroom.” A few teachers in the LVA schools commented that the school encourages group work, which provides opportunities for meaningful peer interaction and student autonomy. But they added that it is difficult to implement because of the low academic level of some students, noting that group work is much easier with honors students. In HVA schools, such difficulties were not mentioned. At Water Oak (HVA, lower CLASS-S scores), a number of teachers discussed the importance of building respectful relationships with their students, but remarked that building respectful relationships can be challenging. The challenging nature of building respectful relationships at Water Oak is a possible explanation for the difference in findings between the CLASS-S and interview data; while teachers’ prize positive, respectful relationships with students, their difficulty in fostering such relationships may be captured in lower CLASS-S Emotional Support scores. Teachers describe a changing student body, where fewer students come to their classrooms prepared to learn the curriculum and fewer resources at home to support them, which they largely attribute to an increase in the number of students from impoverished backgrounds and distractions from technology and social media. Similarly, they describe unmotivated students who do not complete homework, problems with absenteeism, and lack of respect for teachers.


Behavior Management and Student Motivation


Regarding behavior management, the challenges across LVA and HVA schools described by teachers are similar: student misbehavior, distractions, and lack of respect. Participants in LVA schools, however, also described problems that were more severe, including cheating on homework, which had become so widespread that it was accepted as the norm. This is the type of problem that would have been difficult to capture through classroom observation, but might have an effect on student achievement and value-added, particularly because there is no dimension of the CLASS-S to capture such behavior. Participants at HVA schools described problems as less serious and discussed addressing behavior issues proactively, by instructing students on the expectations for behavior. We also found some evidence from across the teacher interviews that student misbehavior broke norms of high expectations in the HVA schools, while poor behavior had come to be expected in the LVAs. One teacher in Starling refers to this as a culture of supporting each other in not performing, while another expresses frustration with the administration’s handling of student behavior problems: “This year they said to us, ‘By the way, if a student hits you, you better make sure you are hurt, otherwise we can’t do anything to them.’” Similarly, at Bayfront, teachers describe a prevalent lack of respect for adults, and one teacher speaks to students who “take it quite with pride of how many teachers have left because of them.”


The story is similar regarding how teachers described their students’ motivation. In the interviews with teachers in both HVA and LVA schools, they tended to link behavior management and student motivation, although they differed in their responses to how they address motivational challenges. Multiple teachers in all schools mention lack of student motivation as their primary challenge in improving their instruction. For example, a Bayfront teacher describes her students’ approach to homework: “Kids don’t think twice about not coming in with it, or copying it right in front of you.” Among teachers in LVA schools that mentioned student motivation, however, few provided any detail regarding how they tackle this challenge. In HVA schools, teachers that mention student motivation as a challenge also tended to provide specific examples or strategies for addressing students’ lack of motivation and engaging them in instruction, although this happens more in Torreya than in Water Oak. This corresponds with the CLASS-S data that found the highest average Behavior Management score at Torreya.


Adapt Lesson or Curriculum to Students’ Needs


The theme of adapting the lesson to students’ needs also emerged in the interview data as an element teachers considered to be important for high-quality instruction. In the LVA schools, several teachers sought to improve the quality of their instruction by researching different instructional strategies on the internet, adapting what they were doing to fit the students’ proficiency, and responding to the different modalities and strategies that students use. But they could provide no specifics as to what these strategies were or how they were implemented. The challenges of adapting instruction to students were described as ever shifting and more difficult because the students were not proficient. In HVA schools, by contrast, more teachers (six, as compared to two in the LVA schools) described adapting their lessons and instruction to student learning needs by observing and collaborating with other teachers and staying abreast of the latest instructional strategies. A variety of modalities, strategies, and materials aimed at engaging students’ interests were evident in the data from HVA schools. Similarly, when teachers in LVA schools discuss differentiating instruction, they simply mention that the school encourages it, without providing any detail or examples of how they actually practice differentiation in their classrooms. In contrast, when teachers in the HVA schools discuss differentiating instruction, they also mention that it is an encouraged practice and often a challenge, but they were also more likely to provide examples and strategies of how they put differentiation to work in their classrooms. For example, one teacher gives different assignments based on students’ instructional needs. Another incorporates multimedia demonstrations into instruction to foster student interest. Multiple teachers adjust their pacing to student needs, and some teachers provide independent extension activities for more advanced students to give them time to work with struggling students. This aspect of instruction is captured by the Teacher Sensitivity dimension under the Emotional Support domain of the CLASS, but is only a small piece of what is measured by this dimension. Thus, the CLASS may not have adequately picked up on differences between HVA and LVA in this area of instructional quality. Further, there may be an interaction between emotional support and instructional support in that strong emotional support may be a necessary condition for strong instructional support. Teachers in both HVA and LVA schools stress the importance of gaining students’ respect and attention in order to motivate them, while, as discussed previously, teachers in LVA schools struggle to build these respectful relationships and motivate students.


Making Vocabulary Visible


All seven mentions of word walls (a district initiative) came from LVA schools and all 10 mentions of school-wide use of the “word of the day” came from Torreya. While vocabulary development was evident at all four schools, Torreya took an active approach rather than the passive one seen in LVA schools, where word walls are pro forma. To that end, teacher reports included mention of colleagues merely photocopying words and stapling them to the wall, without referring to how the walls are used to increase students’ vocabulary.


This result is in line with broader evidence across all of the interviews conducted, suggesting that Torreya had more instructional routines in place. While teachers at all schools reported that they are required to have an agenda and/or learning objectives posted in the classroom, teachers at Torreya consistently reported the existence of school-level expectations for implementation of additional routines as a regular part of instructional practice (word of the day, silent reading program, “Do Nows”). This result is reflected in the Instructional Format dimension (under the Classroom Organization domain of the CLASS) that captures whether teachers provide learning targets and well-organized presentation of information (and Torreya scores highest). There was also evidence that leadership at Torreya supported these routines, based on consistent reports across those interviewed that the principal cares strongly about bell-to-bell instruction and other productivity-maximizing expected routines, and follows up on whether teachers are implementing them.


Emphasis on Higher-Order Thinking Skills


In LVA schools, rigor is espoused as a means to high-quality instruction, although there was little evidence in the interviews to suggest how often or in what ways this is enacted. Consistent with the slightly higher scores on the Analysis and Problem Solving domain of the CLASS-S that captures higher-order thinking, the majority of concrete examples of teaching higher-order thinking skills came exclusively from teachers at Torreya, including descriptions of using open-ended questions and Socratic methods. Teachers in the LVA schools discuss not using worksheets as an example of rigorous practice, but rarely provide concrete examples of rigorous instruction. The teacher interviews suggest that students at Torreya are carrying a greater amount of the cognitive load, engaging in activities such as synthesizing or summarizing what they have read. As another example, every teacher in this school is supposed to stop instruction during seventh period to read for 20 minutes and then engage students in writing a short response or reflection about what they have read.


In sum, the teacher interview data evidence differences between HVA and LVA schools in the way teachers talk about their instructional practices, particularly in the areas of emotional support, behavior management, and strategies for differentiating instruction. Much of this difference stemmed from the ways in which teachers in HVA schools explained how they implemented these instructional practices, rather than just naming them or describing how student behavior issues inhibited their use. There were also similarities in the practices voiced by teachers in HVA schools. For example, teachers at HVA schools evidenced four areas of emotional support (e.g., real-world connections, a culture of respect, building relationships with students, and collaboration) while at LVA schools there was no such convergence of evidence. Behavior management differed as well; student misbehavior broke norms of high expectations in the HVA schools, while teachers had come to expect poor behavior in the LVAs. Teachers’ descriptions of students’ motivation paralleled the differences in emotional support and behavior management. Teachers reported that students are more highly motivated in HVA schools than in LVA ones.  Finally, teachers in HVA schools described differentiating instruction as a challenging practice, but one that is encouraged. They provided concrete examples of, and strategies for, differentiating instruction. Such evidence was absent in interviews with teachers at LVA schools.


DISCUSSION


We went into this analysis expecting to identify aspects of instruction that were present and supported in HVA schools that might be absent or less supported in LVA schools. If the HVA schools were succeeding in implementing particular dimensions of instructional quality for which LVA schools continued to struggle, we would have had the basis for collaborating with the district in designing an instructional intervention that could be scaled to LVA schools across the district. What we found, however, was that the average differences in instructional quality, as measured by the CLASS-S, were not very wide across our four case study schools and that the biggest differences were between the two HVA schools. While this could be interpreted as suggesting that instructional quality has little association with student achievement gains, we see these results differently. We suspect that certain aspects of the district context, such as a curriculum frameworks and pacing guides that are utilized with fidelity by most teachers, lay the groundwork for students’ exposure to the curriculum—a minimum standard for opportunity to learn. We also found that the HVA schools tend to offer more advanced courses, although this was the case more in Torreya than Water Oak, and that advanced courses tended to score higher on all domains of the CLASS-S. While this suggests that pushing students into more rigorous courses could lead to greater learning, further analysis is necessary to determine the degree to which variation in the offering of advanced courses is related to differences across schools in the prior achievement profiles of their students or in the attitudes of teachers and administrators that all students can succeed.


Our interview data also suggest that there may be more of a “no excuses” stand taken in the HVA schools, where a student’s background is not an acceptable reason for poor performance but rather a motivation for trying new strategies and working to engage students in both academic and social aspects of the schools. These data suggest that teachers in the two HVA schools are more proactive about providing emotional support and preventing behavioral problems, and intentional about attending to content and engaging students in higher-order thinking. While these data support the findings from the CLASS-S coding of classroom instruction, they aligned more with what we saw in Torreya than in Water Oak.


The lack of variation in classroom instructional practice across schools also suggests the need to attend to the ways that schools support academic learning outside of the classroom. For example, in the 2010–2011 school year, Water Oak implemented a new instructional coaching framework, tapping one of the school’s instructional coaches to assemble a team of teacher leaders from across the academic departments tasked with directing the school’s instructional reform efforts. In this role, the “lead instructional coach” is reported to coordinate a variety of activities, including: reading pull-out programs, the school’s Saturday FCAT camp, integration of reading strategies across departments, organizing the school’s professional learning communities (PLCs), and the monitoring and collective analysis of student performance data (Cohen-Vogel & Harrison, 2013). Acknowledging both the importance of instructional leadership and the pressures on administrators’ time from other areas (e.g., discipline, safety, facilities, operations, community partners), the school’s principal articulated a need in this school for a team focused squarely on instruction, sharing that:


I wanted to make sure that I had someone that I trust that was going to kind of lead the way, someone I could pick up the phone at any time of the day, any part of the week, pick up the phone and we could discuss curriculum if I needed to.


While it may take time for this form of instructional alignment to impact classroom instruction, it may lead to a more coherent approach to supplementary services.


Our qualitative findings have only suggested ways in which HVA schools differ from LVA schools in their instructional support in dimensions not measured by CLASS-S. Yet these findings signal that the quality of classroom instruction, as measured by the CLASS-S, is not the only critical input to students’ learning gains when trying to identify what leads schools to place highly in value-added rankings. The alignment of instructional supports, quantity and quality of supplementary services, teachers taking a “no excuses” attitude toward student learning, the building of positive relationships between teachers and students, as well as how the leadership team facilitated these efforts, are all aspects of schooling that deserve more scrutiny.


Notes

1. Additional detail on the research design and context can be found in Sass, 2012.

2. To mask the identity of the schools, the student enrollment rates are provided as ranges.

3. Florida high school grades are based on standardized test scores, growth on those scores, graduation rates, college readiness, and student participation and performance in accelerated curriculum.

4. Correct II status indicates that the school has a grade of at least C, has not made Adequate Yearly Progress (AYP) for more than four preceding years in a row, and met less than 80% of its AYP criteria in the previous school year.

5. Correct I status indicates that the school has a grade of at least C, has not made AYP for more than four preceding years in a row, but met at least 80% of its AYP criteria in the previous school year.

6. The disparity between the number of teacher observations (73) and the number who were interviewed (67) is attributable to data collectors’ efforts to balance the number of observations among different academic tracks (e.g., observations of AP/honors classes and regular/remedial classes) across site visits. Consequently, some teachers were enrolled for observation in the third visit, but not interviewed.

7. Examples are taken from Pianta et al. (2007).


References


Allensworth, E., Correa, M., & Ponisciak, S. (2008). From high school to the future: ACT preparation—too much, too late. Consortium on Chicago School Research (CCSR). Retrieved from http://eric.ed.gov/?id=ED501457 on July 2, 2008.


Alper, L., Fendel, D., Fraser, S., & Resek, D. (1997). Designing a high school mathematics curriculum for all students. American Journal of Education, 106(1), 148–178.


Anderman, L., Andrzejewski, C., & Allen, J. (2011). How do teachers support students’ motivation and learning in their classrooms? Teachers College Record, 113(5), 969–1003.


Applebee, A. N., Langer, J. A., Nystrand, M., & Gamoran, A. (2003). Discussion-based approaches to developing understanding: Classroom instruction and student performance in middle and high school English. American Educational Research Journal, 40(3), 685730.

Bazeley, P. (2007). Qualitative data analysis with NVivo. London, England: Sage Publications Ltd.


Becker, B. E., & Luthar, S. S. (2002). Social-emotional factors affecting achievement outcomes among disadvantaged students: Closing the achievement gap. Educational Psychologist, 37(4), 197–214.


Bell, C. A., Gitomer, D. H., McCaffrey, D., Hamre, B., & Pianta, R. (April, 2011). An argument approach to observation protocol validity. Paper presented at the annual conference of the American Education Research Association, New Orleans, LA.


Berliner, D. (1988). Implications of studies on expertise in pedagogy for teacher education and evaluation. In New directions for teacher assessment (Proceedings of the 1988 ETS Invitational Conference, pp. 39-68). Princeton, NJ: Educational Testing Service.


Boaler, J., & Staples, M. (2008). Creating mathematical futures through an equitable teaching approach: The case of Railside School. Teachers College Record, 110(3), 608–645.


Brown, B. (2008). Assessment and academic identity. Teachers College Record, 110(10), 2116–2147.


Charmaz, K. (2006). Construction grounded theory: A practical guide through qualitative analysis (1st ed.). London, England: Sage Publications Ltd.


Clotfelter, C. T., Ladd, H. F., & Vigdor, J. L. (2007). Teacher credentials and student achievement: Longitudinal analysis with student fixed effects. Economics of Education Review, 26(6), 673682.


Cohen, D., & Hill, H. (2000). Instructional policy and classroom performance: The mathematics reform in California.  Teachers College Record, 102(2), 294343.


Cohen-Vogel, L., & Harrison, C. (2013). Leading with data: Evidence from the National Center on Scaling Up Effective Schools. Leadership and Policy in Schools, 12(2), 122-145.


Cook, M., & Evans, W. N. (2000, October). Families or schools? Explaining the convergence in white and black academic performance. Journal of Labor Economics, 18, 729–754.


Cothran, D., Kulinna, P., & Garrahy, D. (2003). ‘‘This is kind of giving a secret away . . .’’: Students’ perspectives on effective class management. Teaching and Teacher Education, 19(4), 435–444.


Curby, T. W., LoCasale-Crouch, J., Konold, T. R., Pianta, R. C., Howes, C., Burchinal, M., . . .  Barbarin, O. (2009). The relations of observed pre-K classroom quality profiles to children’s achievement and social competence. Early Education and Development, 20(2), 346–372.


Darling-Hammond, L., Ancess, J., & Ort, S. W. (2002). Reinventing high school: Outcomes of the coalition campus schools project. American Educational Research Journal, 39(3), 639–673.


Edhlund, B. (2011). NVivo 9 essentials. Stallarholmen, Swden: Form & Kunskap AB.


Emmer, E. T., & Stough, L. M. (2001). Classroom management: A critical part of educational psychology, with implications for teacher education. Educational Psychologist, 36(2), 103112.


Firestone, W., & Herriott, R. (1982). Prescriptions for effective elementary schools don’t fit secondary schools. Educational Leadership, 40(3), 5153.


Gamoran, A., Nystrand, M., Berends, M., & LePore, P. C. (1995). An organizational analysis of the effects of ability grouping. American Educational Research Journal, 32(4), 687715.  


Glasser, B., & Strauss, A. (1967). Discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Adeline.


Goldring, E., Porter, A., Murphy, J., Elliott, S. N., & Cravens, X. (2009). Assessing learning-centered leadership: Connections to research, professional standards, and current practices. Leadership and Policy in Schools, 8(1), 136.


Gregory, A., & Ripski, M. (2008). Adolescent trust in teachers: Implications for behavior in the high school classroom. School Psychology Review, 37(3), 337–353.


Grissom, J. A., Kalogrides, D., & Loeb, S. (2014). Using student test scores to measure principal performance. Educational Evaluation and Policy Analysis, 37(1), 3–28.  


Harris, D. N. (2011). Value-added measures in education. Cambridge, MA: Harvard Education Press.


Harris, D. N., & Sass, T. R. (2011). Teacher training, teacher quality and student achievement. Journal of Public Economics, 95(7), 798812.



Hill, H. C., Blunk, M., Charalambous, C., Lewis, J., Phelps, G. C., Sleep, L., & Ball, D. L. (2008). Mathematical knowledge for teaching and the mathematical quality of instruction: An exploratory study. Cognition and Instruction, 26, 430511.


Hill, H. C., Kapitula, L., & Umland, K. (2011). A validity argument approach to evaluating teacher value-added scores. American Educational Research Journal, 48(3), 794831.


Kagan, D. M. (1992). Professional growth among preservice and beginning teachers. Review of Educational Research, 62(2), 129169.


Kane, T. J., Taylor, E. S., Tyler, J. H., & Wooten, A. L. (2010). Identifying effective classroom practices using student achievement data (No. w15803). New York, NY: National Bureau of Economic Research.


Kaufman, P., & Chapman, C. (2004). Dropout rates in the United States: 2001 (NCES 2004–057), Table A-1. Data from U.S. Department of Commerce, Bureau of the Census, Current Population Survey (CPS), October Supplement, 1972–2001.


Klem, A., & Connell, J. (2004). Relationships matter: Linking teacher support to student engagement and achievement. Journal of School Health, 74(7), 262–273.


LaParo, K. M., Pianta, R. C., & Stuhlman, M. (2004). The classroom assessment scoring system: Findings from the prekindergarten year. The Elementary School Journal, 104(5), 409–426.


Lee, J. (2002). Racial and ethnic achievement gap trends: Reversing the progress toward equity. Educational Researcher, 31(1), 3–12.


Lee, J. (2004). Multiple facets of inequity in racial and ethnic achievement gaps. Peabody

Journal of Education, 79(2), 51–73.


Marzano, R. J. (2001). Designing a new taxonomy of educational objectives. Thousand
Oaks, CA: Corwin Press.


Measures of Effective Teaching Project. (2012). Gathering feedback for teaching: Combining high-quality observations with student surveys and achievement gains. Bill & Melinda Gates Foundation.


Measures of Effective Teaching Project. (2013). Ensuring fair and reliable measures of effective teaching: Culminating findings from the MET Project’s three-year study. Bill & Melinda Gates Foundation.


Meyer, R. (1997). Value-added indicators of school performance: A primer. Economics of Education Review, 16(3), 283–301.


Muller, C., Katz, S. R., & Dance, L. J. (1999). Investing in teaching and learning dynamics of the teacher-student relationship from each actor’s perspective. Urban Education, 34(3), 292337.


Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects?

Educational Evaluation and Policy Analysis, 26, 237–257.


Oakes, J. (2005). Keeping track: How schools structure inequality, Second Edition. New Haven, CT: Yale University Press.


Patton, M. Q. (2001). Qualitative research & evaluation methods (3rd ed.). St. Paul, MN: Sage.


Pianta, R. C., Hamre, B. K., Haynes, N. J., Mintz, S. L., & LaParo, K. M. (2007). Classroom assessment scoring system – Secondary Manual. Charlottesville, VA: University of Virginia.


Preston, C., Goldring, E., Guthrie, J. E., & Ramsey, R. (2012, June). Conceptualizing essential components of effective high schools. Paper presented at the Achieving Success at Scale: Research on Effective High Schools conference, Nashville, TN.


QSR International Pty Ltd. (2008). NVivo Qualitative data analysis. Melbourne, Australia: QSR International Pty Ltd.


Rampey, B. D., Dion, G. S., & Donahue, P. L. (2009). NAEP 2008 Trends in Academic Progress (NCES 2009–479). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, Washington, DC.


Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.; Advanced Quantitative Techniques in the Social Sciences Series, No. 1). Thousand Oaks, CA: Sage.


Rivkin, S. G., Hanushek, E. A., & Kain, J. J. (2005). Teachers, schools, and academic

achievement. Econometrica, 73(2), 417–458.


Rockoff, J. E. (2004). The impact of individual teachers on student achievement: Evidence from panel data. American Economic Review, 94(2), 247252.


Rockoff, J. E., Jacob, B. A., Kane, T. J., & Staiger, D. O. (2008). Can you recognize an effective

teacher when you recruit one? (NBER Working Paper 14485). Retrieved from

http://www.nber.org/papers/w14485.


Rosiek, J. (2003). Emotional scaffolding: An exploration of the teacher knowledge at the intersection of student emotion and the subject matter. Journal of Teacher Education, 54(5), 399412.


Sass, T. (2012, June). Selecting high and low-performing high schools in Broward County, Florida for analysis and treatment. Paper presented at the Achieving Success at Scale: Research on Effective High Schools conference, Nashville, TN.


Snyder, T. D., Dillow, S. A., & Hoffman, C. M. (2009). Digest of education statistics 2008 (NCES 2009-020). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.


Staples, M. (2007). Supporting whole-class collaborative inquiry in a secondary mathematics classroom. Cognition and Instruction, 25(2/3), 161–217.


Wenglinsky, H. (2002). The link between teacher classroom practices and student academic performance. Educational Policy Analysis Archives, 10(2) 1–30.


Wenglinsky, H. (2004). The link between instructional practice and the racial gap in middle schools. Research on Middle Level Education, 28(1), 1–13.


APPENDIX


Table 1. Summary of HLM Analysis Predicting CLASS-S Domain Scores on Value-Added Status, Course Track, Grade, Subject, and Time of Year of Observational Visit


 

(1)

(2)

(3)

(4)

 

Emotional

Support

Classroom Organization

Instructional Support

Student Engagement

 

b/se

b/se

b/se

b/se

     

HVA

0.002

0.101

-0.039

-0.150

  

(0.15)

(0.18)

(0.20)

(0.19)

Honors

    0.439***

   0.407***

   0.353**

    0.588***


Grade

(0.07)

(0.09)

(0.11)

(0.10)

  9th

0.043

0.064

0.263

0.185

 

(0.14)

(0.17)

(0.21)

(0.20)

  11th

   0.648**

-0.204

0.158

-0.485

 

(0.21)

(0.25)

(0.32)

(0.30)

  12th

-0.179

-0.872**

-0.459

     -1.331***

 

(0.24)

(0.28)

(0.35)

(0.33)

 

0.000

0.000

0.000

0.000

Subject

 

 

 

 

  Math

-0.408*

-0.295

-0.228

-0.318

 

(0.18)

(0.22)

(0.24)

(0.24)

  Science

-0.328~

-0.105

-0.329

-0.086

 

(0.18)

(0.22)

(0.24)

(0.23)

Observation

 

   

time

 

 

 

 

  Winter

-0.020

-0.017

-0.155~

0.135

  

(0.06)

(0.07)

(0.09)

(0.08)

  Spring

0.088

-0.106

-0.217

0.200

 

(0.09)

(0.11)

(0.14)

(0.13)

Constant

     5.157***

     4.933***

     4.318***

     4.823***

 

(0.15)

(0.19)

(0.20)

(0.20)

Level 2

    

Variance

0.336

0.526

0.569

0.556

 

(0.065)

(0.010)

(0.114)

(0.111)

Level 1

    

Variance

0.432

0.620

0.969

0.906

 

(0.03)

(0.025)

(0.058)

(0.052)

N

669

662

641

680

LL

-743.511

-858.713

-964.562

-999.574

Standard errors in parentheses

* p < .05, ** p < .01, and *** p < .001


Table 2. Summary of HLM Analysis of CLASS-S Predicting CLASS-S Domain Scores on School Value-Added Status, Course Track, Differential Effects of Course Track by School Value-Added Status, and Grade, Subject, and Time of Year of Observational Visit


 

(1)

(2)

(3)

(4)

 

Emotional

Support

Classroom Organization

Instructional Support

Student Engagement

 

b/se

b/se

b/se

b/se

HVA

0.224

0.178

0.032

-0.030

 

(0.16)

(0.20)

(0.22)

(0.22)

Honors

    0.673***

    0.488***

   0.429**

   0.716***

 

(0.10)

(0.13)

(0.16)

(0.15)

HVA X  

-0.447**

-0.156

-0.144

-0.244

  Honors


Grade

(0.14)

(0.18)

(0.22)

(0.21)

  9th

0.019

0.056

0.254

0.170

 

(0.14)

(0.17)

(0.21)

(0.20)

  11th

  0.622**

-0.212

0.153

-0.498~

 

(0.21)

(0.25)

(0.32)

(0.30)

  12th

-0.330

   -0.922**

-0.506

    -1.410***


Subject

(0.24)

(0.28)

(0.36)

(0.34)

  Math

-0.382*

-0.286

-0.218

-0.303

 

(0.18)

(0.22)

(0.24)

(0.24)

 Science

-0.323~

-0.103

-0.325

-0.082


Observation   time

(0.18)

(0.22)

(0.24)

(0.23)

  Winter

-0.037

-0.023

-0.161~

0.125

 

(0.06)

(0.07)

(0.09)

(0.08)

  Spring

0.052

-0.120

-0.231

0.179

 

(0.09)

(0.11)

(0.14)

(0.13)

Constant

     5.053***

     4.897***

     4.284***  

     4.766***

 

(0.16)

(0.19)

(0.21)

(0.21)

Level 2

0.348

0.527

0.566

0.561

Variance

(0.067)

(0.100)

(0.113)

(0.112)

Level 1

0.424

0.619

0.969

0.903

Variance

(0.025)

(0.024)

(0.57)

(0.052)

N

669

662

641

680

LL

-738.824

-858.325

-964.350

-998.877

Standard errors in parentheses

* p < .05, ** p < .01, and *** p < .001





Cite This Article as: Teachers College Record Volume 117 Number 11, 2015, p. 1-38
https://www.tcrecord.org ID Number: 18110, Date Accessed: 10/27/2021 10:41:13 AM

Purchase Reprint Rights for this article or review