Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Expanding Instructional Horizons: A Case Study of Teacher Team-Outside Expert Partnerships


by Bradley A. Ermeling & Jessica Yarbo - 2016

Background: Despite increasing popularity and mounting evidence for teacher collaboration as a lever for school improvement, reported changes in teaching associated with collaboration are often subtle and incremental, rarely involving substantial shifts in instructional practice called for by advocates of deeper learning and next-generation standards. One reason more expansive teaching changes remain elusive is that existing “horizons of observation” constrain possibilities teacher teams consider and solutions they develop while collaborating to improve teaching and learning.

Purpose: This case study of two secondary school teacher teams explored the potential of collaborative partnerships with outside content experts (OCEs) for infusing new resources and perspectives that move beyond persistent images of classroom instruction.

Setting: The study context was the Learning Studios model from the National Commission on Teaching and America’s Future (NCTAF), in which interdisciplinary teacher teams partnered with local scientists and researchers to develop and implement yearlong project investigations with students.

Research Design: The study used a qualitative case study design, including live observations and narrative transcription of team planning and design sessions, interviews, focus groups, and analysis of web-based interactions to develop rich narrative descriptions of the two partnership cases during the 2014–2015 academic year. We also compared and analyzed prominent patterns of OCE facilitative action across the two cases.

Findings: Coding and analyses revealed several pivotal episodes of partnership interactions with clear evidence of OCE influence on teacher instructional plans. Cross-case analyses point to three OCE facilitative actions that preceded these effects—adapting expertise to local needs, following up between meetings, and judiciously applying pressure.

Conclusions: The pivotal episodes we captured provide some initial evidence to support previous researchers’ hypotheses that extended collaborative engagements can facilitate teacher learning in ways not readily achieved through traditional partnership models. The joint productive activity and depth of interaction observed in these cases opened up several opportunities to infuse knowledge and insights seldom documented in other teacher–expert studies involving loosely structured programs or short-lived externships. The two cases also provide initial evidence that outside experts can help to expand the horizons of possibilities teachers consider during instructional planning. Although these examples are instructive, we do not believe the reported effects are of sufficient magnitude or duration to be labeled “transformative shifts in practice.” We also cannot say whether these initial changes in instructional plans translated into changes in classroom teaching or improvements in student learning. These remain important questions for future research.




Advocates for teacher collaboration have steadily increased over the last three decades, and recent studies have helped substantiate the premise that engaging educators in productive collaborative practice can lead to improvements in teaching and student achievement (Carroll, Fulton, & Doerr, 2010; Saunders, Goldenberg, & Gallimore, 2009). Despite the increasing popularity and evidence, some important caveats remain to be addressed. One of those caveats is that reported changes in teaching associated with collaboration are often subtle, seldom dramatic changes. They accumulate over time to slowly improve teaching practices and student achievement (Gallimore & Ermeling, 2012). Although not insignificant, these incremental changes and improvements rarely involve the kind of substantial shifts in instructional practice called for by advocates of deeper learning and next-generation standards (National Research Council, 2012).


A key reason why these more expansive changes remain elusive is that teachers construct visions of classroom practice based on deeply rooted cultural routines, selected voices and epistemologies, and preconceived notions of effective and ineffective teaching (Hiebert et al., 2003; Hiebert, Gallimore, & Stigler, 2002; Roth et al., 2006; Stigler & Hiebert, 1999). These tightly bounded “horizons of observation” constrain the possibilities that teachers consider and the solutions that teachers develop as they collaborate to improve their practice and enhance student learning (Ermeling & Graff-Ermeling, 2016; Gallimore & Ermeling, 2012; Hutchins, 1996, p. 52; Little, 2003).


In the study reported here, one possibility we explored for expanding horizons of observation was engaging teacher teams in collaborative partnerships with outside content experts (OCEs), such as the Learning Studios model, developed by the National Commission on Teaching and America’s Future (NCTAF). Launched first in 2009, Learning Studios (LS) are project-based learning environments in which interdisciplinary teacher teams collaborate with local scientists, researchers, and university faculty to develop and implement yearlong project investigations with students.


Although there is limited research on the effect of such extended partnerships, the frequently cited rationale for this approach is that outside experts might infuse new resources and viewpoints that assist teachers to expand their professional knowledge and modify enduring images of classroom instruction (Loucks-Horsley, Love, Stiles, Mundry, & Hewson, 2003; National Research Council, 1996). Leveraging the NCTAF program as a research context, this case study documented planning meeting interactions between OCEs and LS teacher teams from two secondary schools, explored perceived effects of these interactions on teachers’ instructional plans, and examined specific facilitative actions that might be important for OCEs to productively partner with teacher teams.


The following sections include a brief overview of the background literature on teacher collaboration, including caveats and limitations of collaborative contexts, a summary of the existing literature on partnerships between teacher teams and OCEs, and a description of the study design and methodology. We then present results from the study and discuss implications for research and design of teacher team–outside expert partnerships.


CONCEPTUAL FRAMEWORK


RESEARCH ON TEACHER COLLABORATION


Since first receiving national attention in the 1980s and 1990s (e.g., Cochran-Smith & Lytle, 1999; Hord, 1997; Little, 1982; Louis, Kruse, & Marks, 1996; Rosenholtz, 1989, 1991; Tharp & Gallimore, 1988), emphasis on school-based teacher collaboration has rapidly spread as a means for breaking down barriers of isolation, establishing shared goals for student learning, fostering reflective practice, and driving school and instructional improvement (DuFour, 2004; Gallimore & Ermeling, 2010; Hargreaves & Fink, 2006; Learning Forward, 2011; Lewis & Hurd, 2011; Louis, 2006; Lieberman & Miller, 2011; Supovitz, 2002). Reports and observations suggest that settings using one of the many labels for teacher collaboration—learning communities, learning teams, inquiry teams, communities of practice—vary significantly in purpose and effectiveness but are now a common feature of most schools and districts across the United States (Carroll et al., 2010; Ermeling & Gallimore, 2013; Vescio, Ross, & Adams, 2008). Numerous councils and organizations have also prioritized collaborative practice in their published standards and core propositions for the teaching profession (Council of Chief State School Officers, 2013; Learning Forward, 2011; National Board for Professional Teaching Standards, 1989/2002). This growing emphasis on collaboration is a hopeful sign that schools are becoming places of ongoing educator learning that supports student learning, as Sarason (1972) famously advocated.


In general, evidence for the effects of teacher collaboration has lagged behind its advocacy, but recent studies have helped substantiate the premise that engaging educators in productive collaborative practice can lead to improvements in teaching and student achievement. In one review of 11 empirical studies of learning communities, Vescio et al. (2008) concluded, “ Although few in number, the collective results of these studies offer an unequivocal answer to the question about whether the literature supports the assumption that student learning increases when teachers participate in professional learning communities. The answer is resounding and encouraging yes” (p. 87). In a separate review focused on secondary schools, Lomos, Hofman, and Bosker (2011) reported a wide range of relatively small but significant effect sizes and concluded that “the relationship between professional community and student achievement is positive and significant” (p. 137).


In addition, several studies recognized by Learning Forward for the 2010 Best Research Award, summarized findings from a 5-year implementation of a collaborative instructional improvement framework in Title I elementary schools in which achievement rose by 41% overall and by 54% for Hispanic students. Demographically matched comparison schools, selected at the beginning of the study to serve as controls, showed no comparable gains over the same 5-year period (Gallimore, Ermeling, Saunders, & Goldenberg, 2009; Saunders et al., 2009). An external evaluation during the final year of the study revealed that schools in the experimental group, implementing the collaborative instructional improvement framework, had a sharper focus on learning goals and outcomes, stronger collective commitment, higher expectations, and attributions for student achieve­ment more focused on teachers’ planning and instruction rather than other external factors (McDougall, Saunders, & Goldenberg, 2007). Numerous other studies, using a variety of qualitative and quantitative designs, have also reported positive results associated with teachers’ participation in ongoing school-based collaborative contexts (e.g., Ermeling, 2010; Louis & Marks, 1998; McLaughlin & Talbert, 2001, 2006; Stoll, Bolam, McMahon, & Thomas, 2006).


CAVEATS AND LINGERING CONCERNS FOR TEACHER COLLABORATION


Despite the popularity and growing body of evidence supporting teacher collaboration, some important caveats remain to be addressed, including insufficient teacher subject matter knowledge and pedagogical content knowledge (Bausmith & Barry, 2011; Hiebert et al., 2002; National Research Council, 1996, 2012), as well as challenges with sustainability (e.g., Ermeling & Gallimore, 2013; Fink, 2000; Goldenberg, 2004). As indicated before, another important lingering concern is that reported changes in teaching associated with collaboration are often subtle and underwhelming in nature. They build up over time through small increments of change and cumulatively influence attitudes, beliefs, practices, and achievement (Gallimore & Ermeling, 2012; Gallimore et al., 2009). Although certainly worthy of recognition, these incremental changes and improvements seldom involve the kind of significant expansion of instructional practices emphasized in 21st-century standards for deeper learning in math, English, and science, or initiatives focused on preparing students for careers in STEM (Groome, Rankin, & Wheary, 2001; National Research Council, 2012). For example, in one middle school that won an award for exemplary collaborative practice, a team of algebra teachers diligently worked to improve teaching of mathemati­cal procedures. Yet, observations of this team and other secondary schools in the same project showed few signs that even high-functioning teams were “interested, will­ing, or able to raise the bar and move beyond teaching procedures to focus on better teaching of mathematical connections and concepts” (Gallimore & Ermeling, 2012, p. 46).


The underlying reason these more expansive and durable changes in teaching are rarely observed  is that “teaching is a cultural activity” learned over time through participation in deeply rooted cultural settings and routines (Hiebert et al., 2002, 2003; Roth et al., 2006; Stigler & Hiebert, 1999, p. 11). Practicing and perfecting procedures, for example, has been identified by researchers as a pervasive cultural script for U.S. math instruction. This was, undoubtedly, how the algebra teachers at the award-winning middle school experienced math instruction throughout their K–20 school careers. The teachers perfected the procedural teaching script well enough to see modest increases in test scores but did not recognize the limits of this approach on students’ conceptual understanding (Gallimore & Ermeling, 2012).


Hutchins (1996) elaborated on the cultural and social barriers for understanding and changing practice in his description of horizons of observation—the boundaries of what tools, interactions, and behaviors members are exposed to and imagine as possible within a given social context and community activity. Little (2003) drew on Hutchins’s terminology in her analysis of teacher discourse during meetings of high school math and English departments. She described how the insular nature of teaching restricts horizons of observation and how teacher communities’ out-of-classroom interactions can “both open up and close off opportunities” for learning and reflective practice (p. 939). Illustrating her point with specific transcript excerpts, she raised the concern “that formation of tightly bounded professional communities with their specialized language and stock of familiar classroom stories . . . might result in highly isolated and insular groups . . . replacing the isolated classroom teacher with the isolated teacher group” (p. 939). Teacher teams, then, just as individual teachers, are constrained by existing horizons of observation, both in their limited access to research and expertise in content knowledge for their discipline and in their conceptions of how that content might be effectively taught.


PARTNERSHIPS BETWEEN TEACHER TEAMS AND OUTSIDE CONTEXT EXPERTS


One possibility for expanding horizons of observation is to engage teacher teams in collaborative partnerships with OCEs, such as university professors, practicing researchers, industry professionals, and educational specialists. Proponents of this approach hypothesize that outside experts’ extensive content knowledge, exposure to latest research, and practical experience in the field might afford access to resources and perspectives that could help teachers expand their professional knowledge and move beyond persistent images of traditional practice (Loucks-Horsley et al., 2003; National Research Council, 1996).   


The research on teacher team–OCE partnerships is scarce, with most of the available studies focused on STEM initiatives in which teachers’ limited content expertise is a well-documented concern (National Research Council, 1996, 2012). Published findings and available evidence from these projects present mixed results. A comprehensive report of the National Science Foundation’s Math Science Partnership (MSP) by Zhang et al. (2008) found that teacher relationships with STEM experts were mostly superficial, with few examples of sustained, collaborative interactions and no significant changes in teachers’ practice. Nelson (2005) described minimal levels of engagement and knowledge negotiation in mentoring partnerships between university science graduates and elementary school teachers. A number of studies, such as Firestone and Pennel (1997) and Bell, Crawford, and Lederman (2003), reported little benefit from teacher–scientist partnerships in which external expertise is passively received and largely disconnected from the immediate context and varying needs of teachers and students. Along the same lines, a recent New Zealand study by Falloon (2013) reported that teacher–scientist partnerships were problematic, and scientists must be “prepared to penetrate deeply into the world of the classroom when undertaking any such interactions” (p. 858).


A handful of studies present a more optimistic view of partnerships that are carefully structured and deeply embedded within the K–12 context. An early study by Briscoe and Peters (1997) documented changes in elementary teachers’ content and pedagogical knowledge during a 6-month collaboration with colleagues and higher education scientists. One uniquely successful MSP project, Science Learning Through Engineering Design (SLED), also brought together STEM experts and elementary school teachers to develop a new problem-based learning curriculum (Lehman & Capobianco, 2012). In the Four Schools for Women in Engineering (WIE) program, middle school teachers indicated that partnerships contributed positively to their understanding of engineering and teaching (Erkut & Marx, 2005; Reisberg et al., 2006). Similarly, results from the California Science Project Teacher Retention Program Initiative (CSP-TRI) showed that partnerships with scientists increased middle and high school teachers’ sense of empowerment and effectiveness while helping scientists better diagnose teachers’ learning needs (Barraza-Lyons & Daughtrey, 2011). Harris Willcuts (2009) also described unique contributions of scientists to middle school teachers’ professional development.


Most recently, in Brown, Bokor, Crippen, and Koroly (2014), high school teachers reported several positive growth opportunities related to content expert support. Teachers and scientists in the program described feelings of mutual respect, but, much like previous studies, the partnerships were characterized as “unidirectional” and limited by the lack of extended collaboration between scientists and teachers (p. 257).


In summary, the existing research on teacher team–OCE partnerships is inconclusive, with the majority of studies focused on short-lived externships, loosely structured partnerships, and participant roles largely characterized by experts “telling” and teachers “listening.” A few cases reported positive responses from teachers but left many unanswered questions about the nature and effect of productive teacher–expert interactions. More research is needed to understand the potential of extended, collaborative engagements in which OCEs’ expertise is systematically integrated into teachers’ ongoing cycles of planning, teaching, assessment, and problem solving. In particular, more studies are needed to explore the specific effects of these sustained partnerships on instructional planning, classroom practice, and student achievement.


METHOD


Given the limited descriptive knowledge base of extended teacher–expert partnerships, the principal goal of this study was to document several concrete examples of sustained, collaborative interactions between OCEs and NCTAF LS teacher teams and record any discernable influence of the partnerships on teachers’ instructional plans. The following research questions guided our investigation:


1.

What are some specific ways that OCEs interface with LS teacher teams during the planning process?

2.

What are the perceived effects of OCE contributions on LS teachers’ instructional plans?

3.

What are the specific OCE facilitative actions that preceded any discernable effects?


RESEARCH CONTEXT


This research was situated in the context of the NCTAF LS model that was under way with 23 schools in Maryland and five schools in New Hampshire as of June 2013. NCTAF defined LS as “creative, collaborative partnerships between classroom teachers and local STEM professionals that foster deeper learning for students and engage teachers in ongoing professional development” (National Commission on Teaching and America’s Future [NCTAF], n.d.). The model was based on the integration of three core program components: (a) interdisciplinary teachers’ teams (b) forging partnerships with OCEs (c) to engage students in problem-based learning (PBL).


NCTAF launched the 2013–2014 school year by hosting several two-day summer design sessions for groups of 12–18 schools within each participating country or district. Teams of four to six teachers from each school attended these summer sessions, which included training on problem-based learning, introductions to OCEs, and time for collaborative work. NCTAF leaders established partnerships with outside organizations who agreed to recruit members of their professional staff (e.g., scientists, engineers, university faculty) to volunteer as OCEs and participate in these LS sessions. OCEs gave presentations about resources and ideas from their professional context and circulated to meet with various teams. Teachers used this time to determine which OCE’s expertise might best align with the needs and goals of their group and to initiate collaborative work on specific project plans.


Across the 2013–2014 school year, NCTAF hosted four additional quarterly design sessions to facilitate project planning and extended LS–OCE collaboration. Teacher teams, with continued guidance and input from OCEs, launched each project by identifying a driving question that framed a learning challenge for students (e.g., Why is it important for us to conserve energy and consider alternative forms of energy?). Students worked collaboratively across different content areas to design projects and solve problems related to the driving question. In addition to the quarterly planning sessions, teachers maintained communication with OCEs and continued project planning during regular team meetings back at their school sites. Depending on the nature and timing of the projects, OCEs also visited school sites to work directly with students and teachers. The LS work culminated in May or June with annual district or school-site project presentations.


RESEARCH PARTICIPANTS


The research participants were 10 teachers and 2 OCEs from two secondary schools in Maryland, identified by NCTAF as strong examples of successful Learning Studios–Outside Content Expert (LS–OCE) partnerships. Both of the schools had at least one year of experience with sustained productive collaboration, as evidenced by NCTAF’s observational records of both LS team results and OCE interactions. The cases included one middle school and one high school, both located in the same urban school district, with LS team members representing a range of subject areas, including math, science, English, social studies, and technology education. All 10 teacher participants were veterans; nine of them had over 10 years of experience in the profession. A profile of each participating school and corresponding OCE is included in Table 1. We obtained informed consent from all teacher and OCE case participants and removed all names and other potential identifiers to maintain anonymity.


NCTAF originally identified four case study schools to participate, but the other middle and high school were unable to sustain engagements with the identified OCEs. One OCE had to drop out of the program because of changes in her employer’s policy regarding school partnership commitments. The other OCE was placed on mandatory furlough resulting from a shutdown of the U.S. government. These issues are addressed further in the discussion section of the article.


Table 1. Partnership Case Profiles


Partnership            

case label

Case #1

Case #2

School (pseudonym)

Nobel High School (NHS)

Nobel Middle School (NMS)

Percent of students receiving free-reduced meals

28%

40%

Number of LS teachers

6

4

Years of teaching experience

  0 = Less than 5

  1 = 5 to 10

  5 = More than 10

  0 = Less than 5

  0 = 5 to 10

  4 = More than 10

Team member

subject areas

Science

Tech Ed

English

Math

ELA

Social Studies

Science

Years of LS experience            prior to 2013–14

4

1

Big idea or theme for                 2013–14 LS projects

Survival needs for extended                  space exploration (Tomatosphere)

Safeguarding our environment through the use of                       green energy

Driving question (project focus) during period of              case study analysis

What effect does “priming” have on the germination rate, growth, and fruit yield of tomato plants?

How does color affect absorption?

OCE label

OCE1

OCE2

Years of LS experience

prior to 2013–14

0

2

(1 with NMS)

OCE job title/

area of expertise

Research Fellow:

Health Science

Science Education Specialist: Environment, Earth and Soil


DATA COLLECTION AND ANALYSIS


Data collection for this project took place during the 2013–2014 school year (June through May) using a qualitative case study design. Data included live observation and narrative transcription of team planning and design sessions, semistructured interviews with the 2 OCEs, and semistructured focus groups with the two LS teams. OCEs and LS team leaders also agreed to copy researchers on email correspondence and share access to web-based planning documents. See Table 2 for a data collection summary and timeline.


Table 2. Data Collection Summary and Timeline


Data Collection

Number of Events

Timeline

Summer design session observations (6 hours each)

1 x 2 cases

June 2013

Quarterly design session observations (6 hours each)

2 x 2 cases

September 2013 (Quarter 1)

February 2014 (Quarter 3)

OCE interviews                                     (60–90 minutes each)

1 x 2 OCEs

March 2014

LS teacher focus groups

(40–60 minutes each)

1 x 2 LS teams

March 2014

Tracking work products and correspondence

N/A

Ongoing


Design Session Observations

Based on recommendations from NCTAF, research assistants attended three of the five LS design sessions across the year to capture examples of project planning discussions at strategic intervals: summer launch (June), quarter one (September), and quarter three (February). The sessions were conducted by county, with multiple schools gathering in a large meeting space and separate tables provided for each LS team. Research assistants spent one full day (approximately 6 hours of session time) on each of these occasions sitting with assigned LS teams and using laptop computers to capture live transcriptions of dialogue and interactions among team members and OCEs. Digital recorders were also used to preserve audio content from each session, enabling researchers to fill in gaps or double-check key segments for transcription accuracy.   


Interviews and Focus Groups


Semistructured interviews and focus groups were scheduled in March, just after quarter-three design session observations. We used evidence and work products from the design sessions to construct specific questions about project lesson plans and OCE contributions to those plans. Each session was digitally recorded and transcribed. See Appendices A and B for the list of main questions used to guide the interviews and focus groups. Researchers also incorporated additional follow-up questions to probe for further detail and pursue emerging lines of inquiry.


General Coding Procedures


For the design session observations as well as interviews and focus groups, we organized the data and codes using a customized, web-based document management system that we previously designed for case study research projects. We did not use qualitative analysis software. Appendices C and D provide a list of codes and a detailed illustration of the transcription coding process. Interviews and focus groups were treated as one data set (see Appendix C), and design session observations were treated as a second data set (see Appendix D). For each data set, we used the first several transcripts to establish and calibrate a coding scheme while tracking and highlighting emerging themes. After assigning the appropriate code to a given segment of text, the researcher added comments in the margin and then copied and pasted a hyperlink of that excerpt into a master spreadsheet organized with coding categories listed as column headings and each partnership case event (transcript) listed by rows. The researcher also marked particular hyperlinks that represented uniquely illustrative excerpts for a corresponding category or theme. The final master spreadsheets and coded hyperlinks provided a readily navigable interface for analyzing patterns across each set of transcripts. The interview/focus-group master spreadsheet included 93 coded hyperlinks; the spreadsheet for the design session observations included 91 coded hyperlinks. Consensus ratings were used to negotiate discrepancies, check for reliability, and confirm coding categories for uniquely complex episodes.


Analysis of Key Episodes


In particular, we focused our analysis on design session episodes coded as “teacher uptake from OCE” to identify key episodes of planning interaction. Uptake was defined as a discernable shift in the pattern of discourse evidenced by heightened teacher interest, curiosity, or engagement in response to the OCE. We further examined each episode of uptake to distinguish between those that were intermittent or short lived, and any sustained, pivotal episodes that might represent expanded horizons of instructional possibilities (see Appendix D for examples). Only a fraction of these uptake episodes warranted further analysis. We used two criteria to classify episodes as expanding horizons: (a) we looked for evidence of teachers adopting an instructional approach from the OCE that was notably different from the approach they might otherwise have pursued for their project design or lesson plans (i.e., something beyond their existing horizons of observation); and (b) we looked for evidence that the insight or approach was directly oriented to a key learning goal or fundamental skill related to the standards and curriculum. Finally, we also analyzed the specific OCE actions that preceded these pivotal episodes of uptake and expansion.


Integrated Analysis


We used a constant comparative method to analyze interviews, focus groups, session observations, and work products for the two LS–OCE partnership cases throughout the 12-month study (Glaser & Strauss, 1967). We used triangulation to corroborate findings from these multiple data sources, across individuals, time, and settings, comparing participant self-reported perceptions with researcher observations (Denzin, 1978; Miles & Huberman, 1994). For each LS–OCE partnership, we compiled and organized key transcript excerpts and project records across all data sources into a single integrated case file and used these files to outline a more complete narrative description for each case. We also compared and analyzed prominent patterns of OCE facilitative action across the two cases.


RESULTS


Analysis of each partnership-case data set revealed multiple episodes of intermittent uptake with limited or no perceived effect, but each case also contained at least one pivotal episode of sustained interaction that demonstrated clear evidence of the OCE’s influence on teacher instructional plans. These key episodes were the focus of our analysis.


For each partnership case, we provide a detailed narrative summary of the pivotal episodes constructed from design session transcripts, field notes, correspondence, and available project records from NCTAF. We also provide teacher focus group/OCE interview descriptions of OCE contributions and summarize the perceived effect on teachers’ instructional planning.


CASE #1: NOBEL HIGH SCHOOL


The LS team at Nobel High School (NHS) included 6 veteran teachers responsible for English, math, science, and technology education (tech ed). The period 2013–2014 was their fifth year participating in the NCTAF LS program and their first year partnering with OCE1. OCE1 was a research fellow at a U.S. government organization focused on health and medical research. She volunteered to assist with the program after learning about NCTAF and NHS from a coworker who partnered with NHS the previous year.


For the 2013–2014 school year, the NHS teachers identified a program called Tomatosphere as a promising project for their interdisciplinary work with STEM and LS. Tomatosphere was a project sponsored by the Canadian Space Agency to engage students in the study of life support requirements for extended space exploration. Schools participating in the 2013–2014 project learned to design and conduct a scientific experiment with dependent and independent variables by comparing germination rates and plant growth for an experimental group of primed tomato seeds (i.e., presoaked in water) and a control group of unprimed seeds. The initial germination phase of the experiment was conducted as a “blind” test to prevent unintentional bias from influencing results. The project also included resources for cross-curricular application in areas such as nutrition, energy, weather, and environmental studies (Canadian Space Agency, n.d.).


During the June 2013 summer design session, the driving question NHS teachers recorded for the project was, “What effect does ‘priming’ have on the germination rate, growth, and fruit yield of tomato plants?” During the focus group, teachers described their selection of the Tomatosphere project as well as the origin of their partnership with OCE1.


T:

We had inquired about a program called Tomatosphere where this agency will send out seeds to schools in Canada and the United States and those seeds will have an experimental group and a control group. . . . When we sat down with [OCE1], where she came in was really the development of the program. We had decided that we wanted to move forward with the Tomatosphere project. . . . And so she jumped right in and said, well you know, she could come in and talk with our students about her work and how she goes about designing an experiment. And that lended itself directly to the Tomatosphere project. (Nobel High School, Focus Group, March 2014)


OCE1’s most notable contribution to the teachers’ planning process involved two sustained interactions during the June and September design sessions, including a lengthy email exchange between the two meetings. During the June session, teachers and OCE1 agreed that it might be interesting to connect some of her studies with health and aging to why lycopene or other nutrients might be beneficial, and why tomatoes, which are rich in lycopene, might be a viable crop for space travel. OCE1 agreed to do some research and see what she could locate:


T1:

I think everyone’s in agreement lycopene is a good extension to bring up. . .

OCE1: I can take a look at lycopene and what the most recent findings are in the literature in terms of health—what it’s good for—and write up a summary with citations. . . . Maybe a review article that’s not super technical but gives you kind of a broad view.

T2:  

Maybe an abstract?

OCE1: Yeah, an abstract . . . just kind of how the role lycopene plays in human health. . . . Writes something down. (Nobel High School, Design Session Transcript, June 2013)


While conducting research on lycopene over the summer, OCE1 discovered there was limited evidence to support the nutritional benefits of lycopene supplements. She explained in her interview comments:


OCE1:  After I left them I did a bunch of research online. . . . And so when I was looking I found a website through Mayo Clinic, and they go through and they grade the science of different supplements, and so seeing them grade science made me think about that as an in-class activity. . . . But when you get down to something like lycopene, they showed in one study that it maybe kind of did something one time, but nobody could replicate that, you know, really like there’s this one study, but it wasn't really done very well. (OCE1, Interview, March 2014)


OCE1 sent the teachers an email over the summer providing three specific scenarios for how they might approach a lesson involving the idea of supplements and health (see Appendix E). The options included: (1) ignore the role of lycopene and just focus on antioxidants in general; (2) discuss the studies pointing to limited evidence for lycopene and use them as an opportunity to engage students in critical thinking; or (3) press forward with their plan to discuss the benefits of lycopene and focus on the few available studies that demonstrated an effect. This email and corresponding list of options served as a launch point for the team’s planning discussions at the next design session in September. The following transcript excerpt captures a pivotal moment of teacher uptake that resulted from OCE1’s suggestions:


T2:  

Since most of us have introduced the Tomatosphere project design, the overall purpose to our students. . . . I was thinking maybe we could actually have you come in and they could learn like, “Why tomato?”—with the lycopene.

OCE1: Ok. So, like I said in my email, I was looking online for evidence of lycopene and human health, and unfortunately there is not very strong support. Like, there will be one study saying that it kind of helps this and another person can’t reproduce it. So the direct role of lycopene itself seems to be pretty tenuous. It doesn’t seem to have a great connection to human health. But I think that this could either be like a learning opportunity (summarizes options 1 and 3 from her email). . . . Another option is that you can use it as a critical thinking opportunity to have them like maybe look at the evidence, see what there is, and have them decide if it’s good evidence.

T2:  

I was going to go with that.

OCE1: It could be a little bit trickier, but it may be rewarding.

T2:

Well, I like both things but what I was thinking when you started talking is . . . not  just learning facts from a textbook, but actually learning how scientists actually learn the science that we teach in our classrooms. So when you just said that there’s not a whole bunch of evidence to say that lycopene is perfect . . . I thought it was good for students to see that, that’s it’s an ongoing process. . .

OCE1: Yeah . . . looking at different studies and identifying why they’re flawed or why they don’t agree with one another is also teaching the material of what evidence there is for lycopene in health but also critical thinking skills.

T2:

So I thought that maybe, I don’t know if you can do this, but give them something and then say, “Does this look like it’s reliable data?”

OCE1: So . . . maybe. I don’t know, tell me if this would work in an actual classroom. What if I went to the studies and I looked at the abstracts and . . . if it’s super heavy write a simplified format. And then provide a couple of abstracts about lycopene and let’s say prostate cancer. And I don’t know, maybe the students could read over it . . . and hold up a letter grade for how good they think the study supports it . . . and why do you think it’s a great study . . . or why do you think it’s a bad study? And you know back it up. I don’t know.

T2:  

That’s a good idea.

T3:

I like that idea.

General nodding and “yeah” across the group

T1:

I think it directly relates to what we’ve been talking about for our writing samples for claim, evidence, reasoning. So we’ve been recently discussing having students as a goal for the year increase their ability to write a scientific explanation. And the components of a scientific explanation are claim, evidence, reasoning. So if they can actually evaluate a simplified version of the abstract, “Does the information from the abstract match the claim?” . . . they’re processing through that filter of, “Does this evidence support this claim or not and why?” And then have them do a writing at the end. . .

T3:

OCE1 emailed us a Mayo Clinic link. . .

T2:  

So I have a question . . . I like the idea. . . . What is the best way to take this to the students so that when you come in and talk it can be a dialogue? Is it better to have the abstracts before? . . . What are you thinking?

T3:  

I’m thinking the floor as a gallery . . . so it doesn’t look like we grouped the best, the mediocre. And the students are in groups.

OCE1: I could write up the abstracts and you guys could print them out and give them to the students to read the night before so they have some time to digest it. And then I could come give a 15-minute talk about lycopene and human health or what makes a good research study solid.

T2&3:

I like what makes a good research study solid. (Nobel High School, Design Session Transcript, September 2013)


The teacher–OCE interactions and reflections around this planning episode represent a clear example of teachers expanding horizons of instructional plans as a direct result of OCE1’s contributions. After alerting teachers to oversimplified claims about the benefits of lycopene, OCE1 presented the team with a wider range of instructional options to consider that might better support their project learning goals. In the following focus group excerpts, teachers describe how their lesson plans became more focused on helping students think critically about the scientific process than would have likely been possible without OCE1’s participation (first criteria for expanding horizons). They also describe how these lesson changes directly supported important learning outcomes for students, such as understanding experimental design and evaluating scientific claims, as articulated in their state science standards and corresponding assessment indicators (second criteria for expanding horizons; Maryland State Department of Education, 1995).


T:

I’ll just say what I was talking to you about earlier, that question that I was—that discussion I had with [OCE1]. It’s making me reflect and try to figure out ways that I can connect, not just the fundamental conceptual knowledge they get of our science topic, but also like how the science is actually done. . .

T:

. . . we realized that students had a weakness when it came to the skills and processes of science, that [Maryland State Department of Education] goal and indicator specifically. . . . And so [OCE1] really set up a, you know, kind of a foundation for this project. So not necessarily in terms of the content topic, but for the foundation in experimental design. . . .(Nobel High School, Focus Group, March 2014)


For Case #1, there were no additional episodes of significant uptake recorded after the June and September design sessions. The NHS team decided to repeat the same activity with several other groups of students and were also partnering with another OCE (biotechnology expert) during the second half the year. As OCE1 described it, “From this point on, our conversations become far more logistical in nature, because I did the first thing and they liked it, they wanted me to do it again, and then quarter three was planning for the logistics of the science fair” (OCE1, Interview, March 2014).


CASE #2: NOBEL MIDDLE SCHOOL


The LS team at Nobel Middle School (NMS) included four veteran teachers covering science, language arts, and social studies content for Grades 6–8. The period 2013–2014 was their second year with LS as well as their second year partnering with OCE2, a science education specialist with a broad generalist background and particular expertise with earth, soil, and the environment. Her partnership with the school originated through a position she held at a government-sponsored science education center.


During the June 2013 summer design session, the NMS teachers discussed “sustaining efficient energy,” and specifically “green energy,” as big ideas or general project themes for the school year. Initially, they discussed a project utilizing solar-powered cars to teach seventh-grade students about kinetic and potential energy but encountered a number of obstacles related to time constraints, securing necessary resources, and establishing an appropriate level of complexity for the seventh-grade level. Later in the school year, with the guidance of OCE2, they decided to switch from an emphasis on solar to thermodynamics and worked to engage students around the driving question: “How does color affect absorption?” Teachers explained:


T:

[OCE2] also helped with that because in order to do the solar project, you need to see them with STEM more than one hour of the week. So for us, we only had STEM on Thursdays. So, doing solar would not have been conducive for us to do that. So she helped us to stay with the energy concept and so we went from solar to thermodynamics. . .

T:

And so with seventh graders, some of them coming from elementary, some of them are in a middle school. So when you speak of thermodynamics, you have to think of how to break it down for them . . . she helped us to narrow it down to using colors for interior or exterior of their homes. So that is our project . . . now that the temperatures are breaking, we can go outside and actually test to see does color have an effect on temperature. (Nobel Middle School, Focus Group, March 2014)


Together with OCE2, the teachers developed detailed instructional plans to teach students about heat, absorption, reflection, refraction, and the urban heat island effect. They designed a lesson where students would develop hypotheses for which colors (white, black, yellow, and green) would absorb the most heat and which would reflect the most heat. Students would then conduct an experiment mixing water-based paint into 200 ml of water (preacclimated to the outdoor temperature) and measure both the temperature change and the differences between temperatures for each of the different colors. This set the stage for subsequent lessons in which students would learn principles of architecture and prepare to design and eventually build green buildings. For the exterior paint color of the buildings, they would use the results of their experiment and choose the color that proved most efficient in reflecting away heat. In English and social studies classes, they would also write and produce public service announcements about the project to raise awareness about energy conservation.


Throughout the planning process, OCE2 offered multiple suggestions to help teachers maximize the validity and precision of the experiment within the finite constraints of the classroom context. For example, she emphasized the importance of taking the temperature of the plain water to ensure they began the experiment with a solid baseline. She also suggested they take the temperature of each sample mixture every 2 minutes until two consecutive temperatures were within 1 degree of one another.


The most notable contribution OCE2 made to the instructional planning was helping teachers decide how to best incorporate color into the jars of water. Teachers’ initial plan was to paint the jars or place colored paper around the jars, but OCE2 helped them understand how this might limit the accuracy of the experiment for students. Below is an excerpt from that pivotal episode during the February design session:


T1:

So actually I was thinking, do you want them. . . . Cause I was wondering, would paint work best on the jar or just wrap them in construction paper?

OCE2: Well the thing is that what we’re doing is, we’re getting natural light. So that’s why it’s better to have a liquid or solid . . . because if you just put the jar . . .

T1:

Right. We have the jar with the water and then, of course, color it with construction paper or paint the jar.

OCE2:

That’s why I’m saying. Instead of doing that, because it’s the sun . . . it’s going to be shining down on clear water and surrounded by color rather than the color itself.

T1:

So, the water should be colored?

OCE2: I think so. That’s why I was saying let’s melt the water color paint into the water and test that. Because it’s the actual medium we want to test, not the outer . . .

T1:

So once you add the color to the water—Okay. So then I don’t have to worry about painting the jars.

OCE2: Exactly. . . . So that's what I was thinking. If you start with air temperature water, whatever it is outside that's what we will have. . . . Then we add in the different color paint. Then we start measuring every two minutes to see what is the difference between white water, black water, red water—What is the optimal one? (Nobel Middle School, Design Session Transcript, February, 2014)


During email exchanges after the session, OCE2 reiterated and further outlined specific details for the lesson and experiment (see Appendices G and H), preserving each of these points in writing for the teams’ continued reference and planning.


This evidence from teacher–OCE interactions, combined with their reflective comments on the episode, provides another tangible example of LS teachers expanding horizons of instructional plans as a direct result of OCE suggestions and contributions. Compared with their initial plans prior to OCE2’s assistance, it was evident that teachers gained an expanded knowledge of design and measurement principles for teaching thermodynamics (first criteria for expanding horizons). They also gained practical knowledge for translating these ideas and principles into the immediate context of their seventh-grade classroom. These insights were important for ensuring reliable results but also for helping teachers engage students in effective scientific practices as defined in the middle school Next Generation Science standards under “planning and carrying out investigations” (second criteria for expanding horizons; Next Generation Science Standards, n.d., MS-PS3-4). Both the teachers and OCE2 described this planning episode in the focus group and interview discussions as instrumental in increasing the scientific validity and precision of their design for the heat absorption experiment.


T:

. . . I think we were first looking at taking colored paper to put around the jars to simulate the different colors.

T:

Right.

T:

And she said that it would be probably more effective to just color the water itself and then go from that perspective because that way we would be testing the temperature of the water. (Nobel Middle School, Focus Group, March 2014)


OCE2:

 Colored jars where the sun is shining directly on them wouldn’t give us a medium. It would give us the sun shining directly on the colored jars. So, it would be a different experiment than if we’re actually looking at the thermometer taking the temperature of that color. . . . So, I needed them to see that what we’re doing is actually looking at the temperature, following a set of protocols for each medium being the same. . . . Is the sun shining directly on the thermometer, or is it in the shadow of the colored jar? All of that is stuff they don’t have to worry about because the medium is there and the thermometer is going to be in the medium to take the measurement of the entire thing. . . . So, we would be getting a more consistent temperature measurement of what we’re actually trying to measure. (OCE2, Interview, March 2014)


Later in the interview, OCE2 also summarized the change process she observed with the teachers over the course of the project, specifically the episode involving measurement and temperature:


OCE2:

I think that they’re changing the way they’re thinking . . . sometimes, we have to teach how to measure something, how to measure temperature, but it’s not just let’s teach how to measure temperature. It’s . . . we’re going to have an electric bill to pay and if our electric bill was lower because we have a black roof, does that make a difference, or how does it make a difference? And so, it’s more integrative . . . students are learning a wide array of different processes through this one project. And I think the teachers themselves are starting to see that aspect of it. . . (OCE2, Interview, March 2014)


For Case #2, there were no additional significant episodes to report from the preceding design sessions in the first half of the year. This was primarily due to project delays from a shipment of solar cars that never arrived, but also due to changes in leadership roles with key NMS team members. OCE had limited interaction with the team until conditions stabilized in February. She explained,


I felt rather keenly that they had plans, but, you know, the cars hadn’t arrived and they were going to be doing some research. So, I let them call. I made them some resources to look at . . . like it could be a website or various places like that, but I was not specifically involved.” (OCE2, Interview, March, 2014).


KEY FACILITATIVE ACTIONS


Cross-case comparative analysis revealed at least three key patterns of OCE actions that contributed to the sustained episodes of uptake and helped instigate teachers’ rethinking of project design and instructional plans. These actions are not inclusive of all OCE characteristics and traits we observed or that might be deemed necessary for successful partnerships, but rather three specific patterns of facilitative action brought to light by the evidence in these cases.


Adapting Expertise to Local Needs


At the outset of each pivotal episode, both OCE1 and OCE2 demonstrated a flexible mindset. Although they both articulated in interviews clear ideas of possible research, expertise, or activities they might share with the teams, their first action we documented in design sessions was to listen and build understanding of teachers’ existing plans, needs, and rationale. Both OCEs stressed the importance of listening, genuinely tuning in to the needs of the group, and learning from the group’s knowledge and experience to effectively adapt and assist the emerging project:


OCE1: I think the number one thing is to be a good listener. I think that if I had just come in with my ideas of what I wanted to do and just barreled through, like, “Oh, well if it’s not my area, I don't want to do it. . . ” I mean originally I had thought about talking about genetics, like I had this whole idea of what we would talk about, and I think that if somebody’s not willing to listen to what the teachers want and need, and if they're not flexible, I think that things wouldn't go very far. (OCE1, Interview, March 2014)


OCE 1: [Teacher A] brought up the fact that a lot of people really struggle with the idea that just because it’s a theory, doesn't mean that it isn't relevant, or that it doesn't matter. . . . And that’s one of the things that I kind of get blind to in science, is that I forget how suspicious people are of science . . . I kind of lose touch with that everyday skepticism. And so it’s good that she brought that up, because I wouldn't have even thought about that. . . (OCE1, Interview, March 2014)


OCE2: Preconceived notions . . . I think that’s one of the most important things you can avoid because, you know, what I want to do as a workforce partner may not be helpful for what the teachers are trying to accomplish. . . . Curiosity is certainly something that I think is important and a willingness to listen to what their needs are and what their ideas are. And then, synthesizing all of those ideas into a coherent whole, then begin guiding them on how they can achieve the big goal they set for themselves, or how they can truncate it into something that’s manageable. . . . If there’s a reason why I am good at helping it’s because I am pretty well-rounded. . . . Often it takes someone who can look at possibilities and remain flexible to developing it in whichever path works best for the teachers. (OCE2, Interview, March 2014)


Listening, asking questions, showing interest before proposing ideas—these were all manifestations of OCEs working to adapt and integrate their expertise to the teachers’ specific project goals and plans. This approach not only laid a foundation of trust and shared understanding but also helped OCEs gain insight into teachers’ thinking, sometimes revealing important gaps in lesson plans or a specific “blind spot” resulting from limited horizons of observation.


Following Up Between Meetings


After learning from teachers and gaining knowledge of their local context, one way that both OCE1 and OCE2 applied their expertise and contributed ideas was through diligent follow-up work between meetings. As demonstrated in her email (see Appendix E), OCE1 not only made a substantial effort to review existing literature on lycopene but also carefully outlined three specific options that might expand teachers’ horizons regarding how to approach this particular teaching opportunity. Teachers expressed appreciation for this follow-through and responsiveness:


T: . . . they have to be accessible . . . if you're going to get feedback, you have to have somebody that's going to give you feedback pretty quick turnaround. Like we’ve all been saying, if we asked OCE1 something, she was very quick to respond to us. (Nobel High School, Focus Group, March 2014)


OCE2 also provided extensive guidance through follow-up emails (see example in Appendix G) in which she carefully outlined the nuanced instructional steps for the heat absorption experiment. A few days later, she sent the team a detailed data worksheet with corresponding instructions to guide students in accurate measurement and data collection of the various water temperatures (see Appendix H). This helped preserve, in writing, the important ideas from their design session discussions about maximizing validity and precision of the scientific process.


Judiciously Applying Pressure


After taking time to listen and develop a shared understanding of project plans, both OCEs also looked for critical junctures to stretch teachers’ thinking. In the pivotal episodes we captured, both OCEs patiently guided their respective teams to new insights and judiciously applied pressure to expand horizons of instructional possibilities. Observations and interviews showed that OCEs had clear ideas of instructional activities that might help teachers achieve stated goals and increase scientific rigor or validity but introduced these ideas as “options to consider” through a sequence of understated facilitative moves rather than aggressively asserting opinions or overtly leveraging their authority as an outside “expert” or “researcher.”


For example, OCE1 conceived of the “grading science” activity (see Appendix F) while studying research abstracts over the summer but chose to help teachers evaluate instructional options on their own by situating the idea within a list of three scenarios and waiting to see which option generated the most uptake. She frequently softened her tone with words like maybe or phrases like “tell me if this would work” to engender respect and cultivate openness, while at the same time pushing teachers to consider an alternative instructional approach. She reinforced the initial uptake with slightly more direct statements such as, “It could be a little bit trickier, but it may be rewarding.” She then added a few specific points of rationale as interest was building: “Yeah . . . looking at different studies and identifying why they’re flawed or why they don’t agree with one another is also teaching the material of what evidence there is for lycopene in health but also critical thinking skills.” Reflecting back on this key episode, OCE1 was gratified to see teachers embrace the more challenging lesson approach: “So I talked to them about it, and they were interested in pursuing the critical thinking aspect of it . . . I was excited that they wanted to pursue that track!” She also emphasized the importance of trusting teachers’ knowledge and expertise: “I was willing to compromise the critical thinking aspect if they felt that students weren’t up to it, or it wasn't a good use of their time. I think you have to be willing to let things go” (OCE1, Interview, March 2014).


Similarly, OCE2 located important gaps in teachers’ initial design of the heat

absorption experiment and identified changes that would help increase the precision and accuracy of students’ measurement process. She explained, “I needed them to see that what we’re doing is actually looking at the temperature, following a set of protocols for each medium being the same. . . ” (OCE2, Interview, March 2014). She patiently persisted with four consecutive attempts to explain the importance of applying paint directly to the water instead of using colored jars or construction paper. OCE2’s first attempt was interrupted by T1, who presumed the team already understood what OCE2 was trying to explain, “Right. We have the jar with the water and then, of course, color it with construction paper or paint the jar.” Despite teachers’ initial shortsightedness, OCE2 continued to explain the rationale regarding natural light and how the sun would be “shining down on clear water and surrounded by color rather than the color itself.” What was particularly striking on the audio recording was how OCE2 maintained a consistent, calm tone of voice during each successive exchange and gradually guided the teacher to a new understanding (see italics that follow for an example of where this calm tone was particularly evident).


T1:

So, the water should be colored?

OCE2: I think so. That’s why I was saying let’s melt the water color paint into the water and test that. Because it’s the actual medium we want to test, not the outer. . .

T1:

So once you add the color to the water—Okay. So then I don’t have to worry about painting the jars.

OCE2: Exactly. . . (Nobel Middle School, Design Session Transcript, February 2014)


These subtle and judicious applications of pressure by both OCE1 and OCE2 provided just enough stretch to help teachers grow beyond their existing visions of practice while not demanding so much as to close off communication or create resistance. Both OCEs confronted gaps without being confrontational. They intentionally and carefully pursued opportunities to help teachers improve the design of project lessons and address important learning goals. As OCE2 explained, they also showed respect for teachers’ knowledge and experience in the process: “I see that . . . in Learning Studios, my job is to take what the teachers either already know or are curious about, give them something they didn’t know, but yet are excited to know more, and then we build off each other to develop the idea—to develop the project” (OCE2, Interview, March 2014).


DISCUSSION


In this study, we investigated two cases of extended collaborative partnership between OCEs and NCTAF LS teacher teams. Our goal was to document concrete examples of sustained interactions and record any discernable influence of the partnerships on teachers’ instructional plans. The pivotal episodes we captured provide some initial evidence to support previous researchers’ hypotheses that extended collaborative engagements can facilitate teacher learning in ways not readily achieved through traditional partnership models. The joint productive activity and depth of interaction observed in these cases opened up several opportunities to infuse knowledge and insights seldom documented in other teacher–expert studies involving loosely structured programs or short-lived externships. The two cases also provide some initial evidence that outside experts can help to expand the horizons of possibilities that teachers consider during the instructional planning process.


To be clear, this is not to suggest that outside expertise is the only way teachers might generate or expand professional knowledge. We agree with Cochran-Smith and Lytle (1999) that “knowledge of practice” is generated  “when teachers treat their own classrooms and schools as sites for intentional investigation at the same time as they treat the knowledge and theory produced by others as generative material for interrogation and interpretation” (p. 250). We posit that the changes in project plans documented in these examples would be unlikely to occur without the combination of well-structured collaborative teacher inquiry (to leverage teachers’ existing craft knowledge) and well-timed, purposeful involvement of OCEs (to help teachers look beyond existing conceptions of practice). We hope that future studies might explore the specific dosage, frequency, and types of OCE contributions that are most likely to yield these desired effects.


IMPLICATIONS


The results of this study have immediate implications for NCTAF LS and other STEM partnerships, but they also present broader implications for any context or program in which outside experts partner to assist teacher teams. Program leaders should consider how the key facilitative actions highlighted in these cases might help define criteria used for OCE candidate recruitment as well as the skills emphasized in OCE mentoring and training. The facilitative actions depict a different image of OCEs than the typical profile documented in other observations throughout the partnership literature (e.g., Brown et al., 2014; Falloon, 2013). Rather than operating as “purveyors of knowledge,” the OCEs worked to adapt expertise to local contexts, made a substantial investment in detailed follow-up communication, and judiciously applied pressure to stretch horizons of practice. This required listening, flexibility, and a genuine investment in understanding teachers’ project goals, plans, and rationale. It required a generalist perspective and willingness to set aside personal areas of research interest. It also required significant time and patience as OCEs carefully selected language and strategically positioned suggestions to nudge thinking forward when they noticed opportunities to expand scientific knowledge or enhance quality of instructional plans.


Program leaders might also consider the implications of these findings for the design and structure of teacher team–OCE collaborative planning sessions. In the design sessions we observed, for example, teachers focused the majority of their time on determining “what” content and activities to cover for each subject or class period. There was limited time available in these sessions for the kinds of discussions we observed in pivotal episodes—discussions focused on dilemmas of practice and the nuances of “how” to teach this content well. One possible adjustment is to structure some portion of session time with a simple protocol for identifying specific instructional dilemma(s) related to the driving question, articulating the rationale for the selected instructional approach, and discussing details for how to implement it well. This could be combined with giving OCEs clear guidance on the kind of conversations to “look for” and foster (i.e., examples or images of “expanding horizons”) and training OCEs to help nurture or instigate discussions that focus on the nuances of teaching.


It might also be valuable for program leaders to use transcript examples like those captured in this study as training modules for OCE orientation, pointing out the facilitative actions as well as other specific factors that might contribute to “expanding horizons.” Possible points of emphasis might include: learning to frame suggestions and communicate knowledge in classroom-oriented terms; expressing advice or corrections with softened tones; outlining instructional ideas in sufficient detail that captures important nuances of effective implementation; or helping teachers establish specific criteria by which to measure the selection of an instructional activity (e.g., not just whether it will increase student engagement but whether the lesson design will help address important learning goals). Evidence from this study suggests that, in one year of partnership interactions, each OCE might only have a few key opportunities to significantly contribute to teachers’ instructional plans and conceptions of practice. This places a high priority on preparing OCEs to both foster and recognize these key opportunities and leverage them effectively for the benefit of teachers and students.


LIMITATIONS


As with all case study research, these findings are bounded by the specific context of this implementation and case study population (Merriam, 1998). The results provide rich description of two NCTAF LS partnership cases but are not predictive of future outcomes with other schools or partnership projects. There are several other cautions and caveats.


First of all, we cannot speak definitely as to whether these findings are typical of all successful LS–OCE cases, or whether they are even typical of these particular teams and cases, given that we did not observe all five design session events, and all interactions at the school sites. Nor did we observe their partnership interactions from previous years. However, the substantial volume of data and systematic effort to transcribe and study the interactions, compared with sporadic anecdotes that arise from participant or program leader self-reported stories, provided us a more complete and realistic picture of the full variation of interactions that existed in these events, as well as their potential to influence the direction of teams’ instructional plans.


Second, it is important to reiterate that NCTAF had originally selected four cases to follow, but two other schools were unable sustain engagements with the OCEs because of changes in policies and politics of the external partner organizations. This serves as an important warning and reminder that external factors can and often do disrupt implementation, and even the most proactive programs and successful partnerships are susceptible to these challenges. Previous studies have raised similar questions about the viability and sustainability of teacher–OCE partnership models, emphasizing that “to succeed, sufficient commitment to them must come from [OCEs] and their employing organizations” (Falloon, 2013, p. 870).


Finally, although the evidence reported in this study demonstrates several clear examples of OCEs influencing teachers’ instructional plans and expanding horizons of instructional possibilities that teachers consider, we do not believe the reported effects are of sufficient magnitude or duration to be labeled as “transformative shifts in practice.” This would require extended observation over multiple years and a thorough analysis of lesson changes based on established criteria for each domain, such as the “teaching for transfer” and “deeper learning” practices outlined by the NRC (2012, p. 143). Because the scope and resources for this study focused only on the lesson planning process, we also cannot say whether these initial changes in instructional plans translated into changes in classroom teaching or improvements in student learning. These remain important questions for future research.  


References


Barraza-Lyons, K., & Daughtrey, A. (2011). The California Science Project Teacher Retention Initiative: A final report. Carborro, NC: Center for Teacher Quality.


Bausmith, J. M., & Barry, C. (2011). Revisiting professional learning communities to increase college readiness: The importance of pedagogical content knowledge. Educational Researcher, 40(4), 175–178.


Bell, R., Blair, L., Crawford, B., & Lederman, N. (2003). Just do it? Impact of a science apprenticeship programme on high school students’ understanding of the nature of science and scientific inquiry. Journal of Research in Science Teaching, 40(5), 487–509.


Briscoe, C., & Peters, J. (1997). Teacher collaboration across and within schools: Supporting individual change in elementary science teaching. Science Education, 81, 51–65.


Brown, J. C., Bokor, J. R., Crippen, J. K., & Koroly, M. J. (2014). Translating current science into materials for high school via a scientist-teacher partnership. Journal of Science Teacher Education, 25(3), 239–262.


Canadian Space Agency. (n.d.). Tomatosphere. Retrieved from http://www.tomatosphere.org/


Carroll, T., Fulton, K., & Doerr, H. (Eds.). (2010). Team up for 21st century teaching and learning: What research and practice reveal about professional learning. Washington, DC: National Commission on Teaching and America’s Future.


Cochran-Smith, M., & Lytle, S. L. (1999). Relationships of knowledge and practice: Teacher learning in communities. Review of Research in Education, 24, 249–305.


Council of Chief State School Officers. (2013). Interstate Teacher Assessment and Support Consortium (InTASC) model core teaching standards and learning progressions for teachers 1.0. Washington, DC: Author. Retrieved from http://www.ccsso.org/Documents/2013/2013_INTASC_Learning_Progressions_for_Teachers.pdf


Denzin, N. (1978). The research act: A theoretical introduction to sociological methods. New York, NY: McGraw-Hill.


DuFour, R. (2004). What is a professional learning community? Educational Leadership, 61(8), 6–11.


Erkut, S., & Marx, F. (2005). 4 schools for WIE evaluation report. Wellesley Centers for Women. Retrieved from http://www.coe.neu.edu/Groups/stemteams/evaluation.pdf


Ermeling, B. A. (2010). Tracing the effects of teacher inquiry on classroom practice. Teaching and Teacher Education, 26(3), 377–388.


Ermeling, B. A., & Gallimore, R. (2013). Learning to be a community: Schools need adaptable models to create successful programs. Journal of Staff Development, 34(2), 42–45.


Ermeling, B., & Graff-Ermeling, G. (2016). Teaching better: Igniting and sustaining instructional improvement. Thousand Oaks, CA: Corwin Press.


Falloon, G. (2013). Forging school–scientist partnerships: A case of easier said than done? Journal of Science Education and Technology, 22(6), 858–876.


Fink, D. (2000). Good schools/real schools: Why school reform doesn’t last. New York, NY: Teachers College Press.


Firestone, W. A., & Pennell, J. R. (1997). State-initiated teacher networks: A comparison of two cases. American Educational Research Journal, 34(2), 237–268.


Gallimore, R., & Ermeling, B. A. (2010, April 14). Five keys to effective teacher learning teams. Education Week, 29(29).


Gallimore, R., & Ermeling, B. A. (2012). Why durable teaching changes are elusive and what might we do about it. Journal of Reading Recovery, 12(1), 41–53.


Gallimore, R., Ermeling, B. A., Saunders, W. M., & Goldenberg, C. (2009). Moving the learning of teaching closer to practice: Teacher education implications of school-based inquiry teams. Elementary School Journal, 109(5), 537–553.


Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory. Chicago, IL: Aldine.


Goldenberg, C. (2004). Successful school change: Creating settings to improve teaching and learning. New York, NY: Teachers College Press.


Groome, M., Rankin, J., & Wheary, J. (2001). Support, collaborate, retain: Strategies for improving the STEM teaching crisis. Triangle Coalition. Retrieved from http://www.demos.org/sites/default/files/publications/STEM_Report_Demos-NYAS.pdf


Hargreaves, A., & Fink, D. (2006). Redistributed leadership for sustainable professional learning communities. Journal of School Leadership, 16(5), 550–565.


Harris Willcuts, M. (2009). Scientist-teacher partnerships as professional development: An action research study. Richland, WA: Pacific Northwest National Laboratory (PNNL). Retrieved from http://science-ed.pnnl.gov/teachers/pdfs/pnnl-18305.pdf


Hiebert, J., Gallimore, R., Garnier, H., Givvin, K. B., Hollingsworth, H., Jacobs, J., . . . Stigler, J. W. (2003). Teaching math­ematics in seven countries: Results from the TIMSS 1999 video study (NCES 2003-013). Washington, DC: U.S. Department of Education, National Center for Education Statistics.


Hiebert, J., Gallimore, R., & Stigler, J. (2002). A knowledge base for the teaching profession: What would it look like and how can we get one? Educational Researcher, 31(5), 3–15.


Hord, S. (1997). Professional learning communities: What are they and why are they important? Austin, TX: Southwest Educational Development Laboratory. Retrieved from http://www.sedl.org/change/issues/issues61/Issues_Vol6_No1_1997.pdf


Hutchins, E. (1996). Learning to navigate. In S. Chaiklin & J. Lave (Eds.), Understanding practice: Perspectives on activity and context (pp. 35–63). Cambridge, England: Cambridge University Press.


Learning Forward. (2011). Standards for professional learning. Oxford, OH: Author. Retrieved from http://learningforward.org/docs/pdf/standardsreferenceguide.pdf?sfvrsn=0


Lehman, J., & Capobianco, B. (2012, January). Creating shared instructional products for integrating engineering education in the science learning through engineering design (SLED) partnership. Paper presented at the annual meeting of the Association for Science Teacher Education, Clearwater, FL.


Lewis, C., & Hurd, J. (2011). Lesson study step by step: How teacher learning commu­nities improve instruction. Portsmouth, NH: Heinemann.


Lieberman, A., & Miller, L. (2011). Learning communities: The starting point for professional learning is in schools and classrooms. Journal of Staff Development, 32(4), 16–20.


Little, J. W. (1982). Norms for collegiality and experimentation: Workplace conditions of school success. American Educational Research Journal, 19(3), 325–340.


Little, J. W. (2003). Inside teacher community: Representations of classroom practice. Teachers College Record, 105(6), 913–945.


Lomos, C., Hofman, R. H., & Bosker, R. J. (2011). Professional community and student achievement–a meta-analysis. School Effectiveness and School Improvement, 22(2), 121–148.


Loucks-Horsley, S. L., Love, N., Stiles, K. E., Mundry, S., & Hewson, P. W. (2003). Designing professional development for teachers of science and mathematics (2nd ed.). Thousand Oaks, CA: Corwin Press.


Louis, K. S. (2006). Changing the culture of schools: Professional community, organizational learning, and trust. Journal of School Leadership, 16(5), 477–489.


Louis, K. S., Kruse, S. D., & Marks, H. M. (1996). School wide professional community. In E. M. Newman & Associates (Eds.), Authentic achievement: Restructuring schools for intellectual quality (pp. 179–203). San Francisco, CA: Jossey-Bass.


Louis, K. S., & Marks, H. M. (1998). Does professional community affect the classroom? Teachers’ work and student experiences in restructuring schools. American Journal of Education, 106, 532–575.


Maryland State Department of Education. (1995). Science core learning goals. Retrieved from http://www.msde.maryland.gov/NR/rdonlyres/64C18AB5-2426-4BC5-9EF4-9D5FC284294C/2452/11AReportoftheHighSchoolAssessmentTaskForce_Attch6.pdf


McDougall, D., Saunders, W. M., & Goldenberg, C. (2007). Inside the black box of school reform: Explaining the how and why of change at Getting Results schools. International Journal of Disability, Development and Education, 54, 51–89.


McLaughlin, M. W., & Talbert, J. E. (2001). Professional communities and the work of high school teaching. Chicago, IL: University of Chicago Press.


McLaughlin, M. W., & Talbert, J. E. (2006). Building school-based teacher learning communities: Professional strategies to improve student achievement. New York, NY: Teachers College Press.


Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco, CA: Jossey-Bass.


Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage.


National Board for Professional Teaching Standards. (2002). What teachers should know and be able to do. Arlington, VA: Author. Retrieved from http://www.nbpts.org/sites/default/files/documents/certificates/what_teachers_should_know.pdf (Original work published 1989)


National Commission on Teaching and America’s Future. (n.d.) Learning studios. Retrieved from http://nctaf.org/learning-studios/


National Research Council. (1996). The role of scientists in the professional development of science teachers. Committee on Biology Teacher Inservice Programs, Board on Biology, Commission on Life Sciences. Washington, DC: National Academies Press.


National Research Council. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century (J. W. Pellegrino & M. L. Hilton, Eds.). Committee on Defining Deeper Learning and 21st Century Skills. Board on Testing and Assessment and Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.


Nelson, T. (2005). Knowledge interactions in teacher-scientist partnerships: negotiation, consultation and rejection. Journal of Teacher Education, 56(4), 382–395.


Next Generation Science Standards. (n.d.). MS-PS3-4 Energy. Retrieved from http://www.nextgenscience.org/search-performance-expectations?tid_2%5B%5D=14&tid%5B%5D=28


Reisberg, R., Ziemer, K., Knight, M., Wong, P., Swan, A., Sumru, E., & Camesano, T. (2006). A model STEM team collaboration: Four schools for WIE. Proceedings of the 2006 WEPAN Conference. Women in Engineering Programs and Advocates Network (WEPAN). Retrieved from  https://ojs.libraries.psu.edu/index.php/wepan/article/download/58458/58146


Rosenholtz, S. J. (1989). Workplace conditions that affect teacher quality and commitment: Implications for teacher induction programs. Elementary School Journal, 89(4), 421–439.


Rosenholtz, S. J. (1991). Teachers’ workplace: The social organization of schools. New York, NY: Teachers College Press.


Roth, K. J., Druker, S. D., Garnier, H. E., Lemmens, M., Chen, C., Kawanaka, T., . . . Gallimore, R. (2006). Teaching sci­ence in five countries: Results from the TIMSS 1999 video study (NCES 2006-011). Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/timss


Sarason, S. (1972). The creation of settings and the future societies. San Francisco, CA: Jossey-Bass.


Saunders, W., Goldenberg, C., & Gallimore, R. (2009). Increasing achievement by focusing grade level teams on improving classroom learning: A prospective, quasi-experimental study of Title 1 schools. American Educational Research Journal, 46(4), 1006–1033.


Stigler, J. W., & Hiebert, J. (1999). The teaching gap: Best ideas from the world’s teachers for improving education in the classroom. New York, NY: Free Press.


Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning communities: A review of the literature. Journal of Educational Change, 7(4), 221–258.


Supovitz, J. A. (2002). Developing communities of instructional practice. Teachers College Record, 104(8), 1591–1626.


Tharp, R. G., & Gallimore, R. (1988). Rousing minds to life: Teaching, learning, and schooling in social context. Cambridge, England: Cambridge University Press.


Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24(1), 80–91.


Zhang, X., Frechtling, J., McInerney, J., Nyre, G., Michie, J., Miyaoka, A., . . . Wells, J. (2008). A Year 4 RETA Report for Effect of STEM Faculty Engagement in MSP—A Longitudinal Perspective (Prepared under contract to the National Science Foundation). Rockville, MD: Westat.


Appendix A


OCE Interview Questions


1.

Open with introductions. Review purpose of the study and remind participant of previously signed consent forms. Request permission to record the interview. Ask if he/she has any general questions before we begin.


2.

Please think back to your design sessions this year and see if you can recall any specific interactions you had with (LS team name) that you felt were especially productive. Please try to pinpoint a specific memory and context. (Pause for time to reflect.) Please describe the context and walk me through the discussion as you remember it.


3.

The big idea or central theme (LS team name) recorded as their focus for this year was (state “big idea”). As you far as you recall, what inspired their selection of that idea?


4.

What (if any) contribution did you make to the selection or development of the big idea?


5.

The driving question (LS team name) recorded for the project was (state “driving” question”). As you far as you recall, what inspired their selection of that question?


6.

What (if any) contribution did you make to the selection or development of that driving question?


Note: For questions 7 and 8, incorporate specific notes and exchanges researchers recorded during design session observations and probe for more detail as part of extended discussion.


7.

Next, let’s transition to the project lessons and specifically your contributions to the lesson(s) the team developed. What resources or materials did you provide that were essential to this project?


8.

Please think back to some of the critical decisions the group made in planning the project lesson(s). In what ways (if any) did you contribute to those decisions? Please walk me through the discussion(s) and your thinking process as you remember them.


9.

Can you describe for me the characteristics you believe are necessary for an effective workforce partner?


10.

Is there anything else I’ve left out that you wanted to share with me regarding your partnership with (LS team name)?



Appendix B


LS Teacher Focus Group Questions


1.

Open with introductions. Review purpose of the study and remind group of previously signed consent forms. Request permission to record the focus group. Ask if the group has any general questions before we begin.


2.

Please think back to your design sessions this year and see if you can recall any specific interactions you had with (OCEs name) that were especially helpful for you. Please try to pinpoint a specific memory and context. (Pause for time to reflect.) Please describe the context and walk me through the discussion as you remember it.


3.

The big idea or central theme you recorded as your focus for this year was (state “big idea”). Is that correct? What inspired your selection of that idea?


4.

How did (OCEs name) contribute (if at all) to your selection or development of the big idea?


5.

The driving question you recorded for the project was (state “driving” question”). Is that correct? What inspired your selection of that question?


6.

How did (OCEs name) contribute (if at all) to your selection and development of that driving question?


Note: For questions 7 and 8, incorporate specific notes and exchanges researchers recorded during design session observations and probe for more detail as part of extended discussion.


7.

Next, let’s transition to your project lessons and specifically (OCE’s name) contributions to the lesson(s) your team developed. What resources or materials did (OCE’s name) provide that were essential to this project?


8.

Please think back to some of the critical decisions you made in planning the project lesson(s). In what ways (if any) did (OCE’s name) contribute to those decisions? Please walk me through the discussion(s) as you remember it.


9.

Can you describe for me the characteristics you believe are necessary for an effective workforce partner?


10.

Is there anything else I’ve left out that you wanted to share with me regarding your partnership with (OCE’s name)?


Appendix C


Sample Transcript Coding for Interviews and Focus Groups


Note. Codes are as follows: Perceived Effects = PE (none = PEn; feasible question = PEfq; feasible and valid experiment = PEfve; focus on teaching scientific thinking/process, experimental design = PEtst; connect science to real world/everyday life = PErw; strengthen connection across content = PEac); Facilitative Actions = FA (concrete examples of high interest activities = FAhia; apply pressure to improve question = FAapq; apply pressure to increase rigor = FAapr; apply pressure to increase precision = FAapp; adapt expertise to group needs = FAae; follow-up with research or ideas between meetings = FAfbm; provide access to special resources = FAasr; intentionally foster interdisciplinary connections – FAic; build on existing practice = FAbep); Expanding Horizons = EH (different instructional approach than otherwise might have pursued= EHdia; oriented to key learning goal or fundamental skill related to standards and curriculum = EHokl); General Characteristics = GC (manage logistics = GCml; listening = GCl; flexibility = GCf; accept imperfection conditions = GCaic; generalist perspective = GCgp; imaginative = GCi; genuine interest in teaching, learning = GCgi); Project Conditions = PC (NCTAF introductions = PCni; stability of partner organization = PCspo; stability of school environment = PCsse).


Example from OCE#1 Interview


Question: What (if any) contribution did you make to the selection or development of that driving question?


OCE1: . . . that was something that they already had in mind because they told me that they were working with a company that basically provided a service where they would give you seeds that had been primed in particular ways, and that they were going to kind of build things out from there (PEn).


Question: Please think back to some of the critical decisions the group made in planning the project lesson(s). In what ways (if any) did you contribute to those decisions? Please walk me through the discussion(s) and your thinking process as you remember them.


OCE1: Yeah, so when they brought up the initial idea of they can use this project across multiple classes and different disciplines, I was like okay, well, I thought about the idea of acting like what a scientist would actually do (FAae), and it would go from biology, to technology, to English and math, and the peer review process would be the English component. . . (FAic)  


Question: What was your observation of their response to that?


OCE1:

I got a positive response from that. I think that a lot of people don't really know how the scientific process works, and that there is a peer review process for journals, or what a scientist would do . . . I think that by likening what [students are] doing to the scientific process (PEtst), it makes it more relatable (PErw) and more interesting. And so I think that our conversation kind of did that somewhat for the teachers.


Question: And then I have a segment here [from our observation] where you’re saying to them, “You know, I've been looking online, and unfortunately there’s not a lot of support . . . the direct role of lycopene in human health is tenuous.” And so that was part of that discussion right . . . deciding whether to gloss over the ambiguity, or have the students actually dig into it?


OCE1: Yeah, I actually sent them a big long email over the summer (FAfbm), because I didn't know anything about lycopene, and so after I left them I did a bunch of research online (FAfbm),  and . . . I could really see this going several different ways . . . I mean you can either pretend that it doesn't happen to illustrate a concept and a point . . . or you can use it as a critical thinking moment (FAapr) . . . there are practicalities, large class sizes, or maybe the kids aren’t up to that level of critical thinking yet, so I just try to tailor to whatever they think their kids can handle, because they know a lot better than I do (FAae).


Question: Can you talk a little bit more about how their thinking evolved beyond that point?


OCE1:

Sure. So I talked to them about it, and they were interested in pursuing the critical thinking aspect of it (PEtst), and so it’s from that idea that we came up with the “grade the science” in-class activities. . . . And so the idea was that I would include supplement research . . . and have the kids read abstracts that I've simplified for them (FAasr), and then they determine whether it’s a good study or not. And lycopene was one of those . . . lycopene is one of those things that we see on a ketchup bottle and we believe. In fact, I thought there was better evidence for lycopene until I investigated it (FArw).


Question: What else do you draw on in your skill set to try to help them?


OCE1: . . . So if you want to have every single class project be perfect and you're not willing to compromise on anything, I think that would be really tough for you, because sometimes you have to compromise to get the bigger picture (GCaic). And going back to the lycopene thing, I was willing to compromise the critical thinking aspect if they felt that students weren’t up to it, or it wasn't a good use of their time (FAae), I think you have to be willing to let things go (GCf).


Example from NHS Focus Group


Question: . . . How did OCE1 contribute, if at all, to your selection or development of that [big] idea?


T: We had inquired about a program called Tomatosphere where this agency will send out seeds to schools in Canada and the United States and those seeds will have an experimental group and a control group. . . . When we sat down with [OCE1], where she came in was really the development of the program. We had decided that we wanted to move forward with the Tomatosphere project. . . . And so she jumped right in and said, well you know, she could come in and talk with our students about her work and how she goes about designing an experiment. And that lended itself directly to the Tomatosphere project. . . (PEtst)


T: I will also say that she was very flexible in that she came in and completed the presentation on experimental design as well as developed activities . . . with the, you know, average, high-quality or and lower quality scientific work. (FAae)


Question: What resources or materials did [OCE1] provide that were essential to this project?


T: She provided us with the scientific reports and the abstracts (FAasr). And she actually came and she not only provided us with them, she tweaked them so that you know, they could be applicable to high school (FAae). She did a lot with that.


T: Right. She actually developed it and we shared them across classes so they were used in the F&R class that [Miss M] teaches. They were used in [Environmental Science] class (PEac).


Question: So how did your lesson plan for this project compare to the way you would normally approach these curriculum topics and ideas in the past? Has your work with [OCE1] influenced your overall practice. . . ?


T: I’ll just say what I was talking to you about earlier, that question that I was—that discussion I had with [OCE1]. It’s making me reflect and try to figure out ways that I can connect, not just the fundamental conceptual knowledge they get of our science topic, but also like how the science is actually done. . . (EHdia)


T: Just as [Teacher S] mentioned you know, we realized that students had a weakness when it came to the skills and processes of science, that [Maryland State Department of Education] goal and indicator specifically (EHokl).  And so we focused on strengthening them and also, you know, really reiterating that part of the lesson. . . . And so [OCE1] really set up a, you know, kind of a foundation for this project. So not necessarily in terms of the content topic, but for the foundation in experimental design. . . (PEtst)


Question: Anything else that she provided . . . that was really helpful or that you wanted to share?


T: She really seemed like she genuinely enjoyed and got gratification out of working with us and she’s always there at every NCTAF or Stem design session, she’s always come in and she comes in with a smile and comes and sits with us. And she has a whole bunch of other schools that are in the group, but she always manages to spend, I feel like she spends a lot of time with us (GCgi).


T: I didn't feel like she was just assigned to do it.  Some of it may have been like okay here’s this opportunity . . . it might be something that might look good . . . on your resume or if you have to do hours of volunteer work in the community.  It seemed like it was a lot more than that for her . . . she actually took a genuine interest and concern to work with us. And when she’d come to the meetings, she’d have like either an idea, like hey what do you think about this?  Or I know you were talking about this last time, how do you feel about using this resource. . . . So I always look forward to when she was coming (GCgi).




Appendix D


Sample Transcript Coding for Design Session Observations


Note. Codes are as follows: General dialogue with OCE = GD; Reference to OCE while not present = R; OCE provides resources or demonstration = PR; OCE asks question = AQ; OCE offers advice or suggestions for project plans = OA; OCE attempts to establish good will, provide encouragement = GW; Teacher uptake from OCE = U (Short-lived, intermittent episode of uptake not warranting further analysis = slU; Sustained pivotal episode of uptake that might represent expanded horizons of instructional possibilities = sU); Expanding Horizons = EH (different instructional approach than otherwise might have pursued = EHdia; oriented to key learning goal or fundamental skill related to standards and curriculum = EHokl). Facilitative Actions = FA (concrete examples of high interest activities = FAhia; apply pressure to improve question = FAapq; apply pressure to increase rigor = FAapr; apply pressure to increase precision = FAapp; adapt expertise to group needs = FAae; follow-up with research or ideas between meetings = FAfbm; provide access to special resources = FAasr; intentionally foster interdisciplinary connections = FAic; build on existing practice = FAbep).


Example NHS Episode with “Short-Lived Uptake”


OCE1:

I know that sometimes . . . you can do thoughtful planting (OA).

T2:  

Marigolds! (slU begins.)

OCD1:

Yeah. You can plant one thing next to another thing and that will cause it to grow better and be less likely to have pests (OA).

T2:

Yeah. Marigolds specifically. . . . They exude a type of. . . (slU continues).

OCE1:  I can’t remember if you want put tomatoes next to onions or if you want put tomatoes next to basil?

T1:

Carrots? Yeah, tomatoes and a couple of different things.

OCE1:

There are things you can plant with tomatoes . . . that will naturally keep the insects away. . . (slU ends)

T3:  

I have a question that’s off-topic.  Since you’re from Texas and we’re discussing pests, do you know how to get rid of fire ants? My grandmother was just bitten by a fire ant (GD).

OCE1:

Yeah. As a kid I actually fell into a fire ant pile and was covered head to toe. It was awful. I don’t know anything about organic ways to get rid of them. I just see people spreading poison, unfortunately. . . (GD).


Example NHS Episode with “Sustained Uptake”


T1:

I’m working on my inside voice. . . . We have changed to do a seed germination experiment. Instead of determining the soil first . . . we’re just going to do the base experiment which is seeing from our two groups . . . how many seeds germinate and how fast they germinate. So we don’t need the soil for that…We can still talk about Lycopene and the effects on the brain and so forth. . . (GD).

OCE1:

Yeah, I feel like the soil component is independent. . . . So with germination, are you doing the plastic bag with the moist towel? (AQ.)

T1:

Right. Exactly.

OCE1:

Yeah, I used to do that. . . (GW).

T2:

. . . Since most of us have introduced the Tomatosphere project design, the overall purpose to our students . . . I was thinking maybe we could actually have you come in and they could learn like, “Why tomato?”—with the lycopene (GD).

OCE1:

Ok. So, like I said in my email (FAfbm), I was looking online for evidence of lycopene and human health, and unfortunately there is not very strong support. Like, there will be one study saying that it kind of helps this and another person can’t reproduce it. So the direct role of lycopene itself seems to be pretty tenuous. It doesn’t seem to have a great connection to human health. But I think that this could either be like a learning opportunity (summarizes options 1 and 3 from her email). . . Another option is that you can use it as a critical thinking opportunity to have them like maybe look at the evidence, see what there is, and have them decide if it’s good evidence (OA; FAapr).

T2:  

I was going to go with that (sU begins; EHdia begins).

OCE1:

It could be a little bit trickier, but it may be rewarding.

T2:

Well, I like both things but what I was thinking when you started talking is . . . not  just learning facts from a textbook, but actually learning how scientists actually learn the science that we teach in our classrooms. So when you just said that there’s not a whole bunch of evidence to say that lycopene is perfect. . . I thought it was good for students to see that, that’s it’s an ongoing process . . . (sU continues; EHdia continues).

OCE1:

Yeah . . . looking at different studies and identifying why they’re flawed or why they don’t agree with one another is also teaching the material of what evidence there is for lycopene in health but also critical thinking skills (OA; FAapr).

T2:

So I thought that maybe, I don’t know if you can do this, but give them something and then say, “Does this look like it’s reliable data?” (sU continues; EHdia continues).

OCE1:

So . . . maybe. I don’t know, tell me if this would work in an actual classroom. What if I went to the studies and I looked at the abstracts and . . . if it’s super heavy write a simplified format? And then provide a couple of abstracts about lycopene and let’s say prostate cancer. And I don’t know, maybe the students could read over it . . . and hold up a letter grade for how good they think the study supports it . . . and why do you think it’s a great study . . . or why do you think it’s a bad study? And you know back it up. I don’t know. (OA; FAapr.)

T2:  

That’s a good idea.

T3:

I like that idea.

General nodding and “yeah” across the group . . . (sU continues; EHdia continues).

T1:

I think it directly relates to what we’ve been talking about for our writing samples for claim, evidence, reasoning. So we’ve been recently discussing having students as a goal for the year increase their ability to write a scientific explanation. And the components of a scientific explanation are claim, evidence, reasoning. So if they can actually evaluate a simplified version of the abstract, “Does the information from the abstract match the claim?”. . . they’re processing through that filter of, “Does this evidence support this claim or not and why?” And then have them do a writing at the end. . . (sU continues; EHdia continues; EHokl).


Appendix E


Email Correspondence From OCE 1


On Tue, Jul 30, 2013 at 11:11 AM, OCE1 wrote:

Hi everybody!


I hope you've had a relaxing and fun summer! Since we met in June, I've been researching the role of lycopene and human health and the evidence is somewhat inconclusive. Here is a link from the Mayo clinic giving a "grade" to the studies of lycopene and the scores aren't great. Interestingly, this article came out in The Atlantic indicating that there is little evidence that vitamins and antioxidants help prevent anything, and sometimes they are associated with increased risk of cancer and death.

The point of these studies isn't that lycopene has no effect, I think the takeaway message is that isolation of one nutrient isn't the way to go. These articles underscore the importance of nutrients as part of a balanced diet (exactly the kind of thing you guys are teaching by growing and cooking the tomatoes), so I thought of three different scenarios:

1. We ignore the role of lycopene, specifically, in human health and just focus on antioxidants in general and use this as an opportunity to teach the balance of antioxidants and free radicals in human health.

2. We give the students the article from The Atlantic and discuss the complexity of recent data as a critical thinking exercise. Current studies are indicating that the problem isn't with antioxidants, but supplementing with antioxidants. For this reason, I don't think this information has to undermine what you are doing with tomatoes, but rather underscore the importance of antioxidants through a balanced diet of actual vegetables (i.e., eating tomatoes vs. taking pills). That being said, this one could be trickier to navigate in the classroom, controversial, and may be more work than it's worth.

3. We press onward and discuss the role of lycopene in human health and focus on the studies that do demonstrate an effect.

I am fine whatever way you all want to proceed and I promise fast turn-around on some PowerPoint slides to help out (if you still want that). If you would like to discuss these possibilities more I would be more than happy to Skype or do a conference call with you guys or we can continue to email back and forth. . .

Thanks and look forward to hearing from you!

OCE1



Appendix F


Grade the Science Handout


Designing a good experiment! Things to think about. . .


How many people did you study? If you only collect data from a small number of people, you have gotten an unusual batch of people. The more people you collect data from, the more representative your conclusions will be. This is called having a large sample size.


Is there a control group? To compare the effect of a treatment, you need to have a group who hasn’t been exposed to the treatment to ensure that this treatment is responsible for the outcome!


Think about the placebo effect . . . sometime just knowing you have been treated makes you feel better.


Did the participants in the study get to choose what group they would be in? Sometimes it isn’t possible to assign people to treatment groups (think about a smoking study, we can’t force people to become smokers so they choose their own groups). If it is ethical to assign people to groups, was this done randomly?


If you have a control group, whether the patient or the doctor know that the they are receiving the “real treatment” can influence how well they think it’s working. Therefore, the best studies have the patient and the doctor “blind” to whether the drug is real or a placebo. This is called a double blind experiment.


As you read over these studies, try to identify the independent variable (the treatment or “the cause”) and the dependent variable (the outcome or “the effect”). It’s important that everything we test in a scientific study be able to be measured, so assigning a number to the difference is crucial to making comparisons!


This should get you started, but there are many more things to think about when designing a good experiment so think creatively and always be skeptical!


Appendix G


Email Correspondence From OCE 2


On Fri, Mar 28, 2014 at 11:15 AM, OCE2 wrote:

Hello OCE2

We will have our hypothesis, we are working on the procedures. I will be going to home depot and/or lowes this weekend to get various samples of paint so, that the students can be more specific about the colors and heat absorption/reflection. We must make sure they are understanding absorption vs reflection.(the difference in temperature)

That all sounds good. Did you want me to provide the watercolor paints for them to dissolve in the water? Also, what about the clear, plastic cups, the water itself and spoons to stir the paint/water to help everything dissolve uniformly? Will you have that available? If so, how many students per group do you want to have? If it is 30 students, and we break them up into groups of five, then we will need six separate stations (with what, four cups of water [and graduated cylinders for them to pour the same amount of water with], one for black, one for white, and two for colors you/the students choose? And we will also need at least one thermometer for each group [metric?] and paper towels to clean them off, as well as some sort of stopwatch per group to help them take the temperature in consistent intervals [every two minutes until a maximum is reached where the maximum is two consistent measurements in a row]). Also, do you want to develop the data sheet or would you like me to do that? Please let me know.

Once we gather data, complete conclusion and discussion. We will be able to construct homes made of various materials. (cardboard boxes, plastic, etc.)

Yes, that all sounds good as well. I'll need to come present to the class on design concepts and architecture and teach the basics of drafting/blueprinting so they can design their buildings before they start designing or building anything.

Do you think that a chromatography lab will be beneficial here?

Sure. Would you want to do that this Tuesday?

You will be working with 30 students. (7th graders)  We have them from 7:30–1:25. How much time do you think you will need? You tell me and we will make it happen.

I am not certain because I am still not certain exactly what we will try to accomplish on this Tuesday. In my mind, we would try to do. . .

Introduction to urban heat island

temperature differentials and how to measure temperature

why it matters

green buildings

reflection/refraction

guided discussion on how they might test it

Introduction to the activity

Gathering the equipment and heading outside (it is supposed to be mostly sunny on Tuesday so that will bode well)

I would imagine the actual activity will be at least 45 minutes.

1. Set up cups.

2. Pour water into cups using the graduated cylinder.

3. Measure the temperature of the plain water until all four samples (per group) have the same temperature.

**Is it possible for the water gallons to sit outside in the sun before we head outside so that the water will be a consistent temperature and acclimated to the outdoor climate before we get outside? That will shorten the amount of time we need to be measuring the water. Or, if you want them to have more practice with the measurements themselves, I am happy to bring the water out with us.

4. Dissolve watercolor paint in each cup and stir with plastic spoons.

5. Measure initial temperature.

6. Measure temperature every two minutes of each mixture.

7. Measure temp until two consecutive measurements of each color/mixture are the same or within one degree of one another.

8. Record each set on the data sheet for further analysis.

9. Bring all equipment back to classroom.

I would imagine the above would take at least 45 minutes if we do leave the water jugs out to acclimate to the sun and another ten or so minutes if we bring the water out with us.

So, I welcome your thoughts on how long you think we will need to conduct the activities in everything.

Thank you.

OCE2


Appendix H


Optimal Roof Color Experiment Data Sheet


Names:                                                                                                                                                                                 


Date:       /        /          Starting Time:        :                (Hr:Min) AM / PM (circle one)


Sky Conditions (circle one):  Sunny   Mostly Sunny   Partly Cloud   Partly Sunny   Mostly Cloud   Overcast


Clear Resting Water Temperature. Pour 200 mL of water into each cup. Allow water to acclimate to ambient temperature. (Take temperature of water by placing the thermometer into the water for 20 seconds once every two minutes until two consecutive measurements are within 1°F (one degree Fahrenheit) of one another.


Time: Hr/Min

(thermometer insertion)

Sample 1 °F

Sample 2 °F

Sample 3 °F

Sample 4 °F

:

    

:

    

:

    

:

    

:

    

:

    

:

    

:

    

Final Temp. (°F)

    


Dissolve paint samples into each cup of water. One paint sample per each cup of 200 mL of water. Take initial water temperature and then take temperature of each sample (by placing the thermometer into each sample for 20 seconds) once every two minutes until two consecutive temperatures are within 1°F of one another.


Time: Hr/Min

(thermometer insertion)

Yellow Sample

°F

Black Sample          

°F

Green Sample

°F

Red Sample

°F

:

    

:

    

:

    

:

    

:

    

:

    

Final Temp. (°F)

    





Cite This Article as: Teachers College Record Volume 118 Number 2, 2016, p. 1-48
https://www.tcrecord.org ID Number: 18227, Date Accessed: 10/23/2021 8:38:08 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Bradley Ermeling
    Pearson Research and Innovation Network
    E-mail Author
    BRADLEY A. ERMELING, Ed.D., is principal research scientist with Pearson Research and Innovation Network and member of a research team from UCLA and Stanford. His current research focuses on educator collaboration, assisting performance, and methods for facilitating teacher reflective practice.
  • Jessica Yarbo
    George Mason University
    E-mail Author
    JESSICA YARBO, MA, is a doctoral student in the Clinical Psychology program at George Mason University and has worked as a consultant for Pearson Research and Innovation Network on a variety of projects. Her current research focuses on educator effectiveness and utilizing digital technology in the classroom setting.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS