Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

OECD’s Approach to Measuring Global Competency: Powerful Voices Shaping Education


by Susan Ledger, Michael Thier, Lucy Bailey & Christine Pitts - 2019

Background/Context: Adding global competency to the Organisation of Economic Cooperation and Development (OECD) Program of International Student Assessment (PISA) suite heralds the world’s first large-scale attempt at gauging education systems’ development of students’ global competency. Given the contested definitions and more than 150 extant instruments whose creators purport to measure global competency or related constructs, it is important to interrogate how influential and privileged global voices such as OECD portray and promote the construct. This new aspect of the PISA battery has the potential to reach 15 year olds in 80 countries that account for more than 80% of the world economy.

Purpose/Focus of Study: This paper is the first of a series of policy studies aimed at mapping OECD’s global competency measure that will occur at significant periods of time within the implementation process. This initial study examines OECD’s Global Competency for an Inclusive World (GCIW) promotional document to reveal its construction of global competency within discourse and assessment design.

Research Design: The study employs an uncommon mix of interpretive and relational methods. Critical discourse analysis (CDA) captures “how” global competency is portrayed and interrogates implications of OECD’s language use and power differentials. Social network analysis (SNA) captures “who” is influencing the policy text and examines the connectivity among the authors cited as authorities on the subject of global competency. Greene, Caracelli, and Graham’s (1989) convergence, complementarity, and contradiction framework is used to triangulate qualitative and quantitative datasets. Discussion and recommendations are framed and filtered through a policy implementation lens with five core policy threads: (a) people, (b) place, (c) philosophies, (d) processes, and (e) power, which Authors (2015a) refer to as the 5Ps.

Findings: CDA and SNA findings converge around the people, power, and places most central to OECD’s approach to measuring global competency. The CDA shows that OECD’s construction of the globally competent student reflects a rather narrow philosophical view of the term. The SNA demonstrates the power of particular academic networks and people that have informed this particular construction of global competency. CDA and SNA tell complementary stories about OECD’s philosophies and processes. There were minimal contradictions between analytical methods, but the document under review seemed at odds with its own claims at times. For example, the PISA global competency measure seems to privilege particular social and economic ideologies, exercising power through its language in ways that oppose the very global competency definition that OECD seems to espouse.

Conclusions: By investigating the policy threads (5Ps) embedded in GCIW’s production, the authors of the current study find OECD to have entertained a somewhat limited conversation in developing its definitional and measurement frameworks for assessing global competency. The ensuing critique highlights power differentials and inequalities within the GCIW document, revealing political, social, and technical issues. The current study concludes by challenging policymakers to seek a wider range of voices to inform policy directions as OECD and other influential organizations continue to refine their understanding of global competency, a 21st century imperative that is yet-to-be fully understood. The current study also offers recommendations such as continuing critiques of global policy texts and measures from inception through implementation, ensuring to capture both implications and impacts.



Premised on the growing need for skilled intercultural communicators, the Organisation for Economic Cooperation and Development (OECD) formally announced in May 2016 that the 2018 Programme of International Student Assessment (PISA) will attempt to measure global competency among 500,000-plus 15 year olds. OECD (2016) describes the global competency initiative as the world’s “first, comprehensive overview of education systems’ success in equipping young people to support the development of peaceful, diverse communities” (p. 1). Since PISA’s first administration in 2000, the triennial battery of assessments has become increasingly recognized as a report card of a national system’s ability to educate its students in reading, maths, and sciences (Breakspear, 2012). PISA scores, now armed with a global competency component, can further leverage education policy across 80 participating countries that account for more than 80% of the world economy. More now than ever before, PISA advocates might position the suite of assessments as a “global yardstick for measuring success in education” (Schleicher, 2017, p. 123). We situate the current study within this journal’s ongoing critical discussion on the significance and impact of PISA, focusing on OECD’s latest contribution to an assessment that may influence international education policy (see Berliner, 2015; Labaree, 2014; Meyer, 2014; Perry & Ercikan, 2015).


The current study critiques the OECD’s Education 2030 Framework’s global competency initiative. It employs disparate but complementary methods in focusing on the key messages and the measure used to portray and promote global competency. We use critical discourse analysis (CDA) to critique the potential consequences of OECD’s influence on international understanding of global competency and social network analysis (SNA) to explore relations between the sources OECD consulted to arrive at its understanding of the construct. We begin our program of research in this area by examining Global Competency for an Inclusive World (GCIW; Ramos & Schleicher, 2016), OECD’s core promotional document on the topic. The rationale for interrogating GCIW rests on its importance in promoting and implementing the rollout of the new global compentency assessment in the PISA suite. GCIW was the first encounter with the new OECD component for the vast majority of educators and education researchers. In the 44-page document, OECD attempts to “help shape what young people learn for 2030” (p. 2). The document is being used to justify the inclusion of the new measure. Given the value that global education sectors ascribe to PISA as a national measure, the document required scrutiny.


We recognize OECD as a potent and growing authority in education policy decisions (Meyer, 2014; Sellar & Lingard, 2014); therefore, this paper interrogates sources and possible consequences of that influence. Cautioned by Meyer’s (2014) concern about the OECD’s increasing ability to influence and reshape societies, we explore OECD’s sociopolitical stance in enacting, legitimating, and reproducing policy related to global competency. Consequently, we seek within this study to address the following research question: How does the OECD promote and portray global competency within the discourse and assessment design of the Global Competency for an Inclusive World document?


THEORETICAL FRAMEWORK


A policy implementation lens and critical theory orientation provide the theoretical framework for this evaluative study, which follows principles of policy analysis theory (see Thier, 2015; Ball, 2015; Lingard, Martino, Rezai-Rashti, & Sellar, 2015; Vidovich, 2007). The current study aims to interrogate what the GCIW document reveals about how OECD portrays and promotes global competency in the 2018 PISA administration through the 5Ps, a policy lens that Thier (2015) uses to examine people, philosophy, power, process, and place. The 5Ps help identify GCIW’s authoritative voice(s) and targeted audience(s), its philosophical approaches to operationalization in the new measure, and the implementation processes expected of schools and communities to enact requisite changes.


To some scholars, concepts such as global competency, global citizenship, or what the International Baccalaureate calls international mindedness are culturally relative. To that end, some scholars argue that such concepts are not constructs to the point of being unmeasurable (e.g., Abdi, 2011). Other camps of researchers, educators, and organizations have become keen on the possibility of measuring these coveted educational outcomes (see Deardorff, 2015). Correspondingly, a range of assessment tools exist in these conflicting and potentially overlapping arenas (Deardorff, 2015; Ledger, Vidovich, & O’Donoghue, 2015; Schleicher, 2017; Thier, 2017), including but not limited to the Global Competence Appitude Assessment, Intercultural Competencies, Intercultural Knoweledge and Competence Value Rubric, the Global Competencies Inventory, Global Perspectives Inventory, and many others. OECD’s recent commitment to measuring global competency is evidence of the changing emphasis from measuring only traditional disciplinary curriculum-oriented subjects such as literacy and numeracy to affective domains of education (Zhao, 2015). This study highlights current literature that informs the OECD position.


Therefore, the significant addition of global competency to the PISA suite requires close scrutiny to determine what governs the design and development of the assessment that OECD has developed. An interrogation of the GCIW document’s political, social, cultural, and technical issues warrants careful review because any measure carrying the OECD and PISA brand has the ability to inform curriculum, pedagogy, student experience, and outcomes in a global range of diverse contexts and cultures. In order to critique the power inequities and political, social, and cultural ideologies embedded in the OECD’s GWIC document, the authors of the current study employ a critical theory orientation with a policy perspective lens. In the following section, we introduce key elements embedded in the current study’s theoretical underpinnings.


PEOPLE, POWER, PHILOSOPHIES, PROCESSES, AND PLACES


Ball (2015) argues that policy should not be conceptualized as a fixed entity, but that we should examine the trajectory of policy across contexts of influence, contexts of policy text production, contexts of practice, and outcomes. Likewise, Vidovich (2007, 2013) adds the notion of levels to capture global-to-local contexts. Moreover, in order to draw on both these insights, Thier (2015) developed a framework in which the 5Ps are interdependent threads that cut across contexts and levels as policy is devised and implemented. Authors found examination of the 5Ps (people, power, philosophy, processes, and place) to offer a framework for understanding significant policy enablers and constraints within the policy process.


In the context of the current study, people relates to the significant policy actors who influence OECD’s text and design of the assessment tool, as well as the members of the audience for whom the document is intended. Casting people in this manner supports assertions from O’Donoghue (2007), who shows that people continually shape and are shaped by the dynamics of social practices around them. Regarding power, the current study explores inequities embedded in the text and the power of language that provides agency for the OECD and, in so doing, allow individuals to put their stamps on policy texts (see Marginson & Rhoades, 2002). Considering philosophy, the current study refers to the beliefs and underpinnings embedded in the global competency discourse. Although not a new construct, notions of global competency continue to evolve and be problematized by policy enactors who reconceptualize and reinterpret them within given contexts (Thier, 2015). Concerning processes, the current study refers to the way in which the policy text, philosophies, and power are represented and plan to be implemented. In terms of place, the current study relates it to local settings in which the new PISA assessment will be implemented and the global context in which the OECD policy and assessment tools are shaped. In summary, the 5Ps enable a critique of GCIW by providing an analytical tool to synthesize findings from our various methods. In the next section, we discuss how definitions and attempts to measure global competency remain contested territory.


DEFINING AND MEASURING GLOBAL COMPETENCY


Contested definitions of global competency complicate any interrogation of how to portray and promote this concept (Deardorff, 2015; Ledger, Vidovich, & O’Donoghue, 2015; Singh & Jing, 2013), by OECD or any other powerful voice in the multifaceted global education arena. Many scholars’ conceptualizations of global competency cast wide nets, such as “the capacity and disposition to understand and act on issue of global significance” (Mansilla & Jackson, 2011, p. xxiii). Global competency, global citizenship, global mindedness, internationalization, and related social constructs have become embedded in education policies and discourses of international organizations, national school systems, and individual classrooms (Thier, 2015). Within this global arena, the OECD has become a privileged voice joining other giants such as UNESCO, Pearson, and Springer (Junemann, Ball, & Santori, 2016). These organizations exercise worldwide influence over diverse policy contexts ranging from early childhood education through the tertiary sector and teacher preparation. As Meyer (2014) suggests, it is timely to listen closely and critically to such influential voices, in search of both harmonies and dissonance.


From a measurement standpoint, the definitional diffusion that surrounds global competency presents even greater concerns, yielding more than 150 extant instruments whose creators purport to measure global competency or related constructs. Attempts to tap into such ambiguous psychological terrain have bedeviled measurement experts for nearly a century (Thier, 2017; Türken & Rudmin, 2013). For example, Likert (1932) developed his well-known scale to measure attitudes toward internationalism as a portion of his dissertation more than 85 years ago. Still, measurement debates swirl about the operational boundaries of dozens of potentially overlapping constructs that have been named haphazardly (see Fantini, 2009; Singh & Jing, 2013). Consequently, many unanswered questions surround the interpretations and policy implications that OECD’s measure might tell the world about global competency (Deardorff, 2015; Ledger, Vidovich, & O’Donoghue, 2015) and globalization writ large.


GLOBALIZATION AS A CONTEXT


Depictions of the current wave of globalization present a multifaceted phenomenon influenced by profound technological advances, innovation, and policy decisions. For some scholars, globalization is about modernization and change, not a “people exercise” (Gupta, 2003, p. 353). Instead, OECD’s focus on global competencies centers the individual learner in the discussion. The OECD focus shifts attention to people and, potentially, to agency, rather than maintaining debates of structures and processes. In shifting the debate, OECD’s focus echoes the work of Zhao (2010), who connects interpersonal relationships to global competencies.


Meanwhile, educational ideologies, philosophies, and hegemonies trickle around the globe through attempts to internationalize testing regimes and corresponding curricula (Law, 2004; Meyer, 2014; Papastephanou, 2015). Thus, technologies that accompany globalization yield “hybridized curriculum policies being produced in different national contexts as a result of the intersection of international, national and local forces” (Winter, 2012, p. 295). Forces of globalization—such as the PISA testing regime—are, in turn, “felt” locally (Reid et al., 2009, p. 390). Therefore, it seems clear that one can conceive of PISA’s new battery of assessments as a globalizing tool that will inevitably impact policies and practices in both national and local spaces (Lingard et al., 2015; Meyer, 2014; Schleicher, 2017).


CRITICAL THEORY


Critical theory underpins this policy investigation because of its focus on power and privilege: crucial considerations given OECD’s role as a global voice in education (Meyer, 2014). Critical theory concerns itself with knowledge creation, acquisition, and communication (Scotland, 2012). The current study adheres to a socio-cultural paradigm within which humans in the new millennium experience the intersection of social and cultural events such as unprecedented population migrations, surging climatic issues that span national borders, and rising global terror networks, to name a few. Meanwhile, powerful global voices underwrite the rising momentum and significance of ideologies and hegemonies that advocate for international testing regimes to soothe desires for competencies that have been dubbed essential for a good, global life in the 21st century (Ramos & Schleicher, 2016).


Equating language and power and “how” subjects are delineated and defined, is a hallmark of critical theory (Carr & Kemmis, 1986; Habermas, 1981). Language and power also underpin critical discourse analysis (Fairclough, 2010). Language typically features two levels of power: power as language and the language of power (Weib & Schwietring, 2017). Regarding the latter, language can be instrumentalized to exercise power through persuasion (see Goethe Institute, 2017). Therefore, theory that embraces social and cultural facets such as critical theory is necessary to help make meaning of texts and discursive practices that contribute to the creation and reproduction of unequal power relations (Jorgensen & Phillips, 2002).


Finally, critical theory allows the authors of the current study to interrogate the OECD policy position on global competency by deconstructing and critiquing the core text and exploring binaries, which we find embedded in the 5Ps. We use people, philosophies, power, processes, and place as an organizing lens for viewing and analysing our findings (Thier, 2015). These key policy threads align with disparate methods—critical discourse analysis (CDA) and social network analysis (SNA)—that are both analytical tools we use to address issues of power, people, and processes impacting policy text production (Fairclough, 2010) and acknowledge the power of language within the construction of policy documents (Ball, 2015). Framed through the 5Ps, our CDA and SNA findings enable us to capture multiple perspectives embedded in the OECD’s new measurement, in turn, adding rigor to our critical analysis. We describe those methods in detail in the ensuing section.


METHOD


Some purists assert that researchers should not combine methodological paradigms (McMillan & Schumacher, 2006); however, mixed methods remain popular in applied social sciences (Cameron, 2008). For the current study, we chose an uncommon mix of complementary methods, combining critical discourse analysis (CDA) and social network analysis (SNA). In so doing, this paper joins a very small number across the social sciences that point to the potency of joining these two methods (1). For example, Ryu & Lombardi, (2015, p. 76)1 have argued that:


Whereas CDA provides an interpretation for why and how something happens in engagement with critical reflection, SNA visualizes what is happening in relationships through the flow of available artifacts and knowledge, which is not otherwise readily discernable.


One of the clear advantages to this infrequently employed mix of methodologies is our descriptive use of the quantitative method, SNA, to explain social attributes of relationships and patterns among individuals (Wasserman, 1994). More typically, quantitative methods are used to test hypotheses inferentially. In the current study, our use of qualitative understandings from CDA more naturally marries the qualitative aspects of a descriptive approach to SNA.


Furthermore, the combination we employ moves beyond simple mixed methods description and critique, instead toward strategic eclectism (Roberts & Green, 2013), providing possible solutions and recommendations to phenomena under investigation. The methods compliment the study and one another as they provide opportunities to explore what the OECD portrays as global competency and how it promotes the construct. By showing how our qualitative and quantitative findings converge with, complement, and contradict one another (see Greene, Caracelli, & Graham, 1989), we can interrogate and critique distinct aspects embedded in a crucial OECD policy document. The CDA component interrogates implications of OCED’s language used and assumed compatibilities or oppositions. The SNA component explores GCIW’s position within a global context where large organizations facilitate the transmission of ideologies of particular scholars and associated groups. Consequently, we offer evidence to critique the OECD’s role in the social, cultural, and economic dimensions of marketization, globalization, and commodification of global education and competencies. In the ensuing sections, we report our research design, data source, and analytic techniques.


RESEARCH DESIGN


We used a convergent, parallel mixed methods design (Creswell & Clark, 2011), dividing our international team of scholars to conduct two concurrent, independent processes: CDA on the GCIW text and SNA on GCIW’s reference list. Each half of the team provided findings from their respective methodological strand, which the other half reviewed. After iterative reviews, we compared/merged findings, according equal emphasis to CDA and SNA during the development of meta-inferences (Teddlie & Tashakkori, 2009) .


Seeking to integrate GCIW through various types of methodological expertise and cultural/experiential backgrounds, we formed a diverse team representing research institutions in Asia, Oceania, and the Americas. We embraced CDA as an established approach to explore connections between language and power and SNA as an evolving tool for understanding power relations between speakers of language. CDA provides insight into how language is used to impact social practice, which both reproduces and changes knowledge, identities, and social relations. CDA facilitates interrogation of underlying assumptions and oppositions that a document establishes. SNA shows social connections of many sorts, displaying them as graphs that depict nodes (i.e., actors) and edges (i.e., ties). Ties can explain scientific developments within research organizations, businesses, and national initiatives (Abbassi, Altmann, & Hossain, 2011). Therefore, we applied this principle to investigate academic connections among authors who report on global competency. Thus, triangulating CDA and SNA enables a more nuanced understanding of relations between speakers, what they speak, and potential systematic inequalities that might develop or calcify as consequences of such speech. As a typical goal of triangulating methods, we sought to offset each approach’s typical limitations and associated biases. Therefore, we strengthened the validity of our findings through potential corroboration by employing intentional, independent, and simultaneous application of these disparate methods to examine the same conceptual phenomenon (Greene & McClintock, 1985).


We concluded our analysis by triangulating our data using Greene et al.’s (1989) convergence, complementarity, and contradiction framework to combine findings for the purpose of inference-making. In addition to that framework, we also followed recommendations from leading mixed methodologists (e.g., Creswell & Clark, 2011; Greene et al., 1989; Onwuegbuzie & Teddlie, 2003) to strengthen our study’s conclusions further by:


(1)

specifying an organizing lens and framework for viewing findings and drawing conclusions, i.e., Thier (2015)’s 5Ps: people, power, philosophies, processes, places;

(2)

drawing data for both methods from the same source (i.e., GCIW);

(3)

addressing the same research question with both methods;

(4)

providing a joint display that featured qualitative data (e.g., quotations) to show areas of alignment with quantitative findings (see Table 1); and

(5)

conducting iterative monthly team meetings to evaluate processes/progress and proactively address any methodological divides.



Table 1. Critical Discourse Analysis and Social Network Analysis Findings with 5P and 3C Frameworks

Thier (2015)

Critical Discourse Analysis

Social Network Analysis

Greene et al. (1989)

People, Power, and Places

International community of experts

OECD nation status

Preference for free-market capitalism

A Western notion

Assumptions about ideal citizens and their experiences: all conjure privilege

Young people interact with “other[s]” for own benefit

Overall importance of brokers

Five dominant culture representatives from the United States or the United Kingdom

Convergent

Philosophy and Processes

Global competency reacts to changing world, does not call to change the world

Assessment should feature quantitative and qualitative indicators; comprise knowledge, skills, and attitudes; (no discussion of strengths, limitations, and trade-offs)

Vague means to develop students: teacher preparation; coursework in science, technology, sustainability, language, & history; pedagogies: group-work, interdisciplinary

Schools and families cause discrimination; education can solve those ills

Voices informing OECD thinking are published in multiple modes (e.g., Deardorff, whose work had the most novel ties to other authors).

Nodes 69 (Torney-Puerta) and 12 (Byram) have similar structural positions. In addition to representing the government agency community, Torney-Puerta also published with the peer review community; Byram published with the government agency communities.

Denser networks can move resources more quickly than networks with fewer ties (Daly, 2010), so one might posit that these authors’ publication modes might be more efficient at commonly initiating change or ideation if they were more shared.

Complementary




DATA SOURCE


The GCIW document was the sole data source, one chosen deliberately as OECD’s starting point to promote the “new, ambitious and still experimental approach to” global competency (Ramos & Schleicher, 2016, p. 2). GCIW presents OECD’s conceptualization of requisite knowledge, skills, attitudes, values, and competencies for life in 2030. To ease understanding for readers who have not yet encountered GCIW, it may be helpful to summarise its underlying precepts for developing its concept of global competency:


(1)

Modern: Traditional disciplinary curriculum should account for knowledges and understandings germane to the 21st century.

(2)

Non-discriminatory: Skills, attitudes, and values that shape human behavior should counter discriminatory behaviors learned at schools and within families or communities.

(3)

Reflective: Learning should foster abilities for reflective processes to help one learn best.

(4)

Targeted: Each learner should strive to achieve a small set of key competencies (e.g., acting autonomously) with competency signaling the ability to mobilise knowledge, skills, attitudes, and values, enabling one to engage with and act in the world.


The OECD visualize these points and their construction of global competencies in Figure 1.



Figure 1. OECD’s 2030 Framework (work in progress as presented in GCIW).



[39_22705.htm_g/00001.jpg]




ANALYTICAL TECHNIQUES


We summarize our qualitative, quantitative, and mixed methods techniques in Table 2, but detail our procedures for conducting CDA and SNA, as well as our approach to combining findings from those methods, in the ensuing sections.



Table 2. Analytical Techniques for Qualitative, Quantitative, and Mixed Methods

Analytical technique

Critical discourse analysis

Social network analysis

Triangulation

Purpose of use

Reveal assumptions and implications of OECD’s language and assumed compatibilities or oppositions in GCIW.

Explore the interconnectivity and voice embedded in the OECD.

Develop a sequential collection of data that allows elaboration on an idea and expands data sets to enable more inquiry.

Key domains

Definitions: how OECD describes a future world.

Targets: who and/or what must change for global competency to be achieved.

Processes: methods schools can use to promote global competency.

Authority: locus of control for defining and measuring global competency.

Measurement: how global competency be measured.

Network density: number of ties a network presented as a proportion of its total possible number of ties.

Communities: groups of homogeneity within a network.


Brokers: potential of a node to mediate relations within the network.

Convergence: correspondence between data sets

Complementarity: data sets overlap, but offer distinct insights into a phenomenon

Contradiction: seeking opposing or incompatible results through iterative comparisons.




Critical Discourse Analysis (CDA). We conducted CDA to reveal assumptions and implications of OECD’s language, as well as the assumed compatibilities or oppositions that the GCIW document invokes. Fairclough (2010) points to discourse as an important form of social practice that both reproduces and changes knowledge, identities, and social relations, including power relations. Fairclough highlights three dimensions for CDA: discourse as texts, discursive practice (text production and consumption), and social practice. We focused our CDA on highlighting the assumptions and silences of GCIW regarding knowledge, identity, and relationships. We identified themes using a mixture of grounded theorizing and the critical CDA concepts of knowledge, identities, and assumed relationships (Fairclough, 2010). Two members of the research team identified codes independently. Their results revealed major commonalities and few initial points of difference, which led to double-coding to account for those points of difference following discussion among coders. The agreed-upon framework features five themes that correspond to the 5Ps, thus guiding our CDA process:


(1)

Definitions (philosophies): how OECD defines key terms, with particular attention to global competency, its conflation with other terms, and how OECD describes a future world that features global competency;

(2)

Targets (people and places): claims and assumptions about who and/or what must change for global competency to be achieved, or for global future prospects to improve;

(3)

Processes (processes): claims about methods that schools can use to promote global competency among their students;

(4)

Authority (power): claimed locus of control for defining and measuring global competency; and

(5)

Measurement (philosophies): how global competency can be measured


Social Network Analysis (SNA). We used SNA to explore the interconnectivity of voices embedded in GCIW. We coded authors whose work OECD cites in its promotional document, excluding publications that do not pertain to global competency (e.g., general topics of assessment such as rubric design or anchor-paper writing). Retaining 99 authors, we coded article–author pairs for type of publication: peer reviewed, university center/press, government agency, other journal, or popular press. If OECD cited multiple publications from an author, we aggregated that author’s publications into a single row to facilitate article-author pair analyses.


Using R to visualize the network, we explained three descriptive aspects of the network: (a) network density; (b) communities within the network; and (c) brokers within networks. Network density reflects process in the current study, or the extent to which authors that OECD cites have previously published their global competency-focused work in similar or dissimilar publication modes. Less dense networks reflect greater heterogeneity of authors (Daly, 2010). Communities are dense groups of nodes that share attributes, creating cohesive subgroups (De Nooy, Mrvar, & Batagelj, 2011). Communities reflect place in the current study, representing forms of homogeneity that can be identified by analyzing paths between nodes as direct or indirect. Brokers reflect people in the current study, occupying crucial positions in a network by spreading or retaining information strategically as they exert control over information diffusion. Therefore, we examined how single nodes might broker between nodes (see Burt, 2004).


Triangulation. After conducting CDA and SNA as independent processes, we triangulated results in accordance Greene et al. (1989)’s purposes for mixing methods:


(1)

triangulation to detect convergence or correspondence between data sets;

(2)

complementarity, where data sets overlap but offer distinct insights into a phenomenon;

(3)

development, in which sequential collection of data sets allows elaboration of initial ideas;

(4)

initiation that seeks contradictions deliberately through iterative comparison of results; and

(5)

expansion, whereby additional data sets enable further enquiry.


At this phase of our program of research, we could neither undertake sequential analysis (No. 3) nor identify additional data sets to mine (No. 5). Instead, we compared findings elicited from the CDA and SNA according to whether they converged with, complemented, or contradicted one another. We interrogated these findings futher through the theoretical lens of the five key policy threads (5Ps): people, philosophy, processes, power, and place.


FINDINGS


In this section, we report findings in response to our research question using the 5Ps as an organizational and theoretical lens. We first report themes that emerged from critical discourse analysis (CDA). Second, we report network density, communities, and brokers as descriptive findings from our social network analysis (SNA). Last, we show areas of convergence, complimentarity, and contradiction arising from the mix of CDA and SNA findings.


CRITICAL DISCOURSE ANALYSIS (CDA)


The coding themes that emerged related to definitions, target audience, processes, authority and measurement. We connect these findings to the 5Ps below and summarize them to reveal messages that OECD portays regarding global competency in its GCIW document.


Definitions (philosophies). GCIW defines global competency as a response to a changing world, ignoring students’ abilities to change that world. The document posits global competency as a quantifiable concept that can be tested within an hour using an assessment that comprises knowledge, skills, and attitudes. GCIW uses the image of a braided rope to demonstrate knowledge, skills, and attitudes as three intertwined domains that can produce the strength of a globally competent individual (Figure 1). The document states clearly that global competency involves more than maximizing economic gains of social change. Instead, “It will cover quantitative and qualitative indicators, including subjective well-being and quality jobs. It will ensure that the benefits of growth are fairly shared across society” (p. 1).


GCIW envisages a future world of “unprecedented challenges and opportunities” (p. 1), creating an imperative to change education such that it can prepare children for the coming world. GCIW alludes to prospective changes in technology, migration patterns, inequalities (widening), and job security (shrinking). Overall, GCIW concludes that: “If young people are to co-exist and interact with people from other faiths and countries, open and flexible attitudes, as well as the values that unite us around our common humanity, will be vital” (p. 1).


Targets (people and places). GCIW presumes that young people need to change to meet the world of the future, rather than suggesting that we should work systemically to construct alternative futures to meet the needs of the young. For example, the document initially asserts: “This generation requires new capacities” (p. 1). Thus, young people must change to fit the predicted world, eschewing a world that should be changed to fit people. GCIW acknowledges drawbacks of globalization, which is, nevertheless, assumed to be an irresistible wave of change.


GCIW offers token examples from three continents—Africa, Asia, and South America—to support its claim that global competency is important in many countries across the world.3 In this way, OECD claims all young people across all cultures under an umbrella of those who need to acquire the relevant competencies. Although GCIW concedes that global competency has developed as a preoccupation of Westernized nations, the document suggests that global competency resonates elsewhere, albeit under different names. For example, “Ubuntu” is claimed to be the similar term used in South Africa, but this concept is only mentioned in passing; it is tokenistic, rather than reflective of a genuine commitment to reconceptualise global competency in less Western terms. In pursuing the development of global competency, GCIW posits education as a reactive endeavor. With the predicted changes taken as fixed, education’s role is to adjust young people to change, rather than direct it. Twice in the document, OECD uses a jigsaw image to capture how globally competent individuals can fit into the world of the future, a metaphor that calls one to conform to an allotted space. In this metaphor, global competency provides the ability to act with such conformity.


Developing global competency becomes necessary, according to GCIW, “to counter the discriminatory behaviours picked up at school and in the family” (p. 2). Curiously, GCIW holds these two social institutions responsible for discrimination. Therefore, those institutions are cast as requiring change, rather than any others (for example, popular media or political discourse). Even taking the stance that global competency can be measured—one we pursue further in a latter section on Measurement—it is not clear what observers can infer from such measurement. On p. 19, GCIW suggests that its global competency measure is assessing effectiveness of curriculum and teaching methods. On p. 3, GCIW suggests the measure is assessing the effectiveness of teacher training. The document does make clear that the assessment tool does not aim to measure the effectiveness with which societies are preparing young people for the predicted future, although related commentary from key OECD voices suggest that societal claims fit squarely within the baileywick of PISA interpretations (see Schleicher, 2017).


Processes (processes). GCIW makes specific claims about the processes by which students can achieve global competency, but devotes little space to this (only 2 of 44 pages, and then at the end of the document) resulting in a necessarily superficial treatment. In its introduction, GCIW argues that curricula should “be comprehensive, interdisciplinary and responsive to an explosion of scientific and technological knowledge” (p. 1). However, this claim is neither analysed nor elaborated. The concluding section talks about what teachers and education systems can do to instill global competency in students. Suggestions are as wide-raning as offering electives in sustainable development to using groupwork as a pedagogy, whilst language and history teaching are both singled out as offering especial leverage in developing global competency. Yet the document offers no evidence that such interventions are likely to be effective. GCIW identifies teacher education as crucial for introducing this topic effectively, suggesting that teachers themselves must be globally competent before they can promote this competency amongst their students. But the document expands little on this topic.


Authority (power). GCIW makes three broad claims about authority. First, the document attributes to “the international community of experts” (p. 2) the concept of global competency and the measures to be taken to achieve it. However, the nature of such an assumed community is ill-defined. For instance, whilst this seems to be equated with the academic community of researchers, the works cited are often extremely dated and usually Western (e.g., a 1981 survey of U.S. college students is used as the basis for the assessment strategy). Second, in a self-referential move, the document describes prior consultation with OECD members—35 nations that dominate the upper ends of distributions for global economic indicators. Third, on a number of occasions, GCIW invokes the authority of the free market. For example, GCIW claims “employability” as an imperative for change.


Measurement (philosophies). GCIW pays considerably more attention to measuring global competency than promoting it. The document cites validated academic scales to measure related concepts (e.g., Global Understanding Survey and the International Civic and Citizenship Study), but relations between these different concepts remain unclear, including whether, in fact, they overlap. Furthermore, GCIW makes several claims about the measurement of global competency, including that it should examine a student’s ability to analyze cultural stereotypes through the use of case studies and critical incidents. However, the document offers no evidence that such an approach would reliably produce data to inform valid inferences. Potential problems with self-reporting are acknowledged, but not addressed. For a discussion of problems of self-reporting in measures of a related construct (global citizenship), see Thier (2017).


Items in the proposed assessment tool and scale offer further insights into OECD’s operationalization of global competency. According to GCIW, a globally competent person feels confident and happy about traveling to other countries, implying that if one hails from a background where this is not a norm, and feels apprehensive about such new experiences, one is not globally competent. The globally competent person seems to have had a range of experiences that would not necessarrily be available to students from lower socioeconomic backgrounds: having savored international cuisines or interacted with people from various countries. Seemingly by definition, one who lacks access to these types of experiences that might be necessary to create such confidence or comfort, would somehow be globally incompetent.


Ultimately, GCIW implies unstated assumptions about the ideal citizen and what kind of experiences they should have had, all of which depend upon affluence and privilege. The ideal globally competent student has money to donate to charity, has a home in which they can host exchange students, has met people from many countries, and goes to a school which is able to offer exchange programs. These variables essentially describe the habitus of a global elite, making it hard to see how a child from a lower socioeconomic background and/or an attendee of a poorly funded local school could possibly score well on this scale. These findings raise a philosophical debate about inherent binaries in OECD’s construction of global competency (e.g., rich/poor, homogenous/hetrogenous, global/local, lifestyle/survival, travelers/non-travelers, etc.). We provide further detail of the CDA phases and findings in Table 3.



Table 3. CDA Overview of Findings


Phase

CDA Process

Discourse

Phase 1

Identified CDA Structure

Section Title, Page, Key words, Theme 1, Theme 2, Boxed Information, Questions/Issues Raised, Curriculum, References

Phase 2

Extracted key words per page & section

Section Titles (11), Pages (41), Modules (25), Key words (94); Theme 1: Audience (5), Targets (10), Definitions (9), Methods (6), Authority (4), Measurement (9); Theme 2: Boxed Information (14), Questions/Issues Raised (39), Curriculum (4), References (7)

Phase 3

Identified sections

Introduction, Definition, Outline, Cognitive Test, Skills & Attitudes, Self-reporting, Openness, Values, References, Annexe Questionnaires, Experts

Phase 4

Identified key words

Globalisation, entrepreneurship, tension, cultural biases, GC, technology competencies, common grammar & language, cultural and gender stereotypes, complex, tangible, perspectives, knowledge, understandings, skills, dispositions, attitudes, values, dimensions of assessment, descriptive analysis, culture, change, dynamic affiliations, intersections, global awareness, world issues, case studies, performance standards, catering for country or subpopulation level, self-reporting, flexibility, empathy, openness, accountability, schools role to integrate globally, languages, teacher education, professional learning, racism & discrimination

Phase 5

Coded & identified recurrent themes

Audience, targets, definitions, methods, authority, measurement

Phase 6

Subthemes & elaboration

Globalisation, entrepreneurship, tension, cultural biases, technology competencies, common grammar & language, T&L cultural & gender stereotypes, perspectives, knowledge, understanding, skills, disposition, attitudes, values, intercultural competence, global competence, global citizenship, identity, relationships, context, culture, change, dynamic, cultural affiliations and intersections, testing knowledge & understanding, catering for country or subpopulation level information, self-reported information, proficiency in a foreign language, communication skills, responsibility, analysis, courage, stance, collective, accountability, teacher education, professional learning racism, discrimination, integrated curriculum, cooperative opportunities

Phase 7

Critique of boxed Information & visuals

2030 framework; 4 propositions; 7 policy questions & TALIS; perspectives on GC; Fig1 Dimensions of GC; 3 Definition of Culture; 3 Knowledges & skills; Review of cognitive assessment; Assessment in PISA; attitudes, openness, respect for cultural otherness, global mindedness, responsibility   

Phase 8

Constructs from Annexe Modules 1-25

foreign language reading; number of foreign languages; openness & flexibility; intercultural openness; perspective taking; resilience & emotional strength; intercultural resilience; intercultural communication; self-efficacy regarding global issues; geographical cultural identity; contact with people from other countries; friends from other countries; students engagement with others; school climate, multicultural, egalitarian acceptance by the teacher; attitudes towards immigrants; resilience; trends in the curriculum




SOCIAL NETWORK ANALYSIS


We used SNA to reveal who informed the way OECD portrays global competency. In this section, we first describe the author network’s density and explore the communities that exist within the author network. Second, we identify the network’s brokers as specific nodes of connectivity lying between two or more communities.


Network Density (process) and Communities (place). We provide a visual representation of our network’s density, which is also a measure of cohesion (see Figure 2). This figure depicts how authors in our sample employ publication modes. Of 2,628 possible ties, this network actualizes 657 ties between authors (25.0%); thus, connections reach one quarter of a completely linked network. We detect five communities, one per publication mode, based on the edge betweenness algorithm. Modularity value (0.52) reveals communities to be moderately dense. To examine the structure of those communities, we report three measures of centrality (degree, closeness, eigenvector) and average constraint scores for each community in Table 4.



Table 4. Author Centrality and Constraint Scores for the Total Network and by Community Type

 

Degree

Closeness

Eigenvector

Constraint

Network

18.73

(8.78)

0.007

(0.00)

0.40

(0.43)

0.22

(0.07)

Government agency et al.

28.71

(5.48)

0.01

(0.00)

0.45

(0.44)

0.13

(0.02)

Peer reviewed

13.92

(3.33)

0.007

(0.00)

0.47

(0.45)

0.25

(0.03)

Other journal

13.77

(2.77)

0.007

(0.00)

0.26

(0.38)

0.25

(0.03)

University center/press

12

(0.00)

0.007

(0.00)

0.45

(0.45)

0.28

(0.00)

Popular press

10

(0.00)

0.006

(0.00)

0.26

(0.39)

0.32

(0.00)

Note. SDs in parentheses.




Figure 2. Social Network Analysis

 


[39_22705.htm_g/00002.jpg]




Within the community of OECD-cited authors who had published via government agencies (n = 27 yellow nodes in Figure 2), we see the greatest influence on the overall network, based on three of four indicators. As a community, government agency nodes have the highest average degree (M = 28.71; SD = 5.48) and closeness measures (M = 0.01; SD = 0.00), the second highest eigenvector (M = 0.45; SD = 0.44), and the lowest constraint (M = 0.13; SD = 0.02). Next, the community of authors who had published via peer-reviewed journals (n = 12 green nodes) show the second greatest influence. Although the peer-reviewed journal community have produced the highest eigenvector (M = 0.47; SD = 0.45), it has the second most influential values (i.e., higher or lower depending upon metric) for degree (M = 13.92; SD = 3.33), closeness (M = 0.01; SD = 0.00), and constraint (M = 0.25; SD = 0.03).


The other three communities trail government agencies and peer-reviewed journals widely on all four measures. Other publications (n = 13 red nodes) and university/center presses (n = 10 purple nodes) each have the second-most influential score on two metrics apiece. Popular press (n = 8 blue nodes) underperforms the other communities on all metrics, making it the least central community by far. The other publications and popular press communities differ dramatically in terms of eigenvector (M = 0.26, SD = 0.38) from the other three communities. The other three metrics depict comparable relations.


Brokers (people). In this network, five nodes broker relations between communities, including three authors from the government agency community, one from the peer-reviewed journal community, and one from other publications. We summarize evidence of their centrality and constraints in Table 4. Deardorff (i.e., node No. 23) scored the highest on four of five indicators, revealing her as the most influential broker. She has the highest values in degree (52), closeness (0.011), and betweenness (1,048.50), and the lowest value for constraint (0.07), a function of the OECD drawing upon her publications from three communities of publication mode: government agency, other publications, and university/center presses.


The second most influential broker, Torney-Puerta (No. 69), also represents the government agency community. She finishes second to Deardorff in four indicators (degree, closeness, betweenness, and constraint); they share third-place for eigenvector (0.07). Like Deardorff and Torney-Puerta, Byram (No. 12) represents the government agency community, producing the highest eigenvector (0.09), which he shares with Bennett (No. 9). Byram also shares the second-highest closeness value with Torney-Puerta (0.09). Donnelly (No. 25), from the peer-review community, is another broker whose work from university/center presses informed OECD’s model. In addition to representing the government agency community, OECD drew upon work from Torney-Puerta (peer-reviewed journals) and Byram (popular press) in separate communities. Bennett’s work that informed the OECD model represents the other publications community, for which he serves as the broker to the popular press community.


TRIANGULATION


Our data highlight convergences, complementarities, and contradictions between CDA and SNA findings regarding how the OECD signals its promotion and portrayal of global competency.


Convergence. Our qualitative and quantitative data overlap in their explicit and implicit assertions of people and power. When our CDA findings implied the existence of a community of experts and market imperatives as bases for proposed changes to what education should comprise (i.e., globalizing the student experience for the purpose of becoming competitive enough to enter a sprawling market), that community appears somewhat more nebulous than economies might demand. However, the SNA gives shape to that community by identifying brokers, thus confirming the existence of community groupings that become more coherent than the bazaar-like conflation of various aspects of knowledge, skills, and attitudes that GCIW presents in various ways. Therefore, the SNA provided increased clarity for the authority that informed GCIW, surrounding the document with an academic community that seemed more concrete than the verbal analysis of GCIW’s speculation about the priorities of a future-oriented global economy. SNA results show clearly that the OECD obeys a hierarchy that esteems government publications and articles in peer-reviewed journals over university/center and popular presses or other types of outlets. Hence, our disparate methods corroborate the authority that each invokes.



Table 5. Centrality and Constraint Indicators per Broker

Broker

Community

Degree

Closeness

Eigenvector

Betweenness

Constraint

Byram (12)

Government agency

37

0.01

0.09

419.00

0.10

Deardorff
(23)

52

0.01

0.07

1,048.50

0.07

Torney- Puerta (69)

40

0.01

0.07

522.00

0.09

Donnelly (25)

Peer review

25

0.01

0.06

219.50

0.14

Bennett (9)

Other journal

23

0.01

0.09

171.00

0.15

 



Regarding place, we show in Table 5 that CDA unearthed the predominance of Westernization and OECD nation status, despite GCIW’s thin argument about global competency’s resonance for young people across oceans or other physical divides that create and confound nation-state borders. By revealing a core community of influential voices that appear to be even more tightly woven and more Western than the GCIW text states explicitly, SNA findings shed similar light to their CDA counterparts, which made dubious the veracity of the cross-cultural claim. Put simply, all five brokers represent dominant ethno-cultural groups in the United Kingdom and United States. Coincidentally, but as a fitting nod to such a strong degree of sameness, two of five brokers work in towns called Durham (Byram in the United Kingdom and Deardoff in the United States). Given that our CDA findings showed GCIW to make privileged assumptions about the ideals and experiences that would typify globally competent citizens, the fact that these two nations have caused the world’s most prevalent colonial and neo-colonial impacts seems particularly salient for understanding the international reach, or lack thereof, for OECD’s operationalization of global competency. This finding echoes Meyer (2014), who suggests that the PISA battery of tests is an exercise in soft power through communities of knowledge, affording privilege to particular social and economic ideologies. Likewise, this exercise in power through language (see Fairclough, 2010) sits in opposition to the definition of global competency that OECD offers:


the capacity to analyse global and intercultural issues critically and from multiple perspectives, to understand how differences affect perceptions, judgments, and ideas of self and others, and to engage in open, appropriate and effective interactions with others from different backgrounds on the basis of a shared respect for human dignity. (Ramos & Schleicher, 2016, p. 6; emphasis ours)


Findings from both methods suggest that OECD predicated its understanding and measure of global competency on a very limited range of perspectives and backgrounds, a mistake that runs afoul of the very purpose of educating for global competency.


Complementarity. Regarding philosophy and process, our qualitative and quantitative methods do not speak directly to one another. Instead, they broaden our understanding of the world that the OECD delineates in its official document. For instance, SNA offered nothing philosophical about how one should measure global competency, although the authors that informed the SNA certainly did. For example, dissertation research from Deardorff, the primary broker of the community of scholars that OECD consults according to SNA findings, produced the first theoretical model for measuring the related construct of intercultural competence. Deardorff (2006) argued that developing and measuring intercultural competence must use students’ attitudes as its foundation. This theoretical claim has only been subjected to one empirical test thus far (see Thier, Kim, & D’Aquillanto, 2018), and the extent to which OECD’s measurement model has taken up an attitudes-first approach is unclear. The OECD articulated its operationalization of global competency’s Knowledge and Skills domains, but left its Attitudes domain blank (see Figure 1).


Meanwhile, CDA revealed the discursive construction of the globally competent student as a member of a socioeconomic elite, with the resources to travel, consume multicultural experiences: a young person who feels comfortable with the accoutrements of a cosmopolitan lifestyle. Complementarily, SNA showed greater influence among authors whose work touches upon a diverse array of communities, a finding that seems to support the philosophical notion that a globally competent individual is one who interacts with various communities. Moreover, the philosophical underpinnings embedded in GCIW heap the burden of responsibility on the young. For example, globally competent young people “are better equipped to build more just, peaceful, inclusive and sustainable societies through what they decide and what they do” (p. 6). Perhaps the rarity of such individuals (either among young people or researchers) explains the document’s vague expressions of the needs for quantitative and qualitative foci, as well as knowledge, skill-based, and attitudinal aspects of global competency.


Furthermore, the structural positions of influential scholars such as Deardorff, Bryam, and Torney-Puerta that SNA revealed seems to complement CDA findings about processes. GCIW reports—unclearly—how educators might instill global competency in their students. Correspondingly, SNA findings revealed that denser networks (i.e., perhaps those for which it is harder to detect inner workings due to increased complexity) are better positioned to serve as hubs of intellectual change and/or ideation. Relatedly, CDA data situated schools as nuclei to solve societal ills such as discrimination, for which the document blames schools and families. Further positioning students as bodies in need of change rather than agents of it, the OECD document follows the SNA logic: external forces or expertise are needed to fix students in this arena. This interpretation of global competency is a far cry from student-active initiatives such as Canada’s The National Youth White Paper on Global Citizenship (2015), in which more than 1,000 students in high schools across Canada conducted a virtual town hall to define obligations, rights, responsibilities, initiatives, policies, and practices that pertain to global citizenship.


Contradiction. Tensions between CDA and SNA can offer insights into the role of educational thinkers, such as those who read this journal. The CDA revealed that the document defined education as a reactive endeavor, which must trail along behind worldly (largely economic) factors, rather than lead and influence them. It might be thought, then, that educational academics, too, must be reactive to circumstances rather than able to influence change. After all, scholars have questioned whether better PISA rankings necessarily equate to improved education, yet PISA data remain simplistically equated thusly in much public discourse (Ercikan, Roth, & Asil, 2015). However, SNA findings demonstrate the significance of the community of educational thinkers (publishing mainly in peer-reviewed journals) to the network that informed the composition of GCIW. This was the lone point of contradiction that we noted between methods. GCIW states that global competency “requires numerous skills, including the ability to: communicate in more than one language; communicate appropriately and effectively with people from other cultures or countries (p. 7). Yet, the SNA revealed that most of the cited authors/experts originate from monolingual, Western nations, particularly the United Kingdom and the United States, both of which associate with a great deal of colonial baggage. Although GCIW expects teachers and learners of global competency to tap into diversity, OECD did not draw upon a diverse literature base. Thus, we conclude the paper with a discussion of our key findings and recommendations, limitations to the current study, and its implications for the OECD, as well as the researchers and policymakers who are curious about what the supranational organization says regarding education, especially toward global competency.


DISCUSSION


The current study presents an examination of how a policy document from a large, globally significant voice portrays and promotes global competency through the lens of five policy threads (5Ps) and by employing an uncommon mix of CDA and SNA. Within the GCIW document, our CDA unveiled an OECD philosophy in which global competency is a requisite reaction to the 21st century’s ever-changing and amorphous set of international opportunities and challenges. Per the OECD, global competency entails knowledge, skills, and attitudes that can be quantified quickly based on 15 year olds’ self-reported responses to Likert-type items for which scarce validation evidence is presented. The OECD targets young people, regardless of national and/or cultural background, as subjects who must change to meet the economic and social needs of a coming reality, not actors who can shape that reality. En route to that coming reality, the OECD positions schools and families as sites of discrimination that should be unlearned. GCIW spends little time explaining the processes necessary for educators to lead their students for the needed transformation, other than to suggest a need for educators’ global competencies as a precondition for making educative experiences relevant. For the OECD, the power to operationalize global competency rests within an international community of experts that hail from OECD nations with the some of the past’s largest colonial footprints and that retain outsized influence over present and future global dynamics. Consequently, the measurement model paints a picture of a globally competent 15 year old as one whose life is composed of acute privilege. Any construct display for this measure would depict a high-scoring student as someone whose experiences would make him or her seem like an extreme outlier in comparison to the normative respondent to the new PISA tool.


Using SNA, we found a moderately dense community of scholars contributing to OECD’s work. Only five brokers—Deardorff, Torney-Puerta, Byram, Bennett, and Donnelly—connect the work of nearly 100 scholars and seem to dominate the discussion. Correspondingly, the Greene et al. (1989) convergence, complementarity, and contradiction framework revealed a very cohesive story between our qualitative and quantitative methods. For example, despite OECD nominally espousing a version of global competency based on multiple perspectives and understanding cultural differences, both CDA and SNA findings show evidence of an OECD conversation impoverished by a limited degree of diversity of scholars, publication types, backgrounds, and viewpoints. Additionally, GCIW does not heed the scholarly advice of its key contributor (Deardorff), instead incompletely articulating the core attitudes that undergird OECD’s operationalization of global competency.


Despite such flaws observable in its key promotional document, the OECD continues to use GCIW to inform the 2018 PISA administration, which can now count global competency among the educational domains for which it can leverage profound change across nations (Lingard et al., 2015; Meyer, 2014). In a recent article on the future of PISA, one GCIW author explains the power of PISA: showing policymakers and practitioners the uppermost possibilities of education, especially for those who are willing to measure and “embrace a wider range of competencies that respond to the changing skill needs” of a modern economy (Schelicher, 2017, p. 117). However, the PISA global competency assessment is the first large-scale foray into measurement of a domain that is highly polemic, especially in countries where nationalism has been or recently became politically salient (Anderson, Thier, & Pitts, 2017). Amid uncertainty and controversy surrounding global competency, the OECD cannot afford to get this wrong.


RECOMMENDATIONS


Our findings support a strong argument to revisit and modify OECD’s global competency measure. As one example, our analysis raises concerns about how GCIW discursively constructs a globally competent student as being a member of the socio-economic elite, someone able to travel internationally, have a variety of cross-cultural experiences, and learn multiple languages. For a measure intended to be transferrable across at least 80 national contexts, the literature that informed it does not explore diversity in a meaningful way. The resulting ideal of the globally competent student appears to be a privileged, Westernized mash-up of James Bond, Indiana Jones, and Superman, the latest manifestation in the White Saviour narrative that has dominated Hollywoord and educational discourse alike (Cammarota, 2011). Even Schleicher (2017) notes in his discussion of PISA’s future, a global measure of learning must be “accessible and relevant” to middle and low income countries, not just the 35 well-resourced nations that comprise the OECD (p. 117).


Problematically though, Schleicher’s description of PISA as a “yardstick” of national achievement in education speaks to the limited worldview that informed OECD’s framing of global competency. About 95% of the world’s population employs the metric system, not U.S. customary units such as yards. This seems to be an odd unforced error, a linguistic gaffe that reflects the OECD’s missed epistemological opportunity. Instead, the OECD could have expanded its understanding of global competency beyond a small group of influential brokers who share many experiential and demographic features.


In discussing the OECD’s 2015 expansion of PISA to include collaborative problem solving, Schleicher (2017) critiques the irony of the majority of assessments in present-day schools, which esteem individual achievement amid a rising tide of interdependence. Increasingly,


we rely on great collaborators and orchestrators who are able to join others in life, work and citizenship. Innovation, too, is now rarely the product of individuals working in isolation but an outcome of how we mobilise, share and link knowledge. Future tests should not disqualify students for collaborating with other test-takers, but encourage them to do so, and assess collaborative skills (Schleicher, 2017, p. 114)


Just as the assessment tools currently available in the overwhelming majority of schools do not comport with modern needs of collaborative problem solving, we find that the OECD’s operationalization of global competency does not comport with even its own ideas of what global competency should entail. Therefore, our findings yielded five recommendations that the OECD, researchers, policymakers, and practitioners can use to improve or interpret results from the upcoming landmark experiment in measuring global competency.


Recommendation #1. Re-examine the philosophical approach to global competency, a reflective activity that would likely prompt revision of the discursive construct of the globally competent student, one awash in his own privilege. Perhaps global competency is not the ideal; Scholars are beginning to confront tensions between global competency and global citizenship (see Dill, 2013; Hammond & Keating, 2017). The former might have unavoidably neoliberal implications with too great a dependency on acquiring a globalized world view and associated knowledges, skills, and behaviors to further one’s competitive advantage in global markets. By contrast, the related notion of global citizenship might better enable students to collaboratively live, learn, and work in an increasingly interconnected world (Anderson, Thier, & Pitts, 2017).


Recommendation #2. Consider which people were and were not heard during the OECD’s initial process for operationalizing global competency. For their part, researchers should voluntarily contribute their ideas for a truly global approach to measuring global competency. Consequently, the OECD can depend upon a wider range of voices, specifically creating a space for non-Western authors to engage in debates about definitions, measurement approaches, and issues of socio-cultural diversity. Such an approach would conform to an OECD description of PISA as a “collaborative effort” (Schleicher, 2017, p. 116).


Recommendation #3. Articulate the affective domain of global competency beyond the presently limited structure of the OECD’s visual representation in Figure 1. One approach would be to surrender the power that has thus far been consolidated within a small network of brokers. Instead, solicit broadly to recruit insights from professionals in schools and key stakeholders in communities. Doing so will ensure local and global resonance of both the construct and how the OECD re-operationalizes it. Currently, the GCIW document provides insufficient examples of how schools can promote global competency. Therefore, the type of peer-to-peer exchange across the PISA global community that Schleicher (2017) endorses would enable a socially valid interpretation of global competency and clearer pathway to lead students to develop it.


Recommendation #4. Ponder a process to shift the burden of change from student responsibility to the work of schools. Again, engaging school-based professionals for ideas how to do so will create stronger connections between OECD, PISA, and the real work of schools. Absent this step, students will be unlikely to esteem global competency so long as educators frame the construct such that students feel burdened by change rather than feeling as if they are empowered to drive that change.


Recommendation #5. The OECD and policymakers should partner to offer pedagogical support that accounts for place. Schools and organizations need to see how the OECD’s expectations connect to their local realities through culturally diverse and appropriate examples. Otherwise, practitioners will struggle to develop young people’s knowledge, skills, and affective domains of global competency. If PISA is to “successfully cater to a larger and more diverse set of countries,” especially those in middle and low-income countries, as Schleicher predicts (2017, p. 122), schools must find utility in the tool. It seems clear that constructs such as global competency are inflected by international and interregional differences (see Goren & Yemini, 2017). The OECD has indicated a willingness to adjust PISA instruments to better contrast high-performing and low-performing students or to make contextual questionnaires more nationally relevant (Schleicher, 2017). But it has not ensured that its operationalization of global competency suits contextual differences based on local, national, or regional places.


These recommendations provide constructive solutions to overcome issues and limitations revealed in the critique. Next, we discuss limitations of the current study.


LIMITATIONS


Recognising that all research is inherently biased and typically not neutral, we endeavoured to mitigate possible limitations related to any singular methodological tradition through a mixed methods design. Therefore, this paper adds to a small, but growing number of methodological examples in which CDA and SNA combine in research designs. Our simultaneous use of these methods has helped to alleviate potential risks associated with either method’s individual bias. Furthermore, we employed Greene et al.’s (1989) triangulation process to add rigour to our uncommon design as we attempt to add confidence to our reporting of interpretative findings (see Moser, Groenewegen, & Huysman, 2013).


Still, we must interrogate the limitations of each of our methods prior to the mixing we engaged in to mitigate concerns. Regarding CDA, we follow Luke (1995) to explore two key limitations. First, our research team is primarily English-dependent, like the GCIW text. Both for the source document and our analysis of it, this dependence creates a linguistic bias. As we attempt to show in the current study, such bias is inherently limiting, an especially glaring omission in a multinational inquiry into 15 year olds’ global competency. Again, we strongly recommend that international scholars probe further in this domain. Second, the text under investigation is replete with systematic asymmetries of power and social resources. All researchers on our team were raised and educated in prominent OECD nations, upbringings that position us to both detect and be blind to the implications of such asymmetries and resources. The beliefs and assumptions of the researchers who interpreted the text are highly influenced by the linguistic and socio-cultural and economic dependencies stated previously. Therefore, the research team made conscious attempts to reduce these concerns. We met regularly via Skype to explore possible biases, define terms, press upon each other’s assumptions, and elicit critical feedback.


Vis-à-vis SNA, our process for excluding non-global competency citations could have introduced error into our algorithms. However, we defend that decision based on a desire to avoid diluting the pool with irrelevant citations focused on aspects of designing assessment or curricula broadly, not necessarily ones with global dimensions: the key thrust of this study. Additionally, we had intended to add another vector to the SNA, namely Google Scholar citation counts to approximate prominence for each cited article. We expected this measure would add explanatory power to our model’s ability to detect and describe communities and their brokers. However, Google Scholar citation counts are not universally available for all articles. Such counts were only available for 86% of our sample. Given that SNA procedures for handling data gaps remain underdeveloped when compared to other more-established statistical analyses (Huisman, 2009), we chose not to include GoogleScholar citation counts as a vector in our SNA models.


IMPLICATIONS


Inquiring into PISA naturally brings forth a host of implications that sprawl across educational policy, practice, and research. The opening clause of Schleicher’s (2017) article on the future of PISA explains clearly that “Tests influence policies and practices in many ways by signaling priorities for curriculum and instruction.” When an organization with a reach as global as that of the OECD establishes the content of tests with highly visible scores such as PISA, “administrators and teachers pay attention to what is tested and adapt curriculum and teaching accordingly” (p. 113). As Schleicher elucidates, PISA’s natural orientation is to slake the global thirst for big data that can drive educational policy and guide decisions about it. Comparisons between national scores provide the dialogue between international and intranational conversations about what it means to be a high-performing student, school, or system. These are challenging enough conditions for domains such as literacy, numeracy, and science, upon which researchers generally agree. In unchartered territory such as global competency, the power of PISA can yield unpredictable consequences.


The moment that the OECD releases data from a PISA administration, media outlets flood with comparisons to other participating nations, often reinforcing deficit narratives about schools. Schleicher (2017) concedes that “putting education data out into the public space does not automatically change the ways in which students learn, teachers teach and schools operate”, but he laments that most policymakers and practitioners are “light years away” (p. 121) from having the requisite tools to interpret or use data well. Given the current study’s findings about what the global competency measure will and will not tell education systems about themselves relative to an ill-defined construct, mass dissemination of nation-level global competency scores for ill-equipped consumers could lead to a dangerous disconnect.


Schleicher (2017) touts a need for “collaborative consumption” of data that PISA affords, exemplifying the point by describing the rich conversations that principals and their teachers are having in Fairfax County, Virginia, United States. However, the case is an inadequate example because Fairfax is the outlier of outliers. It is the second wealthiest county in the United States, a Top 10 gross domestic product per capita nation. Furthermore, Fairfax is one of the world’s most policy-informed areas. Fairfax is the sixth-most educated county in a country with the second-highest percentage of adults with bachelor’s degrees. Fairfax is home to many affluent suburbs where Washington, DC, policymakers live. Perhaps educators and community members in Fairfax could examine PISA reports and understand the instability of estimates based on their standard errors or why large sample sizes are driving statistically significant differences in student performance between nations that might otherwise not show real differences. Perhaps they can parse aggregated means from student percentages at certain proficiency levels. Perhaps they will interrogate differences between nations whose PISA respondent pool includes students who attend privately funded schools from nations that only sample from state-funded schools. Perhaps they might click a dozen times through OECD’s online documentation to seek explanations of unfamiliar constructs like collaborative problem solving or global competency. But doing so would make them very rare in comparison to a normative population of PISA result consumers.


Seemingly, the current study has implications for further critiques of global voices in education as outlined in our recommendations above. As a substantive consequence to these issues raised, GCIW’s lack of curricular precision seems unsurprising and well aligned to Author’s (2015) 5Ps: people, power, philosophies, processes, and places. Developing global competency can benefit students worldwide, yet as we argue in this paper, global competency as currently constructed may communicate ideologies of privilege and ultimately restrict students’ ability to affect change. Global organizations and testing regimes can transmit global ideologies, preferred social constructs, which are often derived from Western norms and desires of culture and language (see Hill, 2006; Rizvi & Lingard, 2010). Global competency is one such construct that has developed from previous notions of international mindedness, global citizenship, and cosmopolitanism into what Zhao (2010) termed global competencies. Merely being added to the PISA battery signals some degree of global recognition of the term. Although there is ambiguity surrounding the term, there is no doubt that global competency is seen as an essential 21st-century skill (Mansilla & Jackson, 2011). However, the introduction of such an international measure has been demonstrated to be a potent tool with many potential intended and unintended effects (Lingard et al., 2015).


The need for a global competency that accounts for non-Western voices intensified recently. England, the United States, and four other influential OECD nations opted out late in January 2018 (Coughlan, 2018), several months after we completed data analysis for the current study. The two nations that most palpably contributed research to inform the GCIW document joined Denmark, France, Germany, and the Netherlands in announcing their shared intention to have their students continue to sit only for the literacy, numeracy, and science batteries. Losing these highly influential and well-resourced nations raises continued questions about the perceived quality of the OECD measure and its lack of curricular precision. Popular media touts the measure as an assessment of the following dubiously conflated domains: “tolerance, cultural awareness and how well teenagers can distinguish between reliable sources of information and fake news” (Coughlan, 2018, para. 7). OECD expects further changes among which nations will or will not participate. Corresponding to our recommendations, this shift in the balance of influence among participating nations seems to create a vacuum into which non-Western and/or economically developing (i.e., non-OECD) nations can add more nuanced understanding of global competency for a measure, one that certainly seems ripe for revision.


CONCLUSION


Whilst acknowledging that the OECD’s measure of global competency is in its infancy, we argue that the launch of it is the most important moment to influence the “trajectory” (Ball, 1994) of policy. Its introduction will not without critics, its implementation not without complication, and its impact not without policy, practice, and research implications. We challenge the OECD, policymakers, practitioners, and researchers to expand the range of global voices to inform policy directions, particularly when considering a directive that can be as globally significant and influential as PISA’s global competency measure. Our call for further critique of the implementation process of the OECD’s global competency initiative across the 5Ps remains timely as the OECD unveils its 2018 handbook for Preparing our Youth for a Sustainable World: The OECD PISA global competency framework to accompany the 2018 enactment of the OECD’s new global measure. It is also timely, as it captures the initial wave of influential OECD nations opting out of the global competency measure including the United Kingdom, England, Denmark, France, Germany and the Netherlands. Curiously, some of these countries most palpably contributed to the research informing the GCIW and its global competency measure.



Notes


1. See Ekman (2015); Ruane and Lee (2016); and Ryu and Lombardi (2015).

2. For a full description of the methods used to construct these measures, please email the authors.

3. Regarding knowledge that carriers clear cultural dimensions, there is considerable international variation in the number of continents that students learn in school. In some systems, the Americas (North and South) are a single continent. In some systems, Asia and Europe are considered to be a singular landmass. Some systems disregard Antarctica as an uninhabited landmass. The five rings on the Olympic flag recognize an international understanding of five continents: Africa, the Americas, Asia, Australia, and Europe.


References


Abbasi, A., Altmann, J., & Hossain, L. (2011). Identifying the effects of co-authorship networks on the performance of scholars: A correlation and regression analysis of performance measures and social network analysis measures. Journal of Informetrics, 5(4), 594–607.


Abdi, A. A. (2011). De-monoculturalizing global citizenship education: The need for multicentric intentions and practices. In L. Shultz, A. A. Abdi, & G. H. Richardson (Eds.), Global citizenship education in post-secondary institutions: Theories, practices, policies (pp. 25–39). New York: Peter Lang.


Anderson, R., Thier, M., & Pitts, C. (2017). Interpersonal and intrapersonal skill assessment alternatives: Self-reports, situational-judgment tests, and discrete-choice experiments. Learning and Individual Differences, 53, 47–60.


Ball, S. (2015). What is policy? 21 years later: Reflections on the possibilities of policy research. Discourse: Studies in the Cultural Politics of Education, 36(3). doi:10.1080/01596306.2015.1015279


Berliner, D. C. (2015). The many facets of PISA. Teachers College Record, 117(1), n1.


Breakspear, S. (2012), The policy impact of PISA: An exploration of the normative effects of international benchmarking in school system performance, OECD Education Working Papers, No. 71, Paris, France: OECD. Retrieved from http://dx.doi.org/10.1787/5k9fdfqffr28-en


Burt, R. S. (2004). Structural holes and good ideas. American Journal of Sociology, 110, 349–399.


Cameron, R. (2008). Mixed methods in management research: Has the phoenix landed?, 22nd Annual Australian & New Zealand Academy of Management Conference, Auckland.


Cammarota, J. (2011). Blindsided by the avatar: White saviors and allies out of Hollywood and in education. Review of Education, Pedagogy, and Cultural Studies33(3), 242–259.


Carr, W., & Kemmis, S. (1986) Becoming critical: Education, knowledge and action research. Philadelphia: Falmer Press, p. 249.


Coughlan, S. (2018, 24 January). England and US will not take PISA tests in tolerance. BBC. Retrieved from http://www.bbc.com/news/business-42781376


Creswell, J. W., & Clark, V. L. P. (2011). Designing and conducting mixed methods research (2nd ed.). Los Angeles, CA: Sage.


Daly, A. J. (Ed.). (2010). Social network theory and educational change (Vol. 8). Cambridge, MA: Harvard Education.


Deardorff, D. K. (2006). Identification and assessment of intercultural competence as a student outcome of internationalization. Journal of Studies in international Education10(3), 241–266.


Deardorff, D. K. (2015). Intercultural competence: Mapping the future research agenda. International Journal of Intercultural Relations, 48, 3–5.


De Nooy, W., Mrvar, A., & Batagelj, V. (2011). Exploratory social network analysis with Pajek (Vol. 27). Cambridge, UK: Cambridge University.


Dill, J. S. (2013). The longings and limits of global citizenship education: The moral pedagogy of schooling in a cosmopolitan age (Vol. 109). New York: Routledge.


Ekman, M. (2015). Online Islamophobia and the politics of fear: Manufacturing the green scare. Ethnic and Racial Studies38, 1,986–2,002.


Ercikan, K., Roth, W. M., & Asil, M. (2015). Cautions about inferences from international assessments: The case of PISA 2009. Teachers College Record, 117(1), 1–28.


Fairclough, N. (2010). Critical discourse analysis: The critical study of language (2nd ed.). Abingdon, Oxon, UK: Taylor & Francis.


Fantini, A. E. (2009). Assessing intercultural competence, in Deardorff, D. K. (Ed.), The SAGE handbook of intercultural competence, pp. 456–476, Thousand Oaks, CA: Sage.


Goethe Institute. (2017). The impact of language in a globalised world. Retrieved from http://www.goethe.de/lhr/prj/mac/msp/en1253450.htm


Goren, H., & Yemini, M. (2017). Citizenship education redefined—A systematic review of empirical studies on global citizenship education. International Journal of Educational Research82, 170–183.


Greene, J., & McClintock, C. (1985). Triangulation in evaluation: Design and analysis issues. Evaluation Review9, 523–545.


Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255–274.


Gupta, N. (2003). Geopolitics of globalisation: a re-examination by comparativists. Policy


Futures in Education, 1(2), 351–360.


Habermas, J. (1981). The theory of communicative action. In Reason and the Rationalization of Society, Volume 1. Thomas McCarthy. Boston, MA: Beacon.


Hammond, C. D., & Keating, A. (2017). Global citizens or global workers? Comparing university programmes for global citizenship education in Japan and the UK. Compare: A Journal of Comparative and International Education, 1–20.


Hill, I. (2006). Do International Baccalaureate programs internationalise or globalise? International Education Journal, 7(1), 98–108.


Huisman, M. (2009). Imputation of missing network data: Some simple procedures. Journal of Social Structure, 10(1).


Jorgensen, M., & Phillips, L. (2002). Discourse analysis as theory and method. London, UK: SAGE.


Junemann, C., Ball, S. J., & Santori, D. (2016). Joined-up policy. In K. Mundy, A. Green, B. Lingard, & A. Verger (Eds.), The handbook of global education policy. Chichester, UK: John Wiley & Sons. doi:10.1002/9781118468005.ch30


Labaree, D. F. (2014). Let’s measure what no one teaches: PISA, NCLB, and the shrinking aims of education. Teachers College Record, 116(9), 1–14.


Law, J. (2004) After method: Mess in social science research. Oxon, UK: Routledge.


Ledger, S., Vidovich, L., & O’Donoghue, T. (2015). International and remote schooling: Global to local curriculum policy dynamics in Indonesia. The Asia-Pacific Education Researcher, 24(4), 695–703.


Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 140, 1–55.


Lingard, B., Martino, W., Rezai-Rashti, G., & Sellar, S. (2015). Globalizing educational accountabilities. New York: Routledge.


Luke, A. (1995). Text and discourse in education: An introduction to critical discourse analysis. Review of Research in Education, 21, 3–41. doi:10.2307/1167278


Mansilla, V. B., & Jackson, A. (2011). Educating for global competency. New York: Asia Society. Retrieved from http://asiasociety.org/files/book-globalcompetence.pdf


Marginson, S., & Rhoades, G. (2002). Beyond national states, markets, and systems of higher education: A glonacal agency heuristic. Higher Education, 43(31), 281–309.


McMillan, J. H, & Schumacher, S. (2006). Research in education: Evidence-based inquiry (6th ed.). Boston: Pearson Education.


Meyer, H. D. (2014). The OECD as pivot of the emerging global educational accountability regime: How accountable are the accountants. Teachers College Record, 116(9), 1–20.


Moser, C., Groenewegen, P., & Huysman, M. (2013). Chapter 24: Extending social network analysis with discourse analysis: Combining relational with interpretive data. In The influence of technologu on social network analysis and mining, LNSN, 6, 547–556. New York: Springer Nature. doi:10.1007/978-3-7091-1346-2 24


The National Youth White Paper on Global Citizenship. (2015). Retrieved from http://www.takingitglobal.org/images/resources/tool/docs/Global_Citizenship.pdf


O’Donoghue, T. A. (2007). Planning your qualitative research project: An introduction to
interpretivist research in education. Abingdon, Oxon: Routledge.


OECD. (2016, 15 May). OECD proposes new approach to assess young people’s understanding of global issues and attitudes toward cultural diversity and tolerance. Retrieved from http://www.oecd.org/fr/social/oecd-proposes-new-approach-to-assess-young-peoples-understanding-of-global-issues-and-attitudes-toward-cultural-diversity-and-tolerance.htm


Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. Handbook of Mixed Methods in Social and Behavioral Research2, 397–430.


Papastephanou, M. (2015). Thinking differently about cosmopolitanism: Theory, eccentricity, and the globalized world. Oxon, UK: Routledge.


Perry, N., & Ercikan, K. (2015). Moving beyond country rankings in international assessments: The case of PISA. Teachers College Record, 117(1), n1.


Ramos, G., & Schleicher, A. (2016). Global competency for an inclusive world. Paris, France: OECD.


Reid, J., Green, B., White, S., Cooper, M., Lock, G., & Hastings, W. (2009). Understanding complex ecologies in a changing world. Symposium on education practice and rural social space. Presented at the American Educational Research Association annual conference, Denver, Colorado.


Roberts, P., & Green, B. (2013). Researching rural place(s): On social justice and rural education. Qualitative Inquiry, 19(l0) 765–774.


Rizvi, F., & Lingard, B. (2010). Globalizing education policy. London, UK: Routledge.


Ruane, R., & Lee, V. J. (2016). Analysis of discussion board interaction in an online peer mentoring site. Online Learning, 20(4), 79–99.


Ryu, S., & Lombardi, D. (2015). Coding classroom interactions for collective and individual engagement. Educational Psychologist50(1), 70–83.


Schleicher, A. (2017). The future of PISA. Tertium Comparationis23(1), 113–125.


Scotland, J. (2012). Exploring the philosophical underpinnings of research: Relating ontology and epistemology to the methodology and methods of the scientific, interpretive, and critical research paradigms. English Language Teaching5(9), 9–16.


Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New global modes of governance in education. British Educational Research Journal40, 917–936.


Singh, M., & Jing, Q. (2013). 21st century international mindedness: An exploratory study of its conceptualization and assessment. University of Western Sydney.


Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Thousand Oaks, CA: Sage.


Thier, M. (2015). Globally speaking: Global competence. In Y. Zhao (Ed.), Counting what counts: Reframing education outcomes, pp. 113–132. Bloomington, IN: Solution Tree.


Thier, M. (2017). Curbing ignorance and apathy (across the political spectrum) through global citizenship education. Berkeley Review of Education, 7(1).


Thier, M., Kim, M., & D’Aquillanto, K. (2018). It matters how you ask: Assessing the knowledge, skills, behaviors, or dispositions of global citizenship. Presented at the American Educational Research Association’s annual meeting, New York, NY.


Türken, S., & Rudmin, F. W. (2013). On psychological effects of globalization: Development of a scale of global identity. Psychology & Society, 5(2), 63–89.


Vidovich, L. (2007). Removing policy from its pedestal: Some theoretical framings and practical possibilities. Educational Review, 59(3), 285–298.


Vidovich, L. (2013). Policy research in higher education: Theories and methods for globalising times? In J. Huisman & M. Tight (Eds.), Theory and method in higher education research (pp. 21–40). Bingley: Emerald Press.


Wasserman, S. (1994). Social network analysis: Methods and applications (Vol. 8). Cambridge, MA: Cambridge University.


Weib, J. & Schwietring, T. (2017) The power of language: A philosophical- sociological reflection. Retrieved from http://www.goethe.de/lhr/prj/mac/msp/en1253450.htm


Winter, C. (2012). School curriculum, globalisation and the constitution of policy problems and solutions. Journal of Education Policy, 27, 295–314.


Zhao, Y. (2010). Preparing globally competent teachers: A new imperative for teacher education. Journal of Teacher Education61, 422–431.


Zhao, Y. (2015). Counting what counts: Reframing educational evaluation. Bloomington, IN: Solution Tree.




Cite This Article as: Teachers College Record Volume 121 Number 8, 2019, p. 1-40
https://www.tcrecord.org ID Number: 22705, Date Accessed: 12/3/2021 11:55:39 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Susan Ledger
    Murdoch University
    E-mail Author
    SUSAN LEDGER is Associate Dean of Engagement at Murdoch University, Australia. Her research interest centers on mixed method approaches to policy, practices, and issues related to the preparation of teachers for diverse contexts: international, rural, and remote. She is currently utilizing mixed reality learning environments and avatars to prepare preservice, inservice teachers and leaders for diverse and difficult contexts, scenarios, and critical incidences. Related publications: Ledger, S. (2017). The International Baccalaureate Standards and Practices as reflected in literature (2009–2016). International Schools Journal, 37(1), 32–44. Petersfield: UK. Ledger, S., Vidovich, L., & O’Donoghue, T. (2016). Global to local curriculum policy processes: The enactment of the International Baccalaureate in remote schools. Policy Implications of Research in Education Series, Vol.4. VIII, 217p. Springer.
  • Michael Thier
    University of Oregon
    E-mail Author
    MICHAEL THIER is a research associate jointly appointed to the University of Oregon's (UO) Center for Equity Promotion and Inflexion (formerly the Educational Policy Improvement Center). With collaborators in 10 countries, he pursues three goals: (a) helping education leaders implement and measure global citizenship education (GCE) programs; (b) comparing GCE programs’ possibilities and constraints internationally/cross-culturally; and (c) discovering opportunities and conditions that enable students, especially attendees of rural and/or remote schools, to avail themselves of GCE programs. His recent publications on issues related to global citizenship can be found in journals such as Learning and Individual Differences, Psychological Assessment, and the Berkeley Review of Education.
  • Lucy Bailey
    University of Nottingham
    E-mail Author
    LUCY BAILEY is Associate Professor in the School of Education at the University of Nottingham Malaysia Campus. Her research interests include refugee education, gender in education, and international schooling. She has recently published a chapter on international schools in the Routledge International Handbook of Schools and Schooling in Asia. Bailey, L. (2015). Reskilled and 'running ahead': Teachers in an international school talk about their work. Journal of Research in International Education, 14(1), 3–15 Bailey, L., & Ingimundardottir, G. (2015). International employability: Stakeholder attitudes at an international university in Malaysia. Journal of Teaching and Learning for Graduate Employability, 6(1), 44–55
  • Christine Pitts
    University of Oregon
    E-mail Author
    CHRISTINE PITTS is a research scientist at NWEA in Portland, Oregon. Mrs. Pitts leads mixed methods education research studies on the efficacy and effectiveness of educational assessment and professional development programs, specifically regarding their relationship to instructional practices, teacher pedagogy, and student engagement in learning. Mrs. Pitts is a Ph.D. candidate at the University of Oregon in Educational Methodology, Policy, and Leadership. In her practice as an educator, researcher, and policy analyst, Christine prioritizes the transformation of existing education policy and practices towards a model of teaching and learning that engages communities in developing a sustainable shift for equitable systems.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS