Language-based Differences in the Literacy Performance of Bidialectal Youth
by Patriann Smith, Jehanzeb R. Cheema, Alex Kumi-Yeboah, S. Joel Warrican & Melissa L. Alleyne - 2018
Background/Context: Standard English functions as a dominant language in the English-speaking Caribbean context despite the bidialectal, bilingual, and multilingual nature of countries. Notwithstanding, Caribbean non-Standard English-speaking students continue to be administered literacy assessments that do not take into account their nonstandardized English language use. This practice inadvertently reinforces assumptions that privilege Standard English as a language of assessment (Canagarajah, 2006b; Shohamy, 2006) and that devalue certain World Englishes (Canagarajah, 2006a) in academia.
Purpose/Objective/Research Question/Focus of Study: In this study, we examined the way in which 3,184 15-year-old 9th and 10th grade Trinidadian bidialectal adolescent youth self-identified linguistically on the 2009 Programme for International Student Assessment (PISA) literacy assessment and explored their reading, math, and science literacy performance based on their self-identification as native English and non-native English speaking students.
Population/Participants/Subjects: The population included 3,184 15-year-old students, 52.3% (n = 1,666) of whom were girls and 47.7% (n = 1,518) of whom were boys. Of this population, 28.5% (n = 909) were in Grade 9 while the rest were in Grade 10 (n = 2,275); 89.7% (n = 2,856) were enrolled in public schools and 10.3% (n = 327) were enrolled in private schools; and across these groups, 97.3% (n = 3,098) identified English (i.e., Standard English) while 2.7% (n = 86) identified a language other than Standard English as their “native” language (i.e., non-Standard English).
Research Design: The statistical results in our study were based on secondary analysis of a survey-based nationally representative sample of 15-year-old students from Trinidad and Tobago. We used analysis of covariance in order to control for demographic differences and used hierarchical linear modeling to verify the robustness of our empirical findings.
Findings: The majority of students self-identified as [Standard] English speakers despite the predominant use of nonstandardized Englishes in their country. Findings showed large and significant differences between “self-identifying native” and “self-identifying non-native” speakers of English, with higher mean scores for the former group in all three assessed areas of literacy as measured in English. Self-identifying native English speakers performed significantly below the PISA 2009 OECD mean of 500 and reflected a high degree of volatility in performance. These differences persisted even after controlling for important student demographic differences such as grade, gender, school type, and indicators of socioeconomic and cultural status.
Conclusions/Recommendations: The study serves to justify the need for closer attention to the pervasive role of colonialism in the dominance of Standard English in multilingual testing (Shohamy, 2006), highlights the need for attention to bidialectal students’ performativity in World Englishes that challenge normative Standard English literacy proficiency (Canagarajah, 2006a), and requires that assumptions steeped in colonialism that underlie Standard English literacy testing on the PISA international measure be revisited if bidialectal adolescent learners are to be accurately represented on these measures in much the same manner as their monolingual and Standard English speaking counterparts.
The dominance of Standard English as a global language (Canagarajah & Said, 2011; Crystal, 1997; Johnson, 1990; Kirkpatrick, 2007; Pennycook, 2013) presents a significant challenge for adolescent youth whose nonstandardized Englishes continue to receive little acceptance within the academic arena (Alim, 2009; Castagno & Brayboy, 2008; Craig, 2006; Luke, Luke, & Graham, 2007; Siegel, 1997, 1999, 2002, 2005; Simmons-McDonald, 2004). Despite calls to make World Englishes acceptable forms of international communication (Canagarajah, 2006a; Teachers of English to Speakers of Other Languages [TESOL], 2008), globalization, internationalization, and transnationalism continue to reinforce language ideologies that reflect the prominence and power of Standard English (Crystal, 2003; New London Group, 2000; TESOL, 2008). To date, approximately 335,000,000 individuals in 99 countries speak English as a first language and at least 100 additional countries regard English as a spoken language, making English the third most widely spoken language in the world (Lewis, Simons, & Fennig, 2014). With millions of youth enrolled in schools globally (Organisation for Economic Co-operation and Development [OECD], 2010), the significance of Standard English literacy teaching and learning (Davies, Hamp-Lyons, & Kemp, 2003; TESOL, 2008) continues to be at the forefront of a global agenda, placing in limbo the needs of nonstandardized English speaking youth whose linguistic characteristics often become conflated with those of Standard English speaking students.
Internationally, statistics across Minority1 and Majority2 World contexts demonstrate the concern of literacy primarily with Standard English mastery (Hill & Parry, 2014). In countries of the Minority World, such as the United States, the European Union, and Canada, learners of (Standard) English (English language learners, or ELLs) constitute the fastest growing student population (European Commission, 2012; National Clearinghouse for English Language Acquisition [NCELA], 2011; National Council of Teachers of English [NCTE], 2008; People for Education, 2013). For these learners, many of whom are elective multilinguals (Ellis, 2004), governmental mandates stipulate expectations for mastery of English as a Native Language, English as a Second Language or English as a Foreign Language (e.g., European Commission, 2012; NCELA, 2011). Much like Minority World contexts, students from the Majority World across regions such as Africa, China, India, Latin America, South America, and the English-speaking Caribbean3 must also demonstrate proficiency in Standard English in order to perform acceptably on local and national measures of literacy (Hill & Parry, 2014; Wei & Su, 2012). The privileging of Standard English (McKay, 2005), accompanied by a neglect of local Englishes (Smith, 2016), not only fails to facilitate multilateral interaction and mutual intelligibility across hybrid language communities and cultures (Canagarajah, 2006b) but also has a detrimental impact on the performance of bidialectal youth (International Reading Association, 2012).
The English-speaking Caribbean context is one such region in which Standard English functions as a dominant language despite the bidialectal, bilingual, and multilingual nature of countries. In this region, Caribbean nationals speak many nonstandardized Englishes, including 35 Creoles and 15 indigenous languages (Simmons-McDonald, 2006; Voice, 2011). Students possess predominantly nonstandardized English linguistic backgrounds (Warrican, 2009; Winer, 2006) and English-related Creoles and vernaculars have been said to interfere with their literacy performance (Bogle, 1997; Craig, 1999; Miller, 1989). For instance, in 2009, Caribbean bidialectal youth performed poorly on national measures of literacy and in other subject areas such that 21% of the candidates sitting the Caribbean Secondary Education Certification (CSEC) examinations in the Caribbean achieved passing grades in five or more subject areas (e.g., math, English language arts, science, history, and social studies). Similarly, 52% either did not pass any subject or received passing grades in just one subject area (Jules, 2010). Notwithstanding, Caribbean non-Standard English speaking students continue to be administered literacy assessments that do not take into account their nonstandardized English language use. This practice inadvertently reinforces assumptions that privilege Standard English as a language of assessment (Canagarajah, 2006b; Shohamy, 2006) and that devalue certain World Englishes (Canagarajah, 2006a) in academia.
In response to these challenges, this study explores the reading, mathematics, and science literacy performance of Trinidadian adolescent bidialectal learners by focusing on their self-identification as native English and non-native English speaking students on the 2009 Programme for International Student Assessment (PISA) literacy assessment. The study further examines the extent to which assumptions underlying English dominance may inadvertently pervade international measures of literacy assessment and subjugate, by inappropriate comparison, students whose native language is a nonstandardized English (i.e., English dialect). In this study, we use the term self-identifying native English speaker to refer to students who self-reported their status as speakers of Standard English on the PISA 2009 assessment and self-identifying non-native English speaker to refer to students who self-reported their status as speakers of a language other than Standard English on the PISA 2009 assessment. We utilize the term bidialectal adolescents in keeping with sociolinguistic conceptions (e.g., Labov, 1972) as a general descriptor of Trinidadian adolescents between the ages of 12 and 18 whose primary forms of language proficiency are Trinidadian English-lexicon Creole (TEC) or Tobagonian English-lexicon Creole (TOB) and Trinidadian Standard English (TSE). A review of the literature concerning language and literacy differences in the multilingual Caribbean provides a preface for the study.
THE LINGUISTIC LANDSCAPE IN THE ANGLOPHONE CARIBBEAN
A sparse body of literature points to the linguistic challenges for English bidialectal students as they receive literacy instruction and are assessed in the Caribbean. One study in which English dialect has been considered as a function of literacy instruction focused on Jamaican learners. Researchers used an intervention to enhance first and second grade students language awareness and self-concept, improve students Jamaican [English-based] Creole (JC) and Standard Jamaican English (SJE) literacy skills, and enable students to develop mastery of material taught in the content areas (Bilingual Education Project [BEP]; Devonish & Carpenter, 2007). Results from the Jamaican intervention indicated that with the exception of monolingual speakers of JC, first and second grade students encountered difficulty with bilingual delivery (i.e., JC, SJE) (Devonish & Carpenter, 2007).
Studies concerning Caribbean bidialectal learners literacy assessment have examined students informal writing and oral literacy assessment. Students from Grades 1 to 6 were administered phonics tests with an emphasis on the nature and production of sound patterns, literacy, phonetics, phonics, and students ability to discriminate between consonant clusters across SJE and JC (Lacoste, 2007; Mitchell, 2007). Findings reflected students tendencies to read below grade level and to possess knowledge of very few letter sounds (Mitchell, 2007). Moreover, primary grade speakers and readers tended to attach known JC sound systems to words requiring SJE structures, increasing proficiency with articulation and gestures of cluster patterns after engaging in repetition (Lacoste, 2007).
A study conducted by Warrican (2006) focused on participants in the third year of high school. Researchers engaged high school students in read-alouds, discussion, and silent reading of informational and fictional texts during 45-minute sessions over a period of 16 weeks. Findings from the study reflected that students lacked interest in reading, maintained negative attitudes to reading, and engaged in self-deprecating behaviors. These behaviors counteracted efforts to enhance feelings of self-efficacy in reading and in other forms of academia (Warrican, 2006). Extending the emphasis on high school students to consider dialect variation, Millar and Warrican (2015) conducted an action research study designed to create a third space for bridging gaps between traditional and semiotic literacies; between students home English varieties and the Standard English language used in Barbadian classrooms. Findings from the study showed that the students engaged in literate activities that drew from their home Englishes in numerous ways as their roles and the roles of their teachers were redefined.
Across these studies, the examination of dialect variation in literacy appears to be most often emphasized at the elementary level and reflects an emphasis on students performance on local assessments. In the study focused on high school students, researchers focused on qualitative findings related to students dialects that held limited generalizability given the sample size and research design. The current study aims to focus on bidialectal adolescents dialect identification on an international assessment via the use of a quantitative research design. The expectation is that findings will fill in the gap in the research on high school students dialects in relation to international literacy assessment and provide insights about literacy for these adolescent learners. The following discussion of challenges associated with Caribbean Englishes by bidialectal learners in the Caribbean helps to provide a backdrop against which to understand the study.
CHALLENGES ASSOCIATED WITH CARIBBEAN ENGLISHES
Caribbean Englishes, also defined as dialects of English, are often referred to as Creoles and Patois. However, because they operate along an English/Creole continuum, these dialects are also sometimes referred to as English (Carrington, 1992). Nero (2006) asserts that as a result, Caribbean English (CE) speakers often self-identify as Standard English speakers and lay claim to the [standard] language despite the deviation of their non-Standard English variety from Standard English. Inadvertently, CE speakers maintain an ongoing shift between the denigration and celebration of their non-Standard English varieties, reflecting what has been referred to as attitudinal schizophrenia (Kachru & Nelson, 2001). Winer (2006) explains that in many ways, CE speakers Standard English ideology leads them to be intentional about using Standard English in formal settings because it promises upward mobility and social status, but they simultaneously treasure their local non-Standard English varieties in informal contexts as symbols of national identity (St. Hilaire, 2011). For example, a CE speaker from Trinidad might use TSE when speaking with friends during a structured indoor class session at school but revert to the TEC when speaking with the same individuals during an unstructured outdoor session at church. By exploring how speakers of CEs in Trinidad and Tobago self-identify linguistically, this study extends the recent emphasis placed on CEs (Clachar, 2003; Pratt-Johnson, 1993; Winford, 1997; Youssef, 2001) and serves to highlight the importance of attending to CEs in extending discussions about World Englishes (Nero, 2006).
CHALLENGES IN LANGUAGE FOR TRINIDADIAN AND TOBAGONIAN BIDIALECTAL LEARNERS
Once a colony and now independent of British rule, the Caribbean twin-island country of Trinidad and Tobago considers English its official language. Yet, across the country, three linguistic varieties are used: TSE, most often used in academic settings, TEC, most often used in informal settings, and TOB, also used primarily in informal contexts (Ferreira, 2013; Government of the Republic of Trinidad and Tobago, 2010). TSE, which acts as an acrolect, is closest to what we know as Standard English (i.e., Standard American English). TEC, on the other hand, functions as a mesolect, given that its status is lower than TSE and that it deviates significantly from the Standard English language. TOB, a basilect, is the least acknowledged in status and is less structurally aligned with TSE than its TEC counterpart. Because Trinidadian and Tobagonian youth primarily reside in either twin island, they are primarily bidialectal with each being exposed to two of these three linguistic varieties. Specifically, youth in Trinidad are exposed largely to the TEC mesolect and, to a lesser degree, the TSE acrolect. Similarly, youth who reside in Tobago are more often exposed to the TOB basilect, and less often, the TSE acrolect. As a result, Trinidadian and Tobagonian youth speak predominantly TEC or TOB as their primary languages (Craig, 2006; Roberts, 2007).
As of 2013, Trinidadian and Tobagonian bidialectal youth constituted 22% of the nations population of 0.3 million youth between the ages of 10 and 24 (United Nations Population Division, 2013). As explained earlier, the majority of these youth utilize TEC or TOB most frequently and therefore their daily language differs significantly from Standard English, both syntactically and semantically (Craig, 2006; Roberts, 2007). Notwithstanding, they often view themselves as Standard English speakers and therefore identify as native speakers of the [Standard] English acrolect, that is, the standard and accepted form of English in formal settings (Ferreira, 2013). Trinidadian and Tobagonian bidialectal youth are typically only awakened to their positioning as non-native English speakers when they encounter ideologies about the English language beyond the Caribbean or from other Standard English speakers who designate their use of English as inferior (Canagarajah, 1999; Kachru, 1986; Nero, 2006). Otherwise, oblivious to the status of TEC and TOB as, respectively, mesolect and basilect of English, but cognizant of the status that the TSE acrolect holds (Pratt-Johnson, 2006), Trinidadian bidialectal adolescents demonstrate a rather subtle internalization of [English] monolingualism (Williams & Carter, 2005, p. 238).
The foregoing evidence supports our assertions concerning the lack of attention to linguistic challenges faced by bidialectal English-speaking adolescents on literacy assessment measures in Trinidad and the broader English-speaking Caribbean despite the contentions associated with non-Standard English language use. Assumptions underlying the literature attest to the fact that Trinidadian adolescent bidialectal learners have a substantial motivation to be classified as Standard English speakers on measures of literacy assessment. Moreover, multiple factors such as adolescents perception of language and social status appear to contribute to their self-identification as English speakers. Given that students are not asked to classify their language status on local and national measures of literacy and considering that students are required to report their language classification on the PISA international literacy assessment measure, we surmised that exploring Trinidadian students self-identification of language and potential interpretations of their performance in relation to this measure would provide us with a sense of how they classified and the ways in which they performed based on this classification. But first, we examine the bases upon which the PISA functions in relation to the construct of literacy as a means of providing a conceptual understanding of the notion of literacy guiding assessment in this study.
THE PROGRAMME FOR INTERNATIONAL STUDENT ASSESSMENT (PISA)
The Programme for International Student Assessment (PISA; OECD, 2014) is an international literacy measure engineered by the OECD. PISA assesses the performance of 15-year-old youth in the areas of mathematics, science, and reading literacy across member and non-member OECD nations. The OECD describes PISA as a measure of students knowledge and skills (para. 1). In other words, the exam is designed to determine how well students can apply the knowledge and skills they have learned at school to real-life challenges (para. 3). In relation to PISA, the OECD operationalizes literacy as a students ability to apply knowledge and skills in key subject areas and to analyze, reason and communicate effectively as they examine, interpret and solve problems (para. 1). This definition is largely consistent with current notions of literacy as dynamic, multifaceted, and socially and culturally situated (Castek, Leu, Coiro, Gort, Henry, & Lima, 2007; Kalantzis & Cope, 2012; Moll & Greenberg, 1990; NCTE, 2008; New London Group, 2000; Street, 1995). The OECDs conception of literacy in PISA therefore deviates from traditional notions that emphasized mastery of specific school [reading] curricula (para. 3) and challenges the idea of literacy as a set of foundational or constrained skills (Paris, 2005).
In part, the OECD also relies on notions of disciplinary literacy for developing PISA because the assessments in mathematics literacy, science literacy, and reading literacy require students to have sophisticated and specific kinds of literacy support for reading in content areas, or academic disciplines (Lee & Spratley, 2010, p. 2). In order to perform well on PISA, students must possess specific kinds of background knowledge in reading, they must do so across a range of disciplines, and they must be able to navigate texts within these disciplines. Successful performance on PISA requires that adolescent youth be highly functional in foundational literacy skills, but also possess the prerequisite skills critical for responding to discipline-specific questions (Faggella-Luby, Graner, Deshler, & Drew, 2012). In adopting such notions of literacy for PISA, the OECD acknowledges that specific language patterns accompany particular disciplines and have an impact on students comprehension of discipline-specific content (Fang & Schleppegrell, 2010). Notwithstanding, evidence shows that these requirements position many adolescent youth at a disadvantage given that disciplinary texts required are often much too difficult to be read independently and that they require practical strategies to manage the density of information encountered (Jetton & Shanahan, 2012).
For adolescent youth to whom PISA is administered in languages that are not their home or native languages, PISA may pose a further disadvantage. On the surface, while PISA appears to attend to the variations across culture implicit in the literacy assessment of multilingual youth (Leu, Castek, Coiro, Gort, Henry, & Lima, 2005), at closer look, the assessment appears to favor the literacy and language norms privileged by Minority World educational institutions and to prioritize the dominant language and cultural forms by which society is determined to operate and through which adolescents are judged to be successful (Davies et al., 2003; Delpit, 1988). While the OECD claims to consider language in its tests for robustness (2014, para. 5), it appears to privilege a context of communication in which standard language ideology and its corresponding monolithic conceptions of literacy seem to marginalize capable multilingual students by reducing their chances at literacy success while simultaneously increasing standard languages speakers possibility for success (Canagarajah, 2006a; Lippi-Green, 1997; Ricento, 2014) on PISA literacy measures. If this is truly the case, we argue that it is an importantly overlooked advantage that favors students in the Minority World (e.g., United States) and that privileges standard language speakers in Minority World (e.g., Caribbean) contexts over their nonstandardized speaking Majority World counterparts. Such incongruence raises questions about the extent to which adolescent youth are marginalized by these circumstances, the reasons why these circumstances persist and, ultimately, which students stand to benefit from this process.
To address these questions and to challenge prevailing linguistic norms that characterize PISA, this study examines the mean differences in literacy in math, reading, and science between self-identifying native English Trinidadian and self-identifying non-native English Trinidadian speakers on the PISA 2009 assessment. The study is the first of its kind that addresses the extent to which differences prevail across the three literacies in relation to students self-identification as language speakers in this predominantly TEC/TOB multilingual context. Our conceptual framework presented next describes the way in which colonialism served as a lens through which to better understand and explain the constructs in this study.
The notion of colonialism in relation to education and, specifically, language and literacy education, provided a useful lens through which to better understand discussions surrounding the ways in which the assessment of language and of education are affected by broader sociopolitical factors, factors that point to the inadvertent impact of power and privilege on youth in the educational system. According to Thompson, Warrican, and Leacock (2011), to discuss education in the Caribbean without reference to the influence of colonialism is to deny the pervasive and co-dependent relationship of the colonizer and the colonized (p. 61). As used in this study, colonialism is taken to refer to the imposition of economic and political relationships within a society by another country (p. 61). In the case of the English-speaking Caribbean, the imposers represented the British and the Caribbean functioned as the society on which these relationships were imposed. But colonialism has also functioned beyond the Caribbean, at the global level, such that hegemonic powers (e.g., British, French, Dutch imposers) inordinately influenced the curricula and educational processes of newer countries (i.e., colonized societies) across the globe. This influence then determined largely the norms of dominant countries educational systems to be legitimate and simultaneously required nation states (i.e., the colonized) to conform to these norms if their educational systems and practices were to be regarded as proper and legitimate (Dale, 2000).
The requirements for conformity to dominant educational systems of the colonizing powers has been largely responsible for the current impositions on the language and literacy instruction and assessment of nation states in what seems to be currently defined as our postcolonial era (Dale, 2000). Yet, as St. Hilaire (2011) observes, the colonial past continues to be present in many postcolonial societies, creating what we know as the colonial present. This colonial present is visible in the case of the English-speaking Caribbean, where the stigma persists against low status English vernaculars within instructional language and literacy contexts and where students, teachers, and parents are continuously oriented towards favoring standardized Englishes for assessment and instruction (St. Hilaire, 2011). Similarly, in contexts beyond the Caribbean and across the globe, the colonial present (i.e., a reflection of the colonial past) is visible in education systems that reinforce the use of standardized Englishes as the norm based on educational systems informed by traditional hegemonies of the colonial past, even while these systems appear to advocate for literacy and language instruction and assessment that is aligned with a postcolonial future.
This colonial past, which remains alive and well in the colonial present by virtue of its requirement for conformity to colonial norms, explains in part why the systems of Standard English that traditionally governed literacy instruction operated primarily based on autonomous models that designated language as a system, that privileged proficiency, and that emphasized constrained skills (Brown, 2014; Canagarajah, 2006a; Paris, 2005). This notion of the colonial past remains at odds with the descriptions of a postmodern postcolonial present where literacy takes on, instead, an ideological (Street, 1995) and pragmatic (Canagarajah, 2006a) function with social and performative functions. The tensions between the autonomous and ideological perspectives of literacy reflect the friction visible in the gravitation of the OECD to PISA assessments, which seem to reflect the social process of literacy (i.e., ideological model), but which, at the same time, inadvertently reinforce proficiency in the structures of standardized Englishes (i.e., autonomous model) as a prerequisite for literacy. They also explain, in part, why bidialectal adolescents continue to be judged based on native or Standard English norms that limit awareness of how language discrepancies directly impact international perceptions about their literate and language capacity (Brown, 2014; Canagarajah, 2006a; Canagarajah, 2006b; Shohamy, 2006).
Thus, colonialism, and its positioning of youth within the broader educational and linguistic context, served as a lens through which to explain the ways in which the bidialectal youth in this study were positioned by virtue of their self-identification on the PISA measure.
The following research questions guided the study: (a) How did the non-native English Trinidadian youth self-identify on the PISA 2009 assessment? (b) What were the mean differences in literacy in math, reading, and science between self-identifying native English Trinidadian and self-identifying non-native English Trinidadian speakers on the PISA 2009 assessment, both before and after controlling for the influence of demographic characteristics such as gender, grade, school type, and socioeconomic status?
The data used in this study were extracted from the Trinidad and Tobago portion of the PISA 2009 student and school data files (OECD, 2010). PISA is a cross-country survey of student attitudes and perceptions, and assesses student literacy in areas such as mathematics, reading, and science. This survey is administered under the oversight of the OECD in 3-year cycles. As opposed to academic achievement, the aim of PISA literacy assessments is to assess the extent to which students can apply what they have learned during their years of compulsory schooling. This differs, in part, from the standardized measure currently utilized in the Caribbean, CSEC, which focuses on academic achievement. For this reason, and in order to facilitate cross-country comparisons, the target population for this survey is a country's entire population of 15-year-old students. Sampling from this population is based on a two-stage stratified random selection process in which a representative sample of schools is selected in the first stage followed by a random selection of students from each selected school in the second stage. This results in representative samples that allow sample-based inferences to be generalized to the target population in the entire country (OECD, 2012).
SAMPLE AND PARTICIPANTS
In the PISA 2009 survey, 4,778 students from 158 schools participated from Trinidad and Tobago (OECD, 2010). Of these, 889 students belonging to Grades 7, 8, and 11 were excluded because they were not directly relevant to the purposes of our study. Of the remaining 3,889 students, not all provided complete information on all variables of interest used in analytical procedures presented later in this study, leaving 3,184 cases with complete information. Of these 3,184 students, 52.3% (n = 1,666) were girls and 47.7% (n = 1,518) were boys; 28.5% (n = 909) were in Grade 9 while the rest were in Grade 10 (n = 2,275); 89.7% (n = 2,856) were enrolled in public schools and 10.3% (n = 327) were enrolled in private schools; and across these groups, 97.3% (n = 3,098) identified English (i.e., Standard English) while 2.7% (n = 86) identified a language other than Standard English as their native language (i.e., non-Standard English). Cross tabulations of gender, grade, and school type by self-identified native language are provided in column 2 of Tables 1 and 2.
Table 1. Descriptive Statistics for Standardized Literacy Scores in Mathematics, Reading, Science, Mean School SES, by Gender, Grade, and School Type, Cross-Tabulated by Self-Identifying Native Language
Note. n = 3,184. The three literacy scores and school SES are standardized variables with M = 0 and SD = 1 for the overall sample.
Table 2. Descriptive Statistics for the Five SES Subscales, by Gender, Grade, and School Type, Cross-Tabulated by Self-Identifying Native Language (Native English, Non-Native English)
Note. n = 3,184. Cultural possessions, educational resources, and wealth are standardized variables with M = 0 and SD = 1 for the overall sample.
Dependent variables. Students included in our sample were assessed in three areas of literacy: mathematics, reading, and science. The PISA student survey data file provided literacy scores on all three subjects for each sampled student. These literacy scores are used as outcome variables in analytical models presented later in this study. The correlations among the three literacy scores exceeded .75 in our sample (see Table 3). The literacy score in each subject was based on a number of items ranging in format from multiple-choice to constructed response. Sample assessment items for each subject are available at OECD's PISA website and publications (OECD, 2006a, 2006b, 2006c, 2014). The raw literacy scores for all students in our sample ranged between 147.07 and 704.48 (M = 431.23, SD = 90.16) in mathematics, between 91.04 and 784.53 (M = 438.39, SD = 98.89) in reading, and between 34.07 and 728.96 (M = 428.52, SD = 98.29) in science. These literacy scores are standardized such that the overall mean for OECD countries is 500 (SD = 100). Thus, the three mean literacy scores of students in the Trinidad and Tobago sample were lower but less volatile as compared to their counterparts in OECD countries.
Table 3. Pearson Product Moment Coefficients of Correlation for Literacy Scores in Mathematics, Reading, and Science, School SES, and SES Subscales
Note. n = 3,184. Cohen's (1992) effect size interpretation in parentheses: S = small, M = medium, L = large.
*p < .001 for all reported correlations.
Independent variables. The following factors and covariates were used as independent variables in our statistical analyses. Of these, self-identifying native language is our primary variable of interest while the remaining act as controls for demographic differences.
Self-identifying native language. This is a nominal variable with two categories, self-identifying native English and self-identifying non-native English. Since the PISA survey did not specify what these other languages were, all students who did not identify English as their native language were grouped into the same category (i.e., self-identifying non-native English speakers).
Gender. This variable captures the gender of a student. It has two values, male and female.
Grade. This variable captures a student's academic grade. In our original sample, 81.4% of the students belonged to either Grade 9 or 10. Other represented grades included Grades 7, 8, and 11. However, these were excluded from our analysis as these grades are atypical for 15-year-old students and tend to represent either accelerated students or those who repeat grades. This exclusion means that results from our tests of inferences are generalizable only to 15-year-old students in enrolled in Grades 9 and 10.
School type. This variable classifies a student as enrolled in either a public school or a private school.
Indicators of socioeconomic status. In order to control for socioeconomic status (SES) based differences, we used both SES at the student level and mean school SES, as both are known to be strong predictors of student performance on standardized assessments. At the individual level, SES is captured by a number of subscales that include cultural possessions at home, availability of educational resources at home, parental education, parental occupation, and family wealth.
Cultural possessions. This scale was based on three items that asked the students about items of cultural significance at their home. A sample item included, "In your home do you have books of poetry?" The response categories for this scale were 1 (yes) and 2 (no). The reliability of this scale was .49 in our sample with interitem correlations ranging from .16 to .39 (M = .25, SD = .11). The standardized value of this scale ranged from -1.97 to 1.13 (M = 0, SD = 1). Higher values on this scale represent ownership of more cultural possessions at home.
Educational resources. This scale was based on seven items that asked the students about the availability of educational resources at home. A sample item included, "In your home do you have a desk to study at?" The response categories for this scale were 1 (yes) and 2 (no). The reliability of this scale was .65 in our sample with interitem correlations ranging from .03 to .52 (M = .18, SD = .14). The standardized value of this scale ranged from -4.17 to 1.07 (M = 0, SD = 1). Higher values on this scale represent availability of more educational resources at home.
Parental education. This variable captured a student's parental education, and took the greater of two values of father's or mother's years of schooling. The value of this variable ranged from 3 to 16 (M = 12.40, SD = 3.09) in our sample. Higher values of this variable represent more education.
Parental occupation. This index was based on the International Socio-Economic Index of Occupational Status (ISEI) developed by Ganzeboom, de Graaf, and Treiman (1992). The value of this variable ranged from 16 to 88 (M = 46.19, SD = 15.28) in our sample. Higher values of this variable represent more prestigious occupations.
Wealth. This scale was based on 12 items that asked the students about items representative of family wealth at their home. Seven of the items had two response categories while the remaining had four. For the first set a sample item included, "In your home do you have a room of your own?" The response categories for this scale were 1 (yes) and 2 (no). For the second set a sample item included, "How many of these are there at your home: Computers?" The response categories for this scale were 1 (none), 2 (one), 3 (two), and 4 (three or more). After inverting the responses on the first set of items in order for them to have the same direction as the second set, the reliability of this scale was .71 in our sample with interitem correlations ranging from -.06 to .49 (M = .16, SD = .11). The standardized value of this scale ranged from -3.78 to 3.68 (M = 0, SD = 1) in our sample. Higher values of this variable are indicative of greater family wealth.
School SES. This school-level variable was derived from the five subscales of student-level SES. For each school in the sample, school SES was computed as the mean SES of all students sampled from that school. The standardized value of this scale ranged from -2.11 to 3.15 (M = 0, SD = 1) in our sample. Higher values of this variable are indicative of higher SES.
Analytical method. In order to test for the mean difference in literacy in mathematics, reading, and science between self-identifying native and non-native speakers of English in our sample, we employed student's independent samples t test. However, by design, these t tests are based on the assumption that other than self-identifying native language, there is no difference between the two groups of students. This is a very strong assumption that is likely to be not valid. In order to see if our results changed with the introduction of additional dimensions, we controlled for a variety of student background characteristics such as gender, grade, school type, and SES by estimating three univariate analysis of covariance (ANCOVA) models with literacy scores as outcomes, native language as our primary factor, and the aforementioned student characteristics as control variables. We excluded two-way and higher interactions from our estimated models due to the issue of very small and empty cell sizes.
We computed descriptive statistics, validated underlying model assumptions for both the independent samples t tests and univariate ANCOVA models, and used appropriate sampling weights in our computations. All computations were performed with SPSS 22.0. Unless otherwise noted we used a .05 level of significance to evaluate tests of hypotheses. Observed effect sizes (Cohen's d and R2) were interpreted based on guidelines recommended by Cohen (1992). Our a priori power analysis estimated that in order to detect a medium effect size, f = .25 (Cohen, 1992), with .05 level of significance, .95 power, and 10 numerator degrees of freedom, the required sample size for ANCOVA was n = 400, which is much smaller than our effective sample size of 3,184. Ceteris paribus, this is the same thing as saying that our sample size is sufficiently large to detect an effect size as small as f = .09; or that our ANCOVA models have almost .99 power to detect a medium effect when such an effect exists.
For this investigation, we relied on the standardized scores for these three literacies obtained from the PISA 2009 survey. Descriptive statistics for all variables of interest included in our analysis are presented in Tables 1 and 2. Specifically, the mean standardized scores obtained by the sample in the two language groups on the three literacy measures are shown in Table 2, disaggregated by gender, grade and school type. Pearson product moment coefficients of correlation among the three literacy outcomes and continuous predictors are presented in Table 3.
Results from independent samples t tests that evaluated mean difference in the three literacy scores between self-identifying native and non-native speakers of English are presented in Table 4. These results suggest that there is a significance mean difference in literacy between these two groups for mathematics, [t(3,184 = 3.71, p <.001], reading, [t(3,184) = 6.28, p <.001], and science, [t(3,184) = 4.15, p <.001], with the unstandardized mean literacy score of self-identifying native English speakers exceeding that of self-identifying non-native speakers by 36.61 points (d = 0.41, medium effect) in mathematics, 67.62 points (d = 0.69, large effect) in reading, and 44.60 points (d = 0.45, medium effect) in science. A bar graph for the mean difference between the two language groups by literacy area based on t test results is presented in Figure 1.
Table 4. Independent Samples t Test Results for Mean Difference Between Self-Identifying Native English and Self-Identifying Non-Native English Speakers for Literacy Scores in Mathematics, Reading, and Science
Note. n = 3,184. df = 3182. Two-tailed critical |t| at 5% level of significance = 1.96. Effect size interpretation based on Cohen (1992).
*p < .001 for all reported t values.
Figure 1. Mean literacy scores in mathematics, reading, and science for self-identifying native and non-native English speakers before controlling for other demographic differences. For each subject the two means are significantly different from each other
In order to see if the results of our bivariate analysis persisted after controlling for student characteristics, we estimated one ANCOVA model for each literacy outcome with self-identified native language as the main factor and student characteristics such as gender, grade, school type, and SES as control variables. Results from these ANCOVA models are summarized in Table 5 with corresponding parameter estimates presented in Table 6. These results suggest that for mathematics, the main effect of all variables except school type and parental occupation was significant; for reading, the main effect of all variables except cultural possessions was significant; and for science, the main effect of all variables except gender and school type was significant.
Table 5. ANCOVA Results for Literacy Scores in Mathematics, Reading, and Sciencea
Note. n = 3,184.
aANCOVA R2 values were .50, .49, and .47 for mathematics, reading, and science models, respectively.
Table 6. Parameter Estimates from ANCOVA for Literacy Scores in Mathematics, Reading, and Sciencea
Note. n = 3,184. b = coefficient based on standardized outcome. b' = coefficient based on unstandardized outcome.
*p < .001
aANCOVA R2 values were .50, .49, and .47 for mathematics, reading, and science models respectively.
bReference category was private for school type, male for gender, 10 for grade, and non-native English speaker for native language.
Thus, even after controlling for student demographic differences there was a significant mean difference between self-identifying native and non-native English speakers in all three subjects, with the mean score of native English speakers exceeding that of self-identifying non-native English speakers by 25.08 points in mathematics (Means: native, 425.83; non-native, 400.74), 53.13 points in reading (Means: native, 435.97; non-native, 382.83), and 30.38 points in science (Means: native, 425.08; non-native, 394.69). The total variation in literacy scores explained by these ANCOVA models was 50% for mathematics, 49% for reading, and 47% for science. A bar graph for the mean difference between the two language groups by literacy area based on ANCOVA results is presented in Figure 2.
Figure 2. Mean literacy scores in mathematics, reading, and science for self-identifying native and non-native English speakers after controlling for other demographic differences. For each subject the two means are significantly different from each other
In addition to the significant main effect of self-identified native language, our ANCOVA results also revealed large main effects of school SES and grade. Our estimation results suggest that a 1SD increase in the value of school SES raised mean literacy scores by as much as 55.94 points in mathematics, 51.42 points in reading, and 55.69 points in science. These coefficient estimates are more than one half of their corresponding standard deviations in these subjects. Similarly, mean scores of Grade 10 students significantly exceeded those of their Grade 9 counterparts by 39.96 points in mathematics, 49.82 points in reading, and 46.51 points in science. In addition to these large effects, the mean difference between boys and girls was also significant for mathematics and reading, with boys on average outperforming girls in mathematics by 8.19 points and girls outperforming boys in reading by 35.09 points. For school type, we observed a significant mean difference only in reading scores, with students enrolled in private schools outperforming their public school counterparts on average by 9.57 points.
HIERARCHICAL LINEAR MODELS (HLM) ANALYSIS
In order to validate our ANCOVA results, we also estimated a series of hierarchical linear models (HLM) separately for each of the three literacy outcomes. For each outcome, the first model was a null (or intercepts only) model that allowed us to separate within-school variation from between-school variation. The second model contained only student-level (Level 1) predictors. This model allowed us to estimate the proportion of total variation in the outcome variable that could be explained by Level 1 predictors. Finally, we estimated a full model containing both Level 1 and Level 2 (school-level) predictors. This model helped us further estimate the total proportion of explained variation in the outcome that was attributable to school-level predictors. Results from the second and third HLM models were combined in order to get an estimate of the total amount of variation in our outcome variables that could be explained by all the predictors taken together. These estimates of explained variation are comparable to R-square estimates from our ANCOVA models.
The difference between our ANCOVA and HLM results was small. Although the critical values of test statistics and corresponding p values changed, the variation was not dramatic. For example, for math literacy all of our student-level predictors and school SES had the same signs as observed in the corresponding ANCOVA model. The only difference that we observed in the full HLM model was that now cultural possessions and parental education were no longer statistically significant. We note that although these predictors were statistically significant in our ANCOVA model, even there both predictors had almost no practical significance. For instance, in our ANCOVA model a 1SD increase in cultural possessions was associated with an increase of only 3 points in standardized math literacy score (note: OECD, M = 500, SD = 100). Finally, we note that the effect of three primary predictors, native language, grade, and school SES, remained both statistically and practically significant in both the HLM and ANCOVA models with only a small difference in corresponding parameter estimates.
HLM results for our reading outcome variable were even less dramatic with only school type now being statistically insignificant. Earlier in our ANCOVA model this predictor had exhibited marginal statistical significance. Similar results were observed in the HLM model for science literacy outcome. For each of the three outcome variables (math, reading, and science) the three main predictors, native language, grade, and school SES, exhibited both high statistical significance, p < .001, and large effect sizes. The total amount of variation explained by HLM models for math, reading, and science literacy outcomes were 47%, 46%, and 43%, respectively. These estimates are comparable to 50%, 49%, and 47% from the corresponding ANCOVA models, respectively. The similarity in parameter estimates and total proportion of explained variation from our ANCOVA and HLM analyses supports the robustness of our statistical findings.
The purpose of this study was to determine Trinidadian bidialectal adolescents language identification on the PISA 2009 assessment and to examine differences in performance across literacy assessments of Trinidadian bidialectal adolescents based on this self-identification on the PISA 2009, before and after controlling influential factors. From the onset, the study showed clearly that the Trinidadian adolescents classified primarily as Standard English speakers when given the choice to select between Standard English versus non-Standard English on the PISA literacy assessment. According to the demographic information available, most adolescents who identified themselves as English speakers on the PISA 2009 were primarily speakers of TEC and TOB, the mesolects and basilects used in both countries respectively. The choice of bidialectal adolescents to identify themselves as speakers of English may likely be the result of their use of the English-based dialects: Trinidadian English-lexicon Creole (TEC) and Tobagonian English-lexicon Creole (TOB). Moreover, this choice may have been influenced by Caribbean teachers (Godley, Carpenter, & Werner, 2007), Caribbean parents (Deuber & Youssef, 2007; Youssef, 2001), and administrators (Government of the Republic of Trinidad and Tobago, 2010), many of whom may have operated based on the norms of a colonial present in their acceptance of Standard English as the language of instruction in the classroom (St. Hilaire, 2011).
As Ferreira (2013) observes, Trinidadian and Tobagonian youth generally identify themselves as speakers of [standard] English, a phenomenon that could have led them to select the Standard English classification on the test. Though their linguistic varieties (i.e., TEC, TOB) are significantly different from TSE in morphology, syntax, and phonology (Ferreira, 2013), the students failed to view the syntactic and semantic deviation of these English dialects as a basis for their non-Standard English speaking status. In deciding to self-identify as English speakers on a literacy assessment measure that privileges, is based on, and appears to favor literacy proficiency in Standard English (i.e., autonomous model) (Brown, 2014; Paris, 2005), the students implied that they possessed Standard English speaking proficiency and, therefore, Standard English literacy proficiency. While PISA does allow these learners the alternative to classify as non-native English speakers, that designation too seems inappropriate, because it positions Trinidadian and Tobagonian youth as having no knowledge of English when this is not the case. Based on these preliminary indications, students were presented with a dichotomized approach in a situation where their language status was peripheral to the dichotomies and therefore could not be captured by the assessment. This absence of a dialect-speaking option on the PISA assessment to allow students to reflect their status as dialect speakers seems to therefore be a concern (OECD, 2010). There is some reason to believe that the presence of an option for dialect classification on the PISA assessment could be useful as it can potentially reveal whether the adolescents selected the English option based on their affinity to Standard English or because of its absence as a choice on the examination. Determining whether students would have selected the bidialectal option had it been available and the extent to which this self-identification can provide further insights about their literacy performance remain an area for further research.
The presence of large and significant differences between self-identifying English and non-native English speakers, even after controlling for important student demographic differences such as grade, gender, school type, and indicators of socioeconomic and cultural status, implies that language-based performance differences are real and are unexplained by socioeconomic and other demographic differences among students. Large disparities across all literacy measures seem to suggest that when distinctions are made between students who self-identify as English speakers and students who do not identify as English speakers, significant variations persist in literacy performance. However, interpretations of literacy performance based on the variations may be inaccurate because of PISAs unwitting designation of these bidialectal adolescents as [standard] English speakers, which seemingly presumes that their literacy performance should be comparable to that of other Standard English speaking counterparts. Should bidialectal learners status as non-native English speakers be considered on PISA, there is a greater likelihood that their literacy underperformance could be accurately interpreted.
As is, our findings imply that the results from the PISA survey, though representative of students literacy performance, do an injustice to them by comparing them to Standard English speakers across the globe whose English language proficiency is considerably different from that of Trinidadian bidialectal youth. This comparison, seemingly predicated on traditions of colonialism that inadvertently privilege students conformity to Standard English norms, obscures the ways in which the bidialectal learners literate and language capacity was judged by these norms (Brown, 2014; Shohamy, 2006). An opportunity on the PISA assessment where Trinidadian bidialectal learners can self-identify appropriately is more likely to allow for an accurate representation of their literacy performance that acknowledges their TEC and TOB Englishes. In doing so, the system is less likely to sanction unjust comparisons between these learners and their Standard English speaking counterparts steeped in traditions of a colonial past (St. Hilaire, 2011) and would be better able to identify solutions to address literacy underperformance for these youth.
In addition to the observations above, findings from the study suggested that though the bidialectal adolescents identified as English speakers, all adolescents struggled significantly to perform in mathematics, science, and reading literacy regardless of their self-identification on the assessment measures. As shown earlier, the population of self-identifying Trinidadian adolescents, most of whom were presumably bidialectal learners, performed significantly below the OECD mean of 500, with a high degree of volatility in performance. Their self-identifying non-native English speaking counterparts fell even further below this mean. The underperformance of these youth seems to suggest that the disciplinary nature of the PISA literacy assessments may have played a role in mean differences. To date, much of the science, mathematics, and reading literacy assessments to which Trinidadian adolescents are exposed on national and local measures focus largely on literacy as a body of knowledge as opposed to the application of skills based on various contexts. This tendency seems to reflect the orientation of educational systems within the Caribbean to gravitate towards colonial norms that focus on proficiency in English literacy as opposed to performativity based on the social contexts in which literacy use is leveraged (Canagarajah, 2006a).
These findings raise concerns about how the literacy orientation to colonial norms in a postcolonial era (Dale, 2000; St. Hilaire, 2011) may explain gaps between the content-specific (i.e., disciplinary) literacies (Lee & Spratley, 2010; Vacca, Vacca, & Mraz, 2011) of students in the Majority World twin-island state of Trinidad and Tobago and those of students in Minority World countries. There is some evidence to suggest that in the Caribbean, content-specific literacies assessed on the PISA 2009, which are in many ways similar to disciplinary literacy, tend not to be widely acknowledged or addressed in teacher preparation programs (Nero, 2000; Yiakoumetti & Mina, 2013). For example, surveys of Caribbean teachers revealed a limited awareness of disciplinary literacy, an unwillingness to teach reading because it is considered the purview of English teachers (Leacock & Warrican, 2006), and a lack of training to deliver disciplinary literacy instruction (Warrican & Leacock, 2007). Moreover, the literature at large indicates that specific language patterns accompany particular disciplines and have an impact on students comprehension of discipline-specific content (Fang & Pace, 2013; Fang & Schleppegrell, 2010). In the absence of an emphasis on literacy as a social process where the language patterns that accompany specific literacies seem to be sidelined because of due attention to the need for students to acquire English literacy proficiency (Brown, 2014; Canagarajah, 2006a), these factors, taken together, could very well explain the disparities observed among self-identifying Standard English speakers.
Although the main focus of our study was on differences in language self-identification and its impact on literacy, our use of gender primarily as a control variable provided useful insights into the literacy performance of the bidialectal youth. Our results indicated that after controlling for important differences such as grade, school type, native language, and indicators of SES, adolescent males in general outperformed females in mathematics literacy and adolescent females in general outperformed their male counterparts in reading literacy. In terms of magnitude, the mean difference in reading literacy scores appeared to be approximately 4 times larger than the mean difference in mathematics literacy scores, indicating that adolescent males lag significantly behind females in reading literacy. These findings are quite revealing because they counteract the general perception in the Caribbean region that males outperform females academically (Layne, Jules, Kutnick, & Layne, 2008). Moreover, the findings are in harmony with evidence from earlier studies reflecting that though adolescent females tend to outperform males in English, males have the upper hand in mathematics and any gap between genders in science is narrow (George, Quamina-Aiyejina, Cain, & Mohammed, 2009; Salandy, Mendez, & Ragoo, n.d.). While the extent to which language played a role in this dynamic remains unknown, there may be reasons to investigate its impact on Trinidadian bidialectal adolescents, both male and female. As shown by a recent study, males are beginning to lag behind in so-called male dominated areas (such as mathematics) in the region (Ministry of Science, Technology and Tertiary Education, 2011). The ways in which self-identification of native language potentially interacts with the results from students assessments can prove to be particularly revealing.
Despite the fact that our analytical procedures revealed several significant effects, care should be taken when generalizing these results. Our findings are applicable only to the population that is represented by our sample: viz., 15-year-old students enrolled in Grades 9 and 10 in private and public schools in Trinidad and Tobago. While our findings could be generalized to other populations with characteristics similar to those of students included in our sample, derivation of accurate conclusions based on such generalizations is not guaranteed.
This study is the first of its kind to be done in the Caribbean region, and few if any studies have considered the positioning of non-Standard English speakers and implications for their self-identification on interpretation of international measures of literacy assessment. Beyond the concerns raised in our discussion of the results, this study serves to justify the need for closer attention to the pervasive role of colonialism in the dominance of Standard English in multilingual testing (Shohamy, 2006), highlights the need for attention to bidialectal students performativity in World Englishes that challenge normative Standard English literacy proficiency (Canagarajah, 2006a), and requires that assumptions steeped in colonialism that underlie Standard English literacy testing on the PISA international measure be revisited if bidialectal adolescent learners are to be accurately represented on these measures in much the same manner as their monolingual and Standard English speaking counterparts. While our studys design did not allow us to directly determine the ways of thinking (i.e., ideologies) emanating from colonialism that specifically influenced the bildialectal learners selection of Standard English as their native language, the lack of a bidialectal option for these students on the PISA measures reflects, in part, perceived notions in the test preparation process that legitimate Standard English as a dominant language and its dialect varieties as substandard enough to be omitted from the assessment (Kamwendo, 2006; Winer, 2006). Whether intentional or not, this legitimization of Standard English seems inescapable in discussions about preparation of the PISA 2009 assessment and appears directly linked to the ways in which the colonial present remains reflective of the colonial past, thus resulting in the privileging of Standard English norms over nonstandard counterparts in these literacy measures. As a result, Trinidadian and Tobagonian bidialectal adolescents literacy performance was misjudged based on the results of said measures.
To address this situation more deeply, and to obtain additional insights, future research in this area can replicate our method with other bidialectal populations and with PISA literacy findings to determine if and how self-identification of bidialectal speakers and learners potentially conflates international literacy assessment results for bidialectal and multidialectal students in other Majority World contexts such as Africa and India. Similarly, researchers may do well to examine this premise in Minority World contexts such as the United States and Canada. By acknowledging the ways in which dominant Standard English norms function in literacy assessment as well as the assumptions of colonialism that accompany them, we can better understand how bidialectal student populations are unfairly judged by these assessments. In doing so, the field stands poised to facilitate the language and literacy needs of bidialectal adolescent youth.
1. We use the population-count-based Minority World label in place of traditional labels such as First World, Global North, More Developed Nation, Advanced, More Economically Developed, and Post-Industrial Countries.
2. We use the population-count-based Majority World label in place of traditional labels such as Third World, Developing World, Global South, Underdeveloped Nations, and Less Developed Countries.
3. We use the term English-speaking Caribbean to denote countries in the Eastern and Northern Caribbean region who were, at one particular instance in their histories, colonized by Britain and therefore impacted by the use of English in such a way that their primary de facto language for education and academic purposes is Standard English. This is in spite of the fact that the majority of these populations all speak as a country, and on a day-to-day basis, a language variation primarily derived from Standard English and, in many cases, additional language variations based on Spanish or French.
Alim, H. S. (2009). Creating An empire within an empire: Critical Hip Hop language pedagogies and the role of sociolinguistics. In H. S. Alim, A. Ibrahim, & A. Pennycook (Eds.), Global linguistic flows: Hip Hop cultures, youth identities and the politics of language (pp. 213230). New York, NY: Routledge.
Bogle, M. (1997). Constructing literacy: Cultural practices in classroom encounters. Caribbean Journal of Education, 19(2), 179190.
Brown, J. D. (2014). The future of world Englishes in language testing. Language Assessment Quarterly, 11(1), 526.
Canagarajah, A. S. (1999). Interrogating the native speaker fallacy: Non-linguistic roots, non-pedagogical results. In G. Braine (Ed.), Non-native educators in English language teaching (pp. 7792). Mahwah, NJ: Lawrence Erlbaum Associates.
Canagarajah, A. S. (2006a). Changing communicative needs, revised assessment objective: Testing English as an international language. Language Assessment Quarterly, 3(3), 229242.
Canagarajah, A. S. (2006b). Globalization of English and changing pedagogical priorities: The post-modern turn. In B. Beaven (Ed.), IATEFL 2005 Cardiff Conference selections (pp. 1525). Canterbury, UK: IATEFL.
Canagarajah, A. S., & Said, S. B. (2011). Linguistic imperialism. In J. Simpson (Ed.), The Routledge handbook of applied linguistics (pp. 388400). Oxford, UK: Routledge.
Carrington, L. (1992). Caribbean English. In T. McArthur (Ed.), The Oxford companion to the English language (pp. 191193). Oxford, UK: Oxford University Press.
Castagno, A. E., & Brayboy, B. M. J. (2008). Culturally responsive schooling for Indigenous youth: A review of the literature. Review of Educational Research, 78(4), 941993.
Castek, J., Leu, D. J., Coiro, J., Gort, M., Henry, L. A., & Lima, C. O. (2007). Developing new literacies among multilingual learners in the elementary grades. In L. L. Parker (Ed.), Technology-mediated learning environments for young English learners: Connections in and out of school (pp. 111113). New York: Routledge.
Cohen, J. (1992). Quantitative methods in psychology: A power primer. Psychological Bulletin, 112(1), 155159.
Clachar, A. (2003). Paratactic conjunction in creole speakers and ESL learners academic writing. World Englishes, 22(3), 27189.
Craig, D. R. (1999). Teaching language and literacy: Policies and procedures for vernacular situations. Georgetown, Guyana: Education and Development Services.
Craig, D. R. (2006). Teaching language and literacy. Kingston, Jamaica: Ian Randle.
Crystal, D. (1997). English as a global language. Cambridge, UK: Cambridge University Press.
Crystal, D. (2003). English as a global language (2nd ed.). Cambridge, UK: Cambridge University Press.
Dale, R. (2000). Globalization and education: Demonstrating a common world educational culture or locating a globally structured educational agenda? Educational Theory, 50(4), 427448.
Davies, A., Hamp-Lyons, L., & Kemp, C. (2003). Whose norms? International proficiency tests in English. World Englishes, 22(4), 571584.
Delpit, L. (1988). The silenced dialogue: Power and pedagogy in educating other people's children. Harvard Educational Review, 58(3), 280299.
Deuber, D., & Youssef, V. (2007). Teacher language in Trinidad: A pilot corpus study of direct and indirect Creolisms in the verb phrase. Retrieved from http://www.birmingham.ac.uk/documents/college-artslaw/corpus/conference-archives/2007/31Paper.pdf
Devonish, H., & Carpenter, K. (2007). Towards full bilingualism in education: The Jamaican bilingual primary education project. Social and Economic Studies, 56(12), 277303.
Ellis, E. (2004). The invisible multilingual teacher: The contribution of language background to Australian ESL teachers professional knowledge and beliefs. The International Journal of Multilingualism, 1(2), 90108.
European Commission. (2012). Foreign language learning statistics. Retrieved from http://epp.eurostat.ec.europa.eu/statistics_explained/index.php/Foreign_language_learning_statistics
Faggella-Luby, M. N., Graner, P., Deshler, D. D., & Drew, S. (2012). Building a house on sand: Why disciplinary literacy is not sufficient to replace general strategies for adolescent learners who struggle. Topics In Language Disorders, 32(1), 6984.
Fang, Z., & Pace, B. G. (2013). Teaching with challenging texts in the disciplines: Text complexity and close reading. Journal of Adolescent and Adult Literacy, 57(2), 104108.
Fang, Z., & Schleppegrell, M. J. (2010). Disciplinary literacies across content areas: Supporting reading through functional language analysis. Journal of Adolescent & Adult Literacy, 53(7), 587597.
Ferreira, J. S. (2013). A brief overview of the sociolinguistic history of Trinidad and Tobago. In P. M. Lewis, G. F. Simons, & C. D. Fennig (Eds.), Ethnologue: Languages of the world (17th ed.). Sallas, TX: SIL International. Retrieved from http://www.ethnologue.com
Ganzeboom, H., de Graaf, P., & Treiman, D. (1992). A standard international socio-economic index of occupational status. Social Science Research, 21, 156.
George, J., Quamina-Aiyejina, L., Cain, M., & Mohammed, C. (2009, June 15). Gender issues in education and intervention strategies to increase participation of boys. Retrieved from https://www.mona.uwi.edu/cop/library/gender-issues-education-and-intervention-strategies-increase-participation-boys
Godley, A. J., Carpenter, B. D., & Werner, C. A. (2007). Ill speak in proper slang: Language ideologies in a daily editing activity. Reading Research Quarterly, 42, 100131.
Government of the Republic of Trinidad and Tobago. (2010). Language and language education policy. Retrieved from https://www.mona.uwi.edu/cop/library/gender-issues-education-and-intervention-strategies-increase-participation-boys
Hill, C., & Parry, K. (2014). From testing to assessment: English as an international language (2nd ed.). New York, NY: Routledge.
International Reading Association. (2012). Adolescent literacy: A position statement of the International Reading Association. Retrieved from http://www.reading.org/Libraries/resources/ps1079_adolescentliteracy_rev2012.pdf
Jetton, T. L., & Shanahan, C. (2012). Adolescent literacy in the academic disciplines: General principles and practical strategies. New York: The Guilford Publications.
Johnson, R. K. (1990). International English: Towards an acceptable, teachable target variety. World Englishes, 9(3), 301315.
Jules, D. (2010, December 13). Rethinking education in the Caribbean. Retrieved from http://www.cxc.org/?q=media-centre/cxcs_blog
Kachru, B. B. (1986). The alchemy of English: The spread, functions, and models of non-native Englishes. Oxford, UK: Pergamon Press.
Kachru, B. B., & Nelson, C. (2001). World Englishes. In A. Burns & C. Coffin (Eds.), Analysing English in a global context (pp. 925). New York, NY: Routledge.
Kalantzis, M., & Cope, B. (2012). Literacies. Cambridge, U.K.: Cambridge University Press.
Kamwendo, G. H. (2006). No easy walk to linguistic freedom: A critique of language planning during South Africas first decade of democracy. Nordic Journal of African Studies, 15(1), 5370.
Kirkpatrick, A. (2007). World Englishes: Implications for international communication and English language teaching. Cambridge, MA: Cambridge University Press.
Labov, W. (1972). Language in the inner city: Studies in the Black English vernacular. Philadelphia, PA: University of Pennsylvania Press.
Lacoste, V. (2007). Modelling the sounds of Standard Jamaican English in a Grade 2 classroom. Caribbean Journal of Education, 29, 290326.
Layne, A., Jules, V., Kutnick, P., & Layne, C. (2008). Academic achievement, pupil participation and integration of group work skills in secondary school classrooms in Trinidad and Barbados. International Journal of Educational Development, 28, 176194.
Leacock, C. J., & Warrican, S. J. (2006). Should reading be part of mathematics: Views and practices of Barbadian secondary mathematics teachers. Institute of Education Publication Series, 2, 117138.
Lee, C. D., & Spratley, A. (2010). Reading in the disciplines: The challenges of adolescent literacy. New York, NY: Carnegie Corporation of New York.
Leu Jr, D. J., Castek, J., Coiro, J., Gort, M., Henry, L. A., & Lima, C. O. (2005). Developing new literacies among multilingual learners in the elementary grades. Technology-mediated learning environments for young English learners: Connections in and out of school. Mahwah, NJ: Lawrence Erlbaum Associates.
Lewis, M. P., Simons, G. F., & Fennig, C. D. (2014). Ethnologue: Languages of the world. Retrieved from http://www.ethnologue.com
Luke, A., Luke, C., & Graham, P. (2007). Globalization, corporatism, and critical language education. International Multilingual Research Journal, 1(1), 113.
McKay, S. (2005). Teaching the pragmatics of English as an international language. Guidelines, 27(1), 39.
Millar, P., & Warrican, S.J. (2015). Constructing a third space: Positioning students out-of-school literacies in the classroom. In P. Smith & A. Kumi-Yeboah (Eds.), Handbook of research on cross-cultural approaches to language and literacy development (pp. 87-117). Hershey, PA: IGI Global.
Miller, E. (1989). Caribbean primary education: An assessment. Caribbean Journal of Education, 16(3), 136171.
Ministry of Science, Technology and Tertiary Education (2011, June 13). Gender study on the factors affecting male re-entry into the post-secondary and tertiary education system: Final report. Retrieved from http://test.gov.tt/portals/0/Documents/Publications/MSTTE%20Gender%20Study%20-%20Final%20Report.pdf
Mitchell, S. A. (2007). Acquiring basic reading skills: An exploration of phonetic awareness in Jamaican primary schools. Caribbean Journal of Education, 29(2), 327358.
Moll, L. C., & Greenberg, J. B. (1990). Creating zones of possibilities: combining social contexts for instruction. In L. C. Moll (Ed.), Vygotsky and education: Instructional implications and applications of sociohistorical psychology (pp. 319348). Cambridge: Cambridge University Press
National Clearinghouse for English Language Acquisition. (2011). FAQ: How many school-aged Limited English Proficient (LEP) students are there in the U.S? Retrieved from http://www.ncela.gwu.edu/faqs/
National Council of Teachers of English. (2008). English language learners: A policy research brief produced by the National Council of Teachers of English. Retrieved from http://www.ncte.org/library/NCTEFiles/Resources/Positions/Chron0308PolicyBrief.pdf
Nero, S. J. (2000). The changing faces of English: A Caribbean perspective. TESOL Quarterly, 34(3), 483510.
Nero, S. J. (2006). Language, identity and education of Caribbean English speakers. World Englishes, 25(3/4), 501511.
New London Group. (2000). A pedagogy of multiliteracies: Designing social futures. In B. Cope & M. Kalantzis (Eds.), Multiliteracies: Literacy learning and the design of social futures (pp. 938). South Yarra, Australia: Macmillan.
Organisation for Economic Co-operation and Development. (2006a). PISA released items - mathematics. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development. (2006b). PISA released items - reading. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development. (2006c). PISA released items - science. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development. (2010). PISA 2009 results: Executive summary. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development. (2012). PISA 2009 technical report. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development. (2014). PISA - Try the test: Sample questions. Retrieved from http://www.oecd.org/pisa/test/form/
Paris, S. G. (2005). Reinterpreting the development of reading skills. Reading Research Quarterly, 40(2), 184202.
Pennycook, A. (2013). The cultural politics of English as an international language (2nd ed.)
New York, NY: Routledge.
People for Education. (2013). Language support. Retrieved from http://www.peopleforeducation.ca/wp-content/uploads/2013/07/language-support-2013.pdf
Pratt-Johnson, Y. (1993) Curriculum for Jamaican Creole-speaking students in New York City. World Englishes, 12(2), 25764.
Pratt-Johnson, Y. (2006). Teaching Jamaican creole-speaking students. In S. Nero (Ed.). Dialects, Englishes, Creoles, and Education (pp. 119136). Mahwah, NJ: Lawrence Erlbaum Associates.
Roberts, P. (2007). West Indians and their language (2nd ed.). Cambridge, UK: Cambridge University Press.
Salandy, A., Mendez, R., & Ragoo, B. (n.d.). Student attainment: CXC Examination 20032004. Retrieved from http://www.moe.gov.tt/media_pdfs/publications/Student%20Attainment%20at%20CXC.pdf
Shohamy, E. (2006). Language policy: Hidden agendas and new approaches. London, England: Routledge.
Siegel, J. (1997). Using a pidgin language in formal education: Help or hindrance? Applied Linguistics, 18(1), 86100.
Siegel, J. (1999). Creoles and minority dialects in education: An overview. Journal of Multilingual and Multicultural Development, 20(6), 508531.
Siegel, J. (2002). Pidgins and creoles. In R. Kaplan (Ed.), Handbook of applied linguistics (pp. 33551). New York, NY: Oxford University Press.
Siegel, J. (2005). Literacy in pidgin and Creole languages. Current Issues in Language Planning, 6(2), 143163.
Simmons-McDonald, H. (2004). Trends in teaching Standard English varieties to Creole and Vernacular speakers. Annual Review of Applied Linguistics, 24, 187208.
Simmons-McDonald, H. (2006). Vernacular instruction and bi-literacy development in French Creole speakers. In H. Simmons-McDonald & I. Robertson (Eds.), Exploring the boundaries of Caribbean and Creole languages (pp. 118146). Kingston, Jamaica: University of the West Indies Press.
Smith, P. (2016). A distinctly American opportunity: Crossing linguistic boundaries by exploring non-standardized Englishes in policy and practice. Policy Insights from the Behavioral and Brain Sciences (Sage Publications Special Volume), 3(2), 194202.
St. Hilaire, A. (2011). Kweyol in postcolonial Saint Lucia: Globalization, language planning, and national development. Amsterdam, The Netherlands: John Benjamins.
Street, B. (1995). Social literacies. London, UK: Longman.
Teachers of English to Speakers of Other Languages. (2008). Position statement on English as a global language. Retrieved from http://www.tesol.org
Thompson, B. P., Warrican, S. J., & Leacock, C. J. (2011). Education for the future: Shaking off the shackles of colonial times. In D. A. Dunkley (Eds.), Readings in Caribbean history and culture: Breaking ground (pp. 6186). Plymouth, UK: Lexington Books.
United Nations Population Division. (2013). World population prospects: The worlds youth 2013 datasheet. Retrieved from http://www.prb.org/pdf13/youth-data-sheet-2013.pdf
Vacca, R. T., Vacca, J. L., & Mraz, M. (2011). Content area reading: Literacy and learning across the curriculum (10th ed.). New York, NY: Allyn and Bacon.
Voice. (2011, January 8). Regional linguists meet at UWI International conference on language rights and policy. Retrieved from http://www.thevoiceslu.com/local_news/2011/january/08_01_11/Regional.htm
Warrican, J. S. (2006). Promoting reading amidst repeated failure: Meeting the challenges. High School Journal, 90(1), 3343.
Warrican, S. J., & Leacock, C. J. (2007). Concerns, views and practices relating to reading across the curriculum: Addressing the needs of secondary school teachers in St. Lucia. A report commissioned by the Ministry of Education, St. Lucia.
Warrican, J. S. (2009). Literacy development and the role of the Eastern Caribbean joint board of teacher education. Journal of Eastern Caribbean Studies, 34(2), 7185.
Wei, R., & Su, J. (2012). The statistics of English in China. English Today, 28(3), 1014.
Williams, V., & Carter, B-A. (2005). Arriving at a self-diagnosis of the foreign language teaching situation in Trinidad and Tobago. Caribbean Journal of Education, 27(2), 223243.
Winer, L. (2006). Teaching English to Caribbean English Creolespeaking students in the Caribbean and North America. In S. J. Nero (Eds.), Dialects, Englishes, creoles, and education (pp. 105118). Mahwah, NJ: Lawrence Erlbaum Associates.
Winford, D. (1997) Re-examining Caribbean English Creole continua. World Englishes, 16(2), 23379.
Yiakoumetti, A., & Mina, M. (2013). Language choices by teachers in EFL classrooms in Cyprus: Bidialectism meets bilingualism with a call for teacher training programmes in linguistic variation. Teacher Development: An International Journal of Teachers Professional Development, 17(2), 214227.
Youssef, V. (2001). Age-grading in the Anglophone Creole of Tobago. World Englishes, 20(1), 2946.