Unpacking the Strategies of Ten Highly Rated MOOCs: Implications for Engaging Students in Large Online Courses


by Khe Foon Hew - 2018

Background/Context: The advent of massive open online courses (MOOCs) has fueled much attention among educators. However, despite the high interest they generate, we still understand little about student engagement in these large-scale online courses. Existing studies tend to focus on how MOOCs affect higher education institutions in terms of faculty identity, workload, responsibilities, and policy. Other studies have mostly employed clickstream data analysis to predict student dropout or completion. Although studies such as these are useful, they fall short of explaining the reasons why participants find the activities or course engaging.

Research Questions: Unlike many previous studies, this study seeks to uncover what factors related to MOOC pedagogy or to the individual instructor may encourage or discourage student engagement. This study explores the following questions: What elements related to the course design or the teaching staff did students find enjoyable, beneficial in helping them learn the materials, or motivate them to participate in the activities? What elements did students wish to improve? What elements related to the course design or teaching staff did students find frustrating?

Participants: The sample consisted of 4,466 learners who participated in one or more of 10 highly rated MOOCs. Highly rated MOOCs were sampled because they were likely to exemplify good practice or teaching strategies.

Research Design: Qualitative research methods were used in this study. More specifically, detailed observations of the 10 MOOCs’ course features and grounded analyses of the 4,466 learners’ course review data were conducted.

Findings: Findings suggest six key factors that can engage online students and nine reasons for student disaffection. The four most frequently mentioned engagement factors were (a) problem-centric learning, (b) active learning supported by timely feedback, (c) course resources that cater to participants’ learnings need or preferences, and (d) instructor attributes such as enthusiasm or humor. The two most commonly reported student disaffections across the 10 MOOCs were due to forum- and peer-related issues.

Conclusion: This article ends with five main implications that could provide practical guidelines to other instructors of large online courses. The findings may also offer tips for instructors of traditional e-learning courses. Although we cannot generalize the findings of this study to traditional e-learning courses, it is possible that at the very least, the information presented here may suggest probable solutions for traditional e-learning courses that might otherwise be overlooked.



Understanding what factors might engage online students is important for two reasons. First, in online learning, or other types of traditional learning such as face-to-face, student engagement is considered a necessary condition for fostering positive student behavior and learning. Previous research on face-to-face learning settings, for example, has suggested that student engagement can predict student behavior and grades (Maddox & Prinz, 2003).


Second, the use of online learning has become more prevalent among students and educators around the world, particularly with the emergence of massive open online courses (MOOCs). Many instructors are becoming increasingly interested in teaching large-scale online courses, as suggested by the growing number of MOOCs being offered every month. Data from the Open Education Europa scoreboard showed a total of 4,277 MOOCs worldwide were available for students to take part in at the end of March 2015.


However, not all large-scale online courses are rated highly by students who participate in them. Clearly, teaching 40,000+ students online usually requires much more effort than an instructor merely videotaping lectures and posting them on the web, or hoping that students will find personal reward or interest in pursuing the MOOC. What makes some large-scale online courses more highly rated than others?


The main purpose of this study is to identify and elaborate a list of factors and strategies that instructors of highly rated MOOCs employ to engage their students. Despite the high interest MOOCs generate, we still understand little about student engagement in these large-scale online courses (Anderson, Huttenlocher, Kleinberg, & Leskovec, 2014). Existing studies have mostly examined personal student-related desires for participating in MOOCs, or employed clickstream analysis to examine student activities in MOOCs or predict student dropout and completion (e.g., Anderson et al., 2014; Coetzee, Fox, Hearst, & Hartmann, 2014; Coffrin, de Barba, Corrin, & Kennedy, 2014).


Unlike many previous studies, this study seeks to uncover what factors related to MOOC design or the individual instructor may encourage or discourage student engagement. Specifically, this paper describes 10 highly rated MOOCs from different subject disciplines. Ten highly rated MOOCs were sampled because they were likely to exemplify good practice or teaching strategies. To select highly rated courses, this study relied on user reviews on CourseTalk, a user-driven MOOC rating website (Costello, 2013) which is considered the current leading and largest platform for individuals to search, discover, and review the widest variety of MOOCs (Richardson, 2014). At the time of writing, visitors to CourseTalk could peruse three categories: “Top Rated,” “Popular,” and “Upcoming.”


The “Top Rated” category listed MOOCs based on user reviews on the course content (e.g., topics, materials, activities, assignments), course instructor (e.g., availability, expertise), and course provider (e.g., interface, tools, course forum). Each participant could rate each category using 1 to 5 stars, with five stars representing the highest quality. The website would then calculate the average rating of these three categories and display it as the overall rating given by a particular participant for a particular course. The “Popular” category listed MOOCs based on the frequency the courses were searched by users. Popular MOOCs are not necessarily top rated MOOCs. Not every popular course had an overall 5-star rating in terms of course quality. The “Upcoming” category listed courses that would be offered in the future.


This study focuses on “Top Rated” courses instead of “Popular” courses. More specifically, this study defines a highly rated MOOC as a course that received the highest number of participant reviews and an overall 5-star rating in terms of course quality within a particular subject discipline. Although the ratings can be subjective, the provision of reviews from a wide range of informants provides some form of data triangulation that promotes credibility of the ratings.


A large-scale qualitative analysis of 4,466 participants’ reflection data was conducted. The findings of this study can guide researchers and educators in formulating pedagogical strategies to better engage students in large-scale online courses. The findings may also offer tips for instructors of traditional e-learning courses (Educause, 2013).


THEORETICAL BACKGROUND


The notion of engagement can be broadly conceived as a metaconstruct that includes behavioral, affective, and cognitive engagement (Fredricks, Blumenfeld, & Paris, 2004). Behavioral engagement refers to student participation in a course, such as doing assignments (Birch & Ladd, 1997). Affective engagement refers to students’ feelings toward the teachers, classmates, learning, and course (Skinner & Belmont, 1993). Cognitive engagement refers to students’ thinking and understanding of the subject content (Fredricks et al., 2004; Rotgans & Schmidt, 2011). It is important to note that in reality these three components are dynamically interrelated within the individual; they are not isolated entities (Fredricks et al., 2004).


Engagement is driven by motivation. One of the most frequently used theories to explain human motivation is self-determination theory (Losier, Perreault, Koestner, & Vallerand, 2001; Niemiec et al., 2006; Williams et al., 2006), which posits that individuals possess three fundamental psychological needs that move them to act or not to act—the needs for autonomy (i.e., freedom to choose), relatedness (i.e., feeling of being connected to other people), and competence (i.e., sense of mastery) (Deci & Ryan, 1985). Given the extent of the literature on self-determination theory, a comprehensive review is beyond the scope of this article.


So far, no research has been conducted in the MOOC context to show how autonomy, relatedness, and competence might influence student engagement. Nevertheless, we may potentially draw some insights from previous research studies done on traditional courses. Specifically, these studies have shown that participants high in autonomy were likely to enjoy their lessons (higher affective engagement) (Skinner, Furrer, Marchand, & Kindermann, 2008), to participate in the class activities (greater behavior engagement) (Jang, Johnmarshall, & Deci, 2010; Skinner et al., 2008), and to demonstrate more effort in understanding the subject content (cognitive engagement) (Mahatmya, Lohman, Matjasko, & Farb, 2012; Rotgans & Schmidt, 2011). A sense of relatedness toward peers and teachers can encourage positive student feeling toward a course and motivate students to participate in the course activities (Finn & Zimmer, 2012; Furrer & Skinner, 2003; Skinner & Pitzer, 2012). A sense of competency is likely to increase students’ cognitive engagement (Skinner et al., 2008), as well as encourage student participation (behavioral engagement) and student interest the course (affective engagement) (Skinner et al., 2008).


CURRENT KNOWLEDGE GAP IN MOOC RESEARCH


Scholars have begun to examine various questions and issues related to MOOCs. For example, Courtney (2013), Gore (2014), and Wu (2013) discuss the challenges and issues that MOOCs bring to academic libraries. Fox (2013) and Lombardi (2013) discuss how MOOCs affect higher education institutions in terms of faculty identity, workload, roles and responsibilities, and institutional policy. Scholars have also examined the reasons why professors decide to offer MOOCs. These reasons include a desire for the professors to gain some personal rewards, such as to increase their personal reputation within their discipline and with the general public, or to help get tenure (Kolowich, 2013), a desire to increase student access to higher education worldwide (Kolowich, 2013), and a desire to experience teaching a large and diverse audience throughout the world, which no residential course could boast of (Mackness, Mak, & Williams, 2010; Roth, 2013).


However, as pointed out by many scholars (e.g., Anderson et al., 2014) we still understand little about student engagement, especially in the area of course pedagogy. Currently, many MOOC instructors and providers tend to apply intuitions from traditional courses and think of a student as someone who enrolls in a course and participates for credit over a period of 10–14 weeks (Anderson et al., 2014). Commonly used metrics of student engagement include number of student video views, student assignment submission, student forum posts, and student completion rate or grade achieved (e.g., Anderson et al., 2014; Coetzee et al., 2014; Coffrin et al., 2014).


Although quantitative measures such as these are useful, they fall short of explaining the reasons why participants find the activity or certain segments of the course engaging. Also, an increasing number of researchers are beginning to question whether completion rate or grade received are relevant metrics to measure student engagement in MOOCs (Haggard, 2013). There are many students in a MOOC who only wish to learn certain topics and are not interested in the rest of the material (Wang & Baker, 2014). These students complete only certain course activities or watch certain videos but do not finish all of the assignments (Kizilcec, Piech, & Schneider, 2013). These students may still find a particular activity or topic engaging.


Recently Alario-Hoyos, Perez-Sanagustin, Kloos, & Munoz-Merino (2014) proposed certain recommendations for engaging MOOC students. But these recommendations were drawn from the views of the course instructors. To adequately understand the reasons for individuals being engaged in MOOCs, we would need to examine the course participants’ self-reported reflections. A recent review on MOOCs by Hew and Cheung (2014) suggested that students participated in these courses because they desired to learn about a new topic or update their current knowledge, they were curious about MOOCs, or they wanted the personal challenge of finishing a large-scale online course or collecting as many completion certificates as possible. These findings, however, deal primarily with personal student-related desire.


Instead of examining student personal-related factors, the present study aims to understand factors specifically related to MOOCs’ design and the individual instructor that could engage online students. The following section will first describe the design and results of an earlier pilot work (Study 1, Hew, 2016), discuss its limitations, and then explain how these limitations informed the design of the current extended study (Study 2).


STUDY 1: EXPLORING STUDENT ENGAGEMENT IN 3 HIGHLY RATED MOOCS—A PILOT STUDY


Study 1 was an exploratory pilot study that examined three MOOCs. The following research question was examined: “What elements related to the course design or the teaching staff did students find enjoyable (i.e., affective engagement), beneficial in helping them learn the materials (i.e., cognitive engagement), or motivate them to participate in the activities (i.e., behavioral engagement)?”


Study 1 was based on a mixed methods approach of participant observation of the MOOCs’ course features, and grounded analyses of 965 students’ course review data posted on CourseTalk. Three highly rated MOOCs (i.e., courses that received the highest number of participant reviews and an overall 5-star rating in terms of course quality within a particular subject discipline) were selected. Course participants’ reflections were analyzed using grounded content analysis method to extract the main themes or factors that learners perceived as engaging. The process was inductive without a priori themes being assigned to the data. Five factors were found: course resources, peer interaction, instructor availability and passion, active learning, and problem-oriented learning with clear expositions.


Students have the choice to determine what particular resource (e.g., video, forum) or activities/challenges/course specialism they wish to use or pursue. Students also have the flexibility to use these resources at their own time and pace. All these elements give students a sense of autonomy, which in turn can motivate them to continue participating in the class activities, to enjoy the lessons, and to spend greater effort in understanding the subject contents (Rotgans & Schmidt, 2011; Skinner et al., 2008).


When students have opportunities to interact with their peers (e.g., ask questions and receive answers), they are more likely to feel a sense of connection to their peers (Furrer, Skinner, & Pitzer, 2014; Reeve, 2005). When instructors make themselves available to answer students’ questions or reply to students’ comments, students are more likely to feel connected to their instructors (Furrer et al., 2014; Hoffman, Richmond, Morow, & Salomone, 2002).


The use of problem-centered learning with clear explanations helps promote students’ sense of competency. The use of course resources such as additional readings, peer interaction, active learning, as well as the instructor’s availability (e.g., to answer student questions) also help students gain an understanding of the topics being taught. A sense of competency in turn can positively influence students’ behavioral, affective, and cognitive engagement (Skinner et al., 2008).


Study 1 had three limitations. First, the study examined only three MOOCs involving 965 participants. This limits the transferability of the findings to courses in other disciplines. Second, as the author continued in his examination of other MOOCs, he realized that the second factor Instructor accessibility and passion should be two separate elements. An instructor may be excited about his subject topic, but this does not necessarily mean he will make himself personally available to answer questions from the students. Third, one reviewer, while acknowledging the usefulness of the study, pointed out that it merely set out to find what made MOOC students enthuse, and suggested that a follow-up study be conducted to explore student disaffection in order to yield complementary information. Disaffection may be described as the occurrence of behaviors, feelings, and cognitive orientation that reflect reactions such as distraction, frustration, boredom, and helplessness (Skinner et al., 2008).


STUDY 2: A FOLLOW-UP STUDY—LESSONS FROM 10 HIGHLY RATED MOOCS


METHOD


Study 2 was carried out to examine student disaffection in MOOCs. Study 2 also examined students’ suggestions for improvement and extended the case studies to 10 highly rated MOOCs in different disciplines. Specifically, the following research question guided Study 2:


Research question 1: What elements related to the course design or the teaching staff did students find enjoyable, beneficial in helping them learn the materials, or motivate them to participate in the activities? What elements did students wish to improve?


Study 2 extends the previous study by examining the following additional question:


Research question 2: What elements related to the course design or teaching staff did students find frustrating?


To address the above questions, Study 2 employed a case study approach. The case study method is suitable when the focus of a study is to answer “how” and “why” questions (Yin, 2014). Compared to quantitative methods such as correlational analysis, the case study is a more appropriate method to investigate student perceptions of the factors or reasons that could engage or disaffect them in large online courses. The use of case study is not only an important but also a legitimate research approach in MOOC research (Chen & Chen, 2015; Liyanagunawardena, Adams, & Williams, 2013).


DATA SOURCES


Similar to Study 1, the CourseTalk rating website was used to identify 10 highly rated MOOCs across various subject disciplines for Study 2. Each of the 10 MOOCs was selected because it received the highest number of participant reviews and a 5-star rating in terms of overall course quality within a particular subject discipline. Although the engagement and disaffection findings surfaced in this study might not be generalizable to all possible MOOCs, these findings can offer readers useful insights into what strategies instructors of highly rated MOOCs have used to engage their students. Table 1 lists these courses, along with a brief description of their purpose and the number of ratings and reviews received.


Table 1. Summary of the 10 MOOCs as of February 23, 2015

Subject discipline & the total number of available courses

(in brackets)

Name of MOOC & university

Ratings & number of reviews

Purpose of course

Art & Design

(n=267)

Design: Creation of Artifacts in Society, University of Pennsylvania

5«, 207 reviews

Demonstrates design as a problem-solving process to address challenges in the real world.

    

Entrepreneurship (n=224)

(a)

Developing Innovative Ideas for New Companies: The First Step in Entrepreneurship, University of Maryland

4.5«, 199 reviews

Explains methods and models to identify opportunities based on customer needs and develop business ideas.

 

(b)

  

Healthcare

(n=95)

(c)

Epidemics – The Dynamics of Infectious Diseases, Pennsylvania State University

5«, 328 reviews

Explains the mechanisms of how diseases emerge, spread, and are controlled.

 

(d)

  

History

(n=344)

Greek and Roman Mythology, University of Pennsylvania

5«, 85 reviews

Explores the myths of ancient Greece and Rome, and how they impact societies.

    

Literature

(n=104)

Modern & Contemporary American Poetry, University of Pennsylvania

5«, 170 reviews

Examines poetry through Socratic questioning-based close readings.

    

Management

(n=288)

An Introduction to Operations Management, University of Pennsylvania

5«, 107 reviews

Demonstrates the improvement of operations using a problem-solving approach.

    

Philosophy

(n=203)

Think101x: The Science of Everyday Thinking, University of Queensland

5«, 1,040 reviews

Explores how people think and introduces tools to help people evaluate claims and make sense of evidence.

    

Programming languages

(n=988)

An Introduction to Interactive Programming in Python, Rice University

5«, 1,896 reviews

Demonstrates interactive programming via game-building.

    

Psychology

(n=99)

Irrational Behavior, Duke University

5«, 376 reviews

Explains the reasons for people making decisions that are inconsistent with the assumptions of standard economy theory and rational decision-making.

    

Statistics

(n=126)

15.071x: The Analytics Edge, MIT

5«, 58 reviews

Explains the use of data and analytics methods to improve a business or industry.


DATA COLLECTION AND ANALYSIS


The two main data sources for Study 2 were the researcher’s observations of the 10 MOOCs’ course features and the MOOC participants’ reflection comments posted on CourseTalk. This study would be one of the largest and earliest studies reporting qualitative data from more than 4,460 MOOC participants’ perspectives. These participants were learners who had completed, partially completed, or dropped out of at least 1 of the 10 highly rated MOOCs.


Participants’ reflections were guided by several questions built into the feedback format on CourseTalk. These questions pertained to the course instructor, the tools and activities or assignments, the community forums, the overall course design, and their dislikes and suggestions for improvement. Examples of the questions include: (1) What did you think of the expertise and course delivery by the instructor? (2) What did you think of the tools, assessment and community? What would you improve? (3) What did you like and dislike about the course? What would you improve?


The pilot study’s (Study 1) framework of five engagement factors was used to guide the current study’s initial analysis and coding. One of the original factors, “instructor availability and passion,” had been later found to be rather misleading. Content analyses of the participants’ reflections showed that although an instructor was excited about his subject topic, analysis of the discussion forums revealed that the instructor did not make himself available himself to answer questions from the students. This suggests that “instructor availability and passion” should be  two separate elements instead of one single element. New emerging categories (if any) were also allowed to come forth inductively during the coding process.


The following three examples are presented to show how the data were coded. The first example is a comment from a student in the Management MOOC. The narrative described below was coded as “engagement is fostered when learning is problem-oriented” because of the emphasis on helping students to understand or address issues or questions learners might encounter in the real world or relevant to their own lives:


This is a highly recommended MOOC - The professor gives a fair amount of insights into industry data and practices, and demonstrates how the concepts are utilized in the real-world of operations management.


The second example is a comment from a student in the Epidemics MOOC who described the “Ask-Us-Anything” feature employed by the teaching staff. This comment was coded as “engagement is fostered when the teaching staff is accessible” because of the emphasis on the instructors’ availability:


I am very pleased with how accessible the instructors are. A brilliant addition is the "Ask Us Anything" section, where students can submit questions and the professors select a set of them to answer in a video format. If only all courses could be this dynamic!


The third example is an excerpt from a narrative contributed by a student in the Epidemics MOOC. The narrative described a student response that showed disaffection and was coded as “flawed/unclear words” because the most salient aspect of the comment relates to the confusing subtitles that did not match the video lecture.


There were some unfortunate (and sometimes humorous!) bits of mis-captioning that I found distracting.


RESULTS AND DISCUSSION


The results of a comparison analysis across all 10 MOOCs are presented and discussed in relation to the two research questions. Some of the major findings include:


1.

Students most preferred a problem-centric style of teaching that focused on meaning-making with relevant examples of how the principles or concepts taught can be applied in the real world.

2.

Guest speakers who fit well into the flow of the main topic content were welcomed and appreciated by students.

3.

Students generally desired moderately challenging course assignments rather than easy ones. Assignments that merely tested factual recall were disliked.

4.

By and large, students generally expressed a preference for short lecture videos of not more than 10 minutes. Lecture videos that featured different settings or faces were perceived to be more engaging.

5.

Interrupting the videos with questions (in-video quizzes), or segmenting the whole chunk of lecture into a series of smaller video sequences of about 6–10 minutes each and providing quizzes at the end of each small video sequence helped sustain students’ attention to lecture content.

6.

The two most commonly reported student disaffections across the 10 MOOCs were due to forum- and peer-related issues.


Research Question 1: What Elements Related to the Course Design or the Teaching Staff did Students Find Enjoyable, Beneficial in Helping Them Learn the Materials, or Motivate Them to Participate in the Activities? What Elements did Students Wish to Improve?


A total of 4,565 comments were analyzed and coded. Some of the comments were not coded because they were written in languages other than English (e.g., Spanish). These comments were placed into one of six categories: (a) problem-centric learning supported by clear explanations, (b) instructor availability, (c) instructor attributes (e.g., passion, humor), (d) peer interaction, (e) active learning with timely feedback, and (f) course resources to address participant learning needs or preferences. To determine the reliability of the coding, an independent coder, who was not involved in the study, coded a subset of 200 randomly selected comments (n = 20 from each MOOC). Intercoder agreement was 91%.


The four most frequently mentioned factors across all 10 MOOCs (see Figure 1) were (a) problem-centric learning, (b) active learning supported by timely feedback, (c) course resources that cater to participants’ learning needs or preferences, and (d) instructor attributes (e.g., enthusiasm, humor). Figure 2 shows the percent of factors found in each MOOC. Each of the factors will be described in the following paragraphs, along with the main strategies used. One or more representative examples from the MOOCs are also provided to help illustrate the strategies used.


Figure 1. The frequency of factors across all 10 MOOCs

[39_22013.htm_g/00002.jpg]


Figure 2. Percent of factors found in each MOOC

[39_22013.htm_g/00004.jpg]


Please note that it is beyond the scope of this study to determine which strategy would work better than others. Rather, the aim of this paper is to be practical, rather than evaluative, by highlighting the specific strategies that students perceived to be engaging. Collectively, the results of these analyses yield a set of strategies that could inform future instructors of large online courses.


PROBLEM-CENTRIC LEARNING

Engagement is fostered when learning is problem-oriented, helping students to understand or address issues or questions learners might encounter in the real world or relevant to life

a.

Focus on meaning-making with relevant examples and experiences (including successful and unsuccessful stories) as much as possible

b.

Provide practical or interesting tips and tools as much as possible

c.

Provide easy-to-understand explanations

d.

Enrich learning with the perspectives of guest speakers where possible


Among the 10 MOOCs, by and large the most frequently mentioned factor that could engage students was problem-centric learning. Specifically, students most preferred a style of teaching that focused on sense-making with relevant examples of how the principles or concepts taught can be applied in the real world, and supported with clear explanations of principles or concepts. For example, the Operations Management MOOC focused on a problem-solving process that has been successfully applied across an array of industries to identify performance issues and improve operations. Real-world examples as well as industry practices were employed:


The Professor’s messages were very clearly explained. The lectures were not boring; theory was always accompanied with real-life examples in order to help students understand the material easily. (Students L, Operations Management MOOC)


The Myth MOOC focused on the strategy of using ancient Greek source materials and various analytical tools such as functionalism, structuralism, and psychoanalysis to decode myths in order to explain truths about human nature and the world in which humans live, rather than giving a mere synopsis of different myths. The Analytics MOOC explained and modeled the use of data and analysis methods through real data sets including basketball statistics and tweets about Apple.


Engagement is further enhanced when the problem-centric learning is augmented with practical information, such as the useful programming tips conveyed by one of the instructors in the Python MOOC:


[One of the professors] was very keen on general Python mistakes and I have learned so much from his code style and programming tips videos. (Student K, Python MOOC)


To further enhance the overall learning experience, some MOOCs employed the strategy of inviting guest speakers to participate in some of the topics. Guest speakers helped augment the main content covered in the course by adding valuable insights to students’ learning. The Poetry MOOC, for example, invited actual poets to interact with students on live webcasts. The Irrational Behavior MOOC contained optional weekly video lectures of guest speakers (behavioral scientists) to provide more in-depth explanations of some topics discussed in the course. Probably, the most extensive use of guest speakers was found in the Think MOOC. While other MOOCs employed guest speakers as a form of optional material to supplement the course, the guest speakers in the Think MOOC were featured as part of the main course itself. The teaching staff of the Think MOOC traveled to different places in the world to interview more than 20 well-known experts, including a Nobel Laureate, to provide greater insights about various topics on cognition depending on the experts’ own area of specialty:


The guest speakers were amazing and added an even richer layer of information and understanding to the topics. (Student Z, Think MOOC).


ACTIVE LEARNING WITH FEEDBACK

Active learning can promote learner engagement

a.

Ensure close alignment between the lectures and activities

b.

Require students to think or apply the content learned, instead of merely regurgitating factual information

c.

When necessary, provide specially designed sandbox or simulation games for students to practice the principles taught

d.

Provide guidance or feedback on students’ work (e.g., instantaneous feedback, peer feedback, instructor solution)


Active learning may be defined as the use of any activity that requires students to do things (e.g., engage in discussion) and think about the things they are doing (Bonwell & Eison, 1991). Active learning strategies can enhance learner engagement and improve learner achievement (Phillips, 2005). A recent meta-analysis revealed that average examination scores improved by about 6% in active learning classes, and that students in classes with traditional lecturing were 1.5 times more likely to fail than were students in classes with active learning (Freeman et al., 2014).


All 10 MOOCs had some forms of activities for students to engage with (see Figure 3). By far, the most commonly employed activity was weekly computer-graded homework or activities in the form of quizzes. These quizzes mostly contained multiple-choice questions (MCQ), although some also involved the use of true/false (T/F) or short-answer questions. The second most frequently used activity was a project or writing assignment (peer reviewed) (employed by seven MOOCs). The third most frequently used activity was a computer-graded final exam (employed by four MOOCs). The final exam was generally similar in structure to the quizzes—containing MCQ, T/F, or short-answer questions—but was only conducted at the end of the course.


Figure 3. Types of graded activities used in the MOOCs

[39_22013.htm_g/00006.jpg]


Yet, merely introducing an activity into a lesson may fail to fulfill an important aim of active learning—to foster a deep understanding of the content learned (Wiggins & McTighe, 1998). In order to achieve this, an activity: (a) must relate closely to the content covered in the course, (b) must require students to think or apply the content learned, and (c) must be supported by feedback.


Close Alignment Between Lectures/Tutorials and Learning Activities


Every course has its own specific weekly learning objectives. These learning objectives are conveyed through the lectures or tutorials presented each week. Alignment is achieved when the learning activities require students to review or practice using skills or procedures in a way that moves them toward the learning objectives (Concepcion, 2009–10). This requires strong ties between the course lectures and quizzes, homework, or projects:


The quizzes are directly related to the content and designed to help you review the material, not to trick you. (Student W, Poetry MOOC)


Requiring Thoughtful Engagement on the Part of the Student


Use questions that require students to think or apply the concepts or principles learned, instead of relying on questions or tasks that merely test recall of factual information:


Too many courses present syntax and concepts, but don’t back them up with practical problems to let you use what you've learned. But this course actually gives you enough practice using the tools and real world data sets through the homework exercises to gain some confidence with them and remember how to use them. (Student K, Analytics MOOC)


What I most liked were the quizzes and mini-projects, which forced us to apply the lecture material and concepts right away. (Student P, Python MOOC)


When Necessary, Provide Specially Designed Sandbox or Simulation Games for Students to Practice or Understand the Principles Taught


For example, in the Epidemics MOOC, two games were developed. Vax was a game that students played as a single user, making decisions about whom to vaccinate and how to isolate and quarantine individuals when an infection broke out. MOOCdemic was played on a mobile device with other people worldwide to learn about disease spread and control. The use of simulation games helped students gain more insights into the topic:


I probably learned as much about epidemics from these two games together as I learned from the lectures, quizzes, and discussion boards. (Student H, Epidemics MOOC)


Feedback


Students should also receive feedback from their participation (Phillips, 2005). Due to the large number of participants, it is virtually impossible for an instructor to give individual feedback to students (Suen, 2014). The instructors of the 10 MOOCs therefore relied mainly on two methods: (a) autoscoring of quizzes, and (b) peer reviews. Autoscoring quizzes are particularly suitable for assignments that have fixed close-ended answers, while peer reviews are more appropriate for open-ended written assignments.


Typically, the process of peer review is as follows (Suen, 2014): First, a scoring rubric is developed for the assignment. Next, students are instructed to complete and submit the assignment online. Each submitted assignment is distributed to four or five randomly chosen fellow course mates to rate and/or provide written comments based on the scoring rubric. The mean or median rating score is then computed as the score for the assignment. Finally, the score and written comments are revealed to the original author of the assignment. Deadlines are usually used in conjunction with the peer reviews in order to hasten the feedback loop.


Although peer review is not perfect (see the results related to the second research question), the accuracy of peer rating is quite high. Several studies have found a high consistency between teacher and peer marks on average in traditional online courses (e.g., Bouzidi & Jaillet, 2009). Even though peer review in MOOCs may occur in a very different environment compared to a traditional course, peer review, whether it is perfect or not, is a valuable learning avenue for students (Suen, 2014).


INSTRUCTOR AVAILABILITY

Engagement is fostered when the teaching staff is accessible to answer students’ questions

a.

Provide regular virtual time to interact with students (e.g., office hours)

b.

Employ a professional help desk service to route student inquiries by email to a dedicated course site

c.

Address the most common or most up-voted students’ questions

d.

Provide live video-streamed sessions where possible to provide a virtual face-to-face synchronous interaction between the instructors and students (e.g., Google+ Hangout sessions)


Studies conducted in traditional online or face-to-face classrooms indicate that students want their professors to be available to (for example) respond to the students’ comments in the online discussions, or answer students’ questions (e.g., Delaney, Johnson, Johnson, & Treslan, 2010). But this is not an easy feat to do in a large-scale online course because it is impossible for an instructor to respond to every student.


Many MOOCs rely on the help of teaching helpers to manage students’ questions, which mostly take place in the course forums. These helpers come in one or two forms: teaching assistants who are existing on-campus graduate students or community assistants who are students who have previously completed the particular MOOC. These teaching helpers, rather than the instructors, are the primary source of help that students in MOOCs encounter.


But there were several MOOC instructors who attempted to make themselves more available to students via one or more of the following strategies: (a) offering weekly office hours, (b) employing a professional help desk service, (c) holding weekly “answer-selected-questions” sessions, and (d) providing live video-streamed discussions. Such teacher actions help promote student-to-faculty relatedness (Hoffman et al., 2002).


The instructor of the Poetry MOOC held weekly one-hour virtual office hours in the discussion forum to interact asynchronously with his course students. This was particularly appreciated by students. Analysis of his office hour forum revealed a total of 73 threads were created, consisting of 1,009 posts and 6,042 views:


This is one of the best courses I've taken. You don't just take a course, you effectively join a community of like learners and are in maximal contact with them and the course staff. They even have office hours! (Student J, Poetry MOOC)


The Python MOOC used a professional help desk service (http://helpscout.net) to manage students’ email inquiries. Specifically, this desk service routed student help requests to a course website called “Code Clinic” so that the teaching staff could respond to them (Warren, Rixner, Greiner, & Wong, 2014):


I was able to seek help from one of the instructors via the ‘code clinic’ email system! The level of assistance was at least as good—or even better—than what I've experienced in regular online university courses. (Student D, Python MOOC)


The professional help desk service provides several useful features to help facilitate the instructors’ handling of students’ inquiries. These include: collision detection to prevent duplicate replies, an autoreply to let students know their email request has been received, autotracking of students’ requests, and customizable stock responses to common questions (Warren et al., 2014; http://helpscout.net).


Another possible method of enhancing instructor accessibility was to hold weekly “answer-selected-questions” sessions (e.g., Epidemics MOOC, Irrational Behavior MOOC). Please note that these sessions were not conducted live over the Internet. For example, the Epidemics MOOC organized an “Ask-Us-Anything” forum for students to ask the instructors any course-related questions. The instructors would discuss the most interesting or up-voted questions in a weekly video which would then be uploaded onto the course website:


The faculty took some of the best student questions every week and discussed them in a roundtable format in the Ask Us Anything forums. It was a wonderful supplement to the weekly lectures, and demonstrated that the faculty were really interested in and following students’ questions. (Student C, Epidemics MOOC)


Finally, several MOOCs offered live synchronous video-streamed online sessions (e.g., Irrational Behavior, Myth, and Poetry MOOCs). This is probably the best possible method of providing a virtual face-to-face interaction between the students and instructors in a distance learning environment. Various technological tools were used to achieve this. For example, the instructor and some guest speakers in the Irrational Behavior MOOC used Google+ Hangouts On-Air throughout the class to answer questions with students from across the world (Lorch, 2013).


INSTRUCTOR PASSION, HUMOR


Two particular instructor attributes stood out among the many student comments. First, while not all instructors can be great on camera, they can still engage students if they are passionate about the subject matter (Young, 2013). Demonstrating passion or enthusiasm for the subject as well as teaching it was highlighted as a positive behavior—one that learners noted would draw them closer to the material being studied and inspire students’ interest in the topic:


I can't wait to watch the new video lectures every week, knowing that it's the start of another engaging talk by the professor. He has a contagious enthusiasm that really drew me closer to the topic. (Student U, Myth MOOC)


Second, students also appreciate an instructor’s sense of humor because it helps lightens the mood, creates an enjoyable learning experience, and motivates students to learn the material:


The instructors are very good at explaining concepts and providing examples, and they inject enough humor into the lectures to keep it engaging. (Student A, Python MOOC)


COURSE RESOURCES THAT ADDRESS PARTICIPANTS’ LEARNING NEEDS OR PREFERENCES

Engagement is fostered when course resources address participants’ learning needs/preferences

Optional resources, challenges, or different course specialism

a.

Provide different challenges or course tracks, or additional resources (e.g., videos, readings, website links)

b.

Provide lecture notes or lecture slides to complement the videos

c.

Provide video transcripts, subtitles

Video-related design

a.

Use bite-sized video clips supplemented with relevant visuals (e.g., graphics, animations) that add value to the content instead of merely repeating the content

b.

Shoot video footage in different settings to add variety to the videos

c.

Capture the faces of the teaching staff on video, but it is not necessary to show the faces throughout the entire video

d.

Interrupt the videos with questions (in-video quizzes), or segment the whole chunk of lecture into a series of smaller video sequences of about 6–10 minutes each and provide a quiz at the end of each small video sequence


MOOC learners are very diverse in terms of nationality, worldview, culture, and prior subject knowledge (Suen, 2014). In order to cater to the majority of students, the course materials in MOOCs need to be relatively easy to understand for students. Frustration occurs if students find the content or activities too difficult to complete. This is especially evident in the case of a programming course such as the Python MOOC because many beginning students tend to find coding difficult. Guidance was given explicitly in the lecture videos or through “fill-in-the-blanks” templates that provided an overall structure for students to follow:


I felt this class successfully found the middle ground between easy and hard, between hand-holding and throwing the student to the wolves. At times I felt the instructors gave us a bit too much help with their templates, but then I realized I was still learning the desired concepts while getting to program some pretty neat games. (Student I, Python MOOC)


However, many of the instructors of the MOOCs are also mindful of the needs of the more advanced participants, or participants who are interested in pursuing the subject in more depth. Hence, in addition to providing additional resources, some MOOCs offered optional challenges or a different course specialism. For example, the Poetry MOOC developed an entire additional course, ModPoPlus, to supplement the main course. ModPoPlus enabled interested students to go deeper into the works of the various poets featured in the main course. The Design and Python MOOCs offered optional challenges such as the “Design Innovation Tournament” and the “Hall of Fame” programming competition. The Operations Management MOOC offered two different course tracks: an academic track (doing homework problems and exam) and a practitioner track (completing a course project). The Irrational Behavior MOOC similarly offered two tracks, standard and distinction, to cater to different participants’ needs:


The lecturer presented the content in a light weight way, but provided additional recommended readings for people who are interested to study more about the topics. The course also offers a distinction track, where students propose a solution to a problem and design an experiment to test the solution. (Student B, Irrational Behavior MOOC)


Online video featuring the instructor’s face and voice is quickly becoming a central feature in MOOCs. Previous studies have found that the inclusion of an instructor’s face provides a more intimate and personal feel compared to video instruction not including the instructor’s face (Guo, Kim, & Rubin, 2014; Kizilcec, Papadopoulos, & Sritanyaratana, 2014). Students also feel that a human face helps break up the monotony of PowerPoint slides and code screencasts (Kizilcec et al., 2014).


Even with the inclusion of the instructor’s face in video instruction, one potential limitation is the tendency of students’ mind to wander (Risko et al., 2012). It would therefore be very useful to know—which types of video could engage students? A variety of video production styles can be identified from the 10 MOOCs (see Table 2). Specifically, the lecture video styles may be categorized using the following labels (the first four labels were adapted from Guo et al., 2014, p. 44):


a)

Slides – visual presentation (e.g., PowerPoint slides, Excel spreadsheets, animations, graphics) with voice-over.

b)

Code – video screencast of the instructor writing code in a command-line prompt or text editor.

c)

Freehand – video of an instructor’s freehand writing or sketching (either using a pen or a digital device such as a mouse) instead of computer-rendered fonts.

d)

Classroom – video captured from a live classroom lecture with student participation, focusing mainly on the instructor’s face.

e)

Close-up – close-up shots of an instructor’s face or half-body shots standing at a close-range.

f)

Artifact shots – close-up video shots of objects with voice-over.

g)

Conversational-style – shots of instructors (usually two) having a conversation about the topic.

h)

Instructor interviewing – features an instructor interviewing another expert or guest speaker on specific course-related topic.

i)

Panel discussion – video captured from live discussions involving the instructor(s) and a group of people (e.g., teaching assistants) interacting around a long table. The faces of the instructor and the teaching assistants were all captured on footage.


Table 2. Features of Video Used in the MOOCs

Name of MOOC

Video production style

Available subtitles

Video lecture length

Other video feature

Python

Tutorial/demonstration

video format – Mixture of: (a) close-up, (b) freehand-style explanations of codes (with instructor’s face shown in a small window), (c) code (with instructor’s face shown in a small window)

English, Spanish, Portuguese, Chinese

6–9 videos per week. Video length

(median = 11 min, average = 11 min).

Include occasional in-video embedded questions that ask students to respond before continuing the video

     

Entrepreneur

Lecture-style video format – Mixture of: (a) close-up and (b) classroom with slides (with instructor’s face shown in a small window)

English, Portuguese, Spanish (some videos may include other languages such as Hebrew, Japanese, German)

6–11 videos per week. Video length (median = 21 min, average = 19 min).

Nil

     

Poetry

Panel discussion format using Socratic questions

English, Spanish (some videos may include other languages such as Dutch)

7–12 videos per week, excluding miscellaneous videos and webcasts. Video length (median = 15 min, average = 15 min).

Nil

     

Irrational behavior

Lecture-style video format –

Mixture of: (a) close-up with occasional slides shown on right/left side of video and (b) slides (no face)

English, Chinese, Portuguese

4–9 videos per week, excluding guest videos. Video length, excluding guest videos

(median = 12 min, average = 12 min).

Include occasional in-video embedded questions

     

Management

Lecture-style video format –

Mixture of: (a) close-up and (b) slides with freehand-style explanations (no face)

English, Chinese (some videos may include other languages such as Spanish)

3–9 videos per week. Video length

(median = 9 min, average = 10 min).

Include occasional in-video embedded questions

     

Design

Lecture-style video format –

Mixture of: (a) close-up with occasional slides on right/left side of video, (b) freehand-style explanations (no face), and (c) artifact shots (no face)

English, Portuguese (some videos may include other languages such as Spanish, Russian, Persian)

4–10 videos per week, excluding optional videos. Video length, excluding optional videos

(median = 7 min, average = 8 min).

Include occasional in-video embedded questions

     


Epidemics

Lecture-style video format –

(a) close-up with slides shown on left of video

English

7–10 videos per week. Video length (median = 6 min, average = 6 min).

Nil

     

Think

Lecture-style video format –

Mixture of: (a) conversational style in various settings and (b) instructor interviewing another expert/guest speaker

English

4–12 videos per week. Video length, excluding uncut conversations (median = 5 min, average = 5 min).

Nil

     

Analytics

Tutorial/demonstration video format – Mixture of: (a) slides with occasional freehand-style explanations (no face), (b) close-ups used to introduce each week’s topic, and (c) code (no face)

English

14–23 videos per week. Video length (median = 4 min, average = 4 min).

Nil

     

Myth

Lecture-style video format –

(a) close-up with occasional slides on right/left side of video

English, Chinese (some videos may include other languages such as Korean, Portuguese)

7–10 videos per week. Video length (median = 10 min, average = 10 min).

Nil


Table 2 shows that a majority of the MOOCs predominantly employed the lecture-style video format (e.g., the instructor delivering a talk on particular topics). Two MOOCs utilized the tutorial or demonstration video format (e.g., problem-solving walkthrough, step-by-step coding), while one MOOC focused primarily on the panel discussion video format using Socratic questions. Regardless of the video-style format, three findings stood out.


First, by and large, students expressed a preference for short videos (typically not longer than 10 minutes) and the use of visuals such as animations that highlight important ideas to help students understand the content better:


I like the short video segments—usually 1–2 main concepts taught per a couple minute segment. (Student H, Design MOOC, median video length 7 minutes)

The videos were the perfect length, not too long and not too short. (Student X, Think MOOC, median video length 5 minutes)

Unlike other courses, there are many interesting animations in every video lecture that ensure that students are engaged and will understand what the instructors are saying. (Student M, Epidemics MOOC)


Second, lecture videos that featured different settings or instructor faces were also perceived to be more engaging:


I loved its ‘panel’ format. It offers relief from the problem of looking at the same face, with the same voice for an entire set of videos. Moreover, it’s really fun to feel like you are part of the discussion of the teacher and students in the video. (Student P, Poetry MOOC)

The teachers use video-lectures very effectively—the weekly material is chopped into short videos on specific topic, and they have different instructors with his own style of teaching which helps me to stay interested in the course. (Student B, Python MOOC)


Third, engagement is also fostered when the online videos are interpolated with short quizzes or tests that require students to answer questions about the contents from the most recent lecture segment. Segmenting the whole chunk of lecture into a series of smaller video sequences of about 6–10 minutes each and providing a short quiz at the end of each small video sequence, or interrupting the videos with questions (in-video quizzes) can help sustain students’ attention to lecture content, discourages task-irrelevant activities, and improves learning (Szpunar, Khan, & Schacter, 2013). For example:


I especially like the in-video quizzes that are auto-graded. These quizzes help minimize mind-wandering; they also test the concepts or ideas taught to help me remember and understand them better. (Student S, Python MOOC)


Finally, text still matters in the age of video lectures or tutorials. It is harder for students to search for particular information in the videos compared to text, several students remarked.


I would have preferred an alternate written narrative as a supplement / complement to the video. Reading would be quicker to get through than watching. (Student Y, Epidemics MOOC)

I'd appreciate supporting articles in pdf that could save time compared to video lectures. (Student J, Python MOOC)


Students also like it when the instructors provide summaries of the major ideas covered in lectures.


The one thing I would have liked added to the course would have been a week by week “cheat sheet” with major topics/concepts briefly outlined on it. (Student L, Epidemics MOOC)


PEER INTERACTION

Peer interaction can promote student engagement

a.

Award student posts with marks

b.

Provide an opportunity for students to interact with peer raters regarding submitted assignments

c.

Organize online live sessions for the instructor and fellow students to ask questions and exchange ideas


Peer interaction provides an avenue for individuals to exchange thoughts and ideas (Dunlap, 2005). Interaction with others is an important condition that facilitates peer relatedness (Reeve, 2005). In online classes, most of the peer interactions take place in the asynchronous discussion forums (Coetzee et al., 2014; Warren et al., 2014). Yet, previous research has found that only a few students post in forums (Breslow et al., 2013). Most students are mere viewers of messages.


So how can we encourage people to interact in MOOCs? MOOCs that had relatively more participants’ comments about the occurrence of peer interactions (e.g., Myth, Poetry, Python, Epidemics MOOCs) appeared to use one or more of the following strategies: (a) awarding marks for forum postings (e.g., Epidemics MOOCs), (b) providing an opportunity for students to interact with peer raters regarding their submitted assignments (e.g., Poetry MOOC), and (c) organizing online live sessions for the instructor and fellow students to ask questions and exchange ideas (e.g., Myth, Poetry MOOCs).


Students in the Epidemics MOOC were asked to participate in the forums by posting at least 10 messages. They were also awarded marks for their contributions (2 points per post). Awarding marks for student posting can serve as a starting point to foster peer interaction. This is because a certain “critical mass” of postings is a prerequisite for an online interaction to occur (Schellens et al., 2005).


According to Warren et al. (2014), peer review activity is an inherently social mechanism. By and large, students who submit their work desire to have feedback on their assignments and to discuss the feedback with their reviewers. Therefore, providing an opportunity for students to interact with peer raters regarding their submitted assignments (e.g., seek clarification) is a viable strategy to attract participants to contribute in an online discussion. Such a strategy was implemented in the Poetry MOOC. Analysis of the assignment forums revealed a substantial number of activities (628–1267 threads per assignment with 5–82 responses), as compared to other MOOCs (e.g., Design MOOC) that did not employ this strategy (34 threads per entire assignment Project Websites forum with 2–33 responses).


The use of live online interactive sessions was offered by several MOOCs to enhance peer interactions in the discussion of the course topics. The teaching staff of the Poetry MOOC organized weekly live webcasts for students to discuss course topics. Students from many parts of the world could join the live discussions through the telephone, Facebook, and course forums. The instructor of the Myth MOOC provided live video “screenside chat” sessions to discuss course contents. Students could participate in these sessions by posting questions or comments on the class forum or twitter. The instructor would then answer these questions live video-streamed on YouTube:


I think the screen side chats were a great place to hear more discussion and multiple perspectives on these topics. (Student D, Myth MOOC)


Research Question 2: What Elements Related to the Course Design or Teaching Staff did Students Find Frustrating?


A total of 134 comments related to student disaffection were found. The participants’ comments were examined using grounded content analysis method (Strauss & Corbin, 1990) to yield nine reasons that could lead to student disaffection with online learning at scale (see Figure 4). An independent coder was employed to independently code a subset of 50 comments. Intercoder agreement was 96%.


Figure 4. Student disaffection with learning at scale

[39_22013.htm_g/00008.jpg]


The two most commonly reported student disaffections across the 10 MOOCs were due to forum- and peer-related issues. There were several frustrations about the use of forums in a MOOC, such as rude and arrogant posts:


Forum idea is a very good which gives you a chance to share information and get in touch with others BUT some people in the forum are very arrogant and oppressive with their positions, even impolite. (Student K, Epidemics MOOC)


The major forum-related disaffection was due to the problem of information overload, which makes it difficult for a student to look for relevant posts to read:


Not sure how it should be improved but once the forums get busy, it is tedious to dig through to find relevant information. (Student B, Python MOOC)

One of the main reasons for student disaffection is related to the use of peer review activities.


Typically, students are asked to review several of their peers’ work (usually five). Students were mostly frustrated with the superficial or inconsistent feedback they received from their peers:


There are a small minority who don't take peer reviews serious and just click through giving full credit even when it isn't deserved. (Student U, Python MOOC)

I understand the need for and value of peer assessments, but a couple of times the scores I received were wildly different between peer reviewers. (Student K, Python MOOC)


Other disaffections include:


(a)

Assignment-related issues, including allowing too many quiz retake attempts, using automatic grading for writing assignments, and unclear rubrics: “There are 100 attempts to pass each quiz. And after each attempt I can see which exactly question I miss. I passed all the tests just picking random answers even without reading question. I intentionally did not open lectures after the second week” (Student Z, Epidemics MOOC); “The automatic writing assessment methodology (based on keywords, rather than content) is not really effective and I consider it a major problem” (Student U, Entrepreneur MOOC); “I don't like the grading rubric because it is so open to interpretation” (Student P, Python MOOC).

(b)

A dislike of reading academic papers: “The readings were tough and then to be tested on the content was a bummer” (Student C, Behavior MOOC).

(c)

Unhappiness about the claims or perceived biases in the teachings: “The course has its good point but is too heavily influenced by ’experts’ who seem to hate religion and people of faith” (Student T, Think MOOC).

(d)

Flawed or unclear wording or instructions found in the quizzes, subtitles, or assignment deadlines: “The weekly tests are crafted in such a confusing way that avoidable mistakes are made” (Student A, Myth MOOC).

(e)

Short and inflexible deadlines: “The pace of having something due every two days or so made it difficult to keep up if weekends were taken up with other things and one had to work as well” (Student L, Python MOOC).

(f)

The lack of instructor feedback: “I would have preferred that the teacher provide feedback on one of the exercises we do. Peer feedback may be useful but teachers should also give their students some comments about their work” (Student Q, Design MOOC).

(g)

Video style—student disaffection with lecture video mainly centers on the videos used in the Entrepreneur MOOC. This was due primarily to the small shot of the instructor’s face and the length of the videos: “The only dislike I had was the small screen shot of the instructor. The little window down in the corner of the screen was out of proportion with the text. I would have preferred a larger screen” (Student R, Entrepreneur MOOC); “This course consists of heinously long videotaped lectures” (Student L, Entrepreneur MOOC). The median and average lengths of the lecture videos in the Entrepreneur MOOC (21 and 19 minutes respectively) were the longest when compared to the other MOOCs.


DISCUSSION


In this paper I explored the factors that can promote student engagement or lead to student disaffection in large-scale online classes. Five main implications emerged from this study. First, among the many different MOOCs, problem-centric learning supported by clear explanations was the most frequently described engagement factor. This lends support to the assertion that students learn more when instruction focuses on real-world tasks or problems (e.g., Mayer, 1992; Merrill, 2002). This finding implies that instructors should first and foremost emphasize problem-solving in their lessons rather than merely conveying information, or demonstrating certain skills in a de-contextualized setting. Strategies such as providing actual examples of how the principles taught are applicable to real life, giving practical or interesting information, and using guest speakers who fit into the flow of the main topic are particularly welcomed and appreciated by students because these elements add valuable insights to students’ learning.


Second, despite the obligation-free nature of MOOCs, such as no course fees and no actual course credit for successful completion of assignments, it is interesting to note that students still desire activities or assignments that challenge them to think. The implication here is that instructors should avoid assignments that merely test factual recall. Instead, instructors should employ strategies such as asking students to apply the concepts learned to solve some real-world problems. The activities or assignments should also have varying difficulty levels that can be selected by students. This enables students to exercise a sense of autonomy by choosing an activity that matches exactly their personal ability, while stimulating students to accomplish more difficult tasks.


Third, among the least frequently mentioned factors that could engage students were peer interaction and instructor availability. This implies that students do not seem to attach much importance to the need for peer interaction or teacher availability in large-scale online courses, unlike traditional face-to-face or online classes. Students probably accept the fact that it is impossible for an instructor to make himself available to individual learners in large-scale courses. It is also likely that the anonymous nature of MOOCs combined with job or family responsibilities diminishes student expectations of course interaction with peers or the instructor. One possible method to promote peer interaction is to use face-to-face study groups. In probably one of the earliest studies that explored MOOC students’ use of weekly face-to-face study groups, Chen and Chen (2015) found that study groups could foster a strong sense of community, provide motivation for learning, and broaden students’ perspectives of the content.


Fourth, certain strategies for engaging students seem to be more appropriate for certain types of courses or subject matter. The active learning strategy of using sandboxes or simulations was employed more often in STEM (science, technology, engineering and mathematics) courses such as Python Programming and Epidemics. Sandboxes and simulations enable students to experience phenomena they would not normally experience firsthand.


Non-STEM courses more often utilize peer reviews to provide feedback. Non-STEM courses such as Entrepreneurship, Poetry, Operations Management, and Mythology relied more on open-ended written assignments or projects that required subjective judgments. Although automated essay scoring algorithms can be used to grade essays, some students find such programs too rigid and limited because they only recognize certain keywords in the written works. Therefore, the single approach that is currently used to provide feedback at scale on open-ended writings or projects is peer review. Nevertheless, not all peer reviews work perfectly. The findings of Study 2 revealed that students were mainly frustrated with the superficial or inconsistent feedback they received from their peers.


Some possible solutions suggested by participants to curb the problems of superficial feedback during peer review include using a mandatory requirement for peers to comment at length about a work (e.g., 200 words) which could eliminate the problem of students merely giving a grade and leaving no qualitative feedback at all, or making students stay together throughout all the peer review activities. The latter helps create a sense of accountability to each other which could further help ameliorate the problem of superficial review.


One possible solution to the problem of inconsistent comments is to conduct rating calibration exercises prior to the actual peer review activities. One participant reported that this method was implemented in another MOOC (Introductory Physics I). Before the peer reviewers graded the written works, they had to undergo two calibration exercises on some papers that had been graded by the instructor. These calibration exercises can help reduce some of the discrepancies that could exist among different peer raters (Suen, 2014).


Programming courses such as Python and Analytics more frequently used tutorial or demonstration video styles that provided step-by-step walkthroughs. These videos were usually supported with screencasts of the instructors writing code in a command-line prompt or text editor.


Fifth, what elements related to the course design or teaching staff did students find frustrating? Answers to this question are important because they could help improve student experience of MOOCs. Content analysis of 134 comments yielded nine reasons or factors that could lead to student disaffection with online learning at scale. Of the nine specific reasons for student dissatisfaction, some may indeed be applicable to any online course. For example, student dissatisfaction related to flawed or unclear wording or instructions found in the course materials, short and inflexible deadlines, superficial peer feedback, and perceived biases in the teaching could apply to either traditional online courses or MOOCs. It is reasonable to expect that students would feel frustrated if they encounter unclear instructions in the assignments, if they receive superficial feedback from their peers, or if the deadlines are too strict regardless of whether they are a student in a traditional online class consisting of tens of students or in a large online class comprising thousands or tens of thousands of students.


However, unlike in traditional online classes, certain student dissatisfactions are more specific to or exacerbated in MOOCs. These dissatisfactions include information overload in forums and lack of instructor feedback. Indeed, information overload was the major student dissatisfaction concerning the use of discussion forums in MOOCs. Unlike, traditional online class that consists of tens of students, the sheer number of students in a MOOC (especially MOOCs that are highly rated) can easily create the problem of information overload. Information overload due to a profusion of posts makes it hard for students to search for relevant information, and subsequently discourages students from sharing their opinions with other students in the forums.


Lack of instructor feedback is another student dissatisfaction that is more common in MOOCs than in traditional online courses. Although most students understand the fact that it is impossible for an instructor to make himself available to individual learners in large-scale courses, there are some students who wish the instructors, rather than a teaching assistant, to answer their questions or respond to their comments in the online forums.


In order to manage the problem of information overload, an instructor may consider using one or more of the following strategies. First, an instructor could prepopulate the forums with specific threads (Liyanagunawardena, Kennedy, & Cuffe, 2015). For example, within the “Study group” forum, an instructor may precreate threads according to students’ national backgrounds, such as Russian or Chinese. Or within the “Weekly lectures” forum, an instructor might precreate threads according to the specific types of discussion topics anticipated in the course.


The prepopulated threads would promote a closer relationship between the specific discussion forums and course topics, hence minimizing off-topic discussions. The prepopulated threads would also allow participants to more readily identify topics to join based on common interests or contexts (Liyanagunawardena et al., 2015). Furthermore, by prepopulating discussion forums with threads according to the types of discussions anticipated in the course, instructors are able to reduce the problem of students creating various threads on their own that may overlap one another. This would help further reduce the problem of information overload.


Second, more specific instructions about using or searching for information within forums could be provided at the beginning of course and reinforced at the beginning of each forum page. For example, in order to reduce repeated questions and answers, students could be reminded to search for an answer before asking a question.


Third, a MOOC forum may require participants to tag their questions. The use of tags is currently practiced by the popular and active Stack Overflow, an open Q&A site. As of early August 2010, Stack Overflow had 300,000 registered users who asked 833,000 questions, posted 2.2 million answers, and boasted a high 92.6% question-answered rate (Mamykina, Maniom, Mittal, Hripcsak, & Hartmann, 2011). In Stack Overflow, each question can have up to five tags since a question might be related to several topics. A tag is a label or keyword that categorizes a question with other similar questions; using the right tags makes it easy for other people to find and answer a question (stackoverflow, 2015).


In order to address the lack of instructor feedback in MOOCs, an instructor may consider implementing one or more of the following three suggestions. First, offer a fee-paying consultation option for individual students who wish to consult with the teaching staff. For example, the Epidemics MOOC offers an on-demand assistance via video chat (Skype) service called “Teeays.” Learners can choose either the “Free Plan” or “ExperTA Plan.” The “Free Plan” option is free and connects a learner to any available teaching assistant chosen at random. The “ExperTA Plan” provides learners with the service of vetted teaching assistants who have received high ratings from other learners. In this plan, learners are charged base fee of $4.99 plus $0.39 for every minute after the 10th minute. This scheme can be further enhanced by providing learners the option to connect to the professor or instructor who teaches the course at a higher fee.


Second, an instructor may consider randomly choosing 3–5 student assignments each week and providing feedback on these works. The instructor may grade these assignments and video record his comments. The other students could then see the instructor’s comments on these graded assignments and understand the main points that the instructor is looking for.


Third, an instructor may also consider providing exemplars, as suggested by some students:

One thing that I would have liked is to see the instructor solutions to the mini-projects after the deadline to submit had expired. This would allow us to compare our solutions to the instructor's. (Student K, Python MOOC)


Exemplars such as the instructor’s own sample answers or solutions to the assignments represent the concrete embodiment of standards (To & Carless, 2015), and accordingly can help students learning so that higher quality outcomes are produced (Orsmond, Merry, & Reiling, 2002). The following paragraphs describe possible solutions to address students’ other frustrations.


Instead of making the reading of academic papers part of a mandatory course requirement (students need to read these papers in order to answer the quizzes), provide handouts that summarize the main findings in layman’s language. Academic papers can be provided as optional resources for students who may be interested in them.


Instructors should strive to present two-sided refutational messages concerning the claims made in the course because they are more convincing compared to one-sided or two-sided nonrefutational messages (Murphy, 2001). Two-sided refutational messages present both the arguments for an issue as well as its counterarguments, but one side is generally promoted while the other is refuted (Murphy, 2001).


Provide some flexibility in terms of assignment deadlines, although this may be difficult to do in a course of over tens of thousands of students. Perhaps, instead of allowing total flexibility, an instructor may consider allowing the autograded assignments such as quizzes to be completed by the end of the course rather than at the end of each week. This method was implemented in the Epidemics MOOC, and received favorable comments by many students:


It is great that they did not close the quizzes and discussion forums at the end of each week, but rather made them all due by the end of the course. This allowed students with variable schedules the flexibility to keep up. (Student Q)


Another way is to implement a set number of late days, each with its own penalty so that students will not be overly penalized if they cannot meet a deadline. This scheme was employed in the Entrepreneur MOOC where assignments turned in after the deadline received a 5% per day late penalty for up to 7 days, after which time no credit was given.


CONCLUSION


This study gathered and analyzed data from the participants in one or more of 10 highly rated MOOCs, who provided information about the courses they took. There are four main limitations of Study 2. First, this study relies on participants’ reflections guided by several questions that were built into the feedback format on CourseTalk. These questions pertained to the course instructor, the tools and assessments used, the community, the overall course design, as well as their dislikes and suggestions for improvement. Not all participants posted their comments according to these prompts. It is likely that participants mentioned only the factors that they perceived to be most engaging. In addition, these questions, although useful, might have restricted participants to answer only in response to the prompts. Consequently, other important factors might be missed.


Second, out of 4,466 participants’ reflections on their MOOC experiences, only 134 of these comments speak of participants’ dissatisfaction. Participants might be reluctant to voice their dissatisfaction publicly in CourseTalk, or it is likely that participants who bothered to post their reflections were individuals who liked the MOOCs. Participants who were dissatisfied might not be interested in posting their reviews.


Third, it should be noted that highly rated courses may not necessarily be the most effective ones. It is beyond the scope of this study to examine causal effect between course effectiveness (e.g., completion rate) and user ratings.


Fourth, CourseTalk did not indicate whether student comments came from students taking the MOOC for course credit or from students who were taking it for other reasons. As a result, the current study was not able to analyze how the comments from these two groups of students might differ.


Moving forward, future research could collect student questionnaire data to investigate which type of engagement component (behavioral, affective, or cognitive) could be significant predictors of student learning. Student interviews could also be carried out to gain more in-depth understanding about the specific factors and strategies that could build a sense of autonomy, relatedness, and competency. We need further studies to understand how and why autonomy, relatedness, and competency might lead to a positive response to MOOCs. Collectively, the findings and implications presented in this study provide practical guidelines to other instructors of large online courses. The findings may also offer tips for instructors of traditional e-learning courses. Although we cannot generalize the findings of this study to traditional e-learning courses, it is possible that at the very least, the information presented here may suggest probable solutions for traditional e-learning courses that might otherwise be overlooked.


Acknowledgment


This research was supported by a grant from the Research Grants Council of Hong Kong (Project reference no: 17651516).


References


Alario-Hoyos, C., Perez-Sanagustin, M., Kloos, C. D., & Munoz-Merino, P. J. (2014). Recommendations for the design and deployment of MOOCs: insights about the MOOC digital education of the future deployed in MiríadaX. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality (pp. 403–408).

Anderson, A., Huttenlocher, D., Kleinberg, J., & Leskovec, J. (2014). Engaging with massive online courses. Proceedings of the 23rd International Conference on World Wide Web (687–698).

Birch, S., & Ladd, G. (1997). The teacher–child relationship and children’s early school adjustment. Journal of School Psychology, 35, 61–79.

Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom (ASHE-ERIC Higher Education Report no. 1). Washington, DC: George Washington University, School of Education and Human Development..

Bouzidi, L., & Jaillet, A. (2009). Can online peer assessment be trusted? Educational Technology & Society, 12(4), 257–268.

Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., & Seaton, D. T. (2013). Studying learning in the worldwide classroom. Research into edX’s first MOOC. Research & Practice in Assessment, 8, 13–25.

Chen, Y-H., & Chen, P-J. (2015). MOOC study group: Facilitation strategies, influential factors, and student perceived gains. Computers & Education, 86, 55–70.

Coetzee D., Fox, A., Hearst, M. A., & Hartmann, B. (2014). Should your MOOC forum use a reputation system? In Proceedings of the 17th ACM conference on Computer Supported Cooperative Work & Social Computing (pp. 1176–1187).

Coffrin, C., de Barba, P., Corrin, L., & Kennedy, G. (2014). Visualizing patterns of student engagement and performance in MOOCs. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 83–92).

Concepcion, D. W. (2009–10). Transparent alignment and integrated course design. Essays on Teaching Excellence, 21(2). Retrieved from http://podnetwork.org/content/uploads/V21-N2-Concepcion.pdf

Costello, L. (2013). CourseTalk [Blog post]. Retrieved from https://newlearningtimes.com/cms/article/897

Courtney, K. K. (2013). The MOOC syllabus blues: Strategies for MOOCs and syllabus materials. College & Research Libraries News, 74(10), 514–517.

Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. New York, NY: Plenum Press.

Delaney, J., Johnson, A., Johnson, T., & Treslan, D. (2010). Students’ perceptions of effective teaching in higher education. St John’s, NL: Distance Education and Learning Technologies. Retrieved from http://www.uwex.edu/disted/conference/Resource_library/handouts/28251_10H.pdf

Educause (2013, June 11). 7 things you should know about MOOCs II. Retrieved from https://library.educause.edu/resources/2013/6/7-things-you-should-know-about-moocs-ii

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. (2004). School engagement: Potential of the concept: State of the evidence. Review of Educational Research, 74, 59–119.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS, 111(23), 8410–8415.

Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 97–131). New York, NY: Springer.

Fox, A. (2013). From MOOCs to SPOCs. Communications of the ACM, 56(12), 38–40.

Furrer, C. J., & Skinner, C. (2003). Sense of relatedness as a factor in children’s academic engagement and performance. Journal of Educational Psychology, 95, 148–162.

Furrer, C. J., Skinner, E. A., & Pitzer, J. R. (2014). The influence of teacher and peer relationships on students’ classroom engagement and everyday resilience. In D. J. Shernoff & J. Bempechat (Eds.), Engaging youth in schools: Empirically-based models to guide future innovations, National Society for the Study of Education Yearbook (Vol. 113, pp. 101–123). New York, NY: Columbia University, Teachers College.

Gore, H. (2014). Massive open online courses (MOOCs) and their impact on academic library services: Exploring the issues and challenges. New Review of Academic Librarianship, 20(1), 4–28.

Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the First ACM Conference on Learning @ Scale (pp. 41–50). New York, NY: ACM Press.

Haggard, S. (2013, September). The maturing of the MOOC (Research paper no. 130). Retrieved from https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/240193/13-1173-maturing-of-the-mooc.pdf

Hew, K. F. (2016). Promoting engagement in online courses: What strategies can we learn from three highly-rated MOOCs. British Journal of Educational Technology, 47(2), 320–341.

Hew, K. F., & Cheung, W. S. (2014). Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educational Research Review, 12, 45–58.

Hoffman, M., Richmond, J., Morow, J., & Salomone, K. (2002). Investigating sense of belonging in first year college students. Journal of College Student Retention, 4(3), 227–256.

Jang, H., Johnmarshall, R., & Deci, E. L. (2010). Engaging students in learning activities: It is not autonomy support or structure but autonomy support and structure. Journal of Educational Psychology, 103(3), 588–600.

Kizilcec, R. F., Papadopoulos, K., & Sritanyaratana, L. (2014). Showing face in video instruction: Effects on information retention, visual attention, and affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2095–2102).

Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 170–179). New York, NY: ACM.

Kolowich, S. (2013). The professors who make the MOOCs. Chronicle of Higher Education, 59(28), A20–A23.

Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A systematic study of the published literature, 2008–2012. International Review of Research in Open and Distance Learning, 14(3), 202–227.

Liyanagunawardena, T. R., Kennedy, E., & Cuffe, P. (2015). Design patterns for promoting peer interaction in discussion forums in MOOCs. Retrieved from http://www.openeducationeuropa.eu/en/article/Design-Patterns-for-Open-Online-Teaching-and-Learning-Design-Paper-42-7

Lombardi, M. M. (2013). The inside story: Campus decision making in the wake of the latest MOOC tsunami. MERLOT Journal of Online Learning and Teaching, 9(2), 239–248.

Losier, G. F., Perreault, S., Koestner, R., & Vallerand, R. J. (2001). Examining individual differences in the internalization of political values: Validation of the self-determination scale of political motivation. Journal of Research in Personality, 35(1), 41–61.

Mackness, J., Mak, S., & Williams, R. (2010). The ideals and reality of participating in a MOOC. In L. Dirckinck-Holmfeld, V. Hodgson, C. Jones, M. de Laat, D. McConnell, & T. Ryberg (Eds.), Proceedings of the 7th International Conference on Networked Learning, 2010 (pp. 266–275). Lancaster, UK: Lancaster University.

Maddox, S. J., & Prinz, R. J. (2003). School bonding in children and adolescents: Conceptualization, assessment, and associated variables. Clinical Child and Family Psychology Review, 6, 31–49.

Mahatmya, D., Lohman, B. J., Matjasko, J. L., & Farb, A. F. (2012). Engagement across developmental periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 45–63). New York, NY: Springer.

Mamykina, L., Maniom, B., Mittal, M., Hripcsak, G., & Hartmann, B. (2011). Design lessons from the fastest Q&A site in the West. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2857–2866). New York, NY: ACM Press.

Mayer, R. E. (1992). Thinking, problem solving, cognition (2nd ed.). New York, NY: W.H. Freeman.

Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43–59.

Murphy, P. K. (2001). Teaching as persuasion: A new metaphor for a new decade. Theory into Practice, 40(4), 224–227.

Niemiec, C. P., Lynch, M. F., Vansteenkiste, M., Bernstein, J., Deci, E. L., & Ryan, R. M. (2006). The antecedents and consequences of autonomous self-regulation for college: A self-determination theory perspective on socialization. Journal of Adolescence, 29, 761–775.

Orsmond, P., Merry, S., & Reiling, K. (2002). The use of exemplars and formative feedback when using student derived marking criteria in peer and self-assessment. Assessment & Evaluation in Higher Education, 27(4), 309–323.

Phillips, J. M. (2005). Strategies for active learning in online continuing education. Journal of Continuing Education in Nursing, 36(2), 77–83.

Reeve, J. (2005). Understanding motivation and emotion (4th ed.). New York, NY: John Wiley & Sons, Inc.

Richardson, A. (2014, April 17). Coursetalk and edX collaborate to integrate online course reviews platform [Press release]. Retrieved from http://www.businesswire.com/news/home/20140417005360/en#.VCohj2eSx8F

Rotgans, J. I., & Schmidt, H. G. (2011). Cognitive engagement in the problem-based learning classroom. Advances in Health Science Education, 16, 465–476.

Roth, M. S. (2013). My modern experience teaching a MOOC. Chronicle of Higher Education, 59(34), B18–21.

Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85, 571–581.

Skinner, E. A., Furrer, C., Marchand, G., & Kindermann, T. (2008). Engagement and disaffection in the classroom: Part of a larger motivational dynamic? Journal of Educational Psychology, 100, 765–781.

Skinner, E. A., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 21–44). New York, NY: Springer.

Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage Publications.

Suen, H. K. (2014). Peer assessment for massive open online courses. International Review of Research in Open and Distributed Learning, 15(3). Retrieved from http://www.irrodl.org/index.php/irrodl/rt/printerFriendly/1680/2904

Szpunar, K. K., Khan, N. Y., & Schacter, D. L. (2013). Interpolated memory tests reduce mind wandering and improve learning of online lectures. PNAS, 110(16), 6313–6317.

To, J., & Carless, D. (2015). Making productive use of exemplars: Peer discussion and teacher guidance for positive transfer of strategies. Journal of Further and Higher Education, 1–19.

Wang, Y. & Baker, R. (2014). MOOC learner motivation and course completion rates. Retrieved from http://www.moocresearch.com/wp-content/uploads/2014/06/MRI-Report-WangBaker-June-2014.pdf

Warren, J., Rixner, S., Greiner, J., & Wong, S. (2014). Facilitating human interaction in an online programming course. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (pp. 665–670).

Wiggins, G., & McTighe, J. (1998). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development.

Williams, G. C., McGregor, H. A., Sharp, D., Levesque, C., Kouides, R. W., Ryan, R. M., & Deci, E. L. (2006). Testing a self-determination theory intervention for motivating tobacco cessation: Supporting autonomy and competence in a clinical trial. Health Psychology, 25, 91–101.

Yin, R. K. (2014). Case study research: Design and methods (5th ed.). Thousand Oaks, CA: Sage.

Young, J. R. (2013). What professors can learn from “hard core” MOOC students. Chronicle of Higher Education, 59(37), A4.




Cite This Article as: Teachers College Record Volume 120 Number 1, 2018, p. 1-40
https://www.tcrecord.org ID Number: 22013, Date Accessed: 8/4/2020 1:23:48 PM

Purchase Reprint Rights for this article or review