Sunday, 24 November 2013

Liminal Participants and Skilled Orienteers: Learner Participation in a MOOC for New Lecturers

         Liminal Participants and Skilled Orienteers:
Learner Participation in a MOOC for New Lecturers


Marion Waite

Senior Lecturer and Brookes Teaching Fellow
Department of Clinical Health Care
Oxford Brookes University
Oxford OX3 0BP UK
mwaite@brookes.ac.uk

Jenny Mackness
Independent Education Consultant
Lancaster LA6 1ND UK
jenny.mackness@btopenworld.com

George Roberts
Educational Developer (E-Learning)
Oxford Centre for Staff and Learning Development
Oxford Brookes University
Oxford OX3 0BP UK
groberts@brookes.ac.uk

Elizabeth Lovegrove
Learning Technologist
Oxford Centre for Staff and Learning Development
Oxford Brookes University
Oxford OX3 0BP UK
ejlovegrove@brookes.ac.uk

Abstract

This case study explored learner participation in First Steps in Learning and Teaching in Higher Education (FSLT12), a short massive open online course (MOOC) aimed at introducing learning and teaching in higher education that was offered by Oxford Brookes University in June 2012. Both novice and experienced MOOC learners joined the course. The aim of the case study was to explore triggers for active participation. A mixed-methods approach was utilized in order to collect and analyze data from focus groups, individual interviews, participant blog posts, and a survey. The lenses of social constructivism, connectivism, and community of practice theories were used to enhance understanding of participation in FSLT12. Three main themes emerged: (1) Navigation: New participants felt overwhelmed by technical issues, multiple channels, and a perceived need to multitask, while experienced learners were judicious about planning their route; (2) Transformative learning: Ultimately, learners experienced a transformative shift, but it required reflection on practice, community support, and self-organization; (3) Reciprocal Relationships: New learners needed time to determine their audience and core community, as well as to realize mutual relationships within that community. Learners in a MOOC inhabit a liminal space. Active MOOC participants are skilled orienteers. Engaging local expertise of experienced MOOC learners and developing participatory skills in new learners is a key strategy for those who organize and facilitate MOOCs.

Keywords: massive open online course (MOOC), connectivist massive open online course (cMOOC), threshold concepts, navigation, liminality, transformative learning, participation, reciprocity

Introduction and Background

This case study explores the learner experience of participation in First Steps in Learning and Teaching in Higher Education (FSLT12), which was offered by Oxford Brookes University over a five-week period during May and June 2012 for new lecturers in higher education and was one of the very first massive open online courses (MOOCs) to be run in the United Kingdom. FSLT12, which was funded by the Joint Information Systems Committee (JISC) and The Higher Education Academy (HEA) program on open educational resources, was modest in MOOC terms in that it attracted 206 participants. Nevertheless, this case study demonstrates some important issues about the experience of MOOC participation from the perspective of the learner, which may determine active participation.

Other researchers (Chamberlin & Parish, 2011; Cormier, 2010a, 2010b; McAuley, Stewart, Siemens, & Cormier, 2010; Weller, 2011) have identified potential advantages of MOOC participation, for example opportunities to build personal networks and make connections beyond the content of a course. In spite of these espoused benefits, it has been acknowledged that participation in a MOOC may be challenging and troublesome:

Participation in a MOOC is emergent, fragmented, diffuse, and diverse. It can be frustrating. It's not unlike life. (Bonnie Stewart, in her narrative introduction in McAuley et al., 2010, p. 5)

Evaluation of FSLT12

An evaluation strategy for FSLT12 was integrated with the course design. An identified goal was to gain more knowledge about the experiences of a diverse group of participants, their interaction with content and with one another during FSLT12.

The research and evaluation approach was initially identified as a general topic to understand more about differential participation within FSLT12. Refinement of specific research questions followed feedback from the project critical friend (who was assigned by JISC/HEA) and were as follows:

What was the learner experience of participation in FSLT12?How did learners interact with content in FSLT12?How did learners interact with each other in FSLT12?Bonnie Stewart (in her narrative introduction in McAuley et al, 2010) identifies a range of inhibiting factors for participation within a MOOC. These include not appealing to those most comfortable in a formal educational environment, lack of accreditation, lack of familiarity with the digital skills required, lack of access to the tools required for participation, lack of scaffolding and support, language barriers, lack of netiquette, technology ownership and bandwidth, time zones, volume of information, mastering content, and reading blog and discussion posts.

George Siemens (in his narrative introduction in McAuley et al, 2010) suggests that knowledge of how participants engage or interact with content and each other is a little-known territory. The nature of an open learning platform, which relies on connections between learners in order to autonomously aggregate, remix, repurpose, and feed forward, poses many questions about the fostering of active participation (Kop, Fournier & Mak, 2011). Aggregation, remixing, repurposing, and feeding forward were explained by Stephen Downes in his introduction to the Change11 MOOC (see Downes, 2011). Aggregation refers to selecting the content of a MOOC that is most relevant to the individual participant, remixing refers to keeping a personal record of everything the individual has accessed within a MOOC, repurposing consists of the individual participant creating some content of their own, and feed forward involves sharing that with others.

There is a need for more evidence to guide the implementation of MOOCs and for more evaluations of the educational benefits (Bujack, Paul, & Sandulli, 2012). This case study aims to add to a growing body of knowledge in this domain. The intended audience is higher education lecturers, educational developers, and related support staff, in addition to the international community who have a pedagogical and research interest in MOOCs.

MOOC Principles and FSLT12 Design

A MOOC differs from a formal online course in that it is open and distributed within Internet social spaces. The term "MOOC" was coined in 2008 by Dave Cormier as a description of the phenomenon of George Siemens and Stephen Downes' open online course at Athabasca University, called Connectivism and Connected Knowledge 2011 (CCK11), which attracted over 2,300 participants ("Massive Open Online Course," 2012).

The underlying philosophy was a notion of connectedness, which is underpinned by creating and maintaining a personal network of connections and interacting with and creating content. Importantly, individuals choose at what level they might participate. There are four underlying principles, namely autonomy, diversity, openness, and interactivity (Downes, 2009). The design of FLST12 attempted to reflect these principles. The course design team consisted of a senior educational developer who leads a Postgraduate Certificate in Teaching and Learning in Higher Education, an experienced lecturer in nursing who was seconded to the team, a learning technologist, and an independent education consultant who was an experienced MOOC participant and facilitator. A common thread that was shared within the team was considerable prior experience as distance learning online tutors who were motivated to learn about how a connectivist approach to course design might inform their ongoing online teaching practices.

The FSLT12 curriculum was based on a short non-accredited three-day, on-campus course offered by Oxford Brookes University as an introduction to teaching and learning in higher education and is underpinned by the U.K. Professional Standards Framework for Teaching and Supporting Learning in Higher Education (HEA, 2011). FSLT12 ran for a period of six weeks and each week featured a specific theme from the First Steps curriculum: supporting learning, reflective practice, teaching groups, feedback, lecturing, and evaluation.

FSLT12 was free of charge for all participants; 20 free assessed places were offered on a first-come, first-served basis. Assessment, which was non-accredited, consisted of a reflective statement, participation within an annotated collaborative bibliography activity, and preparation and showcasing of a 10-minute microteaching activity during the final synchronous sessions. Participants who completed the assessment were awarded a certificate of attendance. The delivery of FSLT12 was considered to be a pilot for a potential accredited postgraduate module, which will continue to run in the future as a MOOC.

Openness

A unique intention of running FSLT12 as a MOOC was to introduce the concept of open academic practice to the target audience by offering a traditional course within an open online platform. All of the course material and interaction was openly available on the public web (with the exception of a discussion forum set up for assessed participants). The learning platform consisted of an open WordPress site and an open Moodle virtual learning environment for prepared resources and discussion forums. Five synchronous audiographic sessions were offered within a Blackboard Collaborate environment, featuring an introduction to the course from the facilitators as well as three sessions, which were led by guest speakers: Etienne and Beverly Wenger-Trayner, Frances Bell, and Dave White. Directions to the synchronous classroom were posted on the open WordPress site, which served as an entrance portal to the course. This meant that formal enrollment was not necessary to attend those sessions, to view the recordings afterwards, to access any of the learning materials or read the discussion forums. The only form of course participation that required registration was posting on the discussion forums.

Autonomy

Openness offered participants a wide variety of ways to engage with the course and freedom to set their own patterns of participation. Casting the course as a MOOC explicitly gave participants permission to engage in the ways and quantities they wished, with no requirement to be involved in all aspects of the course. Participants were also empowered to choose their own pathways through the available material, dipping in for particular topics or materials, skipping others, and, as the course materials all remain available, to engage with materials at the time and in the order of their choosing. This autonomy operated on several different levels and means that it is impossible to trace all of the shades of course engagement and quantify how much each was used. Different amounts of participation were possible from reading a single tweet, up to extensive activity in all discussion forums and live sessions, and at all levels in between. The preferences and interests of each individual participant or observer mean that engagement with the course will look different for each person.

There was anticipation that many of the target audience (new lecturers) might be MOOC neophytes, so scaffolding strategies to enable interactions with earlier adopters was an important course design consideration. A range of relevant open educational resources (OER) was therefore developed or harvested prior to the start of FSLT12, and integrated into the WordPress site under the heading "Tutorials," with the specific purpose of presenting guides to acquiring participatory skills.

Diversity

FSLT12 was targeted at new lecturers, Ph.D. students who teach, and people moving into higher education from other sectors. The team anticipated diversity within this group. A significant proportion of the course participants did fit those categories. There were people who were completely new to higher education and to learning online, but who had chosen the course because it was more convenient than its face-to-face equivalent. Others were experienced MOOC participants, and experienced lecturers, interested in refreshing their practice, or sharing their expertise.

There were 206 official registrants for FSLT12, which may be considered small in MOOC terms but given that FSLT12 had a specific course topic and was delivered during a year where many other MOOCs were offered this may not be surprising. Downes (2013a) suggests that a concept of massive may not necessarily be conceived as engaging many numbers of people," but in the design elements that make educating many people possible."

FSLT12 attracted higher numbers than the face-to-face equivalent (where numbers average 20 to 30) and participants were more geographically diverse, from 24 different countries including Australia, Canada, India, and South Africa, as well as many European countries and the United States. Downes (2013a) makes the point that when a course is massive in its design elements, it should not create bottlenecks. This means that it should not rely on the teacher to give feedback to the learner but enable many one-to-one interactions so that feedback may come from a range of sources.

An additional pedagogical approach to the MOOC has emerged, which is based on a different perspective that focuses on a single open platform, instruction, learning outcomes, self-directed learning, and course content. Udacity, which was founded by Sebastian Thrun in 2011, is an example of this. This phenomenon has conflated the understanding of MOOCs; the emphasis on "the massive" within these courses is scale of numbers, sometimes in excess of 150,000 participants. These are referred to as xMOOCs, whereas connectivist-influenced MOOCs are known as cMOOCs (Sloep, 2012).

Interactivity/Connectedness

Support for interaction was provided through the aggregation of blogs in the WordPress site, through centralized discussion forums in Moodle and through a suggested Twitter hashtag (#fslt), which was used extensively. These support mechanisms allowed users to easily follow conversations about the course and to connect to others discussing similar issues, without the need to otherwise engage with the course. In addition, participants were encouraged to interact and connect in locations of their own choosing, resulting, for example in a small group of participants meeting face-to-face.

Reflecting on a Review of the Literature

A review of the current literature has highlighted that there is limited emergent evidence on how participants interact with content and each other within a MOOC. Apparent themes are: multitasking, benefits and risks of openness, digital identity, and communities of practice.

Multitasking is a required skill for active participation. Levy's (2010) account of personal experience as a new MOOC learner in Personal Learning Environments Networks and Knowledge (PLENK 2010) suggested that it was very complex to organize a personal learning network, and considered that time management skills were a key factor to overcome this; in addition, a significant level of personal motivation was required in order to sustain participation. A lot of the learning happened in back channels, but it required multitasking to in order to achieve this. MOOC organizers can facilitate some of this by providing daily newsletters or summaries of participant activity.

There are benefits and risks associated with openness; the fact that access is enabled for a diverse range of participants means that the less experienced can benefit from expertise beyond the realms of a normal course (Kikkas, Lannpere, & PƵldoja, 2011). MOOCs have been described as a model of "digital practice" as they can develop the skills of individuals so that they can participate within a digital economy (McAuley et al., 2010). Weller (2011) suggests that access to participation in education may be widened by MOOCs, as there are fewer prerequisites than in formal education. Risks include the need for prerequisite critical literacies such as the ability to learn autonomously and maintain a level of presence within a MOOC (Kop et al., 2011). Participation in a MOOC represents engagement in norms of digital interactions, which may present a good learning opportunity but can be challenging for those new to these expectations (McAuley et al., 2010). The size of a MOOC cohort can be overwhelming and it can be challenging for the learner to make sense of many voices (Chamberlin & Parish, 2011). The learner is provided with more global connections than a traditional course, exposure to diverse views, and an abundance of resources and sharing of experiential knowledge, but is more likely to participate if there is potential for credit (Chamberlin & Parish, 2011).

Participation can be either enhanced or inhibited by learner digital identity – for example, the difference between having a confident voice within digital worlds compared to feeling that you have nothing worthwhile to contribute, and the role of participant as knowledge builder may feel alien to some learners (McAuley et al., 2010).

The work of Hrastinski (2008, 2009) suggests that there are varying perspectives on how participation within online communities of practice can be conceptualized. This includes levels of participation from accessing the online environment to taking part and joining in a dialogue, the perspective of learners, and quantitative measures such as numbers of postings, lengths of postings, and time spent reading postings. Hrastinski makes the point that online learning participation is a complex and evolving process for the learner, which involves taking part and building relations with others and involves a number of activities and feelings such as thinking, feeling, belonging, and communicating; counting numbers alone is an insufficient measure of participation. Lurking is seen as a legitimate level of participation and an activity that shows potential for more active participation. This is an important consideration for exploration of participation in a MOOC and, in particular, the identification of triggers or processes that transform lurking to active participation.

Theoretical Frameworks

Although cMOOCs are considered by some to be underpinned by connectivist principles (Cormier, 2010a, 2010b) it would appear that the development of MOOCs has been influenced by a combination of theoretical perspectives (Bujack et al., 2012; Kop, 2011; Levy, 2010) such as constructivism, connectivism, and the concept of communities of practice. Hrastinski (2008) suggests that research frameworks for high-level conceptions of online learner participation include a variety of options and identifies constructivism and communities of practice as appropriate frameworks. Sfard (1998) argues that within educational research there can be a case for consideration of more than one theory for learning because each perspective can offer exclusive benefits. The rationale for this is argued within this case study, which was informed by constructivism, connectivism, and communities of practice.

Constructivism

The concept of social constructivism has been influenced by Vygotsky's (1978) social learning theory, which placed prominence on collaborative learning. The learner constructs and interprets knowledge actively, and practice, prior knowledge, and experience are viewed as an important platform for ongoing learning. The design of the environment is constructed around authentic and experiential tasks such as those designed into the curriculum of FSLT12. Within a study exploring learner experience of participation in a MOOC, constructivism acknowledges the diversity of participants and aligns with the emphasis on creating and interpreting knowledge within a MOOC. In terms of interaction with content within a MOOC, constructivism embraces the purposes of learner activities such as aggregation, mixing, repurposing, and feeding forward, and includes social interaction with other participants in order to collaborate within learning activities. Constructivism, on the other hand, may not align with the role of a MOOC facilitator who is less prominent than in a normal course, as the emphasis in a MOOC is on autonomous and peer learning. The potential number of participants within a MOOC may also limit a role for a central facilitator.

Connectivism

Connectivism as a learning theory (Downes, 2012; Siemens, 2005) is based on the premise that learning theories should reflect current social environments and, in particular, the influence of technology and the diverse informal settings in which modern learning takes place. Siemens (2005) critiques constructivism on the basis of the emphasis of knowledge, seen as resting with the individual; the use of technology has altered this premise as knowledge is viewed as residing within systems. Connectivism puts an emphasis on lifelong learning and also integrates chaos and complexity as important concepts, to prepare learners for working in a more unpredictable world and support the development of relevant skills in self-organization and decision making. The theory is that learners will create their own personal networks by making connections with people and ideas across a distributed digitally networked world. Learners will also be enabled to identify their own connections to support their ongoing learning. The pursuit of current and up-to-date knowledge is a desirable outcome. Bell (2011) has critiqued connectivism as an incomplete learning theory in that it may be insufficient to explain what has worked well in practice. However, the theory may be helpful in exploring how participants interact with content and each other in a MOOC, especially if the topic relates to continuing professional development (CPD) such as in FSLT12.

Communities of Practice

Wenger (1998) has presented key ideas about what it means to participate within a community of practice. He suggests that it is a complex, active process of participation and mutual recognition. Within any given community of practice this will lead to personal identity building. Participation within a community of practice leads to a perspective on the knowledge, which is considered to be valuable and effective for that community. Hrastinski (2008) has identified that community of practice theory conceptualizes the complexity of participation and the association of participation with learning.

Wenger, Trayner, and de Laat (2011) differentiate between a community and a network and do not necessarily view participation as collaboration because participation can include all types of relations – for example, conflict and competition. The theory also takes into account legitimate peripheral participation and the idea that new learners within a community of practice may be viewed as apprentices and take on more active roles as they are drawn into the community.

A community of practice is seen as a learning partnership among people who find it useful to learn from and with each other about a particular domain, whereas a network is seen as a group of relations, personal influences, and connections between individuals. This is a useful lens to consider how participants have interacted with each other in a MOOC where many networks may be evident: What is the relationship between community and network, and which ones are the most significant for which participants? It is also a useful lens for exploring learner participation within a MOOC and self-evaluation of participation, as well as to identify strategies that may encourage active participation. Wenger (1998) never intended communities of practice to be a standalone theory. In this case study, community of practice theory is used to complement the theories of constructivism and connectivism as an overall theoretical framework.

Methods

The methodology was a case study that employed a mixed methods approach. Bryman (2012) describes trends in social research and in particular the expansion of mixed methods research as a strategy to provide a more complete picture of a phenomenon. A MOOC has potential to offer researchers useful and comprehensive data and in particular digitally archived records of participant activity and interactions. A quantitative analysis of these was used to illustrate trends in participation, which were explored in more depth with a course evaluation questionnaire, followed up with focus groups, individual interviews, and a survey of participatory experience. The aim was to provide rigor within the case study by what Bryman describes as completeness in addressing the specific research questions, and to provide triangulation of the data, which helps to identify important connections to provide a detailed account of what counts as participation within a MOOC.

The Oxford Brookes University Research Ethics Committee granted ethical approval for the collection and analysis of primary research data from FSLT12. Participants were recruited via the main WordPress site by posting an invitation to participate in research, which included a link to a participant information sheet, consent form, and course evaluation questionnaire. Participant data was anonymized and de-identified, and permission was sought to use direct quotes from research data such as personal blogs, interviews, and survey research data.

Sample

The focus of attention within the case study was FSLT12 participants, so in theory anyone who registered or participated within any element of FSLT12 could be regarded as what Bryman (2012) describes as the unit of analysis for sampling. However, in a case study making use of mixed methods, there will be a number of units within the case.

A quantitative analysis of numbers participating in specific areas of distributed open learning spaces of a MOOC potentially includes everyone who visits that learning space. Unless the space is tagged with a course identifier it may be impossible to fully estimate participation quantitatively. In FSLT12 it was possible to identify everyone who registered for the Moodle site and those who actively participated in identifiable distributed course spaces (Table 1).

Table 1. Level of participant activity in FSLT12

Actively participated within identifiable FSLT12 distributed spacesaPosted to Moodle Discussion forums (minimum of one post)Aggregated blogs at course WordPress siteAttended Blackboard Collaborate synchronous session 1Attended Blackboard Collaborate synchronous session 2Attended Blackboard Collaborate synchronous session 3Attended Blackboard Collaborate synchronous session 4

aMoodle discussion forums, aggregated personal blog, or attended a synchronous session.

Data Collection and Analysis

Just under 20% (n = 41) of registrants completed the course evaluation questionnaire via the link from the WordPress site. Respondents were given the option of remaining anonymous, so it was impossible to identify the representativeness of the sample, however it was clear that diverse participants experienced the course differently. This helped to identify a range of themes (see the Appendix to this paper) that were used to promote discussion during subsequent focus group interviews, which all registrants were invited to attend in Blackboard Collaborate. Two of these were carried out, each with an attendance of eight participants who represented a relatively diverse cross-section of members of FSLT12 in terms of experience of teaching in higher education, prior MOOC participation, international setting, and undertaking an assessed or non-assessed pathway. Active participants who represented the description of the specific target audience (new lecturers in higher education, who were undertaking the assessment pathway) were identified as a purposive sample and invited to attend individual semi-structured interviews: four of these were carried out. In order to reach a wider sweep of FLST12 registrants (in other words to include the views of those who had not been visibly active) a survey questionnaire focused on individual participation was emailed to every course registrant. Response rates were relatively low at 13.6% (n = 28), but over half of the respondents said that they had either read or commented on other participant blogs. Twenty-four participants aggregated their blogs at the FSLT12 WordPress site; these were found to be a rich source of reflective commentary on course experience and were considered to be a source of relevant research data. This was a complex sampling strategy, which aimed to capture data taking into account the diverse ways in which individuals chose to participate within FSLT12 (Table 2).

Table 2. Data collection for the case study – sample sources and sizes

Course evaluation questionnaire, link posted on course siteQuantitative measures of individual active participation in distributed spaces and FSLT12 discussion forumsFocus group interviews x 2 in synchronous audiographic platformThematic analysis of aggregated blogsSurvey questionnaire on experience of participation e-mailed to all FSLT12 registrantsThe analysis of quantitative data was statistically insignificant. However in mixed methods research it is recommended that sources of data are integrated (Bryman, 2012). Data collected from the course evaluation questionnaire, focus group interviews, individual interviews, survey, and aggregated blog posts were thematically analyzed (see the Appendix) with reference to the research questions defined earlier.

The overall thematic analysis (see the Appendix) was returned to the focus group and interview research participants for member checking and verification of participatory behavior themes. During the process of FSLT12 many participants expressed an interest in research and evaluation of MOOCs, and the team felt it appropriate to be transparent about research procedures. Given the diversity of FSLT12 some participants were experienced researchers; this provided a good opportunity to review trustworthiness of the data analysis. The collected research data was also peer-reviewed by the authors who independently organized the many sources into themes and made comparisons in order to reach an overall consistent analysis.

The Learner Experience of Participation in FSLT12

Behaviors that respondents identified as participatory within FSLT12 were observing others, reading, discussion forum posting, blogging, and completion of assessment activities. Specific strategies to maintain individual participation were keeping the MOOC browser window open all the time and making a determined and deliberate attempt to participate as much as possible.

During the focus group interviews diverse identities emerged as participants spontaneously referred to themselves as "newbies" or "vets" in relation to MOOC participation. One participant described the roles as "fresh and green versus lots of MOOC experience ... such a range" (Focus group). These roles appeared to be mutually recognized.

In evaluating the learner experience of participation this dichotomy was used as a reference point to compare and contrast participatory experiences between novices and those who are experienced. Three main themes emerged: navigation, transformational shift, and scaffolding by those who were more experienced. These are discussed in turn below.

Navigation. Different routes and approaches were taken; novices felt initially overwhelmed by technical issues, multiple channels, and a need to be able to multitask, which required too many initial participatory skills incorporating "listening, reading and discussions all at the same time." (Course evaluation questionnaire)

The Moodle site was seen as "delivering too much content all at once" (Course evaluation questionnaire). The number of discussion forum posts in Moodle was highest during the opening week. An experienced participant in a blog post called this "MOOC syndrome": typical of enthusiastic starts within MOOCs, which quickly tail off. Some participants felt that FSLT12 was trying to do too much at once for a short course. Appearance of a "Welcome" discussion topic and a "Week 0" in advance of the advertised start date meant that people who had been following the developments more closely were already active and engaged in discussion by the time the advertised start date had arrived. This meant that some participants had the sense of arriving in the midst of ongoing activity. Conversely, some experienced participants felt very able to navigate the multiple channels. For example, one stated that he/she "appreciated more structure than other MOOCs [he/she had] participated in" (Course evaluation questionnaire).

Experienced participants also appeared to be more judicious about planning their route and their level of participation, as exemplified by the following quote: "This is my sixth MOOC and I chose my level of participation" (Course evaluation questionnaire). The fact that novices encountered navigation difficulties in an online environment is not unexpected and corresponds with the findings of other authors (Fini, 2009; Kop, 2011).

Transformational shift. Although novices initially felt overwhelmed, some, especially those who were undertaking the assessment route, remained active throughout the MOOC. A transformative shift was evident on realization that it is not necessary to join everything: "Initial feeling of great pressure to participate...realized eventually that there was choice about this." (Interview 1)

This did require time and reflection to reach this point. One participant commented during an interview: "Getting over that initial concern ... I realized that it was OK to observe, but that required effort, I spent a whole evening opening all of the different folders and channels, needed to immerse myself" (Interview 2). Another participant wrote: "A core moment in online participating is that when you begin to contribute after only following, observing or consuming" (Experienced participant blog).

A core moment for transformative shifts was the synchronous sessions:

"The first synchronous session was an important milestone in getting to the point where I realized I didn't have to respond to everything. When I was hearing voices and when there was time to have a conversation with others involved in similar activities." (Focus group)

Some novices viewed assessment as a lever to promote a transformational shift in identity. One of them reflected as follows: "My identity as a teacher has shifted contextually, my curiosity has been stimulated about digital literacies and developing them amongst learners and which ones are required for effective teaching" (Interview 1). This contrasted with experiences of some participants, who expressed a view that "not being assessed made it less meaningful to participate" (Survey).

Connectivist-influenced MOOCs are designed for uncertainty, and this can have an impact on participant learning experiences and sense of personal identity: "A MOOC allows me to play with uncertainty and depending on how my day is going this can be scary or liberating" (Discussion forum). Some participants recognized the effect on their identity and how this might change over time in unexpected ways: "It changed my learner identities from confidence and experienced to not having a clue and feeling like a novice" (Focus group).

Scaffolding by "experienced participants." Some registrants of FSLT12 who did not actively participate cited personal circumstances that impacted on time and commitment, but others said: "I didn't know how to participate" (Survey), and "I did try but I was intimidated by the depth of the posts and my newness to higher education" (Survey).

The expert lens of experienced participants recognized that it is challenging to engage MOOC newcomers, and an important skill is to support novices to cope with multi-channel working. One participant observed that "those that are lost are missing out on an important new literacy" (Focus group).

Assessed participants, especially novices, felt supported by the FSLT12 infrastructure and experienced participant expertise. There was more potential for peer facilitation and support for novices. This became evident in responses gathered during the focus group interviews. It was suggested that a core group of experienced volunteers could take responsibility for supporting individual groups within a cohort of novices, moderate back channels and build community at a micro level. Some of the novices reflected that they could have taken more opportunities to seek feedback and ask questions of experts. For instance, one participant said: "This was a good opportunity to participate with teachers who have been teaching for longer ... I should have realized that it was OK to ask them for help" (Interview 1).

There is a clear opportunity to establish informal networks for mentoring and potential for experienced participants to serve as support volunteers for future practice.

Learner Interaction with Content in FSLT12

Motivation to join FSLT12 was prompted by the subject matter. This was indicated on registrant signup records and applied to novices and experienced participants alike. The content of FSLT12 was identified as a clear framework for CPD and reflection.

Reflective practice. Content that focused on reflective practice appeared to prompt this. This included learning activities and assessment tasks.

This was typical of all active participants, and particular evidence of this was found within participant blogs. For example, one participant wrote:

"My goal for my work in the FSLT MOOC is to understand and practice critical reflection more effectively and to learn how to create a space where my students can develop their own. I am grateful for the opportunity to join the FSLT community." (Experienced participant blog)

Another wrote: "The content has encouraged my participation at a more thoughtful level and I have modelled reflection by using a blog" (Novice participant blog).

Reflective content of FSLT12 was viewed as a facilitative tool to stimulate personal reflections on teaching and learning and to develop personal reflective skills:

"In the previous post I described briefly what some of the experience was like as a teacher of Media Studies. I am aware that I have not provide a 'critical analysis' from a 'number of perspectives' – such as my own, the learners involved, my colleagues and indeed, the literature." (Novice participant blog)

Learner Interaction with One Another

Survey responses suggested that active participants used the following channels to interact with each other: posting to discussion forums, responding to blog posts, attending synchronous sessions, and a face-to-face meeting entitled "MOOCup," which had been arranged via Twitter.

Two main themes emerged from the thematic analysis of all of the data, making sense of community for novices and experts, and reciprocal relationships, which are discussed below.

Making sense of community for novices and experts. This category was seen as a skill that was required to "sort out who these experts and the audience are" (Interview 1). Becoming part of the community was viewed as an important confidence-builder and an opportunity to observe participatory behaviors. All interview participants were new to online teaching and they expressed the usefulness of observing how forums, blogs, and social media could be used for digital interaction and to see their use in teaching and learning practice. In the words of one participant: "This opened my eyes as a teacher" (Interview 2).

The synchronous sessions and individual blogs were important participatory areas for FSLT12, but the main thrust of observed participant interaction in FSLT12 occurred in the discussion forums. This is shown in Table 2 and evidenced in the following quote: "In the MOOC forums there is quite a lively discussion this week (as there was last week), which got me thinking. One of the things that I've been pondering is the mode of participation" (Experienced participant blog). Of the 67 active participants, 52 posted on the forums.

Reciprocal relationships. The potential for reciprocity between diverse participants as a catalyst for active engagement became gradually apparent. Novice participants developed a sense of the establishment of a core FSLT12 community after a few weeks. One participant remarked: "Interacting with experts has demonstrated different styles for pedagogy for online learning and different approaches for using technology in teaching" (Interview 3).

The fact that the MOOC was an open online environment did not occur to everyone as a significant issue. For example one interview participant had not registered awareness that presentation of the microteaching assessment was open access. However, the possibility of potential for interactions in many spaces was regarded as an advantage for gaining rich feedback from many sources:

"The interactions between my peers and the facilitators were great, there was a lot of sharing of best practices and lived experiences, and I took a lot of things away; both things that I can immediately put into practice, and things that I need to read up on :-)." (Experienced participant blog)

Focus group interviewees identified the diversity of other participants as an important aspect of learning. The international context and the professional experiences of individuals provided a positive challenge as peer observers. This was an evident outcome of the microteaching assessment. The experienced were impressed with the novices and vice versa, which prompted rich and dialogic reciprocal feedback:
"There is just nothing else quite like social, networked learning to get your brain jazzed up about something. It's about risk, exploration, learning from others who are as wildly different and deeply the same as you are, to try and fail and try some more in a safe and welcoming environment." (Experienced participant blog)

Experienced participants' blog posts demonstrated reflection on a variety of personal MOOC experiences and the strategies they employ to locate themselves within the main mode of interaction in an individual MOOC: "It seems to me, that my own personal strategy is 'biggest bang for the buck' – so wherever there are more people, that's where I participate" (Experienced participant blog).

The digital skills of experienced participants enabled them to evaluate the impact of their own contribution and how individuals are interacting with their artifacts. For example: "Google Analytics gives me information about visits to my blog during the last month" (Experienced participant blog).

Discussion

Threshold concepts have been described as akin to the opening up of a portal of understanding of previously unknown knowledge (Meyer & Land, 2003). They are a useful lens to explore why active participation led to learning for some in FSLT12.

One of the important features of a threshold concept is transformation, which means that once knowledge is understood an aspect of a practice or discipline will be transformed, irreversibly, and the learner will have crossed conceptual boundaries and have a changed discourse or identity. This is preceded by troublesome knowledge (Perkins, 2006) for the learner, where knowledge can be counterintuitive, incoherent, or alien. Another feature of a threshold concept is liminality; Meyer and Land (2003) describe this as "a suspended state of partial understanding or stuck place" (p. 10).

The theory is that as learners transcend steps in their understanding or attempt to gain mastery in their discipline or practice they will oscillate between new and old understandings. Cousin (2006) provides the example of the adolescent who will swing back and forth between child- and adult-like behaviors. Once a learner enters a liminal space they are engaged within a context for mastery. The liminal space may provide a sense of self-achievement, but may also be troublesome as personal identity shifts in an attempt to reach new understanding as old ways of doing and thinking about things are discarded, representing an ontological shift.

Learning in a cMOOC takes place over distributed platforms and in an abundance of information that many learners find overwhelming. Learners therefore need to develop skills of finding relevant information and become adept at filtering, picking and choosing information relevant to personal learning. High levels of critical capability (Kop & Bouchard, 2011) are required in addition to appropriate technical skills. This represents troublesome knowledge in a cMOOC.

Liminality was observed in FSLT12 in novice attempts to make sense of MOOC participation and observe and practice skills and behaviors associated with active participation. Reflection and time appeared to be important factors to facilitate this. The moments in FSLT12 when learners recognized the skills that were required for active participation by interacting with content and other learners was when personal learning became transformed. This could be viewed as a threshold concept and appeared significant when learners described identity shifts.

The liminal space is contrasted with the pre-liminal space where the learner remains completely static within their understanding (Meyer & Land, 2003). This could be said to correspond with those registrants for FSLT12 who stated in the survey questionnaire that they did not know how to participate; this is a significant issue for facilitators and providers of MOOCs to address. In addition the threshold concept theory suggests that once a threshold of understanding is crossed, it is hard for the learner to truly recall lack of understanding from a former pre-liminal or liminal phase. Although this case study demonstrated reciprocity between novice and experienced participants to some extent, it might be challenging for the experienced to realize or remember how difficult MOOC participation may be to those new to learning within distributed open environments.

Downes (2013a) maintains that connectivist MOOCs can enable learners to genuinely participate in the culture of a discipline by fostering the ability to operate across traverse networks, requiring learners to do something quite different from a traditional course. He provides an example of the difference between learning about physics and joining a community of physicists. This demonstrates a link between connectivism and communities of practice and also the work of Cousin (2006) on threshold concepts where the description of mastery of a discipline is exemplified in the difference between learning about economics and thinking like an economist. FSLT12 provided opportunities for novices to participate in the culture of teaching and learning in higher education by allowing participants the chance to move from learning about teaching to acting and thinking like a teacher.

Implications for Practice and Further Research

As a small group of online teachers, the team was interested in the outcomes of this case study in order to influence ongoing teaching practice. Anderson and Dron (2011) argue that high quality online learning exploits a range of pedagogies. Social constructivism was evident in FSLT12 where the facilitators developed content to stimulate discussion and placed high emphasis on dialogue in forums. As Anderson and Dron point out, the social constructivist approach is not conducive to scaling to high numbers. They suggest that a good starting point for connectivism is to expose learners to networks and provide opportunities to gain self-determination in cognitive skills for learning and making connections of their own by creating opportunities for interaction. In this case study, liminality was seen to be an issue potentially impacting upon this; however, experienced participants to some extent provided good role models and facilitated connections with novices. This suggests that there are benefits to opening up online courses to enable learning to go beyond the faculty or the institution in order to maximize diversity. There is however more knowledge needed to develop models for active participation within MOOCs, and in particular making the most of experienced and novice interactions. It is interesting to note that Anderson and Dron describe the scalability of connectivism as low to medium. It could be arguable whether FSLT12 was numerically massive given that only 206 people registered. Conole (2013) suggests that with the proliferation of MOOCs, components of specific learning design will determine the degree of scalability of a MOOC. Downes (2013b) suggests that "Dunbar's number" of 150 should be the minimum number of active participants to qualify as a "massive" course. FSLT12 was therefore massive for Week 0 only when 206 people registered on the Moodle site. More important is that FSLT12 was prepared for massiveness and was designed according to cMOOC principles.

Rodriguez's (2012) review of MOOCs estimates active participation at less than 10% in four cMOOCs of varying registrant numbers (556 to 2,700). Comparatively an active participation rate of 32% in FSLT12 (see Table 1) may be significant but would be hard to generalize beyond this case study. Watters (2012) points out the high dropout rates of MOOCs and high numbers of readers/observers/lurkers, while Rodriguez asserts that a cMOOC offers the potential learner choice about how and when to participate. This presents challenges in estimating those participants who do not visibly interact but may be actively lurking. The usefulness of counting numbers is questionable. Emerging research methodologies in technology-enabled learning such as learning analytics (Long & Siemens, 2011; Siemens, 2012) present alternative approaches for analyzing multiple and diverse interactions within a MOOC.

The authors observe that MOOC offerings are evolving rapidly with the hybridization of many cMOOC and xMOOC philosophies (Roberts, Waite, Lovegrove, & Mackness, 2013). Hybridization may lead to further knowledge about MOOC participation. Based on the authors' experiences of conducting this case study there was clear evidence of multi-MOOC participation. Some experienced participants were engaged concurrently in other MOOCs and openly compared their experiences and participatory behaviors between MOOCs and exposure to differing pedagogical philosophies. The fact that there appears to be a MOOC community who will deliberately expose themselves to many MOOC experiences makes them an interesting group for more longitudinal research to provide further insight into MOOC participation than can be provided by a single case study.

A hybrid MOOC called E-Learning and Digital Cultures (EDCMOOC) was recently offered by academics from another U.K. University, The University of Edinburgh. One of the facilitators noted (Sinclair, 2013, para. 3) that this MOOC was "stimulating a liminal state for many students," suggesting that the MOOC itself is a liminal space and a site where threshold concepts might be encountered. This resonates with the assertions made in this paper that participation within a MOOC is a threshold concept in and of itself.

Conclusion

This case study uncovered themes (see the Appendix), which were common to some active participants and derived from more than one source of data. The most frequently occurring themes were transformation of identity, reciprocal relationships, and reflective practice. Novice participants expressed initial uncertainties and experienced participants observed the difficulties they were experiencing. Reflective practice was important for active participants as an aspect of learning and a tool for self-expression.

The case study reported in this paper adds to the findings of earlier research on learner participation within MOOCs as discussed in the literature review, especially with regard to navigation across multiple channels. The course design of FSLT12 put participants in an area of troublesome knowledge and therefore a liminal space. The opportunity to develop reflective practice skills while in this space was an important factor for dealing with threshold concepts and led to transformative learning. Active participation is the vehicle through which these two concepts can be experienced. Navigation is an underpinning requirement, but of particular consequence in this case study was that the experienced participant presence highlighted some of the difficulties the novices were experiencing and triggers for active participation. More could have been made of experienced expertise to support and orientate registrants new to the MOOC experience.

References

Anderson, T., & Dron, J. (2011). Three generations of distance education pedagogy. The International Review of Research in Open and Distance Learning, 12(3), 80-97. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/890/1663

Bell, F. (2011). Connectivism: Its place in theory-informed research and innovation in technology-enabled learning. The International Review of Research in Open and Distance Learning,12(3), 98-118. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/902/1664

Bujack, K. R., Paul, M. A., & Sandulli, F. D. (2012, July). The evolving university: Disruptive change and institutional innovation. Paper prepared for the panel on "Future of Universities in a Global Context" at the XXII World Congress of Political Science, Madrid, Spain. Retrieved from http://c21u.gatech.edu/sites/default/files/IPSA%202012%20Paper.pdf

Bryman, A. (2012). Social research methods (4th ed.). Oxford, UK: Oxford University Press.

Chamberlin, L., & Parish, T. (2011). MOOCs: Massive open online courses or massive and obtuse courses? eLearn Magazine, 2011(8). doi:10.1145/2016016.2016017

Conole, G. (2013, May 25). A new classification for MOOCs [Web log post]. Retrieved from http://www.e4innovation.com/?p=727

Cormier, D. [dave cormier]. (2010a, December 1). Success in a MOOC [Video file]. Retrieved from http://www.youtube.com/watch?v=r8avYQ5ZqM0

Cormier, D. [dave cormier]. (2010b, December 8). What is a MOOC? [Video file]. Retrieved from http://www.youtube.com/watch?v=eW3gMGqcZQc

Cousin, G. (2006). Threshold concepts, troublesome knowledge and emotional capital: An exploration into learning about others. In J. H. F. Meyer & R. Land (Eds.), Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge (pp. 134-147). Abingdon, UK: Routledge.

Downes, S. (2009). Beyond management: The personal learning environment. Keynote presentation delivered at the World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009, Honolulu, HI. Available from EdITLib Digital Library. (32242)

Downes, S. (2011). How this course works. Retrieved from http://change.mooc.ca/how.htm

Downes, S. (2012). Connectivism and connective knowledge: Essays on meaning and learning networks. Moncton, Canada: Author. Retrieved from http://www.downes.ca/files/books/Connective_Knowledge-19May2012.pdf

Downes, S. (2013a, May 30). MOOC – The resurgence of community in online learning [Web log post]. Retrieved from http://halfanhour.blogspot.co.uk/2013/05/mooc-resurgence-of-community-in-online.html

Downes, S. (2013b, January 17). What makes a MOOC massive? [Web log post]. Retrieved from http://halfanhour.blogspot.co.uk/2013/01/what-makes-mooc-massive.html

Fini, A. (2009). The technological dimension of a massive open online course: The case of the CCK08 course tools. The International Review of Research in Open and Distance Learning, 10(5). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/643/1402

The Higher Education Academy. (2011). The UK Professional Standards Framework for teaching and supporting learning in higher education. York, UK: Author. Retrieved from http://www.heacademy.ac.uk/assets/documents/ukpsf/ukpsf.pdf

Hrastinski, S. (2008). What is online learner participation? A literature review. Computers & Education, 51(4), 1755-1765. doi:10.1016/j.compedu.2008.05.005

Hrastinski, S. (2009). A theory of online learning as online participation. Computers & Education, 52(1), 78-82. doi:10.1016/j.compedu.2008.06.009

Kikkas, K., Laanpere, M., & PƵldoja, H. (2011). Open courses: The next big thing in eLearning? In A. Rospigliosi (Ed.), Proceedings of the 10th European Conference on e-Learning (ECEL 2011) (pp. 370-376). Reading, UK: Academic Conferences.

Kop, R. (2011). The challenges to connectivist learning on open online networks: Learning experiences during a massive open online course. The International Review of Research in Open and Distance Learning, 12(3), 19-38. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/882/1689

Kop, R., & Bouchard, P. (2011). The role of adult educators in the age of social media. In M. Thomas (Ed.), Digital education: Opportunities for social collaboration (pp. 61-80). New York, NY: Palgrave Macmillan.

Kop, R., Fournier, H., & Mak, J. S. F. (2011). A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. The International Review of Research in Open and Distance Learning, 12(7), 74-93. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1041/2025

Levy, D. (2010). Lessons learned from participating in a massive open online course. In Y. Eshet-Alkalai, A. Caspi, S. Eden, N. Geri, & Y. Yair (Eds.), Learning in the technological era: Proceedings of the Chais Conference on Instructional Technologies Research 2011 (pp. 31-36). Ra'anana, Israel: The Open University of Israel. Retrieved from http://www.openu.ac.il/research_center/chais2011/download/f-levyd-94_eng.pdf

Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 31-40. Retrieved from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education

Massive open online course. (2012). In Wikipedia. Retrieved February 1, 2012 from http://en.wikipedia.org/wiki/MOOC

Meyer, J. H. F., & Land, R. (2003). Threshold concepts and troublesome knowledge: Linkages to ways of thinking and practising within the disciplines. Edinburgh, UK: Enhancing Teaching–Learning Environments in Undergraduate Courses Project, Universities of Edinburgh, Coventry, and Durham. Retrieved from http://www.etl.tla.ed.ac.uk/docs/ETLreport4.pdf

McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice. Charlottetown, Canada: University of Prince Edward Island. Retrieved from http://www.elearnspace.org/Articles/MOOC_Final.pdf

Perkins, D. (2006). Constructivism and troublesome knowledge. In J. H. F. Meyer & R. Land (Eds.), Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge (pp. 33-47). Abingdon, UK: Routledge.

Roberts, G., Waite, M., Lovegrove, E. J., & Mackness, J. (2013). x v c: Hybridity in through and about MOOCs. In Creating a virtuous circle: Proceedings of OER13. Milton Keynes, UK: The Open University, Support Centre for Open Resources in Education. Retrieved from https://www.medev.ac.uk/oer13/file/79/9/

Rodriguez, C. O. (2012). MOOCs and the Al-Stanford like courses: Two successful and distinct course formats for massive open online courses. European Journal of Open, Distance and E-Learning, 2012(2). Retrieved from http://www.eurodl.org/?p=archives&year=2012&halfyear=2&article=516

Siemens, G. (2005). Connectivism: A learning theory for the digital age. International Journal of Instructional Technology and Distance Learning, 2(1), 3-10. Retrieved from http://www.itdl.org/Journal/Jan_05/article01.htm

Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. In S. Buckingham Shum, D. GaŔevi?, & R. Ferguson (Eds.), Proceedings of the Second International Conference on Learning Analytics and Knowledge (pp. 4-8). New York, NY: Association for Computing Machinery. doi:10.1145/2330601.2330605

Sinclair, C. (2013). "Maybe it's working": #edmooc in Week 2 [Web log post]. Retrieved from http://edcmoocteam.wordpress.com/2013/02/08/maybe-its-working-edcmooc-in-week-2/

Sfard, A. (1998). On two metaphors for learning, and the dangers of choosing just one. Educational Researcher, 27(2), 4-13. doi:10.3102/0013189X027002004

Sloep, P. (2012). On two kinds of MOOCs [Web log post]. Retrieved from http://pbsloep.blogspot.co.uk/2012/06/on-two-kinds-of-moocs.html

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Watters, A. (2012, December 3). Top ed-tech trends of 2012: MOOCs [Web log post]. Retrieved from http://www.hackeducation.com/2012/12/03/top-ed-tech-trends-of-2012-moocs/

Weller, M. (2011). The digital scholar: How technology is transforming scholarly practice. London, UK: Bloomsbury Academic. doi:10.5040/978184966627

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, UK: Cambridge University Press.

Wenger, E., Trayner, B., & de Laat, M. (2011). Promoting and assessing value creation in communities and networks: A conceptual framework. Heerlen, The Netherlands: Open University of the Netherlands, Ruud de Moor Centrum. Retrieved from http://www.wenger-trayner.com/wp-content/uploads/2011/12/11-04-Wenger_Trayner_DeLaat_Value_creation.pdf

Appendix: Themes Emerging from the Analysis of the Various Data Sources

Course evaluation questionnaire
Overwhelming vs. well organized

Professional values/course identitySensemaking of MOOC learning experienceReflection/reflective practiceThreshold concepts/transformative momentsMaking sense of MOOC communitySurvey focused on participation
Differential views on participationPerceived lack of prerequisite skills

View the original article here

Wrapping a MOOC: Student Perceptions of an Experiment in Blended Learning

         Wrapping a MOOC: Student Perceptions of an Experiment in Blended Learning


Derek O. Bruff

Director, Center for Teaching
Senior Lecturer, Department of Mathematics
Vanderbilt University
Nashville, TN 37235 USA
derek.bruff@vanderbilt.edu

Douglas H. Fisher
Associate Professor of Computer Science and of Computer Engineering
Department of Electrical Engineering and Computer Science
Vanderbilt University
Nashville, TN 37235 USA
douglas.h.fisher@vanderbilt.edu

Kathryn E. McEwen
Graduate Assistant, Center for Teaching
Doctoral Candidate – German, Department of Germanic and Slavic Languages
Vanderbilt University
Nashville, TN 37235 USA
kathryn.e.mcewen@vanderbilt.edu

Blaine E. Smith
Doctoral Candidate – Language, Literacy, and Culture
Department of Teaching and Learning
Vanderbilt University
Nashville, TN 37235 USA
blaine.smith@vanderbilt.edu

Abstract

Although massive open online courses (MOOCs) are seen to be, and are in fact designed to be, stand-alone online courses, their introduction to the higher education landscape has expanded the space of possibilities for blended course designs (those that combine online and face-to-face learning experiences). Instead of replacing courses at higher education institutions, could MOOCs enhance those courses? This paper reports one such exploration, in which a Stanford University Machine Learning MOOC was integrated into a graduate course in machine learning at Vanderbilt University during the Fall 2012 semester. The blended course design, which leveraged a MOOC course and platform for lecturing, grading, and discussion, enabled the Vanderbilt instructor to lead an overload course in a topic much desired by students. The study shows that while students regarded some elements of the course positively, they had concerns about the coupling of online and in-class components of this particular blended course design. Analysis of student and instructor reflections on the course suggests dimensions for characterizing blended course designs that incorporate MOOCs, either in whole or in part. Given the reported challenges in this case study of integrating a MOOC in its entirety in an on-campus course, the paper advocates for more complex forms of blended learning in which course materials are drawn from multiple MOOCs, as well as from other online sources.

Keywords: massive open online course (MOOC), blended learning, online learning, wrapper, flipped classroom, course cohesion, subject coupling, task coupling, local learning communities, global learning communities, course customization

Introduction

Technology continues to transform education in traditional and online settings (Baldwin, 1998), as the recent proliferation of massive open online courses (MOOCs) demonstrates (Guthrie, 2012; Mangan, 2012; Pappano, 2012). Although MOOCs are seen to be, and in fact are designed to be, standalone online courses (Hill, 2012), their introduction to the higher education landscape has expanded the space for possible blended or hybrid course designs (those that combine online and face-to-face learning experiences). Creating a blended course that incorporates another instructor's MOOC simplifies the blended course design problem in some respects, by fixing the online component of the blended course, while allowing the blended course instructor to shape the in-class components. However, fitting in-class modules into an existing MOOC in a way that optimizes student engagement, satisfaction, and ultimately learning, can be challenging.

This paper reports a case study of a blended graduate course in machine learning at Vanderbilt University in Fall 2012, which incorporated a Stanford University MOOC. It reports student perceptions of the blended course and identifies elements of the blended course design that the authors think are responsible for these perceptions. Drawing on these findings, the paper suggests a number of design considerations of potential interest to instructors wishing to build blended learning experiences around MOOCs or to integrate online and face-to-face components of blended courses more generally. Although the blended course in this study adopted the entirety of one particular MOOC, the paper suggests that other customizations may well be both possible and desirable, particularly those that select from and mix multiple MOOC sources.

Background

Blended Learning

Blended or hybrid approaches to teaching integrate face-to-face (offline) instruction with online materials, creating what can be a flexible and effective model for instruction (Aycock, Garnham, & Kaleta, 2002; Bowen, Chingos, Lack, & Nygren, 2012; Hill, 2012). By leveraging online modes of content delivery outside of class time, blended courses can free face-to-face sessions for instructor feedback, applications, and interaction (Aycock et al., 2002; Hill, 2012). Indeed, a 2010 meta-analysis prepared by the United States Department of Education reports that in recent experimental and quasi-experimental studies, blended instruction has been found to be more effective than either face-to-face or fully online instruction (Means, Toyama, Murphy, Bakia, & Jones, 2010). However, caveats are in order, as the meta-analysis notes "it was the combination of elements in the treatment conditions (which was likely to have included additional learning time and material as well as additional opportunities for collaboration) that produced the observed learning advantages" (Means et al., 2010, p. xviii).

As Aycock et al. (2002) outline, blended approaches demonstrate wide variations not only in the distribution of face-to-face and online time, but also in course design, which reflect and accommodate differences in teaching style and course content. Although there is no "standard" approach to blended courses, they often involve a rigorous, time-intensive redesign of traditional face-to-face courses to fully integrate face-to-face and online learning (Aycock et al., 2002; Stone & Perumean-Chaney, 2011). Students' work online must be made clearly relevant to their work in the classroom, just as the face-to-face sessions must draw on and apply the online materials (Babb, Stewart, & Johnson, 2010; Gilbert & Flores-Zambada, 2011; Toth, Amrein-Beardsley, & Foulger, 2010). Building on this research, this paper will argue, based on the authors' case study, that the degree and type of coupling between online and face-to-face components is an important dimension along which blended courses can be varied.

MOOCs and Blended Learning

Despite variations in format, the "traditional" blended course assumes a common designer of both face-to-face and online learning: namely, the on-campus instructor(s) (Aycock et al., 2002; Gilbert & Flores-Zambada, 2011; Rodriguez & Anicete, 2010). For example, in one version of what is often called a "flipped" or "inverted" classroom (Lage, Platt, & Treglia, 2000), students gain first exposure to course content through online video lectures created by their instructor, then explore that content more deeply during class through active learning exercises also designed by their instructor (Talbert, 2012). Although some versions of the flipped classroom involve materials created by others, such as the use of textbooks for the pre-class first exposure to content (Mazur, 2009), the blend of online and face-to-face learning activities in such courses is designed by the students' on-campus instructor.

MOOCs present a new option for blended course design. Instead of "flipping" one's course by producing online lecture videos or leveraging textbooks, instructors can "wrap" their courses around existing MOOCs (Caulfield, 2012a; Fisher, 2012; Koller, 2012; Mangan, 2012; Shirky, 2012). In this approach, students in an on-campus course are asked to participate in part or in whole in a MOOC hosted at another institution, with the local instructor supplementing that online learning experience with face-to-face classroom interactions. Since MOOCs are designed externally and intended to function as stand-alone courses (Hill, 2012), incorporation of a MOOC in a blended learning experience constrains the face-to-face instructor's course design decisions: the online component is relatively fixed, and only the in-class component can be varied. The online component is, however, only relatively fixed because the instructor of the wrapper can always choose to use only parts of the MOOC, a possibility that the authors return to later in discussing customization around more than one MOOC and other online content.

The challenges posed by "wrapping" a course around a MOOC are not unlike those posed by incorporating a textbook, authored by another, into a course. However, given the variety and interactivity of learning experiences available on most MOOCs – lecture videos, automatically graded quizzes, discussion forums – the use of externally hosted MOOCs in blended courses involves design questions not raised by the use of textbooks. These are the questions explored in the current case study.

Methods

Instructional Context
The setting for the present case study (Yin, 2003) was a graduate-level course on machine learning taught at Vanderbilt University, a research university, by co-author Fisher. The Machine Learning graduate course was typically only offered every other year by the computer science program. Due to demand for the course from another graduate academic program of the University, a special section of the course was run during the Fall 2012 on an "off" year, and as an overload course for Fisher. As a result, many of the 10 students in the course were from outside computer science though all were graduate students with some computing sophistication, but new to machine learning. In order to maintain a sustainable workload across all his courses, Fisher decided to draw on some of the educational resources provided by MOOCs for this course, building on his experience incorporating open educational resources (Wiley & Gurrell, 2009), such as online lecture videos created by other faculty, into previous courses (Fisher, 2012). In fact, in an earlier Spring 2012 Machine Learning course, Fisher used online lectures by Stanford professor Andrew Ng, director of the Stanford Artificial Intelligence Lab and co-founder of Coursera.

Students in the Fall 2012 course, however, were asked to go well beyond simply watching Ng's online lectures; they actually enrolled in and were required to complete Ng's Machine Learning MOOC on the Coursera platform. This involved watching lecture videos, completing quizzes and programming assignments, and, optionally, participating in discussion forums. Students were asked to take screenshots of their submitted quizzes and programming assignments and send those to Fisher, allowing that work to contribute to the students' grades in the Vanderbilt course.

The start of the 10-week Stanford MOOC happened to coincide with the beginning of the Vanderbilt semester, one of the reasons Fisher chose to use it as part of his course. However, there were topics in machine learning not addressed by the MOOC that were of potential use to students in their research at Vanderbilt. Thus, students were also assigned additional readings, which were discussed in weekly face-to-face class sessions led by Fisher. Where the MOOC provided an introduction to some classic and widely used methods of machine learning, the readings were journal papers, consisting of both recent and seminal research in the field, chosen to build on the topics covered in the MOOC and to introduce other important areas of machine learning. During the final four weeks of the semester, after the MOOC ended, students worked individually on projects of their own design, receiving guidance and feedback from Fisher and each other, during the remaining in-class meetings. Student projects of this sort had been components of previous offerings of Fisher's Vanderbilt Machine Learning course; it was the first 10 weeks of the 14-week semester that differed in the Fall 2012 offering.

Figure 1 shows the rough layout of machine learning topics throughout the Fall 2012 course. The left column gives the topics covered by journal readings, which were the focus of weekly in-class discussions; the right column lists the online video topics from Andrew Ng's MOOC. The one-week offset in the readings (left column) was intended to allow Ng's MOOC videos to first introduce concepts before they were then expanded upon in the readings. Arrows between the columns indicate some of the dominant conceptual correspondences between the readings and MOOC videos, though synergies along these lines could only be realized at relatively high levels of abstraction and not at a "nuts-and-bolts" level. Fisher adapted his experience with these same videos in the Spring 2012 Machine Learning course in arriving at this sequencing for the Fall 2012 course.


Figure 1. Topics covered in the wrapper course by readings (left) and MOOC (right)

In many cases there are no correspondences shown. In some cases, none are shown because the connections are pervasive. For example, though no correspondences are shown for Week 6 of the MOOC lectures on experimental evaluation, this material had relevance across nearly all in-class discussions and online topics, and, in fact, the readings and their discussions presaged and reflected on this material throughout the course. Similarly, Week 7 of the in-class discussions on projects also referenced material throughout both online and in-class topics. In contrast, some in-class topics, such as relational learning, inductive logic programming, and knowledge-biased learning, simply had no strong linkage – certainly not at the nuts-and-bolts level – with MOOC topics, though in some cases the contrasts suggested by paradigmatic differences were a topic of discussion.

Although the topic references listed in Figure 1 are course specific, the implication that the figure conveys that the readings for in-class discussion were selected with some care has broad applications. They also reflect Fisher's priorities to include certain topics, which although having little or no substantive connection to the online topics, he thought important particularly in the Vanderbilt context. And in all cases, even linkages that did exist were treated at a relatively high level of abstraction.

Fisher describes this course structure as a "wrapper" approach, a term adopted from the machine learning research literature, referring to an algorithm that is wrapped around another in order to extract the most salient features from the environment, and therefore to improve overall learning. With this approach in mind, he "wrapped" his on-campus course around the Machine Learning MOOC offered on the Coursera platform. The in-class lectures and low-stakes homework assignments Fisher provided in previous offerings of this course were replaced by the MOOC's online lecture videos, automatically graded quizzes, and programming assignments. Doing so enabled Fisher to use class time differently, focusing it more on interactive discussions and more challenging material. This structure is a version of the flipped classroom referenced above. In Fisher's case, the MOOC played the role of the earlier video lectures or textbook, providing students with a structured introduction to some of the course content.

Data Collection and Analysis

In order to explore student experiences learning in this wrapped course, a focus group was conducted with the students during one of the weekly class sessions just after the MOOC ended. The focus group, with all 10 students participating, was conducted during the first half hour of class that day. Students were informed that the focus group was part of a research project exploring hybrid teaching models and that their instructor was interested in their feedback on the course. The focus group was audio recorded and transcribed for later analysis. An informal and de-identified summary of the student remarks was shared with Fisher shortly after the focus group, before the end of the semester.

Later in the semester, students were asked to complete the standard end-of-course evaluation forms used widely at the University. Students' responses to two holistic, Likert-scale questions on these forms are discussed below. Additionally, a few weeks after the course had concluded (after winter break), students were asked by Fisher to complete a post-course survey, which was designed by the researchers to further explore some of the themes that emerged from the focus group. The survey consisted of 14 Likert-scale questions and three open-ended questions, and was taken by the students anonymously. Only five of the 10 students in the course completed the survey, yielding a 50% response rate.

Qualitative data analysis for this study involved the constant comparative method (Strauss & Corbin, 1998) and the development of case studies (Yin, 2003). During the initial phase, the transcripts of the focus group and students' responses for the open-ended survey questions underwent line-by-line coding in order to establish categories and subcategories related to students' experiences and views of the Machine Learning class and the wrapper approach. These overarching themes were triangulated (Strauss & Corbin, 1998) with Fisher's perspective as the course instructor. During this iterative process, the researchers met regularly to discuss the emergent categories, refine themes, and connect ideas.

Findings

Overall, student response to the wrapper approach in the Machine Learning course was enthusiastic. They described Ng's lecture videos as designed effectively, presented clearly, and informative; they described the MOOC as generally useful for self-paced learning. The students did not engage actively in the online community of peer learners created through the MOOC, preferring to interact with the local learning community provided by the on-campus component of the course. Although their overall response to the wrapper approach was positive, students pointed to challenges in integrating the online and face-to-face components of the course. Student perspectives on these issues are described in the following section.

Value of Self-Paced Learning

According to students, the major advantage of the MOOC over a traditional lecture-based course was its greater flexibility, customization, and accessibility, which students saw as encouraging structured self-paced learning.

Students valued the flexibility offered through the MOOC, which allowed for them to watch the weekly video lectures at their own pace and on their own schedule. As one student described:

"I really, really like the absorbing information on your own time at your own speed, and through this sort of video format with someone that you know is a really good lecturer, has really carefully prepared these topics, and I think that's much more efficient [than traditional lectures]." (Focus group transcript, November 14, 2012)

Along with students finding "being able to [watch videos] on your own schedule" as "very valuable," students also described that the videos' shorter length, typically between five and 15 minutes, helped them to keep their attention focused and to better digest the lecture content (Focus group transcript, November 14, 2012).

Various features of the online platform also allowed for students to customize the way in which they viewed lecture videos, which they found to be more efficient and conducive for learning. For example, one student explained how the variable viewing speed, captions, and embedded quizzes helped to "make [Coursera] a wonderful learning experience":

"I love the way Coursera is set up, and that you can kind of set your own schedules, watch it when you want. In addition to being able to watch at 2X [double speed], you also have captions throughout, so 2X plus captions makes it really easy to understand what's going on. And they also have questions based throughout the videos, and quizzes and homework assignments through there also, to totally keep you fully integrated with what's going on, and makes it a wonderful learning experience." (Focus group transcript, November 14, 2012)

Another student described the MOOC lecture videos as "basically the best thing ever," having watched the online lectures at "twice the speed," which helped the student "stay focused" and "feel like [he/she] got a lot more out of the material" (Focus group transcript, November 14, 2012). In addition to being able to speed up video playback and customize features, students also found the almost immediate feedback on quizzes and programming assignments to be helpful. One student explained that this "instant feedback" allowed for him/her to gauge his/her understanding and "make changes" accordingly.

Although students believed the flexibility of MOOC's self-paced environment to be effective, they also described it to be a challenge to stay on schedule. One student explained, "You have to be very disciplined to make sure you're keeping up on the material. If not, you'll find that you're trying to play catch-up a lot of times" (Focus group transcript, November 14, 2012). Another student, however, found that the self-paced environment enabled him/her to work ahead. Despite these differences, students described the face-to-face sessions with Fisher as helping to keep them on track with the material online.

Local vs. Global Learning Communities

Although students participated regularly in the Machine Learning MOOC to complete and submit assignments – for example, the programming assignments and quizzes, also submitted to Professor Fisher – they did not actively participate in either the Coursera discussion forums or the study groups formed online. Students cited time constraints as the main reason for not participating more actively in the online discussion forums. Instead, they used the discussion boards to check for course errata or to quickly troubleshoot questions or problems, but tended to ask questions among their local peers.

Additionally, students found the discussion boards helpful for solving problems they encountered, including sharing strategies and solutions pertaining to those problems. Although no students described posting a question on the discussion boards, students did describe the forums as useful for learning about "other people who were having the same problem" and applying their solutions to the problem. One student reported, "I knew that if I was stuck on something, thousands of other students were trying to do the same thing. In all cases I could find my specific questions in the online forums" (Survey response, January 19, 2013). Another student affirmed, "Whenever I had trouble on an assignment, I could almost always just go to the forum and look at the answers provided by people who had already run into similar stumbling blocks" (Survey response, January 17, 2013).

Instead of utilizing the online discussion boards, students preferred to ask questions about and discuss course content during the face-to-face class sessions. As one student explained in the focus group:

"I think when I had a question, I tended to ask the other people in here, before I would probably ask it on the discussion board. I mean, me and [another student] would talk before class about some of the material review." (Focus group transcript, November 14, 2012)

Students liked the structure of the wrapper format because it opened up space for productive class discussions related to the content. They also described in-class discussions as valuable for generating new ideas and new research projects. As one student described in the focus group:

"One of the things I liked is that, since you did the lecture material at home whenever you had time for it, it saved the class time for discussion. And so we didn't always discuss stuff that was exactly following along with the course, but whenever we did, I found that a lot more helpful." (Focus group transcript, November 14, 2012)

Another student echoed a similar belief: "So, I really liked doing that [online content] sort of outside [of class], and then coming in and sort of like taking all of the knowledge that supposedly you sort of download into your brain and apply." A third student expressed a view of class time as "it's more like you ask questions, you learn." (Focus group transcript, November 14, 2012)

Interestingly, three of the five responses to the online survey question asking how to improve the course suggested even more discussions of the MOOC material during the face-to-face class meetings. One student explained that discussions were valuable because they facilitated "instant feedback from [the] instructor and classmates," (Survey response, January 28, 2013) and another explained,

"I would recommend that you discuss more of the Coursera material in the class. I don't think you should give a repeat lecture of the material, but rather spend some time talking about the methods presented and the main ideas of the methods. This was done in our class to a degree, but I would like even more discussion from Coursera." (Survey response, January 19, 2013)

In the focus group, students also suggested more in-class discussion of the material presented in the MOOC. These suggestions included "short discussion for the first 15 minutes of class to ask questions or consolidate ideas before moving on" and more "discussion of applications" of the online content (Focus group transcript, November 14, 2012).

Misalignment between Face-to-Face and Online Components

According to students, one challenge in this offering of the on-campus Machine Learning course was that the topics covered in class did not always line up with the material covered in the video lectures on a week-to-week basis. Students mentioned that they would have preferred a greater degree of alignment between online and on-campus offerings, so that the material in-class would more directly address, and expand upon, the topics covered online. As one student explained in the focus group:

"I felt like the topics we covered in class – because we'd read some like outside papers – they didn't line up very well with a lot of the online material. I mean, not that it wasn't valuable stuff, but it seemed kind of disjointed to me." (Focus group transcript, November 14, 2012)

The misalignment was particularly problematic for students in terms of the research papers discussed in class. One student commented that the information in the papers was presented in a "less structured format" than the information in the MOOC materials, making the papers seem "less accessible." However, as another student pointed out, the research papers required a "different kind of learning" than the highly structured video lectures. And as that student described, although the papers raised more questions than the online lecture material, the face-to-face sessions provided a space for discussion.

Students emphasized that they were new to machine learning and reported feeling ill-prepared and lacking in context to adequately understand the papers. As beginners, they would have preferred to read papers more directly connected to the video lectures and material covered online. They suggested supplementing reading with review articles assigned before each paper, or by including an outline or key points to guide assigned reading. Even though there was a consensus among students that the papers were challenging, they described the reading in the terms of application of knowledge, an exercise in "Can you get something from it?" – that is, the real-life negotiation of meaning. And, in fact, one student reported learning to read machine-learning papers in the course of the seminar, despite the challenges:

"One thing I was just going to say about the papers is for me, they were kind of a bitter pill to swallow. I went through and I read these papers, but looking back, I'm glad I did it because I feel like I can go to a machine learning paper, and I can read it, and I won't be as intimidated by it because I've kind of struggled through it all semester reading these things. So, I feel like I'm in a better place now than I was before I started this course." (Focus group transcript, November 14, 2012)

Student Perceptions of the Instructors

Valuing both Ng's and Fisher's contributions, students viewed each as having different roles in the Machine Learning course. Overall, students perceived Ng, a "world-renowned researcher and teacher," as the lead lecturer of the course and explained that they found his teaching style to be effective (Focus group transcript, November 14, 2012). Students explained that "he did a really good job with the course" (Focus group transcript, November 14, 2012). Specifically, one student pointed out: "Andrew Ng does a great job of teaching the skills necessary and highlighting potential problems" (Survey response, January 17, 2013).

In contrast, students perceived Fisher's role in the face-to-face sessions as that of a "facilitator." They described him as following up on their work in the MOOC, explaining concepts and providing background for the papers, and leading class discussions. In the focus group, one student explained:

"I thought he [Fisher] was a facilitator, and he would try to facilitate discussions. He would introduce papers for us to read, and then just kind of follow up and make sure we're doing the Coursera stuff by having us submit everything to him each week. That's the word that comes to my mind." (Focus group transcript, November 14, 2012)

Another student built on this comment about Fisher's role as facilitator:

"He did a really good job in facilitating the discussion of the research papers, I thought. And he made sure that everybody talked, even when we didn't want to. He would tease something out of us to get us to talk about the paper and what we thought or what we didn't understand." (Focus group transcript, November 14, 2012)

As noted above, students in the course were asked to complete the University's standard end-of-semester course evaluation. Six of 10 students responded to the two holistic questions: "Give an overall rating of the instructor" and "Give an overall rating of the course." Each question had an average response of 4.17 (on a 5-point scale – 3 being average, 4 being very good, and 5 being excellent), with a standard deviation of 0.68. These ratings were comparable to Fisher's Spring 2012 Machine Learning course, in which students viewed Ng's lectures, but the rest of the MOOC was not used. Before 2012 (Spring and Fall), the last offering of the Machine Learning course by Fisher (or any instructor) was in the Spring of 2006. That offering, occurring well before MOOCs were available, was taught using a more traditional face-to-face approach. The average end-of-semester ratings of instructor and course in 2006 were 3.83 (standard deviation: 0.89) and 3.66 (standard deviation: 1.11), respectively, with six of six students responding. These data, although based on small sample sizes, indicate that the hybrid course of 2012 was somewhat better received than the more traditionally taught course of 2006. While suggestive only, the increase in means and the decrease in standard deviations (from 2006 to 2012) are measures that warrant continued tracking in future wrapper courses.

Discussion and Conclusion

While these numbers are too small to support strong conclusions on the efficacy of the authors' wrapper design, the experience is suggestive and can guide research going forward. In this section, after reviewing key observations from the case study, the beginnings of a categorization scheme for the kinds of couplings that can arise between the online and in-class components of a blended course are introduced. The present study is then framed with this nascent categorization, and an argument is put forward that customizing a wrapper around parts of multiple MOOCs – and other online resources – can both leverage the advantages of MOOC platforms, and also soften design constraints that stem from adopting a MOOC en masse.

MOOCs as Learning Resources

It is clear that the students in this Machine Learning course found the online lecture videos provided by the MOOC to be useful, thanks to both content and form. While it is possible that less experienced students (say, first-year undergraduates) might not find online lecture videos, with their lack of instructor-student interaction, as useful, it is clear that at least in this teaching context, the online lectures were a valuable resource for the students.

Interestingly, "outsourcing" the lecture component of the course to the MOOC instructor did not diminish the students' view of the on-campus instructor as an effective teacher. They noted that Fisher's role was changed from a lecturer to a facilitator, but the students had an overall positive view towards Fisher and Ng. This is perhaps not surprising given the greater (graduate) experience level of the students in this course. It is also possible that Fisher's inclusion of research papers he selected helped students see him as an expert, even if he was not fulfilling that role in the traditional way of lecturing. Less experienced students (again, consider first-year undergraduates) might not place as much value on the facilitator role taken by the on-campus instructor.

Moreover, it is possible that having two instructors, with different points of view on the course content, helped the students better understand debates within the discipline. Again, expert differences could be challenging for underclassmen and/or students in fields where a single "right" answer is not typically mandated, such as literature or history; however, they could also provide a useful tool for helping students move from what Kuhn (1992, pp. 167-168) describes as "absolutist" or "relativist" modes of thought to more "evaluative" modes as they grapple with observations that experts in a field can and do disagree. Though there were no overt disagreements between instructors on the material that was covered, Fisher's inclusion of material unrelated to the MOOC coverage reflected (probably) Fisher's different prioritization of material; different prioritizations among machine learning researchers and practitioners was a topic for some in-class discussion.

Also valuable, at least for some students, were the discussion forums provided within the MOOC. Their use of the online learning community, albeit limited to selective reading and no posting, points to the value of the "M" and the "C" in the acronym MOOC: with thousands of other students ("massive") working through the same material at the same time ("course"), it was highly likely that any difficulty encountered by one of Fisher's students was raised and addressed on the forums.

Although the present case study's results indicate that MOOCs can serve as useful learning resources as part of a blended course, student comments in the study also point to the design challenges involved in wrapping a face-to-face course around a MOOC. The misalignment that students perceived between the online and face-to-face components of the wrapped course speaks to the recommendation that students' work online must be made clearly relevant to their work in the classroom, and vice versa (Babb et al., 2010; Gilbert & Flores-Zambada, 2011; Toth et al., 2010). This design challenge seems particularly difficult when building a blended course around a MOOC, given that the online component is relatively fixed, potentially inviting a schism between the online and face-to-face components of the course. This design challenge is explored in the following subsections.

Coupling between Online and In-Class Components

The authors' study suggests that hybrid courses are characterized and distinguished by the coupling that occurs between the online and face-to-face components, as well as the cohesion of the hybrid course in total. Coupling refers to the kinds and extent of dependency between online and in-class components of a hybrid course, whereas cohesion refers to the relatedness of the course content overall.

There was a relatively low degree of coupling (or loose coupling) in Fisher's course, by the instructor's design, and to the apparent dissatisfaction of some students. The factors behind the low-coupling design were: (1) that there were material and skills that the on-site instructor wanted to cover that was not covered by the MOOC, with limited synergies possible; (2) that the on-site instructor, a machine learning expert himself, felt that the MOOC modules were excellent and self-contained; (3) that those modules were certainly within the grasp of graduate students taking the course (indeed, class assessments confirmed this); and (4) that the instructor needed to maintain a sustainable workload (this was an overload class), and greater coupling generally requires greater time and effort (Aycock et al., 2002). Under (1), the skills at issue are those of reading and understanding journal papers published in the literature, skills in which graduate students must become practiced.

The coupling that did exist in the course involved limited in-class discussion of MOOC material as it related to some of the readings. These interactions, in which part of a class session was used to synthesize across readings and MOOC lectures, was perhaps closest to a traditional flipped classroom. The authors call this subject coupling, because subject matter is shared across the online and face-to-face components of a course. While most periodic (e.g., weekly) assessments were done through the MOOC, Fisher did give a weekly quiz on the readings, and in some cases these quizzes would draw upon both reading and video material (e.g., students received an in-class quiz that asked them to combine concepts from the reading on regression trees with the MOOC material on multivariate regression).

The instructor also intended that final projects by students would draw on methods presented through the MOOC and the readings, and that the final projects would also allow students to practice research methodologies, which they were learning about through both readings and the MOOC. This is an example of what the authors call task coupling, since online and face-to-face components contribute to the completion of a task, typically by learning and applying complementary subject content and/or skills. (These categories, subject and task, are not intended to be mutually exclusive; indeed, much of what would be characterized as active and experiential learning would involve both types.)

Implications of Coupling on Student Satisfaction

Student feedback suggests that students would have liked stronger subject coupling between MOOC and face-to-face components, in which the MOOC material is reviewed in class. It remains an open question as to whether students would have been equally satisfied with task coupling, if only it had been distributed throughout the semester (e.g., regular in-class activities in which students applied what they learned in the MOOC) rather than reserved for the project at the end of the semester.

The previous offering of the Machine Learning course, in the Spring 2012 semester, offers a contrast here. As noted earlier, during that semester Fisher incorporated most of the online lecture videos from the Stanford MOOC into the course, but did not require students to participate in the MOOC itself. Indeed, the MOOC was not offered at that time; only the archived lecture videos were available. (This points to another design constraint in wrapping a course around an external MOOC: the times at which the MOOC is offered might not align well with the local academic calendar.) The Spring 2012 offering of the course had higher subject and task coupling, since the online videos were addressed more directly during class (including through quizzes) and the student projects were started earlier in the semester. Even though end-of-semester students' ratings for both Spring and Fall 2012 course offerings were comparable, none of the Spring students mentioned any kind of misalignment between the online and face-to-face components of the course. This experience offers some evidence for what seems a natural conclusion, that higher coupling results in fewer student concerns about low coupling.

Despite some student discomfort with it, the low coupling approach nonetheless resulted in a very satisfactory class as measured by end-of-semester evaluations. Nonetheless, creative approaches to course redesigns that more highly couple the online and in-class components are certainly of interest. Below, plans are described for a blended approach that draws resources from multiple online resources, including MOOCs, which may have positive implications for coupling.

Cohesion and Coupling

The low coupling design of the Fall 2012 Machine Learning wrapper contrasts with the holistically designed, highly coupled, blended pre-MOOC courses surveyed earlier. As noted, the instructor of a wrapper can craft in-class activities that more significantly connect with the existing MOOC, through subject, task, and other forms of coupling; however, learning and teaching goals and scope necessarily guide and constrain course design.

In addition to coupling between online and face-to-face components, the content cohesion of a blended course, or of any course for that matter, must also be addressed. Indeed, the degree of course cohesion will influence the degree and types of coupling that are most natural within a course. Fisher's Machine Learning course was a survey course, designed to cover a wide range of machine learning methods, many of which are quite disparate. As such, the course cohesion was low, as compared to say, Ng's stand-alone MOOC, in which topic choice and scaffolding created a strong sense of synergy. Thus, in blended environments there can be not only a low coupling between online and face-to-face components of a course, such as Fisher's wrapper, but also a low coupling between the face-to-face modules, or between the online modules. Indeed, to the extent that there exists, in wrapper courses, a difference in the degree of scaffolding provided for materials in the online and the on-campus components, it is possible that additional scaffolding of materials in the on-campus classroom could increase students' perception of cohesion, even without incorporating additional forms of coupling.

Furthermore, the Fall 2012 wrapper combined a graduate seminar course (which included journal readings) delivered through the in-class component with a more structured, lecture-based course delivered through the online component. Including both components in one course is important in the authors' setting, but the very different modalities for learning may have been more responsible for perceptions of schism than the hybrid online and face-to-face structure per se. Moreover, student perceptions of schism may have been magnified because "the two faces" (Professors Fisher and Ng) of the two respective course components were different. An open question is whether students would have been as concerned with schism (between content) in a traditional, one-instructor, entirely face-to-face, low-cohesion survey course, as they were in the authors' wrapper version of a survey. Generally, student expectations regarding a wrapper should be characterized and addressed, because having multiple instructors in multiple modalities is not in the experience of most students.

Finally, this paper has introduced the barest categorization scheme for coupling and cohesion, but the authors expect that this can be usefully expanded and deepened. For example, subject coupling (across or within modalities) may be broken down into repetitive (or reinforcing) treatment of the same material, or connected by prerequisite relationships, with some material building on others. Likewise, forms of task coupling might be further distinguished into modules looking at complementary subject content and/or complementary skill content.

Customization and Other Future Work

The Fall 2012 wrapper used the totality of the Stanford MOOC, but customization strategies might choose to wrap a course around only part of a MOOC, or more ambitiously, to wrap a course around parts of multiple MOOCs. The possibilities to explore this latter type of customization continue to emerge as the Coursera platform continues to expand; for example, the platform hosts the lectures of a second Machine Learning MOOC (Domingos, 2013) from the University of Washington. While this course has some intersections with the Stanford MOOC, it also intersects the additional content that Fisher included in his wrapper. A next step would be to design a wrapper around two or more MOOCs, with the instructor selecting and mixing lectures and assignments from each MOOC, as well as using other online content, some perhaps even produced by the instructor of the wrapper. This is an exciting possibility, which does not require that a MOOC be adopted in its entirety, as is. Currently this process of selection and mixing is technically easy, and such use cases may further drive MOOC providers to design for piecemeal use, accelerating customization and a co-evolution of online and blended course designs. Indeed, some recent ideas in this area highlight the possibilities opened by "mixable" elements (Caulfield, 2012b). In any case, it is expected that a process of mixing online resources will require greater attention to course design, perhaps resulting in certain kinds of coupling between online and in-class components, thus reducing perceptions of schism.

In general, the focus of this paper has been the design of in-class components to complement existing online components (a MOOC in this case study). However, the more typical perspective of designing online components to complement in-class components is equally important. In the case where the online components are MOOCs, however, the question becomes novel: how should MOOCs be designed to best take advantage of in-class component designs? More generally, how can MOOCs be best designed to best leverage differently designed local learning communities?

Finally, drawing on the findings of earlier studies (Aycock et al., 2002; Mehaffy, 2012), the authors believe that greater customization will also lead to a greater realization among instructors that they are members of instructional communities, further promoting open opportunities for collegiality and collaboration among instructors and across disciplines. As members of community, the authors expect that in many cases, instructors of wrapper courses will create and add content to the world's repository as well, which may in turn be picked up by students outside the wrapper. While Fisher did not create and contribute video himself for the Fall 2012 Machine Learning course, he has done so for other courses, and for his graduate course in artificial intelligence, he requires students to create and post content. Fisher (2012) reports that students taking a MOOC in artificial intelligence hosted by another institution visited his YouTube channel for clarification on some concepts, posting a link to the channel on the MOOC discussion board, bringing still other visitors.

While such data is anecdotal, it suggests the fascinating possibility for characterizing student and faculty interactions beyond any single MOOC, to include interactions across MOOCs and across media. Longitudinal studies for understanding the nature, extent, and evolution of ad hoc communities, perhaps MOOC centered but not restricted to a single MOOC, would undoubtedly require data mining across larger spheres of Web interactions than is currently easy to do. Nonetheless, the possibilities for understanding and leveraging student patterns in seeking remedial and advanced material, instructor incentives for creating and posting material, and the movement of people between student and teacher roles, are exciting and within current technical abilities – if only the data could be accessed.

References

Aycock, A., Garnham, C., & Kaleta, R. (2002). Lessons learned from the hybrid course project. Teaching with Technology Today, 8(6). Retrieved from http://www.uwsa.edu/ttt/articles/garnham2.htm

Babb, S., Stewart, C., & Johnson, R. (2010). Constructing communication in blended learning environments: Students' perceptions of good practice in hybrid courses. MERLOT Journal of Online Learning and Teaching, 6(4), 735-753. Retrieved from http://jolt.merlot.org/vol6no4/babb_1210.htm

Baldwin, R. G. (1998). Technology's impact on faculty life and work. New Directions for Teaching and Learning, 76, 7-21. doi:10.1002/tl.7601

Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive learning online at public universities: Evidence from randomized trials. New York, NY: Ithaka S+R. Retrieved from http://www.sr.ithaka.org/sites/all/modules/contrib/pubdlcnt/pubdlcnt.php?file=http://www.sr.ithaka.org/sites/default/files/reports/sr-ithaka-interactive-learning-online-at-public-universities.pdf&nid=464

Caulfield, M. (2012a, November 9). How Coursera could walk the talk about MOOC-wrapping [Web log post]. Retrieved from http://mikecaulfield.com/2012/11/09/how-coursera-could-walk-the-talk-about-mooc-wrapping/

Caulfield, M. (2012b, December 11). Threads and the wrappable MOOC [Web log post]. Retrieved from http://www.hapgood.us/2012/12/11/threads-and-the-wrappable-mooc/

Domingos, P. (2013). Machine learning. Retrieved July 25, 2013, from https://www.coursera.org/course/machlearning

Fisher, D. H. (2012, November 6). Warming up to MOOC's. [Web log post]. Retrieved from http://www.chronicle.com/blogs/profhacker/warming-up-to-moocs/44022

Gilbert, J. A., & Flores-Zambada, R. (2011). Development and implementation of a "blended" teaching course environment. MERLOT Journal of Online Learning and Teaching, 7(2), 244-260. Retrieved from http://jolt.merlot.org/vol7no2/gilbert_0611.htm

Guthrie, K. M. (2012). Barriers to the adoption of online learning systems. EDUCAUSE Review, 47(4), 50-51. Retrieved from http://www.educause.edu/ero/article/barriers-adoption-online-learning-systems

Hill, P. (2012, November 1). Online educational delivery models: A descriptive view. EDUCAUSE Review, 47(6), 84-97.Retrieved from http://www.educause.edu/ero/article/online-educational-delivery-models-descriptive-view

Koller, D. (2012, November 7). How online courses can form a basis for on-campus teaching. Forbes. Retrieved from http://www.forbes.com/sites/coursera/2012/11/07/how-online-courses-can-form-a-basis-for-on-campus-teaching/

Kuhn, D. (1992). Thinking as argument. Harvard Educational Review, 62(2), 155-178.

Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31(1), 30-43. doi:10.1080/00220480009596759

Mangan, K. (2012, October 1). Massive excitement about online courses. The Chronicle of Higher Education. Retrieved from http://www.chronicle.com/article/Massive-Excitement-About/134678/

Mazur, E. (2009). Farewell, lecture? Science, 323(5910), 50-51. doi:10.1126/science.1168927

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

Mehaffy, G. L. (2012). Challenge and change. EDUCAUSE Review, 47(5). Retrieved from http://www.educause.edu/ero/article/challenge-and-change

Pappano, L. (2012, November 2). The year of the MOOC. The New York Times, ED26. Retrieved from http://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html

Rodriguez, M. A., & Anicete, R. C. R. (2010). Students' views of a mixed hybrid ecology course. MERLOT Journal of Online Learning and Teaching, 6(4), 791-798. Retrieved from http://jolt.merlot.org/vol6no4/rodriguez_1210.htm

Shirky, C. (2012, November 12). Napster, Udacity, and the academy [Web log post]. Retrieved from http://www.shirky.com/weblog/2012/11/napster-udacity-and-the-academy/

Stone, M. T., & Perumean-Chaney, S. (2011). The benefits of online teaching for traditional classroom pedagogy: A case study for improving face-to-face instruction. MERLOT Journal of Online Learning and Teaching, 7(3), 393-400. Retrieved from http://jolt.merlot.org/vol7no3/stone_0911.htm

Strauss, A. L., & Corbin, J. M. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

Talbert, R. (2012, July 13). Screencasting for the inverted proofs class [Web log post]. Retrieved from http://www.chronicle.com/blognetwork/castingoutnines/2012/07/13/screencasting-for-the-inverted-proofs-class/

Toth, M. J., Amrein-Beardsley, A., & Foulger, T. S. (2010). Changing delivery methods, changing practices: Exploring instructional practices in face-to-face and hybrid courses. MERLOT Journal of Online Learning and Teaching, 6(3), 617-633. Retrieved from http://jolt.merlot.org/vol6no3/toth_0910.htm

Wiley, D. A., & Gurrell, S. (2009). A decade of development ... . Open Learning: The Journal of Open and Distance Learning, 24(1), 11-21. doi:10.1080/02680510802627746

Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage.


View the original article here