Open, closed…and everything in between!

Reading through the survey answers to the question “What does open mean to you?”, the perspective offered by an anonymous librarian had me thinking about ‘open’ and ‘closed’ as the extreme ends of a spectrum. Are there degrees of openness?

Open and closed can be thought of as binary positions, but I think it’s more accurate to see it as a continuum. There are degrees of openness. A college might be called open if it accepts everyone who applies, but tuition and the applications process are still hurdles that close it off to some. Some courses call themselves open but are only open to registrants, and closed to anyone else. When it comes to open education or open learning, I think an open-mindedness on the part of course facilitators is important, so that they’re open to learners establishing their own goals and having a hand in defining their own learning paths and assessments – open outcomes/open assessment.

Anonymous librarian’s answer to a preliminary survey for P2P University course on the item: What does ‘open’ mean to you?. Audio recording by Marie-Laure Le Guen under CC BY 3.0.

In higher education, MOOCs are often regarded as the quintessential example of openness. Indeed, on the surface, participation in a massive open online course requires little more than a reliable Internet connection, an awareness that the course is offered and a bit of time on one’s hands. In practise though, being open isn’t just about uploading a couple of video lectures and letting students grapple with the material on their own.

All MOOCs self-identify as open but they clearly do not apply the same yardstick. To measure how much openness goes into a particular MOOC, we would need a multidimensional framework that would help us determine the type and the degree of openness realised in a given course.

I see several dimensions to consider, which I would formulate along these lines:

1. Open enrollment

This is currently the minimum requirement for any MOOC: anyone with an Internet connection should be able to join, regardless of their location, educational background, professional credentials, financial status, etc.

Open enrollment usually means that there are no restrictions as to the number of students joining, which explains the massive scale of some recent courses. An Artificial Intelligence class offered by Stanford in 2012 saw over 58,000 people [pdf] sign up !

Some courses such as Why Open? have however enforced a limit on the number of participants in an effort to manage the pace of the discussion forums. The irony wasn’t lost on Terry Elliott:

This looks like a worthwhile August project and I would love to be a part of it, but I am struck by the delicious irony that a course called “Why Open?” is already closed 😉

Is it enough to allow anybody to sign up at no cost to claim complete openness in terms of enrollment? Well, not exactly. Setting aside the obvious issue of Internet access and cost, the following barriers to enrollment may still exist:

  • Language: the vast majority of MOOCs are offered in English, by English-speaking facilitators or instructors and thus end up being dominated by native English speakers or at least proficient English speakers. Instruction, discussions and assessments generally take place in English which puts some potential students at a great disadvantage. I would therefore rate a multilingual course as more open than a monolingual one. The practical implications aren’t simple but it is a fact worth noting.
  • Self-censorship: Even though the platform is meant to be open to all, potential learners may feel intimidated by the academic qualifications of other participants or the technical terms used in the introduction page and decide that they are not welcome because they do not fit into the culture of the virtual university.

It appears that even on an issue seemingly as straightforward as open enrollment, degrees of openness emerge.

2. Open participation

Which aspects of the course can participants model according to their individual or group needs? The answer to this question will determine the course’s level of openness-as-participation.

This would involve checking whether students are able to define their own learning goals and paths, whether they are expected to create their own learning materials or just use those provided by an external authority, whether they can ‘come and go’ as they wish or are bound by certain rules. It is mostly about who has control over what happens in the course.

Caveats:

  • It is often taken for granted that the tools used in MOOCs are mastered by all the learners. It isn’t always the case and, in the absence of appropriate scaffolding, the lack of technological literacy is going to be a barrier to participation.
  • Low connection speed and firewalls are major hurdles in many parts of the world, and this digital divide is most acutely felt in the case of learners trying to follow video-based lectures. I’ve experienced this myself time and time again!

According to this framework, a course is all the more open that is gives leeway for participants to forge their own paths and provides an appropriate technical infrastructure taking into account everyone’s needs – including for instance alternatives to video lectures and tutorials to help bridge the technological literacy gap, etc.

3. Open resources

What are participants allowed to do with the course materials? Is the software infrastructure open source?

For a MOOC to qualify as fully open in this regard, it would have to release its course material under an open license and use open source software. Here, the restrictions imposed by the chosen license will define the degree of openness, using only copyrighted material being farthest on the closed end of the spectrum while using exclusively works from the public domain would conversely place a course in the most open position (to take extreme examples…).

This is a side of openness often ignored by the bigger xMOOC players.

4. Open assessment

Since the motivations for joining an MOOC typically vary from improving job prospects to simple curiosity for the subject matter, it would not make sense to look at assessment through a single lens. In an open course, learners are free to set their own achievement goals, so the conditions of assessment should be flexible enough to accommodate the diversity of student expectations.

What would open assessment look like then? Pretty much anything that makes sense to the learner will work. Pragmatically, it comes down to a choice between grading (robot grading, peer assessment…), port-folio based assessment or no formal assessment at all. Some students are pushing for access to formal university accreditation.

As usual, it would be fantastic to hear from you on this. Any feedback? Ideas on how to visualise these 4 dimensions ?


References

Daniel, J. (2012). Making Sense of MOOCs: Musings in a Maze of Myth, Paradox and Possibility (pdf). Accessed 2013-08-05.

Breslow, L., Pritchard, D. E., De Boer, J., Stump, G. S., Ho, A. D. & Seaton, D. T. (2013). Studying Learning in the Worldwide Classroom: Research into edX’s First MOOC. Research & Practice in Assessment, 8(3), 13-25. Accessed 2013-08-05.

Advertisements

7 thoughts on “Open, closed…and everything in between!

  1. Hi,

    Interesting post (notice you include appropriate scaffolding as one of the potential barriers to openness…)

    It looks like, from the data, that the language barrier you identify is fairly significant. Lots of MOOCs specifically tout the provision of educational opportunities to those who can least afford them, in terms of the global divide of wealth and poverty, as a key motivation. But the data seems to show that participants are, overwhelmingly, from rich, English speaking countries.

    High GDP English speaking countries typically top the list. Then high GDP non English speaking countries. The low GDP non English speaking countries typically trail in in the fractions of a percent.

    Looks like the language barrier is an issue, as is access to reliable technology.

    I think the continuum idea is interesting. For example, some universities running Openish MOOCs are saying they want to reuse their resources. So, for things like Multiple Choice Question tests, they might have them as proprietary, so that when the student encounters them on the course, it is for the first time.

    This would seem useful, because the student will value the experience more ( it’s incredibly annoying to have to take a test you have already done. No real learning tends to happen, and you feel trapped, and a little powerless…), and the university has to invest fewer scarce resources (they won’t need to redo all their MCQs for each course). Practically Open, might be a description here…

    In terms of assessment, a sole reliance on peer review can be problematic. Coursera courses trying to come to terms with this are hitting a few brick walls. Trolling, or other undesireable activity on the part of peers might be an issue.

    Furthermore, well designed automated feedback is in some cases better than peer feedback, and expert feedback is superior, in many cases to both (I’m reminded of my institution using peer feedback to inform our research proposals for the coming year. I engaged in the peer feedback actively, carefully, and enthusiastically. My proposal failed, and the expert feedback pointed out around 25 issues that the peer had missed…)

    Feedback doesn’t have to be either/or. It can of course, be a mix…But it might, due to time and resource constraints, be a balancing act. Again,. practically open, which could necessitate constraints on student choice. The more expert feedback that is desired, the more closed, in some respects, the curriculum may need to be. The more autiomated, again, the more closed aspects of it are likely to be ( developing 100 MCQs with good feedback that are well aligned, and useful, is a huge task. Developing 500 is probably beyond the resources of most institutions who are not doing it for profit).

    • Hi Keith,

      Thank you for reading this post and taking time to comment as well!

      Scaffolding

      notice you include appropriate scaffolding as one of the potential barriers to openness…

      Where did I say that ? *confused*

      Language barrier

      Lots of MOOCs specifically tout the provision of educational opportunities to those who can least afford them, in terms of the global divide of wealth and poverty, as a key motivation. But the data seems to show that participants are, overwhelmingly, from rich, English speaking countries.

      Would you have any links to the data you refer to? So far, I’ve only had access to studies based on one course (cf. EdX study above) and I’m not aware of more extensive studies that could demonstrate general trends on Coursera or EdX for example. Maybe I’m not looking in the right place?

      I’ve seen great translation and localization efforts championed by students on a couple of Coursera courses (e.g. subtitling of video lectures, support groups in Spanish and French) but this is bound to remain marginal until their content is openly licensed.

      This said, things seem to be slowly improving: Coursera currently offers 390 courses in English, 11 in Spanish, 10 in French, 5 in Chinese, 1 in Arabic, 1 in German and 1 in Italian. See screenshot below (taken from https://www.coursera.org/courses):
      Coursera website on 09th August 2013

      Assessment

      In terms of assessment, a sole reliance on peer review can be problematic. Coursera courses trying to come to terms with this are hitting a few brick walls. Trolling, or other undesireable activity on the part of peers might be an issue.

      I’ve also noticed that unless there’s already a bond between students taking the same course, many learners will have a hard time accepting or trusting peer review. Last year’s discussion forum on the gamification course (#gamification12) was rife with rants about grades, lack of serious feedback by peers and more general criticism about peer grading. Audrey Watters wrote a great post about her concerns regarding peer assessment.

      The more expert feedback that is desired, the more closed, in some respects, the curriculum may need to be. The more automated, again, the more closed aspects of it are likely to be ( developing 100 MCQs with good feedback that are well aligned, and useful, is a huge task. Developing 500 is probably beyond the resources of most institutions who are not doing it for profit).

      The reason online learning platforms tried to develop assessment solutions is precisely because it was impractical to provide expert feedback to a large number of students. There just aren’t enough professors to go around!

      Does this reflection call for a redesign of online courses ? Maybe. It’s interesting to look at tensions between massive enrollment and open processes.

  2. “Self-censorship: Even though the platform is meant to be open to all, potential learners may feel intimidated by the academic qualifications of other participants or the technical terms used in the introduction page and decide that they are not welcome because they do not fit into the culture of the virtual university.”

    and later, you talk about the absence of appropriate scaffolding for users who may not have the tech savvyness

    “It is often taken for granted that the tools used in MOOCs are mastered by all the learners. It isn’t always the case and, in the absence of appropriate scaffolding, the lack of technological literacy is going to be a barrier to participation.”

    I took both of these to be a refernce to scaffolding, and the absence of it as a possible barrier.

    I can dog out the reference for the demographic spread…though it’s take a while. I’ve read a lot recently, and my memory is fried. Plus, I’m getting the flu. The data is probably about a year old at this stage, nd the game is changing so fast that it may not be so accurate. But I’d be confident it’s still generally true.

    The scalability of instruction, and feedback is an ongoing discussion. But it does seem clear that good feedback is key – to student’s motivation, and assignation of value to courses, and, curiously, to long term retaining and transfer of knowledge too. Really good automated feedback can handle some of this ( so, you design the questions to discover and teach to common and less common misconceptions, you identify the threshold and troublesome concepts, and develop around them, you have a good idea of who will be doing the quizzes, and what that means – here culture, educational histories, prior knowledge, language issues, all come into play, all feedback is meaningful, and informing, and you carfeully decide between formative feedback, where you are going to reteach things, and summative where you are assessing for grade, you align your quizzes carefully with the tasks you, or students, need to achieve, and you design empathetically, from a student’s eye perspective, you reward both process – tjis is what I learned – and goal – this is the achievement I unlocked – oriented learners, and you are clear about why people are doing the quizzes, and how the process will enable them to achieve their goals).

    The community aspect is key – and something I often don;t address. But meaningful interaction is a function of, and prerequisite for meaningful peer feedback. I think you are exactly right. How invested we are in the community ois a good guide to how meaningful our interaction with it will be…

    Laurillard’s Conversational Framework is one possible partial solution to the scalability issue, and centralises peer feedback, instructor feedback, as well as learner reflection, and works on the premise that this is all visible – so it’s posted to a blog, or LMS, or forum, or over social media.

    There’s lots to suggest that if you want peer feedback to work, you need expert feedback, as a model, as a reward, and as a driver of value, utility and effort from the students.

    Scalability ,may be to do with balance here. Or to put it another way, the lower down the competency scale you go, the more valuable expert feedback is, the more it contributes to efficiency of learning, and the more it makes participation likely.

  3. Great post, Laila! I am reminded of my first and still favourite MOOC experience, ETMOOC (Educational Technology and Media MOOC: http://etmooc.org) that @wiltwhatman and I both took earlier this year, because it was open in many of the ways you note, though it failed in many of the ways you note that MOOCs can fail too–language barriers, self-censorship, lack of scaffolding, and low bandwidth (this was a problem for participating in synchronous video session). I can’t help but think about these things in relation to this P2PU course as well, and think that there is much more we could do next time.

    I think we need to skip the discussion forums (not many are using them so far anyway) and just use blogs next time. And have a few people who are willing to help out adding in blog URLs to the blog hub manually, so it’s not a lot of time for one person to do hundreds. Then we wouldn’t have to cap enrollment.

    We could also do more with encouraging learners to take their own paths. This course is fairly structured–we have specific tasks to do each week, that don’t necessarily encourage people to do their own things with the material. We tried to write some of the blog post suggestions so they’d be open to whatever people want to do, but it’s still fairly directive overall. We could do more to encourage people to do what they want, when they want, and not feel like they have to do what we’re saying. That’s a tough balance; some structure is needed, but too much is problematic, I think.

    And another idea would be to allow people to award each other badges, rather than just facilitators awarding badges for particular things done in the course. That would be more participatory. And/or participants collaborating on designing badges and what they should be for, then being able to give them out, rather than just us doing that all beforehand.

    Thanks for providing some helpful ideas for next time!

    • Hi Christina,

      You’re welcome, though you came up with all the ideas yourself! I’m going to copy your feedback to ‘Why Open’ into the pad so everybody can comment on it.

      May I suggest a call for volunteers to manually enter the blog URLs? I’m sure several people in our course would have been happy to help out.

      Let me also say that I don’t believe *any* course will ever be ideally open on all levels. However, it’s important to think about ways of extending openness in directions that might not be obvious to everyone: language barriers and low bandwidth will not necessarily be part of most facilitators’ reality, yet they will be for potential learners.

      In this regard, the P2PU platform is doing great on providing opportunities for bridge persons to emerge. Because the source code is open and tutorials on how to design a course are available and openly licensed, it’s rather easy for someone to either create a similar platform in their language or to use the existing infrastructure to offer a class corresponding to the needs of their community. Even if someone just takes the initiative to translate a blueprint for making courses, it can be taken up later on by someone else who’s not proficient in English but could use the ideas!

      I’m bent on writing a full post on language issues in MOOCs because there’s a lot to say. Plus, I really care 😉

      • Good idea for volunteers for blog hub enrolment. We’re good for now b/c we’ve got two people, but at first I thought it was just going to be me, and I was really nervous about getting like 100 blogs all at once to enter. The problem was that different blog platforms have different RSS feed URLs, and so you have to do them one by one and add the right RSS feed address to the main blog URL. I decided not to ask people to try to find the RSS feed url for their own blogs…that to me was just one step too far for people totally new to blogging, which may be some of our participants!

        At any rate, turns out we only have about 20 blogs on the hub, even though there are nearly 60 people enrolled, and they’re coming in slowly rather than all at once, so it’s completely manageable. But I was worried before the course started!

        So for next time, wiltwhatman, you are on the hook for helping! (only if you have time, of course!)

Share your thoughts here...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s