OpenFiction [Blog]

Why the “open” in MOOC matters

Posted in Evaluation, Mechanical MOOC, MIT OpenCourseWare, MOOC, open education by scarsonmsm on March 4, 2013

Jeff Young has a great piece in the Chronicle of Higher Ed today about what he calls the “bandwidth divide” and how most MOOCs require learners to have persistent high-speed internet access.  When we created the Mechanical MOOC course, we built it on existing open resources mostly because we though it was the most efficient and cost effective way to do it–by leveraging the investments already made in creating MIT OpenCourseWare, OpenStudy and Codecademy.

We realized very quickly that a lot of additional flexibility came with leveraging these resources.  Because they were from mature projects focused on openly sharing their resources and functionality, they had developed alternate modes of delivery to address bandwidth issues:

  • the 6.189 course used an open textbook that was downloadable
  • the 6.189 course materials (assignments, notes) themselves could be downloaded in a single zip file
  • the 6.00SC videos used were downloadable from iTunes U and the Internet Archive
  • OpenStudy was launching a beta mobile interface just as the course kicked off

And our learners downloaded the materials is large numbers:

Screen Shot 2013-03-04 at 7.53.26 AM

Beyond that, we were able to also leverage the deep investments made in translating these open resources.  The text is available in a dozen languages, and the course materials have been translated into Chinese.  By building our course on open resources, we saved money and leveraged the work that these projects have already put into reaching audiences working without persistent internet or in other languages.  A win-win-win.

Who really cares about MOOC completion?

Posted in Evaluation, Mechanical MOOC, MOOC by scarsonmsm on February 28, 2013

This is a really interesting graphic on MOOC completion, showing rates ranging from 20% to 2%.  The data points the researcher (Katy Jordan at OU UK) pulls out are:

  • How big is the typical MOOC? – while an enrollment of 180,000 is often cited as the largest MOOC so far, 50,000 students enrolled is a much more typical MOOC size.
  • How many students complete courses? – completion rates can approach 20%, although most MOOCs have completion rates of less than 10%.
  • What factors might affect completion rate? – the way that the course is assessed may affect completion rates; the completion rates of courses which use automatic grading range from 4.6% to 19.2%, while the rates for courses which use peer grading range form 0.7% to 10.7%. This may present a greater challenge for teaching MOOCs in certain subjects.
  • Do more students drop out if courses are longer? – there does not appear to be a negative correlation between course length and completion rate, which is interesting as you might expect fewer students to ‘keep going’ and complete longer courses.

It’s great to see some data on completion rates, and this will certainly stir up more debate on the topic.

But one issue not addressed in the current discussion is who really cares about MOOC completion?  Certainly the groups offering them do, and educational researchers do.  A fair guess that many non-profit funders do as well.  Interestingly, though, some of the data coming out of the Mechanical MOOC Python course suggest that in the absence of extrinsic carrots like credit or certificates, learners may not.

In the eighth and final week of the class, we asked the 5,775 learners who signed up for the first iteration of the Python course a series of end of course questions; we received 21 partial and 61 complete responses. Assuming a survey completion rate of 3% (typical of what we see for MIT OCW surveys) and 5% (really good for an OCW survey) that would suggest a rough engaged population of learners (that is, still reading the e-mails we were sending out to structure the course) of between 2,733 and 1,640 people during the last week of the course.*

One question asked which was the last week of the course out of the eight they had completed.  Here’s the response:

MechMOOC Week complete

At the point of the survey, midway through the eighth week, 12.1% indicated they had completed the course and 13.8% had completed week 7.  If we assume 25% attrition from those that completed week 7, maybe 10.4% of the 13.8% would be expected to finish the course.  So in very rough numbers, 20.5% of the survey respondents might be expected to finish.

Apply that number to the estimated engaged population of learners above, and we can get very rough numbers of estimated completers:  560 – 336, or 9.7% – 5.8%. or somewhere in the mid to low range of MOOC out there, which might be expected, since we weren’t offering a certificate or other incentive for finishing.  Now there are plenty of places to take issue with the above numbers, and since our course set up doesn’t have a solid way of counting course completers, this really should be taken for the back-of-the-envelope analysis it is.  But…

What is really interesting to me here is the distribution of learners across the weeks completed.  There is a large cohort of students (68.9% of respondents) that reports most recently completing weeks 4-7, which is to say they progressed significantly through the course but most of them were not positioned to finish the course “on schedule.”

How do they feel about this?  Apparently pretty good.  Granted the n’s are painfully small here, but if you ask how successful they felt they were in the portions of the class they completed, most report being completely or mostly successful:

Success by week

Further, if you ask whether they feel prepared for further study based on what they had learned so far in the class, they likewise responded largely that they were very or somewhat prepared:

Preparedness

The data’s a little thin, yes, but this would seem to at least suggest that while MOOC providers and higher education commentators wring their hands about the completion rates of MOOC, the learners may not really care that much.  If they are learning for the sake of learning, they may be quite content to fit in what learning they can given the constraints of their lives and be happy with wherever they finish up.

There’s a great deal of excitement (and fear) over whether MOOCs will replace parts of the current higher education system, but right now I suspect most of the activity with MOOCs (as has been the case with OER more generally) is in extending educational opportunity beyond the current higher education system.  If this is the case, we may need some better metric for understanding student success and satisfaction than completion rate.

* Correlating data point:  The week 8 assignment e-mail recorded 1,929 opens through our e-mail system.

MIT OpenCourseWare 2011 year-end numbers

Posted in Evaluation, MIT OpenCourseWare, OpenCourseWare, Video by scarsonmsm on January 11, 2012

Another good year for MIT OpenCourseWare.  Big story here is the tremendous jump in YouTube numbers–this plus the 12 million iTunes U downloads and the redistribution we are getting through Chinese sites like 163.com (which we get no reporting from) means that a lot of the OCW activity is through our video redistribution.

• 18.6 million visits (+1.2 million over last year) 10.2 million repeat visits, 8.4 million new visits
• 9.8 million visitors (+200K)
• 1.92 visits per visitor (+0.10)
• 101.4 million page views (+3.1 million)
• 5.42 pages per visit (-0.21)
• 1.8 M zip files of course content downloaded (-.1 million)
• 11.4 million YouTube views (+4.1 million)
• 12 million iTunes downloads (+0.2 million)
• 317K visits from the MIT community (+42K)
• 361K visits referred by StumbleUpon (-85K); 183K by EducationPortal (+23K); 178K by Facebook (+100K); 157K by Reddit (-15K); 117K by Wikipedia (+5K); 111K by YouTube (+16K)
• 33% of visitors used Firefox (-5%); 26% used Chrome (+11%); 23% used IE (-11%); 14% used Safari (+4%)
• 2.37% used iOS (+1.57)

Lies, damn lies and…

…new statistics.

Just completed the 2011 evaluation summary.  Hope as always to follow with a more detailed report, but for now, this gives a general idea of directions and trends.

Most interesting thing in here for me is the increase in % of students (up to 45% from 42%), making them now the largest constituency instead of self learners (at 42% down from 43%).  These are margin-of-error-ish changes, but interesting nonetheless.  Could be a result of the time of year we did the survey, could indicate more people returning to school in a tough economy–lots of possible explanations.

Also interesting that the primary student use is now complementing materials from an enrolled course  (up to 45% from 39%) instead of learning outside the scope of formally enrolled coursework (down from 44% to 40%).  This may indicate that more students are coming from undergraduate and community colleges, as this lines up more with past measures of usage scenarios at that level, but I’ll have to dig deeper to see if that holds.

Dig in yourself, and feel free to ask questions!

A record month for MIT OpenCourseWare – 1.7 M visits

Posted in Evaluation, MIT OpenCourseWare, Open Educational Resources, OpenCourseWare, web metrics by scarsonmsm on November 1, 2011

October is traditionally OCW’s annual high-water mark for traffic, and last month was no disappointment in that regard.  The site received a record 1,733,198 visits from 1,026,004 unique visitors.  This eclipses the previous high of 1,602,561/1,015,112 from August last year, and is a 12.4%/12.8% increase over last October.

A few more October numbers:

  • Average visits per day: 55,909
  • Page views:  8.8 M
  • Top course: 6.00 Intro to Computer Science and Programming – 104,096 visits

It’s great to see continued momentum as we swing into a new school year.

Class size: 10,000

The OpenStudy group for 6.00 Introduction to Computer Science and Programming has now eclipsed 10,000 participants. That’s a seriously big class. But what impresses me even more is the depth of interaction in some of the discussions, like this one.

Other groups are also growing at an impressive rate. 18.01 Single Variable Calculus has nearly 8,500 participants. 21F.101 Chinese I has over 2,000. Several others are closing in on a thousand participants, and all but one of the recently introduced OCW Scholar groups have attracted participants in the hundreds.

I cringe as I write this

As I really have a lot on my plate, but I’m not sure which statement David would disagree with:

• Educational resources are, on balance, beneficial to those who have access to them.
• Being “open” doesn’t diminish the value of “educational resources.”
• Obtaining permission to publish under full copyright is as expensive as publishing under an open license.
• The capacity open licenses provide for translation of OER into other languages, which has extended access to millions, is itself sufficient benefit to justify their use.

I feel like we are going in circles again. Notice how that link points to a previous link as well…

OCW and OER: Who’s in?

In preparation for the upcoming OCW Consortium meeting, I’ve been surveying the OCW/OER landscape, and come up with what is (for me, anyway) a bit of a startling observation: the number of universities in the 2010 US News World’s Best Universities list that have significant OCW/OER programs underway. By my count, 9 of the top 25, and 15 of the top 50.

This is in no way to downplay the importance of the hundreds of other universities worldwide that are sharing their materials as well. The less resourced universities that serve larger and less prepared student populations provide some of the most valuable materials, but if you want a measure of where the world’s leading universities are in their thinking about OCW and OER, this is worth considering.

Here’s who I see in the list:

Harvard
Yale
MIT*
Oxford*
Stanford
Michigan*
Johns Hopkins*
University of Tokyo*
Kyoto*
UC Berkeley*
Carnegie Mellon
Ecole Polytechnique*
NYU
Seoul National*
Osaka*

* Members of the OCW Consortium.

OCW Scholar, month 1

Posted in Evaluation, MIT OpenCourseWare, open education, Open Educational Resources, OpenCourseWare by scarsonmsm on February 17, 2011

While we haven’t yet conducted a survey to gather detailed feedback about the new OCW Scholar courses, we do have one month of analytics now, and the results look pretty good. Collectively, the five courses received just of 95,000 visits in their first month, and all are in the top 30 courses on the site for the period January 12 to February 12, 2011. The Top courses, 18.01SC, received just under 28,000 of those visits.

All the user feedback we’ve gotten has been positive as well, along the lines of this comment from a student (location unknown): “This is a great help for my incoming local engineering licensure exam. I kind of sped past these topics during college. So when i saw the videos on 8.02 the thoroughness of explanation just amazed me . I think I am beginning to enjoy engineering again :)”

More data as it becomes available.

Failure to define success, part 2

I’ve had my share of existential moments, but this is one I didn’t realize I was having. Or rather my profession was having. I like Taylor’s book for the most part, and think it serves as a useful examination of the field, but I do think it does miss on a couple of fronts, which I will discuss in later posts.

But for now, the soundbite: Ira Fuchs quote “If you take away OCW completely, I’m not sure that higher education would be noticeably different.” Sure, especially US higher education. The same could be said of Wikipedia. And once again I am filled with the sense that as a movement, we are failing to adequately define success and so leaving ourselves open to having others define it for us.

When OCW was announced, I think there were many out there who hoped it would provide the leverage to break away from the artisan model of teaching to something that was more scalable. There seem to be two varieties of this hope: One that, faculty around the world would just pick up and use MIT’s curriculum, saving time and improving quality in one fell swoop (the “dirty underwear” model); two, that OCW would repeat Wikipedia’s success and that teachers around the world would collaborate on one “killer app” curriculum. A third variety that emerges in Taylor’s book is that online resources might supplant live teachers entirely-the OLI model.

All three I think grow from a view of education that holds it is essentially knowledge transfer, and that there ought to be one “best” way to do it, measurable and precise. Education, at least for me, is intensely local and personal, learning how to learn. I won’t dwell, and plenty of people have spoken more intelligently and articulately on the issue. Comments like Ira’s I think express the frustration of revolutionaries expecting a revolution.

OCW by its nature, though, reinforces the artisanal model of education by providing an example of one of the best artisanal communities of educators in the world hard at work. When we were first going to faculty and encouraging OCW participation, one of the constant refrains we heard was that MIT’s materials were designed for MIT students, and likely weren’t going to be appropriate for most people out there. Not that they were necessarily too high level, just that they were created for a specific community working with in specific conditions.

However, OCW materials do provide educators a window into how the MIT faculty community operates, how they craft educational experiences, and other craftsmen and women around the world can draw inspiration and resources from OCW as they create their own educational experiences. But this is not the kind of activity you see writ large on the face of institutions. Nor does it change the fundamental model.

Large parts of the OCW story also take place outside the walls of institutions as well, offering educational opportunity to people who previously had none, and Ira’s comment completely ignores this issue. OCW has the potential of impacting a great many lives, and appears at some level to have already done so for hundreds of thousands. But this is a difficult story to document and tell, not measured in pre-tests and post-tests.

Which doesn’t mean we shouldn’t be doing it, it just means we have a lot of hard work ahead of us. This again is not a post about Ira Fuch’s comment, it’s a post about our own failures to make the case. To define success. To share what we know about the ways OCW is making a difference around the world. Ira’s right, we haven’t noticeably changed higher education. But we have noticeably changed lives all around the world, and we need to be getting that message out there.