With profound gratitude for the opportunities I’ve been given at MIT, I want to share that I have accepted a new position as Operations Director for the OPENPediatrics program at Boston Children’s Hospital. My last day at MIT will be March 31st.
I’ve always felt that my work at OCW might be a once in a lifetime opportunity to make a profound difference in the lives of people worldwide, and I am humbled to have found another opportunity to have such an impact. Every year, more than 10 million children die of preventable causes, and OPENPediatrics (http://openpediatrics.org) seeks to address this challenge using the principles of open sharing and scalable education that animate OCW and MITx to improve the care of critically ill children on a global scale.
While I am excited by this new opportunity, I am sad to part ways with the many friends and colleagues who mean so much to me. I will spend much of the next few years wondering (and maybe occasionally even asking) how the ODL and OCWC teams would have handled situations I will face.
I’m also sad to be unable to join my MIT colleagues in the engaging work that awaits ODL in the next few years. Amid the uncertainty of the shifting higher education landscape and the organizational changes at MIT, I have total confidence in the amazing people brought together under the ODL banner. I have no doubt that they will all do as they have always done–transform the way we think about the intersection of education and digital technologies, and how it can be used to make ours a better world.
I’m optimistic my new position will allow me to remain engaged in the open education community, and will regardless keep in touch with my friends at MIT and the OCWC. Thank you again to the friends and colleagues who have made my work at OCW, the OCW Consortium, and the Office of Digital Learning such a wonderful experience.
Coursera debates future of monetization – The Daily Pennsylvanian
Reports from the actual event indicate the discussion is really not centered on monetization, but the article sure is:
“I think the excitement surrounding this conference, and around Coursera in general, shows that the profits will come, even if they may not have come yet,” said Law School professor Edward Rock, who serves as Penn’s director of open course initiatives. “When you have a product like these courses that represents an increase in quality and a reduction in cost, it’s bound to make money.”
At Penn, Rock said, about 60 percent of revenue earned from individual courses goes directly to faculty members.
“If a course turns out to be a bestseller, there will be significant revenues that flow to faculty members,” he added. “It’s something that professors think about and care about, because they’re putting a huge amount of time into developing these courses.”
“If you look at similar ventures, the same questions came up there. How’s Google going to get money from searches, how’s Facebook going to get money from hitting a like button?” [Penn mathematics and engineering professor Robert Ghrist] said. “Once you have an interested customer base, then you have something to work with.”
From the Udacity Legal page:
Udacity hereby grants you a license in and to the Educational Content under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License (http://creativecommons.org/licenses/by-nc-nd/3.0/ and successor locations for such license) (the “CC License”), provided that, in each case, the Educational Content is specifically marked as being subject to the CC License. As used herein, “Educational Content” means the educational materials made available to you through the Online Courses, including such on-line lectures, speeches, video lessons, quizzes, presentation materials, homework assignments, programming assignments, code samples, and other educational materials and tools, but, in any event, specifically excluding any Secure Testing Materials. Such Educational Content will be considered the “Work” under the terms of the CC License. “Secure Testing Materials” refers to any exams or other testing materials that are used for certification purposes.
Not sure how I missed that. The ND is limiting and raises many questions, but hey, it’s a step in the right direction.
Jeff Young has a great piece in the Chronicle of Higher Ed today about what he calls the “bandwidth divide” and how most MOOCs require learners to have persistent high-speed internet access. When we created the Mechanical MOOC course, we built it on existing open resources mostly because we though it was the most efficient and cost effective way to do it–by leveraging the investments already made in creating MIT OpenCourseWare, OpenStudy and Codecademy.
We realized very quickly that a lot of additional flexibility came with leveraging these resources. Because they were from mature projects focused on openly sharing their resources and functionality, they had developed alternate modes of delivery to address bandwidth issues:
- the 6.189 course used an open textbook that was downloadable
- the 6.189 course materials (assignments, notes) themselves could be downloaded in a single zip file
- the 6.00SC videos used were downloadable from iTunes U and the Internet Archive
- OpenStudy was launching a beta mobile interface just as the course kicked off
And our learners downloaded the materials is large numbers:
Beyond that, we were able to also leverage the deep investments made in translating these open resources. The text is available in a dozen languages, and the course materials have been translated into Chinese. By building our course on open resources, we saved money and leveraged the work that these projects have already put into reaching audiences working without persistent internet or in other languages. A win-win-win.
- How big is the typical MOOC? – while an enrollment of 180,000 is often cited as the largest MOOC so far, 50,000 students enrolled is a much more typical MOOC size.
- How many students complete courses? – completion rates can approach 20%, although most MOOCs have completion rates of less than 10%.
- What factors might affect completion rate? – the way that the course is assessed may affect completion rates; the completion rates of courses which use automatic grading range from 4.6% to 19.2%, while the rates for courses which use peer grading range form 0.7% to 10.7%. This may present a greater challenge for teaching MOOCs in certain subjects.
- Do more students drop out if courses are longer? – there does not appear to be a negative correlation between course length and completion rate, which is interesting as you might expect fewer students to ‘keep going’ and complete longer courses.
It’s great to see some data on completion rates, and this will certainly stir up more debate on the topic.
But one issue not addressed in the current discussion is who really cares about MOOC completion? Certainly the groups offering them do, and educational researchers do. A fair guess that many non-profit funders do as well. Interestingly, though, some of the data coming out of the Mechanical MOOC Python course suggest that in the absence of extrinsic carrots like credit or certificates, learners may not.
In the eighth and final week of the class, we asked the 5,775 learners who signed up for the first iteration of the Python course a series of end of course questions; we received 21 partial and 61 complete responses. Assuming a survey completion rate of 3% (typical of what we see for MIT OCW surveys) and 5% (really good for an OCW survey) that would suggest a rough engaged population of learners (that is, still reading the e-mails we were sending out to structure the course) of between 2,733 and 1,640 people during the last week of the course.*
One question asked which was the last week of the course out of the eight they had completed. Here’s the response:
At the point of the survey, midway through the eighth week, 12.1% indicated they had completed the course and 13.8% had completed week 7. If we assume 25% attrition from those that completed week 7, maybe 10.4% of the 13.8% would be expected to finish the course. So in very rough numbers, 20.5% of the survey respondents might be expected to finish.
Apply that number to the estimated engaged population of learners above, and we can get very rough numbers of estimated completers: 560 – 336, or 9.7% – 5.8%. or somewhere in the mid to low range of MOOC out there, which might be expected, since we weren’t offering a certificate or other incentive for finishing. Now there are plenty of places to take issue with the above numbers, and since our course set up doesn’t have a solid way of counting course completers, this really should be taken for the back-of-the-envelope analysis it is. But…
What is really interesting to me here is the distribution of learners across the weeks completed. There is a large cohort of students (68.9% of respondents) that reports most recently completing weeks 4-7, which is to say they progressed significantly through the course but most of them were not positioned to finish the course “on schedule.”
How do they feel about this? Apparently pretty good. Granted the n’s are painfully small here, but if you ask how successful they felt they were in the portions of the class they completed, most report being completely or mostly successful:
Further, if you ask whether they feel prepared for further study based on what they had learned so far in the class, they likewise responded largely that they were very or somewhat prepared:
The data’s a little thin, yes, but this would seem to at least suggest that while MOOC providers and higher education commentators wring their hands about the completion rates of MOOC, the learners may not really care that much. If they are learning for the sake of learning, they may be quite content to fit in what learning they can given the constraints of their lives and be happy with wherever they finish up.
There’s a great deal of excitement (and fear) over whether MOOCs will replace parts of the current higher education system, but right now I suspect most of the activity with MOOCs (as has been the case with OER more generally) is in extending educational opportunity beyond the current higher education system. If this is the case, we may need some better metric for understanding student success and satisfaction than completion rate.
* Correlating data point: The week 8 assignment e-mail recorded 1,929 opens through our e-mail system.
My first job out of grad school was as the coordinator of Emerson College’s adult degree program. Without a doubt, the highlight of the job was the daily contact with the amazingly motivated and persistent adult students who were overcoming tremendous challenges to complete their degrees. Their enthusiasm for learning and pride of accomplishment was absolutely infectious.
There are lots of great things about my current gig for sure, but close contact with motivated learners has not been one of them–until now. We are a few days away from the official start of our Mechanical MOOC Python course, and already I am thoroughly enjoying the opportunity to interact with the learners participating in the course.
The enthusiasm they show, the extent to which they are already working ahead through the material and helping each other, the Twitter stream and blog posts about the course all bring back that feeling of really helping people to do something they care about.
No doubt that OCW does this too, but my daily connection to it is abstract, often discussed in numbers. Nice to reconnect more firmly with the people.
I’m working with a group of testers to run through the initial draft of the course sequence for the upcoming Mechanical MOOC Intro to Python course, and I have to say, I am really loving the unplatform aspects of it. I live in one of the more wired cities in the US, and I still spend a fair amount of my time outside of WiFi range. I tried to complete the Udacity Stats course this summer, but one of the challenges was that I always had to be connected. My biggest blocks of free time are during my train commute, when theoretically I have wireless service (from AT&T) but practically I have at best spotty cell coverage (from AT&T). This meant no working on the Stats course during the ride.
Because the Mechanical MOOC depends on existing open content outside of an enforced platform, I have other options. MIT OpenCourseWare helpfully provides a course download option, so I have the 6.189 course installed locally. The text for the course is an open resource downloadable as a PDF. The videos from 6.00 are available through iTunes U, so accessible offline on both my laptop and phone. As an added bonus, OpenStudy just released a mobile interface, so I can even ask and answer questions without a WiFi connection. Codecademy even seems to be functional on my iPhone at some level, though I doubt I’ll try to complete those lessons on that platform.
By not creating and enforcing a single platform, the Mechanical MOOC gives up the opportunity to harvest lots of tightly integrated data about the learners, but it allows us to take advantage of all the hard work that the content and community providers have put into making their environment accessible and inviting. Hopefully this model is going to allow us to meet the learners where they live.
Here’s a paragraph from my review of Taylor Walsh’s book Unlocking the Gates. The review was published in the Continuing Higher Education Review, Vol. 76, 2012. Walsh’s book reviews a number of the early online courseware efforts, including Fathom.com, MIT OpenCourseWare, Carnegie Mellon’s Open Learning Initiative and India’s National Program on Technology Enhanced Learning (NPTEL).
What is consistent for me between these projects and the subsequent MOOCs at Stanford and MIT is that they are all in one way or another institutional answers to the question MIT president Charles Vest posed in 2000 to the committee that ultimately recommended MIT OpenCourseWare: How will the Internet change education, and what should our university do about it? That charge has echoed throughout the open-education community in the last decade as schools continue to grapple with these fundamental issues, and with the emergence of the newest generation of open online offerings, MOOCs, these questions take on increasing urgency.
What is a “MOOC”?
A massive open online course. They’re the latest rage in online learning. OK, they’ve actually been around a while in a variety of different forms, the first of which was a free-for-all approach with little central control where learners co-create a learning experience (“cMOOCs”), and the more recent variety, which are much more like traditional online classes (“xMOOCs”). You can read more about them at http://en.wikipedia.org/wiki/Massive_open_online_course
In both cases, lots and lots of people get together to learn online. These courses are scalable because of peer learning environments that allow the learners to support each other, and because of assessment engines that automate feedback. Typically, participants number on the thousands, though some recent examples have included more than 100,000 initial participants.
OK, what is a “mechanical” MOOC?
Well, with previous MOOCs, there’s still been a professor who offers the course. Our course has no instructor. Our theory is that online learning tools have become robust enough with a light amount of coordination, learners can move through them together and support each other’s learning without a central authority
We are establishing a mailing list that will coordinate learner activities across a selection of online tools, letting you know when class activities are taking place and where to go to participate.
Why would you create a Mechanical MOOC?
We have a theory about MOOCs as they exist today. The first version of MOOCs–the cMOOCs–we think, are a little too unstructured for many learners, casting them into an unbounded environment of blogs, wikis, RSS feeds and other web technologies that are more than many learners can or want to manage.
On the other hand, the new strain of MOOCs–the xMOOCs–offered out of major universities and their spinoffs seem to be all competing to create the killer platform, and we have doubts that this can—or should–be done successfully. Usually, when sites try to do it all, they end up doing not much of it very well.
The lesson of open education in the past 10 years seems to be that the components of education—content, community and assessment—can be unbundled, and that sites can focus on providing one aspect of education very well. So we are combining three “best-of-breed” sites to create an offering that we think is as good or better than other approaches.
Is this course competing with the Stanford’s and MIT’s of the world?
No, this is an experiment to test our theory about the current MOOCs. Whatever comes out of it will be a very different learning experience than either the cMOOCs or xMOOCs. It will hopefully be more structured that the former and less structured than the latter.
It will certainly not be a neat and polished environment where all the pieces are custom-created to fit together neatly. But on the other hand, we hope to bring together the best of what’s already out there without having to build anything from scratch–a significant cost advantage, and a model that will empower many more open education projects to experiment with MOOC-like offerings.
What course are you offering?
The first course will be called “A gentle introduction to Python” and will be, well, a gentle introduction to Python programming.
Who is offering the class?
A group of leading open education sites are involved, including Peer 2 Peer University, OpenStudy, Codecademy, and MIT OpenCourseWare. Peer 2 Peer University is managing the mailing list.
MIT is participating. Is this an MITx offering? A competing program?
Neither. MIT OpenCourseWare supports all experiments involving their content that are consistent with the mission and spirit of the program, and this is one of them. We all have a lot to learn about how open learning takes place, and the more data points the better. This MOOC will not offer an MITx certificate.
How big will this Mechanical MOOC be?
We don’t know, but we’re confident it can be very big. These sites already serve thousands and in some cases millions of users, so we can handle whatever may come. But we’re ok if it’s small also. Our concern is less about getting huge numbers in the front end, and more about delivering a good learning experience for everyone who participates.
How can I get to know others who are studying?
OpenStudy will provide a forum where all learners can interact in one big study group, so that’s a great place to start. We’re also offering the opportunity for learners to be assigned to groups of ten, so that you can work more closely with a more limited cohort.
Where do I sign up?
Sign up for the mailing list at http://mechanicalmooc.org/. You’ll also have to register eventually for the OpenStudy site and Codecademy, but this can be done as the course progresses, so no worries.
Do I get a certificate?
Nope, but both Codecademy offers badges and OpenStudy has SmartScore, so you’ll get recognition of your work there. One of our long-term goals for Mechanical MOOC is to figure out how recognition works in this approach. NOTE: This MOOC will not offer an MITx certificate.
What good are the badges?
They are a shorthand for sharing your informal educational achievements on the Web, and a lot of smart people, including the good folks at Mozilla, are working hard to figure out how to make them more meaningful.
Can I use other sites and services with this course?
Absolutely. We encourage participants to bring in other tools, self-organize, and share what they are doing with the rest of the community. We’re tyring to learn here as well.
If there is one single innovation driving the xMOOC* phenomenon, it’s the emergence of scalable automated assessments. The ability to provide feedback to thousands of students at once is a big part of what makes these courses scalable. A robust peer learning community is another aspect of this, but one for a later discussion. Anyway, to my non-technical self, there are three predominant flavors of these automated assessment:
• “Check yourself” kinds of quiz questions, often randomized in some way to try to control for cheating. These seem so far to be the assessments in the Udacity course I am taking.
• Simulations such as the circuitry sandbox used for 6.002x, which allow for open ended manipulation of variables. While these kinds of assessments are more sophisticated, the underlying technologies seem to be more one-off and to require more development effort than the “check yourself” tools.
• True adaptive learning environments along the lines of those used by Carnegie Mellon’s Open Learning Initiative. I know OLI is not usually discussed in the xMOOC conversation, but everything I understand about the program indicates they should be. These seem to be a whole nother level of complicated above simulations.
That’s my assessment of the assessments. Would love to hear others’ takes.
• So I am adopting Stephen Downes’ conventions, xMOOC for the Coursera/Udacity/MITx varient, and cMOOC for the original connectivist model.