Failure to define success
I guess the thing I’m feeling most strongly out of the most recent exchange over the issue of cost savings and OCW is my own failure to get more data out there about how OCW generates benefit. I have several years’ worth of survey data that we simply haven’t had the time to package up neatly and get out there, and so if we aren’t showing how it works, I guess we can’t complain when others attempt to define success for us.
So this is in small measure an attempt to insert some fact into the speculation. David is currently postulating that cost savings from adopting OER is the key benefit, and if that benefit isn’t generated the movement has failed. The data below is from a 2008 survey of OCW users with more than 5,000 self-selected respondents. Educators were 9% of respondents and so we are already looking at a small portion of overall traffic. How does this subsection of the audience use the content?
As it turns out fewer than 1/3 are using it in a way that would require direct adoption (20.2% incorporating materials + 7.9% developing curriculum). The other three modes of use—personal learning, learning new teaching methods and finding reference materials for students—have nothing whatsoever to do with adoption of materials in the way David describes (although the reference materials for students might be done via linking).
We can look even more granularly at these modes of use and the benefits they produce. So, for example, personal learning, which is 30.5% of educator use:
As can bee seen, educators use the site largely to learn new material, either within or outside their field, and secondarily to refresh their knowledge of the basics. What are the benefits in doing this?
Mostly making them better teachers, it would seem, through the availability of better information and the motivation it provides.
What about learning new teaching methods, the second most prevalent mode of use in Table 1 at 22.9%?
Primarily they learn new methods for themselves, and secondarily for their wider community of educators. What benefits does this activity generate?
I’m sure this will be interpreted in some circles as promoting backwardness, but interestingly, most educators learning teaching methods from the site believe they become better lecturers, and only secondarily learning to make instruction more interactive or project based.
OK, onto the issue that has David’s attention, incorporation of materials at 20.2% of educator use. What does this actually look like?
No surprise most faculty are incorporating materials into an existing course, which means the formats, approaches, language, etc. all have to be a pretty good match for direct adoption. Interestingly, the second most prevalent way is looking for ideas on how to design a course, which may or may not be the kind of direct adaptation David is interested in. The last mode, adoption for a new course seems to me to be the clearest path for adoption in the way David describes it. And the benefits?
Time savings fall pretty low here, and cost savings even lower. So the benefit David is focusing on is 37% (24.9% time + 12.1% cost) of 20.2% (% of educators incorporating materials) of 9% (% of educators) of the benefit we’ve identified. That’s less than 1% of the benefit. If indeed this is the key benefit of OCW, we are truly in trouble.
Let’s complete the data set with the student reference use (15.1% of educator use):
Largely, educators appear to use the materials to help students better learn concepts in the class or to study more advanced topics, and less for remedial work. How do they characterize the benefits?
Increased student learning and increased student motivation appear to be the key benefits here.
Finally, curriculum development, at 79% of overall educator use:
Here is an even split between existing and new curriculum. And the benefits?
Cost was not addressed directly in this instance (the possible benefits were generated through an analysis of open-ended questions regarding benefit in previous surveys). Generously assuming all “other” responses to be cost- or time-savings related is still only 5.9%.
So scanning the benefits in all of the tables above, most of the responses have to do with increases in quality of instruction or learning, or with student and faculty motivation. There are two ways to make a system more efficient, make it cost less or increase the output. I don’t see a lot of evidence in this that OCW can make education cheaper (though open texts and open access journals may), but I see lots of evidence that it can help us all get more out of the investments we do make in education. The above analysis does not even take into consideration the benefits generated for students, who are 42% of our audience.
Of course there are things that could be said about the data–it’s self-selected, the benefits were preselected and incorporate bias, it represents what is rather than what should be and if OCW were more adoptable more people would adopt it… All of these are likely true to an extent, and show just how difficult it is to conduct evaluation on a resource like OCW. But I doubt the cumulative effect is enough to change the picture dramatically.
Why is this important enough to spill so many pixels over? Because selling chickens as a source of milk will disappoint the customer. If the institutional administrators, Department of Education staff, grantors, donors and other sources of support are brought in on the cost savings argument, and those savings fail to materialize, the movement will lose support. For formal education, it’s very important that we actively promote OCWs ability to increase quality of education through transparency and open publication, and to also look for cost savings where they emerge. If we are going to do OpenCourseWare, we ought to do it for what it does—increases the quality of education at our institution and provides educational opportunity for millions—not for what we wish it did.
Anyway, in future posts, I’ll share parallel data for students and independent learners. But more immediately, I have a follow on case study to this educator data that illustrates how “adoption” is not simply plug-and-play or a matter of a few localizations, and why the impact is more quality and less money.