Introduction to Psychology is the 6th of seven courses OCW will publish this year specifically to meet the needs of independent learners.
CAMBRIDGE, MA, July 31, 2012 — MIT OpenCourseWare has released a new version of 9.00 Introduction to Psychology in the innovative OCW Scholar format designed for independent learners. This course presents a scientific overview of how the mind works, and applies that knowledge to contemporary debates around topics like nature versus nurture, free will, consciousness, human differences, the self, and society.
“I hope site visitors come away with an appreciation of just how amazing people are,” says Professor John Gabrieli, who developed the course. “I hope the course makes you think about yourself and your friends in a different way than you ever did before.”
Gabrieli, a renowned expert in the field of learning and memory, has used brain imaging technology combined with behavioral testing to map abstract concepts like memory, thought, and emotion to specific regions of the brain. Gabrieli’s research has significantly advanced our understanding of how learning and memory are organized in the mind. Some of his most recent research has provided insights into key aspects of autism, dyslexia, and visual memory. Gabrieli has also received numerous awards for his teaching, including the Dean’s Award for Distinguished Teaching at Stanford University in 2001.
MIT’s original version of 9.00 Introduction to Psychology from 2004 has received more than 650,000 visits. The new Scholar version provides visitors to the OCW site with an even more robust learning experience.
OCW Scholar courses represent a new approach to OCW publication. MIT faculty, staff and students work closely with the OCW team to structure the course materials for independent learners. These courses offer more materials than typical OCW courses and include new custom-created content. The Introduction to Psychology course provides a complete learning experience for independent learners, including lecture videos, reading assignments from a free online textbook and detailed notes from another book, interactive quizzes for each session, discussion content to elaborate key concepts, online resources for further study, review questions, and exams with solution keys.
The first five of a planned 15 OCW Scholar courses were launched by MIT OpenCourseWare in January 2011, and have collectively received more than 800,000 visits in less than a year. The initial OCW Scholar courses included Classical Mechanics, Electricity and Magnetism, Solid State Chemistry, Single Variable Calculus, and Multivariable Calculus.
Seven OCW Scholar courses were published in 2012. Linear Algebra, Differential Equations, Principles of Microeconomics, and Introduction to Electrical Engineering and Computer Science were published earlier this year. Fundamentals of Biology, Introduction to Psychology, and Introduction to Computer Science and Programming were published this past month. OCW Scholar courses are published on the OCW site with the support of the Stanton Foundation.
About MIT OpenCourseWare
MIT OpenCourseWare makes the materials used in teaching most of MIT’s undergraduate and graduate courses—more than 2,100 in all—available on the Web, free of charge, to any user in the world. OCW receives an average of 1.75 million web site visits per month from more than 215 countries and territories worldwide. To date, more than 125 million individuals have accessed OCW materials. MIT OpenCourseWare is supported by donations from site visitors, grants and corporate sponsorship, including underwriting from our Next Decade Alliance sponsors Dow Chemical, Lockheed Martin and MathWorks.
About John Gabrieli
John Gabrieli is the director of the Athinoula A. Martinos Imaging Center at the McGovern Institute. He is an Investigator at the Institute, with faculty appointments in the Department of Brain and Cognitive Sciences and the Harvard-MIT Division of Health Sciences and Technology, where is holds the Grover Hermann Professorship. He also co-directs the MIT Clinical Research Center and is Associate Director of the Athinoula A. Martinos Center for Biomedical Imaging, MGH/MIT, located at Massachusetts General Hospital. Prior joining MIT, he spent 14 years at Stanford University in the Department of Psychology and Neurosciences Program. Since 1990, he has served as Visiting Professor, Department of Neurological Sciences, Rush-Presbyterian-St. Luke’s Hospital and Rush Medical College. He received a Ph.D. in Behavioral Neuroscience in the MIT Department of Brain and Cognitive Sciences in 1987 and B.A. in English from Yale University in 1978.
About the Stanton Foundation
The Stanton Foundation was created by Frank Stanton, who is widely regarded as one of the greatest executives in the history of electronic communications. During his 25 years as president of CBS, he turned a lesser-known radio network into a broadcasting powerhouse. Stanton made many historic contributions to the industry and to the society it served. In 1960, he initiated the first televised presidential debates—the famous Nixon-Kennedy “Great Debates”—which required a special Act of Congress before they could proceed. He also spearheaded the creation of the first coast-to-coast broadcasting system, allowing CBS to become the first network to present a news event live across the continental United States, a speech by President Truman at the opening of the Japanese Peace Conference in San Francisco. Frank Stanton was the commencement speaker at MIT in 1961.
MIT OpenCourseWare Selected One of Best Free Reference Web Sites for 2012 by American Library Association
OCW honored alongside other rich online reference resources including Google Art Project and World Databank
CAMBRIDGE, MA, July 23, 2012 — MIT OpenCourseWare (OCW) has been selected as one of the “Best Free Reference Web Sites” for 2012 by a division of the American Library Association (ALA). The award is part of an annual series initiated by the MARS: Emerging Technologies in Reference Section of the Reference and User Services Association (RUSA) of the ALA to recognize outstanding reference sites on the World Wide Web. The MIT OpenCourseWare site is one of 26 other web sites to be recognized this year by a committee of member librarians from across the United States. Selection criteria include the quality, depth, usefulness, and uniqueness of the content, as well as the ease of accessing the information. MARS noted that OCW content was “amazingly rich” and “a great resource for self-improvement and for college students who would like extra guidance…in parallel courses.”
Other notable recipients of this year’s award include the Google Art Project, an interactive experience that brings together thousands of works of art from hundreds of museums, the Kahn Academy, which offers free educational content for K-12 subjects, the World Databank, the World Bank’s statistical database on the economic and financial health of countries, the National Jukebox, which makes available over 10,000 song recordings dating from 1901-1925 from the Library of Congress, Common Sense Media, which provides ratings and detailed information for parents about the suitability of movies, books and video games for children; the Federal Bureau of Investigation’s Vault, an open database of declassified FBI records; and Emory University’s Trans Atlantic Slave Trade Database, containing detailed information on 35,000 slaving voyages that occurred between the 16th and 18th centuries.
“We are honored to be recognized by the American Library Association’s emerging technologies group,” says Cecilia d’Oliveira, MIT OpenCourseWare’s Executive Director. “Universal access to high quality information is a vision that we both share, and this award helps raise awareness about the importance of the open education movement.”
The American Library Association established MARS: Emerging Technology in Reference Section in 1978 to track important developments in the use of technology for library reference services. MARS is charged with researching and representing the interests of those concerned with attaining the highest possible quality in planning, developing, managing, teaching, or conducting all forms of computer-based reference information services in libraries. They work to explore the impact of new technologies on users, services and collections, and find ways to educate and prepare library personnel for new developments, emerging trends, and best practices in library reference. This was the 14th year of the award; a full index of all websites that have won the award can be found at the American Library Association.
About MIT OpenCourseWare
MIT OpenCourseWare makes the materials used in the majority of MIT’s undergraduate and graduate courses—more than 2,100 in all—available on the Web, free of charge, to any user in the world. OCW receives an average of 1.75 million web site visits per month from more than 215 countries and territories worldwide. To date, more than 125 million individuals have accessed OCW materials. MIT OpenCourseWare is supported by donations from site visitors, grants and corporate sponsorship including our Next Decade Alliance sponsors Ab Initio Software Corporation, Lockheed Martin, MathWorks and Dow Chemical.
If there is one single innovation driving the xMOOC* phenomenon, it’s the emergence of scalable automated assessments. The ability to provide feedback to thousands of students at once is a big part of what makes these courses scalable. A robust peer learning community is another aspect of this, but one for a later discussion. Anyway, to my non-technical self, there are three predominant flavors of these automated assessment:
• “Check yourself” kinds of quiz questions, often randomized in some way to try to control for cheating. These seem so far to be the assessments in the Udacity course I am taking.
• Simulations such as the circuitry sandbox used for 6.002x, which allow for open ended manipulation of variables. While these kinds of assessments are more sophisticated, the underlying technologies seem to be more one-off and to require more development effort than the “check yourself” tools.
• True adaptive learning environments along the lines of those used by Carnegie Mellon’s Open Learning Initiative. I know OLI is not usually discussed in the xMOOC conversation, but everything I understand about the program indicates they should be. These seem to be a whole nother level of complicated above simulations.
That’s my assessment of the assessments. Would love to hear others’ takes.
• So I am adopting Stephen Downes’ conventions, xMOOC for the Coursera/Udacity/MITx varient, and cMOOC for the original connectivist model.
From the latest breathless NYTimes article on MOOCs:
Coursera does not pay the universities, and the universities do not pay Coursera, but both incur substantial costs. Contracts provide that if a revenue stream emerges, the company and the universities will share it.
Although MOOCs will have to be self-sustaining some day — whether by charging students for credentials or premium services or by charging corporate recruiters for access to the best students — Ms. Koller and university officials said that was not a pressing concern.
I suppose it’s ironic for me to be saying this–given my line of work–but I have concerns about the business model for MOOCs. Not the “build it and they will come” aspect, because I’ve made a good living off of that proposition, but more the cost of the building. I’m a little afraid that MOOC content is largely going to fall into the sour spot between OpenCourseWare content, which is really cheap to produce, and truly useful adaptive learning content, which is really costly to produce.
OCW content costs in the single to tens of thousands to produce; MIT’s model, which is fairly costly, runs about $10K per course. Most other OCWs out there spend far less, often as little as $3K per course. Of course, for that investment, you largely get static PDFs that were used in classrooms, rather than online courses. To create an OCW course with video is about $5K to $25K depending on the recording setup and production value.
I just completed a review of Taylor Walsh’s Unlocking the Gates, which chronicles–among other projects–Carnegie Mellon’s Open Learning Initiative, one of the few online course efforts with significant science behind it. In that profile, Taylor reports the OLI team doesn’t believe they’ll get their cost per course down to anything less than about $500K.
Now I would guess that the cost per course for some entrants into the MOOC space has been in the $500K range, especially those with more robust automated feedback, but I doubt many of the more recent entrants will spend that much. So let’s say that it’s more in the range of $250K, or an order of magnitude more expensive. Assuming that doesn’t buy you particularly robust automated feedback, is the cost jump worth it? Is it sustainable?
I’ve been taking the Udacity statistics course, and at least through the first unit, the automated feedback consists of a clever integration of boxes to enter answers to question directly into the video screen, but no scaffolding or support to help you if you are stuck on a question or adaptive elements (that I can discern) to route you through topics depending on your skill level. The delta between the little box that says “You got it right!’ or “You got it wrong.” and the answer on a chalkboard or PDF is not that big, but I’m guessing the cost to get there is.
And the cost to get from there to real adaptive learning is likewise a big jump, and not one that I think many schools will be willing to make. So my guess is that many of the universities that are joining Coursera are either investing too much in their course materials or too little. Too little, and it’s not clear the experience will be worth it to students to pay for; too much and the schools may not be able to recover costs.
In a system where nobody–the content provider, the platform provider, or the student–has any obligations to anyone else, the barrier to entry is low–as it is with OCW–and the barrier to exit is also very low. This means they are not likely to convert more than a very small fraction of students to paying customers, and the easiest option for participating schools will be a quick exit.
Will be interesting to watch.
One of the issues I’ve been thinking about as I watch the development of massively scalable courses (I continue to resist calling them MOOCs) is whether or not we are repeating the problems of learning management systems in the new course platforms.
My experience with learning management systems is that because they are tasked with doing so many different things, they don’t do any one thing particularly well. In part this is an issue of development burden–even a big, well-resourced team is hard pressed to keep up on the development of the full suite of tools that educators want to use. The second issue is one of nimbleness. It’s much harder in a big system like an LMS to throw out code and start from scratch on a particular piece of functionality–there’s just too much legacy commitment.
I’m not primarily (or maybe even secondarily) a technical guy, so there may be new approaches to putting platforms together that will mitigate the second issue, but as far as I can see, the first issue only gets worse when it comes to massively scalable courses. Why? The key development that is allowing courses to scale in any meaningful way right now is the new generation of automated assessments such as the circuitry sandbox used for 6.002x.
That tool, as reported in the Globe yesterday, has been under development for many years. The problem is, creating an automated assessment tool of similar complexity in a different field is likely to be a similarly complex undertaking. To create a program that spans a wide range of subjects in a meaningfully scalable way means a similar investment in each field.
In addition to the burden of creating assessments across a range of fields, the massively scalable course platforms are also going to have to create content and learning communities as well, adding to the development burden. An alternative to this that I see is for learners to build experiences by pulling together complementary individual projects. We collaborate, for instance, with OpenStudy to add interactive opportunities to MIT OpenCourseWare content. There’s no reason learners couldn’t choose to use Peer 2 Peer University for the same purpose. And increasingly, there are automated feedback tools such as those at Codecademy emerging that can provide a robust experience.
The most compelling part of the massively scalable course value proposition right now–beyond the learning opportunity itself–is the possibility of credentialling, but it’s not clear right now what the market value of that will be, or how much reputational capital participating universities are willing to burn in these efforts, or how efforts like the Mozilla Open Badges Infrastructure will impact credentialling.