Great talk this morning at MIT:
Using Big Data to discover tacit knowledge & improve learning
Ken Koedinger, Prof of Human-computer interaction & psychology, Carnegie Mellon University
CMU Director of LearnLab, Pittsburgh Science of Learning Center
Recent Powerpoint (from a similar talk) with good notes:
“Why is it that science & tech have not improved education as they have medicine and transportation? A root cause is that we, the general public, educators, policy makers, do not fully appreciate the complexity of learning and instruction. Learning is really much more complex than our conscious experience of it would suggest.
Our over-estimated sense that we understand our own learning leads us astray in making educational decisions — it yields a tendency toward the quick fix or one size fits all solution. … The good news is that there is so much S&T can do to better understand learning and to greatly improve instruction. We first need to accept that we do not know what we know!”
Introduced by Lori Breslow, Director, Teaching and Learning Laboratory at MIT. Ken’s lecture was recommended to her as best lectures about online learning, offered as part of MIT DUET Seminar Series.
Full notes in PDF format, including some pretty bad photos (but taking notes on the iPad & inserting photos into them is so fun!).
Most of what we know is tacit > learning based on intuition is flawed (great argument overview slide, see if I can get a copy).
Chick sexing – experts can sex chicks, but can’t explain how & it takes awhile to learn (Beckmann & Shiffar, Sexing Day Old Chicks), half page instruction improved learning curve. Eg we know English but we don’t know what we know. Experts can describe less than 30% of what they know > major design implications. Cognitive Task Analysis (lee 2004 meta analysis) improves instruction
Teachers don’t know what they know (eg story problem, word problem, equation) – math teachers & us think story is hardest for students, but equation is (for beginning algebra students). They have trouble with the symbolic language of the equation. >> Expert Blind Spot – not their fault, but problematic for instructional design. (Difficulty factors assessment)
From textbook model to inductive support model. Algebra Cognitive Tutor: interactive support for learning by doing – within activity: authentic problems, feedback within complex solutions, challenging questions, personalized instruction (specific to their need at the moment), between activities: progress & individualized reference to next problem.
Model tracing (like AI plan recrognition); represent all correct paths, everything else is an error. Knowledge tracing: assess knowledge growth, driving activity selection & placing. Use the data you are collection to measure effectiveness of the model. Some verifiable results (some null too), 600k students using it per year, 80 mins per week. (Using in Algebra, Chemistry, English, Games)
EdTech + wide use = Research in practice
Studies run for a couple weeks, designed to test a specific item/ change.
Interaction data is surprisingly revealing (Worcester Polytechnic)
- accurate prediction of mcas score
- detect student work ethic, engagement
- discover better models of what’s hard to learn
Analysis of Open Learning Initiative data set from stats course. In algebra, the real challenge is learning to model the problem, not solving equations.
“Sciences of the Artificial”.The task, not the cognitive process drives the learning. Inherent difficulty in the task is not obvious, so instruction is misguided.
Need to design the instruction to get at the underlying tacit knowledge. Learning Factors Analysis
- Traditional College Course=>100hours, ~3% learning gain
- Adaptive Data-Driven Course=<50 hours, ~18% learning gain
Experts in the domain/field need to agree in order to make this all possible.