I am a researcher and consultant in cognitive science and online education. My academic work consists of cognitive science research to understand how people learn – through experiments or A/B tests, construction of assessments, and statistical modeling. My practical work consists of consulting to improve and evaluate learning from online educational resources – like EdX videos, K-12 text lessons and Khan Academy's interactive math exercises.
In this blend of scientific research and applications I draw on theories and methodology from research I have done, a synthesis of scientific findings from cognitive science, education, and behavior change research, reviews of evidence-based best practices for teaching and learning, practical experience as a statistics tutor, evaluations of educational technology products & authoring tools for e-learning, and experience as a science and technology mentor for startups in a Haas Business School entrepreneurship class.
I am currently director of the Learning, Education And Research Network (LEARN), a member of the Project for Education Research that Scales (PERTS) at Stanford, and receiving my PhD from UC Berkeley's Psychology Department in May 2013 (where I worked with Tania Lombrozo and Tom Griffiths).
My work on online educational resources allows me to combine basic cognitive science research – using experiments and computational modeling to understand how people learn – with applied research that improves education for real students and designs evidence-based educational web-applications. I focus on how answering questions and generating explanations guides people's learning, and applying learning principles from cognitive science, such as how adding question/explanation prompts to online videos & interactive exercises enhances understanding and helps people acquire new learning strategies. I am currently running studies that investigate how to promote students' construction of explanations when solving mathematics exercises on www.khanacademy.org (here is an example of a prototype), how to measurably increase students' motivation and grades by improving web lessons that teach students that intelligence is malleable, and how to change metacognitive habits and behaviors in MOOCs through Electronic Performance Support Tools.
My technology enthusiasms include documenting useful software – for online research, collaboration, and knowledge management – and reviewing and teaching people about online education software and programming skills that support rapid authoring of pedagogically sophisticated instruction containing "in vivo" experiments. I also do ed-tech entrepreneurship and consulting, drawing on my own research, synthesizing and applying based on my own research , and am interested in how the Internet can be used to more broadly disseminate scientific insights about effective pedagogy and technology to both education and industry.
Below is a brief summary of my research, with more information on the research overview page. This website also has pages with my Academic Papers, my CV and brief Resume, my General Talks about applying cognitive science to improve online education, and a hyperlinked list of papers providing the Research-Based Learning Principles my general talks are based on, synthesized from the literature on cognitive science and education. Feel free to contact me with questions or suggestions at joseph_williams AT berkeley DOT edu.
Online Education Research
Carrying out experiments in the context of online education can support rigorous research, because it allows substantial experimental control – random assignment to conditions, precise specification of what is manipulated, and quantifiable measures of learning. At the same time, research that explores learning processes with an eye toward their enhancement possesses a great deal of ecological validity, and permits iterative improvement of online educational environments. Instead of doing lab experiments and then following a costly process to extend the results to a physical classroom, online education allows for in vivo studies: Experimental and control conditions are simply different instructional strategies, stimuli are educational materials students use every day, and dependent measures are formal assessments.
Bridging laboratory research and actual practice has always been challenging, but online education presents an unprecedented context in which to do this.
Explanation & Learning Research
Generating explanations has been shown to improve learning (e.g. Fonseca & Chi, 2011), and has great promise as a tool in online education. Instructors provide guidance through the questions they ask, while learners still construct the knowledge themselves, and can learn even without feedback. Beyond educational settings, people constantly wonder "why?", such as why objects and people belong to certain categories or why others behave the way they do. Generating and evaluating explanations guides an individual's causal reasoning, categorization, and property induction, promotes learning and transfer in educational settings, and drives conceptual development in children.
My research has proposed a subsumptive constraints account: Explaining "why?" drives people to seek underlying generalizations, understanding how the fact or
observation being explained could be anticipated as an instance of a broader pattern. For example, explaining why 2 x 6 = 12 invokes the principle that multiplication
is repeated addition, and an explanation like "John is a teacher because he's a caring person" appeals to a regularity – that caring people are more likely to become
teachers. I have evidence for four predictions of this account: 1. Counterintuitively, explanation's subsumptive constraint can impair learning
when it promotes the use of misleading patterns (Williams, Lombrozo, & Rehder, 2013). 2. Seeking explanations does not simply boost pattern discovery, but particularly promotes the discovery of broad,
unifying patterns that account for a range of facts (Williams & Lombrozo, 2010). 3. Explaining increases learners' consultation of their prior knowledge to identify and privilege those patterns that
prior beliefs suggest are likely to generalize to novel contexts (Williams & Lombrozo, 2013). 4. Explaining is at times necessary for learning from anomalous observations (that conflict with prior beliefs), as it drives people towards broader generalizations (Williams, Walker, & Lombrozo, 2012). These results have been extended to children as young as five (Walker, Williams, Lombrozo & Gopnik, 2012).