August 2010

Cognitive Load Theory – Is it just a Load?

by Gary Woodill on August 5, 2010

Note: This item has been cross-posted to Workplace Learning Today (Aug. 6, 2010), where I also blog.

Recently, well known author Jane Bozarth wrote an article on “Brain Bandwidth – Cognitive Load Theory and Instructional Design” in Learning Solutions Magazine. Essentially, the argument in this theory is that “there is only so much new information the brain can process at one time.” This is called Cognitive Load Theory, and it is central to the work of such prominent researchers as Richard Mayer and Ruth Clark. One of the main pieces of evidence for the theory is a 1956 article by George Miller who “suggested that the largest number of discrete pieces of information the brain could manage was seven, plus or minus 2.”

Stephen Downes, in OLDaily, points to Jane’s article, but makes this critical comment: “I think cognitive load theory misrepresents how we acquire and store information. It supposes that information is atomic and symbolic, like a string of numbers.”

I have to agree with Stephen on this point.

In fact, the idea of a limited capacity of the brain to memorize a list of items goes way back to Hermann Ebbinghaus, a philosophy instructor at the University of Berlin in the 1880s. How he came up with this idea is documented in Frank Smith’s 1998 critique of learning theory, The Book of Learning and Forgetting, published by Teacher’s College Press. I quote at some length:

How could anyone make comparisons on any aspect of learning when people are so different, especially in the two things that make learning possible for anyone (according to the classic point of view) – interest and past experience? In the revealing language of science, interest and past experiences “contaminate” experiments and “invalidate” results. People who have a great interest in the topic or activity, and who have had a greater experience of it, are bound to learn more. And they ruin experiments. What experiments need is a method of control… so that the learning task is fundamentally the same for everyone.

…This was Ebbinghaus’s world-changing revelation: if you want to study how people learn without the involvement of interest and past experience — study how they learned nonsense. By definition, no one is interested in anything that makes no sense to them, and by definition, nothing in past experience can help anyone learned nonsense.

Ebbinghaus invented the nonsense syllable, a staple of psychological research ever since. He also described “the learning curve” which is that the ability to memorize nonsense syllables drops off around 10 items, and “the forgetting curve”, which is the memory of most of the nonsense syllables quickly drops off within a few hours.

As Kurt Danziger (one of my professors at York University in the late 1960s) points out in his book Naming the Mind: how psychology found its language (Sage, 1997), Ebbinghaus “defined memory in terms of the work of memorizing and not in terms of the experience of remembering. In this context ‘learning’ was used as a synonym for memorizing, and experimental investigations were designed to answer questions about the relative efficiency of different techniques of learning.” In other words, before Ebbinghaus, the word learning had several meanings in psychological, biological, and philosophical writings, but after, at least in North American psychological literature, learning became synomous with memorizing.

Rereading George Miller’s original article shows that he was talking about a limited kind of task – the ability to discriminate among different audio tones (also a nonsense task). Around 7 different tones, people start to make lots more mistakes. But he also suggests many ways of overcoming this seeming limit on working memory.

It seems that by adding more dimensions and requiring crude, binary, yes-no judgments on each attribute we can extend the span of absolute judgment from seven to at least 150. Judging from our everyday behavior, the limit is probably in the thousands, if indeed there is a limit. In my opinion, we cannot go on compounding dimensions indefinitely. I suspect that there is also a span of perceptual dimensionality and that this span is somewhere in the neighborhood of ten, but I must add at once that there is no objective evidence to support this suspicion.

This hardly seems to be hard nosed science, but it is often cited as “research” for “evidence-based learning”. We need to examine our concepts carefully and critically, and move away from research into nonsense as the basis of our instructional designs. (GW)

Nuts and Bolts: Brain Bandwidth – Cognitive Load Theory and Instructional Design | Learning Solutions Magazine | Jane Bozarth | 3 August 2010

{ 5 comments }