Posts

The Debunker Club is hosting a social get together on Tuesday May 21st at the ATD Conference in Washington, DC.

2 to 3PM during the Ice Cream break!

Are are welcome!

RSVP and details here: https://www.evite.com/event/01D5RFTEEI35E4LKKEPJME62P5C7MY/rsvp

 

 

On November 2nd, 2017, the Debunker Club sponsored a one-hour Twitter debate using the hashtag #DebunkDebate. We have a wonderful, cacophonous dialog in typical Twitter-chat fashion.

The file below contains all the tweets from the debate.

Download Great 70-20-10 Debate Tweet Stream

 

Also, Cara North posted a prettier version here.

 

 


Myth:

To become a preeminent expert, it takes 10,000 hours of practice.

Description:

Ever since Malcolm Gladwell’s book Outliers, which has sold north of 2 million copies, the “10,000 Hour Rule” has become very well known in the learning and education fields. While Gladwell popularized the meme, it’s pretty clear that it is not original with him.

Nevertheless, it is Gladwell’s synopsis that has traveled through the information universe. Here are some of the major parameters of Gladwell’s description:

  • It takes everyone, in every field, 10,000 hours to become a preeminent expert.

 

Strength of Evidence Against

Gladwell is certainly correct that expertise requires intense long-term practice. However, the notion that it always takes 10,000 hours is certainly wrong. As researcher Anders Ericsson and Robert Pool have written in their book, Peak: Secrets from the New Science of Expertise, “Unfortunately, this rule — which is the only thing that many people today know about the effects of practice — is wrong in several ways. (It is also correct in one important way, which I will get to shortly.)” (p. 110).

  • Some experts take longer, some shorter to reach expertise.
  • Different fields require different amounts of practice.
  • Not all practice is created equal. It is only “deliberate practice” that enables expertise.
  • Not all people who practice for 10,000 hours will become experts.
  • To become an expert, it does take intensive, intentional, well-designed practice over many years.

Note: Deliberate practice “involves constantly pushing oneself beyond one’s comfort zone, following training activities designed by an expert to develop specific abilities, and using feedback to identify weaknesses and work on them.” From Ericsson and Pool article in Salon.

Notes on Deliberate Practice

While the deliberate-practice notion has received widespread research support, it should not be interpreted to suggest that such deliberate practice is all you need to be good. Moreover, it should not be interpreted to mean that deliberate practice has the same impact in every field of endeavor. Indeed a recent meta-analysis suggested that deliberate practice may be more potent in some fields than in others.

 

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Audio Podcasts

 

 

One never knows what might happen when he/she declares something to the world and asks for volunteers. In May, The Debunker Club raised the flag and declared June 2015 to be DEBUNK LEARNING STYLES MONTH. With only a few days left in our first such effort, we've seen many tweets, much cheer leading, and likely many personal reflections. We've also got members and others to post their myth-busting efforts on blogs, LinkedIn, Scoop It, Pinterest, etc.

Here's a short list (THANKS TO THE DEBUNKERS!):

If you've seen other debunking efforts, please post here (in the COMMENTS below), AND at our sightings page.

 

 

 

Myth:

Discovery learning (also known as problem-based learning, inquiry learning, experiential learning and constructivist learning) hypothesizes that people learn best in an unguided or minimally-guided environment. That is, they learn best, NOT when they are presented with essential information, but when they discover or construct essential information for themselves.

 

Description:

A popular premise for this myth is that learning to solve problems is of utmost importance (which is indisputable) and that in order to achieve this goal we must use problem solving as the primary instructional method (which is not only disputable, but also misguided). Another, related premise is that the discovery of new facts and relationships through exploration and experimentation is of utmost importance in science and that in order to educate scientific thinkers we must use discovery learning also as an instructional method.

The discovery-learning myth has been adopted all around the world, by both educators and laypeople, primarily because discovery learning sounds so logical (in theory). Unfortunately, there is an incredibly large corpus of research showing that (1) minimally instructive methods tax the learner’s cognitive resources to such a large extent that learning is impeded, (2) solving problems in a domain requires first and foremost knowledge of/in that domain, (3) solving problems without the necessary prerequisite domain knowledge is difficult if not impossible and often leads to warped/twisted ‘knowledge’, misconceptions, and poor/weak problem-solving approaches, and (4) while inquiry is a fundamental method of scientists (because a scientist is someone who ‘knows’ and whom is in search of new knowledge), it is not a good learning method for most learners because most learners are not ‘junior scientists’. They simply don’t know enough to do good inquiry.

 

Strength of Evidence Against

The strength of evidence against the use of discovery is very strong. To put it simply, using minimally guided approaches does not lead to effective or efficient learning. Moreover, it does not lead to better problem solving or learning to solve problems. While it may be true that some learners eventually learn through minimally-guided instruction, the bottom line is that when designing instruction for learning, there are far better learning methods to use than discovery learning.

The use of discovery-learning methods completely ignores human working memory limitations (Kirschner, Sweller, & Clark, 2006; Sweller, 1988, 1999; van Merriënboer & Sweller, 2005, 2010) and any instructional procedure that ignores the structures that constitute human cognitive architecture will not be effective. The consequences of requiring novice learners to search for problem solutions using a limited working memory appear to be routinely ignored.

For novice learners, discovery learning should never be the primary instructional method employed, though it might be a long-term goal—preparing learners to handle increasingly more difficult problems. Effective educational methods should carefully and gradually help learners move towards this goal. First, learning designs should help learners gain knowledge about the learning domain, because new relationships can only be discovered when you know enough to know to look for. Second, such methods should help learners develop skills and cognitive strategies for systematically exploring and experimenting in the domain, using the rules-of-thumb that are useful in that particular domain. And finally, such methods should provide support and guidance during the discovery process, and only decrease support and guidance as learners gain more expertise and can actually discover new insights and/or connections on their own (van Merriënboer & Kirschner, 2007).

There are situations where discovery or problem-solving can be used and that is when the learner has gained a good deal of experience in a topic area. This is what Kalyuga calls the expertise reversal effect where providing an approach that works well for experts does not work at all or is even harmful for novices and vice versa. However, most learners are just that: learners (i.e., novices in an area) with little or no prior knowledge or experience. The mistake here is assuming that learners will learn key problem-solving skills on their own when presented with problems early in the learning process. What happens, as the research clearly demonstrates, is that most learners flounder.

Overall, the evidence shows that learning and instructional professionals should NOT use discovery-based learning methods as a way to design learning experiences, except perhaps in the rare instances where the learners are highly experienced with the targeted topic. Minimal instructional guidance leads to minimal learning (Kirschner, Sweller, & Clark, 2006).

 

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Videos

 

Debunking Resources — Newspapers & Magazines

  • None that we know of…

 

Debunking Resources — Peer-Reviewed Scientific Articles

  • Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1-18.
  • Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 46, 75-86.
  • Klahr, D. & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15, 661-667.
  • Mayer, R. E. (2004). Should There Be a Three-Strikes Rule Against Pure Discovery Learning? American Psychologist, 59(1), 14-19.

 

Debunking Resources – Supporting Research

  • Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19, 509–539.
  • Kalyuga, S., Ayres, P., Chandler, P., & Sweller, J. (2003). The expertise reversal effect. Educational Psychologist, 38, 23-31.
  • Kirschner, P. A. (1992). Epistemology, practical work, and academic skills in science education. Science and Education, 1, 273-299.
  • Kirschner, P. A. (2009). Epistemology or pedagogy, that is the question. In S. Tobias & T. M. Duffy (Eds.), Constructivist instruction: Success or failure? (pp. 144-157). New York: Routledge.
  • Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12, 257-285.
  • Sweller, J. (1999). Instructional design in technical areas. Camberwell, Australia: ACER Press.
  • Van Merriënboer, J. J. G., & Sweller, J. (2005). Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17, 147-177.
  • Van Merriënboer, J. J. G., & Sweller, J. (2010). Cognitive load theory in health professional education: Design principles and strategies. Medical Education, 44, 85-93.

Special thanks to Paul Kirschner for helping with this post.

 

Myths:

There are many: Learning objectives presented to learners must (a) follow Mager’s recommendation to include three separate parts (e.g., performance, conditions, criteria), (b) can use words that are not salient in the learning material, (c) can be presented long before learners encounter the learning material, (d) must utilize action verbs (and cannot use the word “understand”), and (e) must always be presented to learners before they begin instruction.

Description:

Presenting learning objectives to learners at the beginning of a lesson has become de rigueur in both the education and training worlds. Unfortunately, there is much confusion about why this is done and how to do it effectively. Research on goal-setting has found general benefits, but such goal-setting effects have never been tested in regards to learning objectives.

The research that has been done on learning objectives has shown that presenting learners with learning objectives produces benefits because it helps learners focus attention on the targeted aspects of the learning material (Rothkopf & Billington, 1979). To be more specific, if a learning objective targets Concept X, then learners are more likely to pay attention to aspects of the learning material that are relevant to Concept X, and are less likely to pay attention to aspects of the learning material not relevant to Concept X.

Given that attention is what drives the benefits of learning objectives, several common practices can be called into question. For one thing, it doesn’t help learners to present them with a three-part learning objective — it simply distracts them from focusing on the main points. As Hamilton (1985, p. 78) wrote, “[An instructional] objective that generally identifies the information to be learned in the text will produce robust effects. Including other information (per Mager’s, 1962, definition) will not significantly help and it may hinder the effects of the objectives.”

Another common practice is writing learning objectives with very general wording (for example, “You will learn how to champion a change effort”). Unfortunately, research has shown that specifically-worded learning objectives produce effects while generally-worded learning objectives produce zero or weak effects (Rothkopf, & Kaplan, 1972; Britton, Glynn, Muth, & Penland, 1985). They words in the learning objective have to be salient and they have to be words that will be encountered in the learning material.

There is no need to use action verbs in learning objectives presented to learners. It is okay to use the word “understand” as well. The action verbs don’t help guide attention and the word “understand” doesn’t distract.

Learning objectives when viewed by learners are integrated into long-term memory. Just like all memory traces, they fade with time. Therefore, learning objectives presented to learners too far from the time when the targeted concepts are encountered are unlikely to trigger attentional processing. For example, Kaplan (1974) found that learning objectives interspersed throughout learning material was more effective than learning objectives presented at the beginning.

Finally, given that we know that learning objectives presented to learners create their advantages by helping learners pay extra attention to the targeted information in the learning materials, we must conclude that learning objectives are not needed. That is, we don’t have to present them to learners because we have other ways to guide learner attention to critical information.

Why has there been so much confusion? In the training-and-development field the biggest problem is that we have confused objective for learners and objectives for learning professionals. Where learning professionals need objectives that focus on behaviors, conditions, and criteria; learners gain a real advantage when the learning objectives help guide attention. Unfortunately, somewhere a long time ago we got the idea that our learning objectives should be used for both learners and learning professionals. To help disambiguate this, Thalheimer (2006) suggested calling learning objectives that were presented to learners “focusing objectives” because they helped learners focus on the targeted information.

Strength of Evidence Against

The strength of evidence against the use of Mager-like objectives is very strong, as evidenced in Hamilton’s research review. The strength of evidence for the importance of word specificity is also strong. The evidence for the importance of keeping the objectives close in time to the subsequent learning material is suggestive, but not many studies have covered this.

 

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Videos

 

Debunking Resources — Newspapers & Magazines

  • None that we know of…

 

Debunking Resources — Peer-Reviewed Scientific Articles

  • Britton, Glynn, Muth, & Penland (1985). Instructional objectives in text: Managing the reader’s attention. Journal of Reading Behavior, 17, 101-113.
  • Hamilton, R. J. (1985). A framework for the evaluation of the effectiveness of adjunct questions and objectives. Review of Educational Research, 55, 47-85.
  • Kaplan, R. (1974). Effects of learning prose with part versus whole presentations of instructional objectives. Journal of Educational Psychology, 66, 448-456.

  • Mager, R. (1962). Preparing Instructional Objectives. Palo Alto, CA: Fearon Publishers.
  • Rothkopf, E. Z., & Billington, M. J. (1979). Goal-guided learning from text: Inferring a descriptive processing model from inspection times and eye movements. Journal of Educational Psychology, 71(3), 310-327.
  • Rothkopf, E. Z., & Kaplan, R. (1972). Exploration of the effect of density and specificity of instructional objectives on learning from text. Journal of Educational Psychology, 63, 295-302.

 

 

 

 

Myth:

People forget at predictable rates regardless of other factors. For example, “People forget 40% of what they learned in 20 minutes and 77% of what they learned in six days.” “People forget 90% after one month.” “People forget 50-80% of what they’ve learned after one day and 97-98% after a month.”

Description:

There are many examples of claims that people forget X amount in Y time. These claims are all over the place, but each of them claims a universal truism without regard to the knowledge-level of the learner, the content of the learning concepts, the emotional salience of the knowledge or skill, the type of knowledge or skill, or the learning methods employed. These claims are often framed in terms of a forgetting curve. The most widespread claim are based on Hermann Ebbinghaus’s work from the 1800s.

Strength of Evidence Against

The strength of evidence against the claim of predictable forgetting rates is overwhelming and conclusive. Any perusal of the scientific research will find examples of forgetting curves that vary widely. Indeed, learning research that measures learners more than once is designed with the very premise that forgetting rates will vary; or why would the study be done in the first place? People certainly forget, but they forget at different rates depending on many factors.

 

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Videos

  • None that we know of…

 

Debunking Resources — Newspapers & Magazines

  • None that we know of…

 

Debunking Resources — Peer-Reviewed Scientific Articles as Examples

  • Corazzini, L. L., Thinus-Blanc, C., Nesa, M.-P., Geminiani, G. C., & Péruch, P. (2008). Differentiated forgetting rates of spatial knowledge in humans in the absence of repeated testing. Memory, 16(7), 678-688.
  • Wheeler, M. A., Ewers, M., & Buonanno, J. F. (2003). Different rates of forgetting following study versus test trials. Memory, 11(6), 571-580.

 

Debunking Resources — Research Reviews

  • None that we know of…

 

 

 


Myth:

By diagnosing learners based on their learning styles, and then using that diagnosis to guide instruction, learning will be improved.

Description:

Probably today’s most ubiquitous learning myth is that people have different learning styles and that these learning styles can be diagnosed and used in learning design to create more effective learning interventions. This myth has resonated and spread throughout the world’s learning-professional community probably because it hints at an idea that seems sensible — that people learn differently. Unfortunately, there are dozens and dozens of ways to separate people by type, so it’s hard to know which distinctions to use for which learner, for which topics, for which situations. More importantly, the research evidence shows clearly that using learning styles in designing/deploying learning does not reliably improve learning results.

Strength of Evidence Against

The strength of evidence against the use of learning styles is very strong. To put it simply, using learning styles to design or deploy learning is not likely to lead to improved learning effectiveness. While it may be true that learners have different learning preferences, those preference are not likely to be a good guide for learning. The bottom line is that when we design learning, there are far better heuristics to use than learning styles.

Even in terms of taking learners’ individual differences into account, there are better guideposts. For example, probably the most important individual difference is learner knowledge of the specific concepts being taught. Good instructors know that one of the most critical things they can do is to diagnose their learners’ conceptual understanding, and deliver instruction appropriate to their level of understanding.

Despite scientific research reviews that debunk learning styles, research is still being done to support the learning styles idea. As Furnham (2012) wrote: “The application of, and research into, learning styles and approaches is clearly alive and well.” (p. 77).

Furnham, A. (2012). Learning styles and approaches to learning. In K. R. Harris, S. Graham, T. Urdan, S. Graham, J. M. Royer, & M. Zeidner (Eds.), APA handbooks in psychology. APA educational psychology handbook, Vol. 2. Individual differences and cultural and contextual factors (pp. 59-81). doi:10.1037/13274-003
For example, a recent research review of fifty-one scientific studies looked at whether learning styles might be an effective way to design adaptive elearning systems.
Truong, H. M. (2015). Integrating learning styles and adaptive e-learning system: Current developments, problems and opportunities. Computers in Human Behavior. Advance online publication. http://dx.doi.org/10.1016/j.chb.2015.02.014
The weight of evidence at this time suggests that learning professionals should avoid using learning styles as a way to design their learning events. Still, research has not put the last nail in the coffin of learning styles. Future research may reveal specific instances where learning-style methods work. Similarly, learning preferences may be found to have long-term motivational effects.

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Videos

 

Debunking Resources — Newspapers & Magazines

 

Debunking Resources — Peer-Reviewed Scientific Articles

  • Willingham, D. T., Hughes, E. M., & Dobolyi, D. G. (2015). The scientific status of learning styles theories. Teaching of Psychology, 42(3), 266-271. http://dx.doi.org/10.1177/0098628315589505\

  • Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological Science in the Public Interest, 9(3), 105-119.
  • Rohrer, D., & Pashler, H. (2012). Learning styles: Where’s the evidence? Medical Education, 46(7), 634-635.
  • Klitmøller, J. (2015). Review of the methods and findings in the Dunn and Dunn learning styles model research on perceptual preferences. Nordic Psychology, 67(1), 2-26. http://dx.doi.org/10.1080/19012276.2014.997783

 

Debunking Resources — Research Reviews

 

Myth-Supporting or More-Neutral Research Reviews

  • Furnham, A. (2012). Learning styles and approaches to learning. In K. R. Harris, S. Graham, T. Urdan, S. Graham, J. M. Royer, & M. Zeidner (Eds.), APA handbooks in psychology. APA educational psychology handbook, Vol. 2. Individual differences and cultural and contextual factors (pp. 59-81). doi:10.1037/13274-003.


Myth:

People remember 10% of what they hear, 20% of what they read, 30% of what they see, et cetera. (and variants thereof, including when these numbers are placed on Edgar Dale’s Cone of Experience)

Description:

One of the most ubiquitous learning myths is that people remember a certain percentage of what they had learned depending on the perceptual modality or activity that they engaged in to learn. So for example, it has been claimed that people remember 10% of what they read, 20% of what they hear, 30% of what they see, 50% of what they see and do, 70% of what they say, and 80% of what they do and say. There are many, many variants of these numbers, but they are all untrue and misleading.

Note that Edgar Dale never used numbers on his Cone of Experience. Moreover, he saw his model as one that described reality, not as one to guide the design of learning.

Strength of Evidence Against

The strength of evidence against the percentages is extremely strong; to the point that there is virtually zero chance that these numbers are correct. Moreover, there are far better resources that can be used to guide learning design than these bogus percentages, which even if they were correct, would not be granular enough to effectively guide learning-design decisions.

Debunking Resources — Text-Based Web Pages

 

Debunking Resources — Videos

  • None that we know of…

Debunking Resources — Newspapers & Magazines

  • None that we know of…

Debunking Resources — Scientific Articles

 

  • PDF copy of the following four articles.
  • Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Mythical Retention Chart and the Corruption of Dale’s Cone of Experience. Educational Technology, Nov/Dec 2014, 54(6), 6-16.
  • Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Previous Attempts to Debunk the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 17-21.
  • Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Good, the Bad, and the Ugly: A Bibliographic Essay on the Corrupted Cone. Educational Technology, Nov/Dec 2014, 54(6), 22-31.
  • Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Timeline of the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 31-24.
  • Jackson, J. (2016). Myths of Active Learning: Edgar Dale and the Cone of Experience. Journal of the Human Anatomy and Physiology Society, 20(2), pp. 51-53. Available by clicking here.