Discovery Learning: A Journey of Discovery (Updated)

"Discovery-based learning doesn't work!," shouted the middle-aged, high school mathematics teacher. Looking back at my twenty-eight year old self, I was brimming with the edtech know-how and expertise. Working with a lab full of educators, I was quite sure he was wrong. I'd facilitated plenty of "discovery-based" workshops where we celebrated what adults knew, created opportunities for them to learn more, and then share that.

Updated (2 mins after publishing this): I added links to Jerome Bruner's definition of discovery learning and an image that prompted the "It's not obvious" section below.

But he didn't care. In the end, I had to invite him to leave my workshop. For over 22 years, I believed he was wrong. Sure, he could have handled it better. Instead of being an angry middle-aged guy (I know the feeling since I occasionally am one), whose cheese had just been moved, he could have approached the workshop differently. But then, I realize now that after teaching secondary math all day, working through "discovery approach" to technology may not be what he was looking forward to.

It's NOT Obvious

This isn't obvious to most teachers. I'm sure there are people who get it right out of the starting gate, but there's so much pop research out there, it's easy to go for that. I'm leery of pundits who say, "This is obvious, teachers should know this already." Well, if you didn't get it in college, no one covered it in inservices (ugh), then how were you supposed to learn the latest research?

It's easy to get into an argument on Twitter. What's harder is explaining what you think. Picking an argument, though, wasn't my intent when I responded to @EduCelebrity while going to dinner and picking up essentials. In fact, my tweet in response was a bit off the cuff. Dean Shareski decided it merited a serious response, and I agree. To get you caught up, here's the original series of can sense my mounting irritation at trying to respond coherently via my smartphone keyboard.

Before I go any further, let me say that Dean's remark is on target. He makes a point taking into consideration that ANY high-effect size instructional strategy, as initially outlined by John Hattie in Visible Learning (2009), needs to be considered in a careful manner.

Direct instruction (.59) is a valuable strategy that is useful when introducing students to new ideas. Problem-based Learning (.35 effect size), or discovery based teaching (.21 effect size), are best used when students need to apply what they have learned, conceptualized, to novel situations. Use them too early, you don't get the results you want.

Let's explore that a bit more.

A Quick Aside

In this blog entry, I'm going to lump inquiry-based learning, PBL, and discovery learning into the same family. Jerome Bruner describes discovery learning in this way:

"Discovery learning is an inquiry-based, constructivist learning theory that takes place in problem solving situations where the learner draws on his or her own past experience and existing knowledge to discover facts and relationships and new truths to be learned[1]. Students interact with the world by exploring and manipulating objects, wrestling with questions and controversies, or performing experiments. 
As a result, students may be more more likely to remember concepts and knowledge discovered on their own (in contrast to a transmissionist model)[2]. Models that are based upon discovery learning model include: guided discovery, problem-based learning, simulation-based learning, case-based learning, incidental learning, among others.
Now, let me say that constructivist learning theory is what I was introduced to and using when I start blending technology into my classroom activities. I'd like to point out the following perspective, which is a bit of a shocker. Sit down before you read it.
"Constructivism too often is seen in terms of student-centred inquiry learning,problem-based learning, and task-based learning, and common jargon words include “authentic”, “discovery”, and “intrinsically motivated learning.” 
The role of the constructivist teacher is claimed to be more of a facilitation to provide opportunities for individual students to acquire knowledge and construct meaning through their own activities, and through discussion, reflection and the sharing of ideas with other learners with minimal corrective intervention.... 
These kinds of statements are almost directly opposite to the successful recipe for teaching and learning..." (Source: John Hattie, Visible Learning as cited).

 Wow, consider that perspective. Constructivist learning approaches are, let me emphasize Hattie's words here:

...almost directly opposite to the successful recipe for teaching and learning.
That's a controversial statement. Whether you agree with it or not depends on how much reading of the Hattie's meta-analyses and research. One thing is clear to me, however. Educational technology based on constructivist approaches is IN DECLINE because it does not yield the results as evidence-based strategies. That's not to say constructivist approaches won't result in some learning...most strategies do have some result. But they fall short of what is needed in today's classrooms. I tend to agree with Hattie's points below:
"We have no right to teach in a way that leads to students gaining less than d=.40 within a year....One of the more difficult tasks is to convince teachers to change their methods of teaching. So many adopt one method and vary it throughout their career."

When Is As Important As What

When you use a strategy is as important as what strategy you use. That's why Dean's response (that is, "It depends") is on target. Here's a quick overview from a blog entry I wrote for another blog:

High-effect size strategies fall into several categories. Understanding these may assist you in selecting one.
Approaches that help students’ surface-level learning may not work for deep learning. This is true in reverse (deep learning strategies may not work for surface learning).
Match the right approach with the appropriate phase of learning. – Hattie, Fisher, and Frey (Visible Learning for Mathematics, 2017)
These are organizational categories for educators, but they are more to students where they serve as stages of learning. Click the strategy to learn more about it.
As you can see, discovery based teaching (.21) is not even on the list above. That's because, according to John Hattie's meta-analyses (see the research studies that make that up), Discovery Based Teaching doesn't even meet the minimum threshold. That threshold? A strategy that has an effect size of .40 is a normal year's growth in the one school year. The higher the effect size, the likelihood student growth will be accelerated beyond a normal school year.

Fortunately, we can distinguish between discovery based teaching and problem-solving teaching (.68). Let's review a bit:

Discovery based teaching (.21) is described in this way:
A practice in which students formulate clear, testable hypotheses, which they then test in a laboratory or workshop setting through direct experience. Often equated to project-based teaching or play-based teaching.
Problem-based learning (.35) is seen in this way:
In problem-based learning scenarios, students often act in groups and decide what they need to learn to resolve a particular problem or question, while teachers act as facilitators. It usually involves real-world problems to promote student learning of concepts and principles as opposed to direct presentation of facts and concepts. The aim is also to promote critical thinking skills, problem-solving abilities, and communication skills.
Problem-solving teaching (.68) is defined in this way:
Problem-solving involves learning to solve a problem that one does not already know how to solve, and can also involve teaching specific, subject-area focused strategies for attempting to solve such problems.
Considering effect size, you're going to want to use Problem-Solving Teaching (.68). You get a better result from it. However, this .68 strategy is weakened when used at surface learning (that is, introducing students to concepts) or deep learning (when students work to gain a deeper conceptual understanding).

But use Problem-Solving Teaching when transfer learning is called for, now you have a strategy that accelerates student growth.

Wait, Are You Sure?

Dean makes another point that I have partaken of in my gradual acceptance of Hattie's work. We must try to ascertain, to determine whether the research behind one approach is better than another.

Direct instruction (.53) is not lecture. It is quite more than that and does it's job well as a surface learning strategy. Is it appropriate for ALL lessons and students? No, of course not.

Students must be introduced to new concepts in a variety of ways. There are many ways to achieve that (as the list of surface learning strategies above shows).

One of my favorite portions of the Visible Learning for Literacy book is this section on page 37:
[One of the authors] recalls working with seasoned teachers to develop new knowledge and skills about problem-based learning (PBL). She held workshops, engaged in professional reading and discussions, and hosted a professional learning community focused on the practice. Yet time and again, the effort fizzled as teachers said it didn't work. They blamed their students' existing knowledge, lack of motivation, and inability to engage in self-directed learning.... 
Yet problem-based learning can work, under the right conditions. However--and this is critical--it isn't particularly effective when students don't yet possess the knowledge, skills, and dispositions needed to engage an inquiry-driven investigation about a topic.  
In other words, the timing is off. 
PBL is better for deepening knowledge, but not for the initial surface learning needed in advance of such study.
One of the interesting points here is that when done later, at the right time, DBT and PBL become "problem-solving teaching." The authors point out:
We need to match what works to accelerate student learning, then implement it at the right time. . .the evidence is that when PBL is used early in the learning cycle, before students have had sufficient experience with learning the declarative and procedural knowledge needed, the effect size is very low: .15. This is surface-level knowledge, and they just aren't equipped with enough knowledge to pursue inquiry. 
But when problem-solving teaching is employed, the effect size skyrockets. Unlike convention PBL, where the problem is presented to students in advance of knowledge acquisition, problem-solving teaching is deployed when students are already deepening their knowledge.
That makes for powerful learning at the right time.

Hands-On Experiences

John Colgan jumped into the conversation. He points out the following:

I don't disagree that an engaging anticipatory set is a great way to start off a lesson.

Back to the Past

When I think back to my "discovery learning" workshop, which I have foisted on adult learners ever since with great reviews, I am a bit disappointed. I wish I had known more about the research behind strategies that Hattie highlights in 2009. Unfortunately, I didn't. "The time wasted,"I complained to a colleague a few months ago  as I continued my studies of Hattie's work and underlying research. "If only I had known."

The truth is, that math teacher tried to tell me. He was right. I was wrong. He will never get the benefit of my new insight, my enlightenment. That aside, @EduCelebrity, when I wrote you that tweet, I did it with a bit of hilarity and foolishness in my heart. Thanks for keeping it funny.

P.S. Hattie Controversy

Like Pᴀᴛᴛʏ Kᴏʟᴏᴅɴɪᴄᴋɪ Eᴅ.D. (@DrKnicki), I did say something similar to what she pointed out:
Yea, Hattie's work, specifically the statistics are flawed. Data from original studies were used incorrectly. The concepts behind the book, awesome, execution, garbage Grimacing face
However, as I explore each of the strategies that work (e.g. Reciprocal Teaching), it's easy to see the research underlying the effect size. I'm grateful to Corwin's Visible Learning MetaX online database because it makes it easy to find the underlying research. While some may make the same point as Dr. Kolodnicki, I find the research underlying it to be convincing.

Everything posted on Miguel Guhlin's blogs/wikis are his personal opinion and do not necessarily represent the views of his employer(s) or its clients. Read Full Disclosure


Dean Shareski said…
Thanks for writing this. I didn't really mean for you to get that worked up about it, but have been concerned a bit over Hattie's seemingly unquestioned references. You outline some important distinctions here.

Popular posts from this blog

Rough and Ready - #iPad Created Narrated Slideshow

Old Made New: Back to Bunsen Labs Linux (Updated)

The Inside Scoop: EdTech 2020 Virtual Conference #edtech #zoom