Inquiry vs. Explicit: Is there even a difference?

Embed from Getty Images

 

I teach physics via explicit instruction, right?

My physics classes generally follow an ‘I do’, ‘You do’ format. We do weekly tests and, insofar as I can manage it, everything that I ask students to do is highly structured and knowledge focussed. I think it’s fair to label this ‘explicit instruction’ and, as such, I’ve often felt a sense of validation when reading articles like this one by Paul Kirschner which claims that PISA data demonstrate that inquiry-based instruction is no match for explicit. Kirschner also suggests that, despite clear evidence, PISA doesn’t want to accept its own conclusions:

“Although this distinction is very clearly shown in the figure, the PISA report is very hesitant with regards to interpreting these results. It seems as if the authors can’t bear the thought to dismiss the enquiry-learning ideology. Pathetic and unfair!”

But recently I did some reading that rocked my explicit instruction boat. The reading was in preparation for an Education Research Reading Room podcast episode with Sharon Chen in Taiwan. Entitled Inquiry Teaching and Learning: Forms, Approaches, and Embedded Views Within and Across Cultures, the paper compared inquiry-based instruction in German, Australian, and Taiwanese primary science classrooms.

All was going just fine until I came across the following sentence:

‘We could argue inquiry learning occurred not only in students’ peer dialogues and in teacher elicited dialogues with constructive activities, but in teacher guided instructional dialogues as well…’ (p. 118)

What the heck? ‘Teacher lead inquiry-based instruction’… I thought that was an oxymoron. Does that mean that I teach by inquiry?

I decided to ask my students. First, I gave them two definitions:

Explicit Instruction

    • The teacher decides the learning intentions and success criteria, makes them clear to the students, demonstrates them by modeling, evaluates if students understand what they have been told by checking for understanding, and re-tells them what they have been told by tying it all together with closure (adapted from Hattie, 2009, p. 206)

Inquiry based Instruction

    • The teacher provides opportunities for students to hypothesize, to explain, to interpret, and to clarify ideas; draws upon students’ interests and engages them in activities that support the building of their knowledge; and uses structured questions and representations (diagrams, animations, live demonstrations, experiments) to assist students to learn. (adapted from Chen & Tytler, 2017, p. 118).

Then I gave each student one of these (each student given one spectrum, axes reversed every other spectrum to cancel out any erroneous associations of ‘right is better’ or the opposite).

Ollie Oliver Lovell Explicit Inquiry Enquiry Teaching Scale

The verdict?  This is a 10 point scale and (after accounting for the flipped axes) the average rating of my physics teaching was only 0.56 points to the explicit side of centre! I’m a fence sitter (or my students are?). Fitting, seeing as the spectrum looks like a fence.  I digress.

Both my teaching and the definitions were obviously not as clear cut as I had thought. It was time to go definition hunting.

First Stop, PISA

All the while I had Kirschner’s article firmly in my mind, so PISA seemed like the logical place to go to explore definitions. Below is the image (red boxes in Kirschner’s article) that he featured as evidence for the conclusion that inquiry-based instruction is baloney (image from 2015 PISA results,  pg. 228).

Ollie Oliver Lovell Explicit Inquiry Enquiry Instruction PISA image

So I clearly needed to find out exactly what these ‘teacher-directed science instruction’ and ‘inquiry-based science instruction’ were. The following comes from page 63:

PISA asked students how frequently (“never or almost never”, “some lessons”, “many lessons” or “every lesson or almost every lesson”) the following events happen in their science lessons:

Then, for Teacher-directed (often thought of as synonymous with explicit instruction, i.e., ‘sage on stage’ rather than ‘guide on side’. (pg. 63)

“The teacher explains scientific ideas”; “A whole class discussion takes place with the teacher”; “The teacher discusses our questions”; and “The teacher demonstrates an idea”.

For inquiry-based instruction…(pg. 69)

“Students are given opportunities to explain their ideas”; “Students spend time in the laboratory doing practical experiments”; “Students are required to argue about science questions”; “Students are asked to draw conclusions from an experiment they have conducted”; “The teacher explains how a science idea can be applied to a number of different phenomena”; “Students are allowed to design their own experiments”; “There is a class debate about investigations”; “The teacher clearly explains the relevance of science concepts to our lives”; and “Students are asked to do an investigation to test ideas”.

Looking at these broadly, the main difference between them is the kind of distinction that the layperson would hold, and the distinction that I held prior to Chen and Tytler’s paper: inquiry-based is more student-directed, which is in contrast to explicit being more teacher-directed.  Additionally, inquiry-based instruction is more frequently related to ‘our lives’.

This definition wasn’t as complex as I’d hoped. Time to dig a deeper.

Stop 2, Furtak’s Meta-analysis

In Chen and Tytler’s paper they referenced Erin Furtak’s 2012 Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. I jumped into this paper and found a much more nuanced approach. Furtak presented two dimensions of inquiry-learning, cognitive and guidance, with each split into different domains. I’ve summarised the framework in the following image:

Ollie Oliver Lovell Explicit Enquiry Inquiry Erin Furtak

Here’s a basic summary of each dimension and domain

  • Guidance Dimension
    • This is what we’re all familiar with and Furtak explains it with the diagram below, with ‘Teacher-guided inquiry’ added in the middle.

Ollie Oliver Lovell Explicit Enquiry Inquiry Erin Furtak guidance spectrum

  • Cognitive Dimension, made up of
    • Conceptual domain
      • Furnishing students with an understanding of all of the science ‘concepts’ of science. This can be thought of broadly as knowledge and relationships between various bits of information.
    • Epistemic domain
      • Exploring with students,  ‘How do scientists know when something is a fact or not’
    • Procedural domain
      • Exploring with students, ‘What kind of things do scientists do to help them find out things about the world?’
    • Social domain
      • Exploring with students, ‘How do scientists communicate with each other, and with the wider world, in order to advance and communicate science?’

Fundamentally, Futak and colleagues argue that most of the time, when discussing ‘inquiry’, we’re talking about what instruction looks like in the classroom, i.e., the guidance dimension. What she suggests matters more is what’s actually going on in students’ heads (the cognitive dimension). This is a classic surface vs. deeper structure error where we’re making assessments of what’s going on based on what we see on the surface instead of what’s going on underneath (for more on this, see Chi, Feltovitch, and Glaser on how expert and novice physicists classify questions differently based upon either surface or structure features).

We’re so good at making this mistake, in fact, that of the 37 papers that Furtak and colleagues examined, they found that:

many of the experimental studies performed in  [the decade during which inquiry was the main focus of science education reform, 1996-2006,] did not actually study inquiry-based teaching and learning per se, but rather contrasted different forms of instructional scaffolds that did not substantively change the ways in which students engaged in the domains of inquiry (pg. 323)

Said another way, from a cognitive perspective, in 13 out of the 37 papers (that’s 35%), there was no difference between the way that the control group (unchanged instruction) and the ‘inquiry’ group treated the content!!!

Does that mean that when people argue about ‘inquiry’ vs. ‘explicit’ instruction, 35% of the time they’re not actually arguing about any difference at all?

Maybe.

After sifting through the (limited number of remaining) studies, Furtak and colleagues made the following suggestion:

“the evidence from these studies suggests that teacher-led inquiry lessons have a larger effect on student learning than those that are student led.”

So, turns out the oxymoron of ‘teacher-lead inquiry’ actually turns out to be a pretty effective method of instruction. Go figure.

What I took from this paper was that the ‘inquiry’ in ‘inquiry-based instruction’ isn’t actually about who’s leading the class, the students or the teachers, it’s about what’s going on in students’ heads. As Willingham aptly puts it, ‘Review each lesson plan in terms of what the student is likely to think about. This sentence may represent the most general and useful idea that cognitive psychology can offer teachers’  (2009, p. 61).

More evidence?

This finding, that ‘teacher-led inquiry’ is the most effective method, is somewhat  corroborated by recent research by McKinsey&Company that suggests that learning is maximised when instruction ‘combines teacher-directed instruction in most to all classes and inquiry-based learning in some.’ This research, interestingly, also utilised PISA data, therefore also using the PISA definitions.

McKinsey and inquiry learning, Ollie Lovell, Oliver Lovell

To me this also relates to the ‘expertise reversal effect’ from cognitive science (see Kalyuga, Ayres, Chandler, and Sweller, 2003). That is, as learners gain expertise in a field, more guided forms of instruction (such as explicit instruction) become less effective, and are surpassed in effectiveness by less guided forms of instruction. I spoke to Professor Andrew Martin about this in a recent podcast where we explored the more and less guided spectrum of instruction in a heap of detail.

And here’s the kicker: If we look back up to the Kirschner-referenced PISA image (the one with the red boxes), we see that the thing sitting directly above the ‘Teacher-directed’ criteria is one entitled ‘Adaptive instruction’, which PISA defined as follows (p. 66)

“The teacher adapts the lesson to my class’s needs and knowledge”; “The teacher provides individual help when a student has difficulties understanding a topic or task”; and “The teacher changes the structure of the lesson on a topic that most students find difficult to understand”.

This sounds a lot like ascertaining a students’ level of expertise in a given domain then providing support accordingly, i.e., more or less guided (why haven’t we been talking about this definition more?!?)

Reflecting back on my own classroom

As I thought about my physics classes’ rating of my teaching, I re-visited my lesson plans to try to see if any of what I do would fit into the category of ‘teacher-led inquiry’?  Maybe this does:

teacher-led inquiry? Ollie Lovell, Oliver Lovell

An image that I leave students to think about and discuss to try to work out the answer to (but not for more than about 2-3 mins).

Maybe some of these questions do too:

      • What falls faster, a feather or a bowling ball?
      • How do rockets fly in space?
      • Based upon this analysis, how would one calculate the impulse from a force vs. time graph?(Which followed on from a discussion of: ‘Use dimensional analysis to determine how impulse is related to force!’)
      • What is energy?
      • If you want to drive your car to the shops… where does the energy come from (and through which forms does it change)?

But really… who cares?

More than anything, this (unfinished) exploration into distinction between inquiry and explicit instruction has left me questioning to what degree such ‘definitions’ are even helpful in discussions about teaching and learning. What do we even achieve by taking sides, or trying to put our praxis into a box anyway?

Instead of asking each other ‘Do you teach by inquiry?’ or ‘Are you trad or prog?’, I think that a much more helpful set of questions could be something like:

  • When you did that activity in class today, what did you hope that students would be thinking about?
  • What did you hope the students could do by the end of today’s lesson that they couldn’t do at the start?
  • Why did you choose to ask that question/call on that student at that time?
  • How did your assessment of your students’ expertise in this domain influence your choice of activities today? (except in less words)
  • What do you think are the strengths and weaknesses of the way that you chose to check for understanding at the end of today’s class?
  • When are you going to re-visit this content, and how are you going to re-visit it, to ensure that students retain the key points?
    • and, following any of these, ‘Why?’, ‘Why?’, then, ‘Why?’.

Epilogue: I watched some ‘teacher-led inquiry’ science lessons in Taiwan…

Whilst I was in Taiwan recently I went and watched the teacher, Pauline, who was featured in Chen and Tytler’s paper as an example of inqury-based science teaching in Taiwan (podcast here). I watched her run four ‘teacher-lead inquiry’ based lessons on solids, liquids, and gasses with year 5 and 6 students… and it was FANTASTIC! Maybe I’ll share a blog post about it some day, but for now, I thought I’d share the following brief notes.

Basic lesson format: 1. Teacher-led re-cap of content covered previously (which related directly to the experiment), 2. Teacher tells students how to do an experiment, 3. Teacher tells students exactly what she wants students to look at and think about whilst the experiment is taking place (what changes between before and after the heating of the popcorn/chocolate/egg?), 4. Students do experiment, 5. Teacher runs a class discussion about what happened, and why! 6. Students eat the food.

This was all in 40 minutes mind you, and the classroom was left spotless by the students too, even though they were making popcorn,  melting chocolate, and frying eggs atop (oldschool) bunsen burners!

Students were engaged, they were expressing their ideas, they were using subject-specific vocabulary and making connections to prior-learning. Pauline had high expectations of them and was rigorously questioning them on the concepts and terminology that she wanted them to be learning.

And whilst discussing and reflecting upon these 4 lessons with Pauline, we barely even used the word ‘inquiry’ ; )

 

 

Edit: I enjoyed Greg Ashman’s critique of this post here, to which I replied here.

TOT026: Critique of Hattie, this one’s for the ‘uncool’ kids, + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the past week-ish. Find all past Teacher Ollie’s Takeaways here

THIS. IS. MIND. BLOWING. Massive critique of Hattie

Is most published research wrong?

An article to boost the confidence of the ‘uncool’ kids at school

Book Summary: Accessible Mathematics

Massive resource, more exploring to do

Mindfulness. The research flips and flops…

Review of PBL, and what are the important ingredients

World experts on procrastination

Teacher experience correlates with effectiveness… with details on conditions

Educational inequality consistent over time in the U.K

Sicko quick reference guide on actions to take in the classroom

I’m REALLY excited about this homework program! (student self-quizzing)

Critique of TLAC

How to win teachers over and make them open to student feedback surveys

Diff kinds of feedback… what are they?

Supporting students to refine their thinking

TOT025: 10000 rule=baloney, legendary advice for students, + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

Eat your heart out Ericsson. 10,000 hour rule is baloney

Promoting metacognition and study planning. A (reportedly) effective approach

Some methodological issues in this one. They try to do a bit too much in one study. But still interesting.

Here’s one to share with your graduating students

Resilience, rights, and respectful relationships learning materials

What the heck do mathematicians do? (TED talk)

TOT024: Inflexible knowledge and the ambitions of children, + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

‘The talk’ with those teenage kids. A how to

A review of effective marking practices. Whaddaweneedtoknow?

Some important stuff on effect sizes

The ambitions of children…

Putting evidence to work. Evidence in early career teacher education

Goldmine of (video) interviews with Robert Bjork

NECESSARY READING on ‘inflexible’ knowledge. Daniel Willingham

Most influential books ever written. CHECK OUT

TOT023: Retrieval practice video, charter school article for liberals, + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

More on retrieval practice

Drilling key content knowledge is imperative. Here’s a number of (quite regimented) ways of doing it.

Stop Berating Black and Brown Parents Over Charters

Nice one for liberals (us liberals?) to read.

http://www.natebowling.com/a-teachers-evolving-mind/2017/10/4/stop-berating-black-and-brown-parents-over-charters-or-give-your-twitter-fingers-a-rest

Desmos is a powerful tool. An Bryn is your man

More that challenges growth mindset

Daniel Willingham on reading and listening comprehension

Education in singapore. Insightful thread

The future (or lack thereof) of male teachers

(Fails to mention the unreliability of extrapolation)

TOT022: Data on major determinants of student success, + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

Lesson starters… the German way

Check out the podcast that this was the article linked to here.

STEM+E=STEME. The ‘E’ stands for ‘ethics’

Large scale data analysis gives insights into major determinants of student achievement

Conclusions:
-Finding 1: Having the right mindsets matters much more than socioeconomic background.
-Finding 2: Students who receive a blend of teacher-directed and inquiry-based instruction have the best outcomes.

… This is good stuff. Read it!

Evidence never enough

Working memory depletion effects

(Good to read in tandom with my interview with John Sweller)

TOT021: Synthetic phonics for newbies, impact of teacher gender + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

Lesson starters… the German way

Check out the podcast that this was the article linked to here.

Fractions… in context

A concise definition…

Critique of Jo Boaler’s critique

Synthetic phonics for newbies

The research on teacher gender

TOT020: Great summary of CLT, sdts developing definitions + more Twitter takeaways

Teacher Ollie’s Takeaways is a weekly-ish post (and sometimes a podcast!) bringing together some of the fascinating things that Ollie read throughout the week-ish. Find all past Teacher Ollie’s Takeaways here

Building relationships with your students

Something I’ve been thinking about a little bit recently. Got some hot tips from mates on how to do this…
1. Use the roll call to greet each student (I added to this, get them to say one word that describes how they’re feeling)
2. Decide on ‘the thing’ that you want students to know about your life. By sharing a little about this they’ll feel you’ve opened up.(E.g., my yr 5 teacher liked Kylie Minogue, and we all knew it!)
3. Use time in the yard to get to know your students.

Bonza summary of CLT

Inspiring oracy program

Would need to compare incoming students with those who have been there for a while in order to get a feel for how much of a diff this school is making, but on the face of it, seems like a pretty good program. Inspiring anyway.

What works in the classroom?

Choosing the right knowledge organiser

Scaffolding students to create definitions. A how to…

If you’re an english teacher… this is worth checking out

ERRR #011. Sharon Chen on International Comparisons of Inquiry Teaching

Listen to all past episodes of the ERRR podcast here.

Professor Sharon Chen received her Ph.D. from the Ohio State University and is now Professor of Education at the National Taiwan Normal University. She also currently serves as the Convener oin the Discipline of Education for the Department of Humanity and Social Science at the Ministry of Science and Technology (1/2015-12/2017).
Professor Chen actively participates in many academic activities and serves at the editorial board of TSSCI journal/s. She receives research funding from the Ministry of Science and Technology every year and has received numerous NSC Annual Research Awards. In recent years she has also been in charge of policy related projects funded by the Ministry of Education in Taiwan. Sharon has studied, taught, and written widely in three principal areas of: curriculum reform issues, qualitative research methodology, and teacher professional development. In 2009 she received the NTNU Outstanding Teacher Award and in 2012 the NTNU Distinguished Professorship Award.

We discussed two of Sharon’s papers in this episode of the ERRR. The first, entitled Implications for Cross-Cultural Comparative Studies of Teaching and Learning, discusses the usual forms that cross-cultural studies take, and how some of Sharon’s research has differed. We spent the majority of the interview discussion the second of Sharon’s papers which was entitled Inquiry Teaching and Learning: Forms, Approaches, and Embedded Views Within and Across Cultures. In this second paper Sharon compared the approach of three primary classrooms to teaching science by inquiry, contrasting classrooms from Germany, Taiwan, and Australia.

Reading these papers, as well as doing some classroom visits in Taiwan really challenged some of my pre-conceived ideas about what the word ‘inquiry’ can mean in science education, and to complement this podcast I’m currently also working on a blog post about my changing understanding of of this term, so watch out at ollielovell.com for that one.

In other recent and exciting news, Cameron Malcher from the Teachers’ Education Review podcast has just set up AEON, the Australian Educator’s Online Network as one step shop for Aussie educators to find a whole host of Australian based education podcasts all in on place. Check out aeon.net.au for more info.

Links mentioned during the interview

John Sweller Interview 9: CLT – misconceptions and future directions

This is part of a series of blogs detailing a discussion that I had with John Sweller in mid 2017. See all parts of this series on this page

OL: What do you think is the biggest misconception about cognitive load theory that people have that you would really like to clear up?

JS: It’s only been very recently that people started taking notice of Cognitive Load Theory. For decades I put papers out there and it was like putting them into outer-space, you know, they disappeared into the ether! So, the issue of misconceptions in cognitive load theory didn’t arise. I guess the most common one is that really all I’m talking about is: ‘Don’t give students too much work to do at once’. That’s not really what I mean. It’s true; don’t give them so much work that they can’t get through it all. But that’s not what Cognitive Load Theory is about. What Cognitive Load Theory is about is you can teach the same stuff by reducing working memory load or by increasing working memory load and the issue is, how do you decrease it? And the whole purpose of decreasing it is so you can give them more information that is important. Cognitive Load Theory does not say: ‘Don’t teach them very much’. Cognitive load theory says: ‘Teach them as much as you possibly can because it’s important in any advanced industrial commercial society. But teach them in such a way that they can take it in without overwhelming working memory’.

OL: What are you most excited about in terms of where CLT is going at the moment?

JS: It changes whenever we have a brand new area and in the last few months we have a brand new area. There’s no publications I can point you to yet because there’s nothing out there but we’ve got experimental data. We’ve always assumed that working memory capacity was essentially fixed. The only thing that changes is what’s in long term memory. If you’ve got a lot of information in long term memory, bring it into working memory and you’ve got a huge increase in working memory. But other than that, working memory is fixed. It has become clear recently that we have what we called working memory resource depletion effects and that means, if you’ve been using your working memory, especially in a particular area, heavily for a while, after a while, and you would have experienced this yourself, your working memory keeps getting narrower and narrower and narrower and after a while it just about disappears. You may need to rest. You may even need a rest until the following day. Get sleep in between. That means that at rest, your working memory comes back.And we’re getting some data on that now. (Since this interview was conducted, some research has been published on this. See Greg Ashman’s summary of the article here, and the original article here)

OL: That’s interesting. And that relates to something else I’ve heard about. I’m sure you’ve heard about cognitive bandwidth obviously?  But the work of Sendhill Mullainathan, do you know that work? They did this really interesting experiment (see the paper here). They work with the impacts on working memory of being under financial stress. They did this experiment in a mall in New Jersey. They had two experimental conditions. In the first, they said: “Ok John, imagine that you just crashed your car and it’s going to cost $1500 to fix” and the other condition they said: “John, imagine you just crashed your car, it’s going to cost $150 to fix.” And then after they’d done that, they got you to complete some cognitive tasks, such as Raven’s Matrices.

JS: Oh, I think I see where this is going.

OL: What they found was that in those who weren’t under financial stress, the $150 vs. $1500 made no difference, but for those who were under financial stress, it made a big difference. So, for those with less money, their brain suddenly decided to process in its subconscious: “Oh, how am I going to do this? Where’s the money going to come from?’ And it had a big effect. So it’s kind of similar. But the other study Mullainathan did,  was they tested the working memory, or performance on Raven’s matrices, of farmers in developing countries. And they tested them before harvest, when they were still waiting and unsure of whether their crop was going to come to fruition, and then after, when they had the money. And they saw big impacts of that as well. So there’s all this stuff in the back of our heads that’s actually using up our cognitive bandwidth (working memory) and we’re just not aware of it at all.

John Sweller: And it may not be going on in the back of our heads. It might be going on right in the front of our heads. Two or three years ago I had an academic from Canada, Kris Fraser, who came on sabbatical to visit me and she was a medico, and she was looking at emotion and she got some really interesting results. She tested medical students practicing on plastic models. She had one group of people who had to learn to give treatment for whatever the condition was, but during practice, the patient died.  The other group, learned to give exactly the same treatment, but in this case, the patient lived and recovered. What they then they did was look at how much they had learned, and what they found out was that the people whose model had died had learned less.

OL: Yeah, because they were all stressing out about how they’d killed someone.

JS:  Exactly. So, as I was saying before when we were talking about motivation, you know, these things shouldn’t be mixed up. At that time, I began to think, this is still valid, ‘Well maybe motivation and emotion can be connected in this way?’. I obviously still haven’t decided whether these factors can be related.

OL: It’s occupying a working memory slot or number of slots of the seven plus or minus slots available.

JS: Exactly. If you’re worrying about the idea: ‘My patient died’. You’re not learning. If you’re worried about: ‘How do I afford 1500 dollars’, you’re not going to learn as much.

OL: If you worry about: ‘If I don’t do well on this tests I’m not going to be able to get into the uni course I want to do, etc etc’.

JS: Tell me about it!

OL: Well thanks you for your time today John.

JS: Oh good! It’s good talking to you!

You might also like to check out:

Education Research Reading Room Podcast Episode 9 with Andrew Martin on Load Reduction Instruction, Motivation and Engagement, also available on iTunes.

All posts in this series:

  1. Worked Examples – What’s the role of students recording their thinking?
  2. Can we teach problem solving?
  3. What’s the difference between the goal-free effect and minimally guided instruction?
  4. Biologically primary and biologically secondary knowledge
  5. Motivation, what’s CLT got to do with it?
  6. Productive Failure – Kapur (What does Sweller think about it?)
  7. How do we measure cognitive load?
  8. Can we teach collaboration?
  9. CLT – misconceptions and future directions