Category Archives: Ollie’s Classroom. Look inside…

My teaching and learning experiences: detailing my trials, tribulations and opinions

Inquiry vs. Explicit: Is there even a difference?

Embed from Getty Images


I teach physics via explicit instruction, right?

My physics classes generally follow an ‘I do’, ‘You do’ format. We do weekly tests and, insofar as I can manage it, everything that I ask students to do is highly structured and knowledge focussed. I think it’s fair to label this ‘explicit instruction’ and, as such, I’ve often felt a sense of validation when reading articles like this one by Paul Kirschner which claims that PISA data demonstrate that inquiry-based instruction is no match for explicit. Kirschner also suggests that, despite clear evidence, PISA doesn’t want to accept its own conclusions:

“Although this distinction is very clearly shown in the figure, the PISA report is very hesitant with regards to interpreting these results. It seems as if the authors can’t bear the thought to dismiss the enquiry-learning ideology. Pathetic and unfair!”

But recently I did some reading that rocked my explicit instruction boat. The reading was in preparation for an Education Research Reading Room podcast episode with Sharon Chen in Taiwan. Entitled Inquiry Teaching and Learning: Forms, Approaches, and Embedded Views Within and Across Cultures, the paper compared inquiry-based instruction in German, Australian, and Taiwanese primary science classrooms.

All was going just fine until I came across the following sentence:

‘We could argue inquiry learning occurred not only in students’ peer dialogues and in teacher elicited dialogues with constructive activities, but in teacher guided instructional dialogues as well…’ (p. 118)

What the heck? ‘Teacher lead inquiry-based instruction’… I thought that was an oxymoron. Does that mean that I teach by inquiry?

I decided to ask my students. First, I gave them two definitions:

Explicit Instruction

    • The teacher decides the learning intentions and success criteria, makes them clear to the students, demonstrates them by modeling, evaluates if students understand what they have been told by checking for understanding, and re-tells them what they have been told by tying it all together with closure (adapted from Hattie, 2009, p. 206)

Inquiry based Instruction

    • The teacher provides opportunities for students to hypothesize, to explain, to interpret, and to clarify ideas; draws upon students’ interests and engages them in activities that support the building of their knowledge; and uses structured questions and representations (diagrams, animations, live demonstrations, experiments) to assist students to learn. (adapted from Chen & Tytler, 2017, p. 118).

Then I gave each student one of these (each student given one spectrum, axes reversed every other spectrum to cancel out any erroneous associations of ‘right is better’ or the opposite).

Ollie Oliver Lovell Explicit Inquiry Enquiry Teaching Scale

The verdict?  This is a 10 point scale and (after accounting for the flipped axes) the average rating of my physics teaching was only 0.56 points to the explicit side of centre! I’m a fence sitter (or my students are?). Fitting, seeing as the spectrum looks like a fence.  I digress.

Both my teaching and the definitions were obviously not as clear cut as I had thought. It was time to go definition hunting.

First Stop, PISA

All the while I had Kirschner’s article firmly in my mind, so PISA seemed like the logical place to go to explore definitions. Below is the image (red boxes in Kirschner’s article) that he featured as evidence for the conclusion that inquiry-based instruction is baloney (image from 2015 PISA results,  pg. 228).

Ollie Oliver Lovell Explicit Inquiry Enquiry Instruction PISA image

So I clearly needed to find out exactly what these ‘teacher-directed science instruction’ and ‘inquiry-based science instruction’ were. The following comes from page 63:

PISA asked students how frequently (“never or almost never”, “some lessons”, “many lessons” or “every lesson or almost every lesson”) the following events happen in their science lessons:

Then, for Teacher-directed (often thought of as synonymous with explicit instruction, i.e., ‘sage on stage’ rather than ‘guide on side’. (pg. 63)

“The teacher explains scientific ideas”; “A whole class discussion takes place with the teacher”; “The teacher discusses our questions”; and “The teacher demonstrates an idea”.

For inquiry-based instruction…(pg. 69)

“Students are given opportunities to explain their ideas”; “Students spend time in the laboratory doing practical experiments”; “Students are required to argue about science questions”; “Students are asked to draw conclusions from an experiment they have conducted”; “The teacher explains how a science idea can be applied to a number of different phenomena”; “Students are allowed to design their own experiments”; “There is a class debate about investigations”; “The teacher clearly explains the relevance of science concepts to our lives”; and “Students are asked to do an investigation to test ideas”.

Looking at these broadly, the main difference between them is the kind of distinction that the layperson would hold, and the distinction that I held prior to Chen and Tytler’s paper: inquiry-based is more student-directed, which is in contrast to explicit being more teacher-directed.  Additionally, inquiry-based instruction is more frequently related to ‘our lives’.

This definition wasn’t as complex as I’d hoped. Time to dig a deeper.

Stop 2, Furtak’s Meta-analysis

In Chen and Tytler’s paper they referenced Erin Furtak’s 2012 Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. I jumped into this paper and found a much more nuanced approach. Furtak presented two dimensions of inquiry-learning, cognitive and guidance, with each split into different domains. I’ve summarised the framework in the following image:

Ollie Oliver Lovell Explicit Enquiry Inquiry Erin Furtak

Here’s a basic summary of each dimension and domain

  • Guidance Dimension
    • This is what we’re all familiar with and Furtak explains it with the diagram below, with ‘Teacher-guided inquiry’ added in the middle.

Ollie Oliver Lovell Explicit Enquiry Inquiry Erin Furtak guidance spectrum

  • Cognitive Dimension, made up of
    • Conceptual domain
      • Furnishing students with an understanding of all of the science ‘concepts’ of science. This can be thought of broadly as knowledge and relationships between various bits of information.
    • Epistemic domain
      • Exploring with students,  ‘How do scientists know when something is a fact or not’
    • Procedural domain
      • Exploring with students, ‘What kind of things do scientists do to help them find out things about the world?’
    • Social domain
      • Exploring with students, ‘How do scientists communicate with each other, and with the wider world, in order to advance and communicate science?’

Fundamentally, Futak and colleagues argue that most of the time, when discussing ‘inquiry’, we’re talking about what instruction looks like in the classroom, i.e., the guidance dimension. What she suggests matters more is what’s actually going on in students’ heads (the cognitive dimension). This is a classic surface vs. deeper structure error where we’re making assessments of what’s going on based on what we see on the surface instead of what’s going on underneath (for more on this, see Chi, Feltovitch, and Glaser on how expert and novice physicists classify questions differently based upon either surface or structure features).

We’re so good at making this mistake, in fact, that of the 37 papers that Furtak and colleagues examined, they found that:

many of the experimental studies performed in  [the decade during which inquiry was the main focus of science education reform, 1996-2006,] did not actually study inquiry-based teaching and learning per se, but rather contrasted different forms of instructional scaffolds that did not substantively change the ways in which students engaged in the domains of inquiry (pg. 323)

Said another way, from a cognitive perspective, in 13 out of the 37 papers (that’s 35%), there was no difference between the way that the control group (unchanged instruction) and the ‘inquiry’ group treated the content!!!

Does that mean that when people argue about ‘inquiry’ vs. ‘explicit’ instruction, 35% of the time they’re not actually arguing about any difference at all?


After sifting through the (limited number of remaining) studies, Furtak and colleagues made the following suggestion:

“the evidence from these studies suggests that teacher-led inquiry lessons have a larger effect on student learning than those that are student led.”

So, turns out the oxymoron of ‘teacher-lead inquiry’ actually turns out to be a pretty effective method of instruction. Go figure.

What I took from this paper was that the ‘inquiry’ in ‘inquiry-based instruction’ isn’t actually about who’s leading the class, the students or the teachers, it’s about what’s going on in students’ heads. As Willingham aptly puts it, ‘Review each lesson plan in terms of what the student is likely to think about. This sentence may represent the most general and useful idea that cognitive psychology can offer teachers’  (2009, p. 61).

More evidence?

This finding, that ‘teacher-led inquiry’ is the most effective method, is somewhat  corroborated by recent research by McKinsey&Company that suggests that learning is maximised when instruction ‘combines teacher-directed instruction in most to all classes and inquiry-based learning in some.’ This research, interestingly, also utilised PISA data, therefore also using the PISA definitions.

McKinsey and inquiry learning, Ollie Lovell, Oliver Lovell

To me this also relates to the ‘expertise reversal effect’ from cognitive science (see Kalyuga, Ayres, Chandler, and Sweller, 2003). That is, as learners gain expertise in a field, more guided forms of instruction (such as explicit instruction) become less effective, and are surpassed in effectiveness by less guided forms of instruction. I spoke to Professor Andrew Martin about this in a recent podcast where we explored the more and less guided spectrum of instruction in a heap of detail.

And here’s the kicker: If we look back up to the Kirschner-referenced PISA image (the one with the red boxes), we see that the thing sitting directly above the ‘Teacher-directed’ criteria is one entitled ‘Adaptive instruction’, which PISA defined as follows (p. 66)

“The teacher adapts the lesson to my class’s needs and knowledge”; “The teacher provides individual help when a student has difficulties understanding a topic or task”; and “The teacher changes the structure of the lesson on a topic that most students find difficult to understand”.

This sounds a lot like ascertaining a students’ level of expertise in a given domain then providing support accordingly, i.e., more or less guided (why haven’t we been talking about this definition more?!?)

Reflecting back on my own classroom

As I thought about my physics classes’ rating of my teaching, I re-visited my lesson plans to try to see if any of what I do would fit into the category of ‘teacher-led inquiry’?  Maybe this does:
teacher-led inquiry? Ollie Lovell, Oliver Lovell

An image that I leave students to think about and discuss to try to work out the answer to (but not for more than about 2-3 mins).

Maybe some of these questions do too:

      • What falls faster, a feather or a bowling ball?
      • How do rockets fly in space?
      • Based upon this analysis, how would one calculate the impulse from a force vs. time graph?(Which followed on from a discussion of: ‘Use dimensional analysis to determine how impulse is related to force!’)
      • What is energy?
      • If you want to drive your car to the shops… where does the energy come from (and through which forms does it change)?

But really… who cares?

More than anything, this (unfinished) exploration into distinction between inquiry and explicit instruction has left me questioning to what degree such ‘definitions’ are even helpful in discussions about teaching and learning. What do we even achieve by taking sides, or trying to put our praxis into a box anyway?

Instead of asking each other ‘Do you teach by inquiry?’ or ‘Are you trad or prog?’, I think that a much more helpful set of questions could be something like:

  • When you did that activity in class today, what did you hope that students would be thinking about?
  • What did you hope the students could do by the end of today’s lesson that they couldn’t do at the start?
  • Why did you choose to ask that question/call on that student at that time?
  • How did your assessment of your students’ expertise in this domain influence your choice of activities today? (except in less words)
  • What do you think are the strengths and weaknesses of the way that you chose to check for understanding at the end of today’s class?
  • When are you going to re-visit this content, and how are you going to re-visit it, to ensure that students retain the key points?
    • and, following any of these, ‘Why?’, ‘Why?’, then, ‘Why?’.

Epilogue: I watched some ‘teacher-led inquiry’ science lessons in Taiwan…

Whilst I was in Taiwan recently I went and watched the teacher, Pauline, who was featured in Chen and Tytler’s paper as an example of inqury-based science teaching in Taiwan (podcast here). I watched her run four ‘teacher-lead inquiry’ based lessons on solids, liquids, and gasses with year 5 and 6 students… and it was FANTASTIC! Maybe I’ll share a blog post about it some day, but for now, I thought I’d share the following brief notes.

Basic lesson format: 1. Teacher-led re-cap of content covered previously (which related directly to the experiment), 2. Teacher tells students how to do an experiment, 3. Teacher tells students exactly what she wants students to look at and think about whilst the experiment is taking place (what changes between before and after the heating of the popcorn/chocolate/egg?), 4. Students do experiment, 5. Teacher runs a class discussion about what happened, and why! 6. Students eat the food.

This was all in 40 minutes mind you, and the classroom was left spotless by the students too, even though they were making popcorn,  melting chocolate, and frying eggs atop (oldschool) bunsen burners!

Students were engaged, they were expressing their ideas, they were using subject-specific vocabulary and making connections to prior-learning. Pauline had high expectations of them and was rigorously questioning them on the concepts and terminology that she wanted them to be learning.

And whilst discussing and reflecting upon these 4 lessons with Pauline, we barely even used the word ‘inquiry’ ; )



Edit: I enjoyed Greg Ashman’s critique of this post here, to which I replied here.

Present new material in small steps with student practice after each step: How’s it look?

Embed from Getty Images

The second recommendation in Rosenshine’s ‘Principles of Instruction is “Present new material in small steps with student practice after each step”. The basis for this recommendation is the fact that working memory is limited and, for learning to occur it’s important to avoid overloading working memory. But that isn’t the focus of this post. In this post I just wanted to share what ‘new material in small steps with student practice after each step’ can look like in the classroom.

As a rule of thumb, the longer a teacher talks for the more likely they are delivering sufficient information to overload their students’ working memory. As I reflected upon this point, prompted by Craig Barton’s in-depth interview with Kris Boulton recently, I found myself thinking, ‘I wonder how long I talk for?’ It was time to collect some data.

Next lesson I split my notebook into three columns ‘explain’, ‘student work’, and ‘check solution’ (I always teach my maths lessons in an ‘I do’ then ‘You do’ format, then go over the solutions as a class), then I got to recording! First class I got distracted and fell off the timing bandwagon (first half of the page) but second class I remembered to stay on task and that whole class (90 mins) is recorded in the image below (red box).

To set the scene, I wanted students to be able to answer the exam question presented by the end of the lesson. This required them to be able to go from a transition diagram and an initial state matrix to the result after multiple periods with or without the addition of extra units each period, as well as determining the result of such transitions ‘in the long run’, and working backwards in such a relation. I split this up into the following sub-steps for the purposes of instruction.

  • Constructing a transition matrix from a transition diagram.
  • Applying a transition diagram to interpret a transition
  • Applying a transition matrix to interpret change after one transition
  • Understanding transition matrices as recurrent relations (And results after multiple periods with a formula)
  • ‘In the long term’: Steady state solutions to Transition matrices
  • Results after multiple periods (using brute force, that means with a calculator)
  • Transition matrix modelling when the total number of units changes.
  • Working backwards in matrix multiplications


The astute observer will note that the total time adds up to about 60 mins. The additional time was taken up with approx. 20 mins of revising previous content and 10 mins talking about an upcoming assessment and doing a ‘brain break’.

Below is the lesson as I presented it, with the timing for each segment added in italics (images weren’t in the original as students had all questions in front of them. I added them for readers here)

I found it really valuable to look at the timing of my lessons in this level of detail. I’d love to know if it’s prompted any similar reflections for you.


Rosenshine, B. (2012). Principles of Instruction: Research-Based Strategies That All Teachers Should Know. American Educator, 36(1), 12.


Teacher checklists: How to never forget an easy but important task again! [Guide with Photos]

Forgetting to take the marking home for the weekend, forgetting to print off a sheet before class, and forgetting to send a follow up email about a student that as promised. Many of the tasks that constitute a teacher’s working are just like these ones… easy, but important.

A couple of months ago, in a particularly hectic teaching period, I forgot to do three of these such ‘easy but important’ tasks over the course of two weeks. Each time I stuffed up I felt increasingly frustrated and by the end of the third occurrence it was clear that I needed a system to militate against this ever happening again. This post is about that system.

Around the time of my streak of forgetfulness,  Harry Fletcher-Wood tweeted an article on checklists. I’d been thinking about developing a checklist system for my teaching for a while, and this article, along with my recent bouts of forgetfulness, was the perfect prompt to revisit this idea. I bought Harry’s book, Ticked off,  read much of it, and appreciated the lists that he’d put together covering everything from pre-term actions, to lesson planning, to formative assessment. It offered some great tips and tricks on how to effectively make and use checklists (some of which I refer to in the following), But it wasn’t exactly what I was looking for. I wanted more of a system that would help me deal with the every day myriad of simple but important tasks on a teacher’s plate. I had a long hard think about the kind of functionality that I wanted from such a checklist system, and here’s the criteria that I came up with.

  • Digital, on both my phone and laptop
  • Enables recurring tasks, this is for things that I know I have to do every week, such as ‘plan progress check’ or ‘compile detention list’
  • Simple layout that enables me to logically file my lists
  • Has some sort of alerts functionality that prompts me to look at it
  • No bells and whistles, animations, or anything else annoying that’s going to get in the way. i.e., simple yet effective.

The search began, and I was pretty excited when I managed to come across this online checklist/to-do list comparison spreadsheet!

I did some various sorts by column and came up with a list of about three checklist apps that seemed to fit the bill. I played with each of them over the course of three days and settled on one that seemed to offer everything that I wanted, Wunderlist (I’m not receiving anything from Wunderlist or any other parties for this post)

It’s taken me a while to optimise how I use the app, but I thought it worthwhile me sharing in detail exactly how I do it in case anyone out there is facing some of the same challenges that I was.

Screen Shot 2017-07-08 at 10.18.31 amThe first question I asked myself was ‘What lists do I need?’. In Harry’s book he makes the excellent point that a checklist is only useful if you pause at relevant times to check it. He calls these moments ‘pause points’. I thought about my day, and my lessons, and tried to figure out which times could be my ‘pause points’. I worked out that for general tasks the ideal times for me to pause would be 8am, as soon as I arrive at school, and 3pm, as soon as classes have finished for the day. From this I devised a set of 5 lists, as pictured right.

The lists are numbered 1.1, 1.2, etc, because Wunderlist has the option to ‘sort alphabetically’, and by numbering the lists like this it would mean that they would stay in the same order all of the time. Each of these lists is recurring and has an alert. The morning list sends me a notification at 8am, and the afternoon (or ‘arvo’ in Australian) list sends me a notification at 3pm. The number at the end of each list refers to the week of term, and once I’ve completed a checklist Screen Shot 2017-07-08 at 10.24.57 amfor the week, I change that number to the following week (Wunderlist does allow you to tick off the whole list, but, as these are recurring lists, as soon as you tick it off it pops back up). The beauty of these recurring lists is that they allow me to easily remember the tasks associated with, for example, Monday morning, that I have to do every week. Pictured right is what my Monday morning aways looks like.

If there’s an additional task that I need to do on Monday morning I’ll add it in the ‘Add a subtask’ section, then, if I don’t need to do it again next week, I’ll delete it once done rather than checking off. Checking it off means that it’ll recur again next week.

There’s one other set of lists that I use, and this is my ‘in class’ lists. At the moment I’ve got three classes, Y11 Physics and two Y12 Further Maths classes. I have a list for each class and I figured that the only reliable ‘pause point’ that I could count on in these classes was prior to class, especially whilst I’m in the pause point habit building phase. Below is a screenshot of what my whole list looks like (note the asterisks at the start of the MAFA, MAFB and PH12 lists that keep them at the top when I sort alphabetically). 

Screen Shot 2017-07-08 at 10.47.07 am

In the above you’ll also note the ‘Sunday before school’ list at the bottom. Pretty self explanatory. We’re between terms at the moment and I still haven’t finished all those tasks, that’s why that’s still down as Week 11 whereas all other tasks are down as Week 1 of next term.

Now, I tried with the pause point at the start of the lesson for a while, but often I didn’t want to front load the class with all the information in one go, so what I’ve started doing is transferring the checklist from Wunderlist to the board. This means that it’s right there in my face whenever I refer back to my lesson contents (the list of concepts we’ll be working through during a lesson that I use in every class), and it makes it much easier for me to remember to check it off. I find that it’s working really well (also good as checking Wunderlist in class can look a lot like checking text messages!). Before the students arrive (when possible) I go into the class, write up my contents and checklist on the board, then they arrive and we get stuck in. I deal with the checklist items as we work through the lesson, and I’m building the habit of checking this checklist about 5 mins before the end of the lesson too to double check that I haven’t forgotten anything. Here’s what my list looks like when transferred to the board (top right of board).


I held off posting about this checklist system for a while because I really wanted to embed it into my practice and make sure that it was a useful and functional system doing everything that I want it to do. I now feel that it’s well embedded and it’s already been a life saver on a number of occasions.

If you adopt this system, or make any variations to it that you think significantly add to its usefulness I’d love to hear about them in the comments or on twitter (@ollie_lovell).

The battle for deliberate practice

It’s exam time and your students are preparing. You’re going around the class, observing how students are studying and, shock horror, they are re-reading and re-writing their notes. The notes are literally going from one notebook to another notebook without going through their brains in-between. As a teacher this is one of the most frustrating things for me to see, and recently I’ve been on a bit of a war path to try to stop it.

This is a short post to celebrate some of my students doing deliberate practice. This year I’ve been stressing the importance of students re-doing questions (as opposed to just re-reading them).

Screen Shot 2017-06-05 at 9.54.23 amPictured right is how Ericsson describes deliberate practice (pg. 367)


They way that I’ve advocated for this is to ask students to 1. Identify questions that they got incorrect in our weekly tests, 2. Get a book or another sheet of paper and cover the answer, 3. Re-do the question, 4. Slide the book/piece of paper down and check, 5. Re-do again if they got it wrong, 6. Re-do again a few hours/days later to consolidate.

I’ve felt like a bit of a broken record but then, today, I had my day made when walking around the class I saw these two students!



To attend to the motivational segment of the task, I knocked up this sheet that I gave to students at the start of today’s revision session.

Deliberate practice for the win. Just wanted to celebrate. Hopefully it pays off in their exam.

How do we know what to put on the weekly quiz?

I’ve really enjoyed working my way through Brian Penfounds series of three (1, 2, 3) blogposts in his Journey to Interleaved Practice series recently. They detail how, prompted by a discussion with the Learning Scientists, Brian has been incorporating interleaving into his integral calculus class.

One particular instrument got me excited, it’s an excel spreadsheet that can be used to interleave questions when you’re planning both lessons and quizzes (see the blank version here (edit: Learning scientists just released a new version here) and Brian’s version here). Here’s a screenshot to give you a taster.

Screen Shot 2017-04-20 at 9.11.59 pm

Being the focussed (and sometimes obsessed) learning strategist that I am, I really loved this idea. But it got me thinking, is this better than what I’m already doing? Should I adapt my current practice to incorporate this approach?

I’ve written about my assessment  and feedback process before here , in which I talk about the weekly quizzes that I give students, and how they incorporate content from the previous three weeks. This means that students see content for a month in a row (in the teaching week, then in the three weeks after that), then they’ll see it in the unit test (maximum 4 weeks later, as each topic is approx. 8 weeks long) then in the mid-year practice exam, then in the end of year exam.

I wanted to take the opportunity to share how I actually choose which questions to put on these weekly tests (or ‘Progress Checks’ (PC) as they’re called in my classes).

Each week I run the PC, students self mark in class immediately after, then I collect up the PCs. I keep them overnight and return them to students the next day (for two of my classes, the third class waits for 3 days due to timetabling) and in the meantime I enter the marks into my gradebook. When I return the PCs to students (I do this once they’ve settled into some question work), I carry around a little notebook and have a mini-conference with each student, the questions I ask are generally

“How do you feel you went?”

‘What did you get wrong?”

“What mistake did you make?”

“How much prep did you do for this Progress Check?”

And finally

“Which question numbers did you get wrong”.

From that, I collate the following.

MAFPCW5_Hard Qs (de-identified)

(For any student who doesn’t demonstrate that they prepared for the PC, they get a detention, which I also note on this sheet).

I then take a photo of this and store it with the progress check itself, like so.

Screen Shot 2017-04-20 at 9.23.20 pm

Then, when it comes time to write the next week’s PC, I feed in the questions that were answered incorrectly (variations thereof) as well as new content, in addition to other concepts from the previous 3 weeks that I think also important to touch on again.

I was really excited by the excel approach, but I’m still very attached to the adaptive approach that I’m using. Perhaps the optimum would lie somewhere in-between, using both a more-complex structure than ‘the last 3 weeks’ (such as is offered by the excel spreadsheet), plus some element of adaptability to the questions and concepts that students are clearly struggling with.

An opportunity for further exploration!

Assessment feedback: Processes to ensure that students think!

We know that ‘Memory is the residue of thought’ (Daniel Willingham) and that in order for our students to learn they must actively think about the content to be learnt. This allows this content to occupy their working memory for long enough, and become anchored to sufficient elements in their long term memory, to trigger a change in long term memory, one of the well respected definitions of ‘learning’ (Paul Kirschner).

One of the arenas of teaching in which this can be most challenging is that of feedback delivery to students. Dylan Wiliam sums it up well in the following quote (Which I came across thanks to Alfie Kohn).

Note: The original quote is “When students receive both scores and comments, the first thing they look at is their score, and the second thing they look at is…someone else’s score”, and can be found here (beware the paywall). 

The challenge is, then, how do we give feedback to our students in a way that encourages them to actively think about their mistakes, and helps them to do better next time?

In the following I’ll share how I give feedback to students in two contexts. The first is on low stakes assessments that I carry out in my own classroom, the second is on major assessment pieces that contribute towards their final unit mark.

Assessment Feedback on weekly Progress Checks.

Before we dive in I’ll just paint a picture of how my weekly ‘Progress Checks’ fit into my teaching and learning cycle, and how each of these elements is informed by education theory.

At the start of each week students are provided with a list of ‘weekly questions’. They know that the teaching during the week will teach them how to answer these questions. Questions are to be aligned with what we want students to be able to do (curriculum and exams) (Backwards Design). Students are provided with worked solutions to all questions at the time of question distribution (The worked example effect). The only homework at this stage of the cycle is for students to ensure that they can do the weekly questions.

Progress Checks’ (mini tests, max 15 minutes) are held weekly (Testing Effect). Progress checks include content from the previous three weeks. This means that students see the main concepts from each week for a month (Distributed Practice). These PCs are low-stakes for year 11 students (contribute 10% to their final overall mark) and are simply used to inform teachers and students of student progress in year 12 (where assessment protocols are more specifically defined).

Edit: Here’s a new post on how I use student responses to these PCs to construct the next PCs. 

When designing the progress checks I had two main goals: 1) Ensure that students extract as much learning as possible from these weekly tests, 2) Make sure that marking them didn’t take up hours of my time. The following process is what I came up with.

Straight after the PC I get students to clear their desks, I hand them a red pen, and I do a think-alound for the whole PC and get them to mark their own papers. This is great because it’s immediate feedback and self marking (See Dylan Wiliam’s black box paper), and it allows me to model the thinking of a (relative) expert, and to be really clear about what students will and won’t receive marks for. Following this, for any student who didn’t attain 100% on the progress check, they choose one question that they got incorrect and do a reflection on it based on 4 questions: 1) What was the q?, 2) Which concept did this address?, 3) What did you get wrong?, 4) What will you do next time?

Here are some examples of student self-marked progress checks and accompanying PC reflections from the same students (both from my Y11 physics class). Note: Photos of reflections are submitted via email and I use Gmail filters to auto-file these emails by class.

Brandon PC

Note how this student was made aware of consequential of follow through marks on question 1.

Here’s the PC reflection from this same student (based upon question 2).

B PC ref

Here’s another students’ self-marked Progress Check.


And the associated reflection.

Screen Shot 2017-04-11 at 7.18.54 am

Screen Shot 2017-04-11 at 7.19.47 am

Students are recognised and congratulated by the whole class if they get 100% on their progress checks, as well as one student from each class winning the ‘Best PC Reflection of the Week’ award. This allows me to project their reflection onto the board and point out what was good about it, highlighting an ideal example to the rest of the class, celebrating students’ successes, rewarding students for effort, and framing mistakes as learning opportunities.

I think that this process achieves my main two goals pretty well. Clearly these PCs form an integral learning opportunity, and in sum it only takes me about 7 minutes per class per week to enter PC marks into my gradebook.

Assessment Feedback on Mandated Assessment Tasks.

There are times when, as a teacher, we need to knuckle down and mark a bunch of work. For me this is the case on school assessed coursework (SACs), which contribute to my students’ end of year study scores. I was faced with the challenge of making feedback for such a test as beneficial to my students’ learning as the PC feedback process is, here’s what I worked out.

  1. On test day, students receive their test in a plastic sheet and unstapled.
  2. At the start of the test, students are told to put their name at the top of every sheet.
  3. At the end of the test I take all of the papers straight to the photocopier and, before marking, photocopy the unmarked papers.
  4. I mark the originals (Though the photocopying takes some time I think that in the end this process makes marking faster because, a) I can group all page 1s together (etc) and mark one page at a time (this is better for moderation too) and b) because I write minimal written feedback because I know what’s coming next…)
  5. In the next lesson I hand out students’ photocopied versions and I go through the solutions with the whole class. This means that students are still marking their own papers and still concentrating on all the answers.
  6. Once they’ve marked their own papers I hand them back their marked original (without a final mark on it, just totals at the bottom of each page), they identify any discrepancies between my marking and their marking, then we discuss and come to an agreement. This also prompts me to be more explicit about my marking scheme as I’m being held to account by the students.

In Closing

I’ve already asked students for feedback on the progress checks through whole class surveys. The consensus is that they really appreciate them and they like the modelling of the solutions and self-marking also. One good thing is that putting together this post prompted me to contact my students and ask for feedback on the self-marking process of their photocopied mandated assessment task. I’ll finish this post with a few comments that students said they’d be happy for me to share. It also provides some great feedback to me for next time .

I’d love any reflections that readers have on the efficacy of these processes and how they could potentially be improved.

From the keyboards of some of my students (3 males, 3 females, 5 from Y12, one from Y11).

Screen Shot 2017-04-12 at 9.09.34 am

Screen Shot 2017-04-19 at 9.23.09 amScreen Shot 2017-04-12 at 9.17.38 am Screen Shot 2017-04-12 at 9.06.27 amScreen Shot 2017-04-12 at 9.08.26 am

Screen Shot 2017-04-13 at 11.32.22 am


A  fellow maths teacher from another school in Melbourne, Wendy, tried out this method with a couple of modifications. I thought that the modifications were really creative, and I think they offer another approach that could work really well. Here’s what Wendy said.

Hey Ollie,

I used your strategy today with photocopying students’ sacs and having them self correct. The kids responded so well!

Beyond them asking lots of questions and being highly engaged, those that I got feedback from were really positive saying how it made them look at their work more closely than they would if I just gave them an already corrected test, understood how the marking scheme worked (and seeing a perfect solution) and they liked that they could see why they got the mark they did and had ‘prewarning’ of their mark.

Thanks heaps for sharing the approach.
A couple of small changes I made were
  • I stapled the test originally then just cut the corner, copied them and then restapled. It was very quick and could be done after the test without having to put each test in a plastic pocket
  • I gave the students one copy of the solutions between two. Almost all kids scored above 50% and most around the 70% mark, and I didn’t want them to have to sit through solutions they already had.

if you have thoughts/comments on these changes I’d love to hear them.

Thanks again!


Find references to all theories cited (in brackets) here.

My attempt at an evidence informed student feedback form.

Seeing as my students have to endure my presence, instructions, and bad jokes for 3 hours and 45 minutes each week, I figure the least I can do is give them an opportunity to tell me how I can make this task a little easier for them. In my first year of teaching I knocked together the below form. I’ve used it for a year now and it’s been really helpful to date. In particular, it’s helped me to bring more celebration into my classroom, with many students over the past year indicating that they want their successes to be celebrated more (usually with lollies!). 
Screen Shot 2017-04-01 at 6.27.41 pm

This has been great, but as I’ve moved into my role as head of senior maths this year it’s prompted me to think more strategically about student feedback, and the role it can play in my own, and my team’s professional development.

No feedback form is going to tell a teacher, or a team leader, everything they need to know in terms of ‘Where am I going? How am I going? Where to next?’, but I’ve been feeling more and more as thought these forms do have a key role to play in helping teachers to spot gaps, and  motivating and inspiring us to improve our praxis.

I was really happy with the willingness of my team to roll out the above form (Obviously with ‘Ollie’ changed to their individual names) in their own classes, and the insights gained were very illuminating. But coupling these feedback forms with my own observations provided and even bigger insight for me. It surprised me just how differently student (novices when it comes to principles of instruction) and I (a relative expert) view what happens in a classroom.

From this it’s became more apparent to me that if I want student feedback to more effectively drive my own professional development, I need to start asking better and more targeted questions that will allow me to see exactly where my teaching is excelling, and where I’m falling short.

So, here’s a first draft of the new feedback questions (which I’ll eventually turn into a google form). I’ve based it off the Sutton Trust’s article What makes great teaching? Review of the underpinning research, headed up by Robert Coe. I’ve used the first four out of the six “common components suggested by research that teachers should consider when assessing teaching quality.” (p. 2). These are the components rated as having ‘strong’ or ‘moderate’ evidence of impact on student outcomes, and they’re also the components with observable outcomes in the classroom (5 and 6 are ‘Teacher Beliefs’ and ‘Professional Behaviours’, which encapsulate practices like reflecting on praxis and collaborating with colleagues).

For each of the following I’ll get students to rate the sentence from 1, strongly disagree, to 5, strongly agree, in the hope that this will give me a better idea of how students interpret the various components of my teaching and teacher disposition.

I’ll also add a question at the end along the lines of ‘Is there anything else you’d like to add?’.

I’ve numbered the Qs to make it easy for people to make comments about them on twitter. This is a working document and today is the second day of our 2 week Easter break. I’m keen to perfect this as much as possible prior to Term 2. Please have a read and I’d love your thoughts and feedback  : )


Link to Twitter discussion here.

Edit: A copy of the live form can now be viewed at:

Four (of the 6) components of great teaching (Coe et al., 2014). Questions.
1. (Pedagogical) content knowledge (Strong evidence of impact on student outcomes)

Student friendly language: Knowledge of the subject and how to teach it.

The most effective teachers have deep knowledge of the subjects they teach, and when teachers’ knowledge falls below a certain level it is a significant impediment to students’ learning. As well as a strong understanding of the material being taught, teachers must also understand the ways students think about the content, be able to evaluate the thinking behind students’ own methods, and identify students’ common misconceptions.

1.1 This teacher has a deep understanding of the maths that they teach you. They really ‘know their stuff’.

1.2 This teacher has a good understanding of how students learn. They really ‘know how to teach’.


If you have any comments on this teacher’s knowledge of the content and how to teach it, please write them below.

2. Quality of instruction (Strong evidence of impact on student outcomes)

Student friendly language: Quality of instruction

Includes elements such as effective questioning and use of assessment by teachers. Specific practices, like reviewing previous learning, providing model responses for students, giving adequate time for practice to embed skills securely and progressively introducing new learning (scaffolding) are also elements of high quality instruction.


2.1 This teacher clearly communicates to students what they need to be able to do, and how to do it.

2.2 This teacher asks good questions of the class. Their questions test our understanding and help us to better understand too.

2.3 This teacher gives us enough time to practice in class.

2.4 The different parts of this teacher’s lessons are clear. Students know what they should be doing at different times throughout this teacher’s lessons.

2.5 The way that this teacher assesses us helps both us and them to know where we’re at, what we do and don’t know, and what we need to work more on.

2.6 This teacher spends enough time revisiting previous content in class that we don’t forget it.

If you have any comments on the quality of this teacher’s instruction, please write them below.

3. Classroom climate (Moderate evidence of impact on student outcomes)

Student friendly language: Classroom Atmosphere and Student Relations

Covers quality of interactions between teachers and students, and teacher expectations: the need to create a classroom that is constantly demanding more, but still recognising students’ self-worth. It also involves attributing student success to effort rather than ability and valuing resilience to failure (grit).


3.1 Students in this teacher’s class feel academically safe. That is, they don’t feel they’ll be picked on (by teacher or students) if they get something wrong.

3.2 Students in this teacher’s class feel socially safe. That is, this teacher promotes cooperation and support between students.

3.3 Even if I don’t get a top score, if I try my best I know that this teacher will appreciate my hard work.

3.4 This teacher cares about every student in their class

3.5 This teacher has high expectations of us and what we can achieve.

If you have any comments on the atmosphere of this teacher’s classroom, or their student relations, please write them below.

4. Classroom management (Moderate evidence of impact on student outcomes)

Student friendly language: Classroom Management

A teacher’s abilities to make efficient use of lesson time, to coordinate classroom resources and space, and to manage students’ behaviour with clear rules that are consistently enforced, are all relevant to maximising the learning that can take place. These environmental factors are necessary for good learning rather than its direct components.


4.1 This teacher manages the class’ behavior well so that we can maximize our time spent learning.

4.2 There are clear rules and consequences in this teacher’s class.

4.3 This teacher is consistent in applying their rules.

4.4 The rules and consequences in this teacher’s class are fair and reasonable, and they help to support our learning.

4.5 Students work hard in this teacher’s class.

If you have any comments on this teacher’s classroom management, please write them below.

Final Open-ended Questions If you have any further comments or questions in relation to this teacher, please feel free to share them below.



Working towards a more evidence informed Professional Development Review process.

My school is currently reviewing our PDR process. As the new head of senior maths this is a really crucial time for me to step up and try to bring some things to the table that will ensure that, as a team, the senior maths teachers are teaching in an evidence informed fashion.

I’m posting now, prior to submitting final ideas to our college, in order to share some thoughts and hopefully open up a discussion with others so that I can improve and optimise this process.

In partnership with my colleagues we’ve brought in a whole new instructional process this year at our senior college. At the moment we’re working on bedding it down, and having imput into the PDR process means ensuring that we’re all being asked by leadership to provide evidence for instructional practices that we actually think are going to contribute to student learning.

I’ve drafted the document below as a list of things that I myself would like to be measured against and I’m looking to take this to our maths team meeting soon to see if there’s anything that the team would like to add or subtract as we make our submission to leadership. (Hover over the top right of the doc to open in a new page).

I’d love any thoughts or comments on what I’ve put together thus far and how it can be improved.

Note: The ‘goals’ across the top come from our pre-existing PDR process. They’re non-negotiable so each of the elements I’ve included below will fit under those three goal headings (I’ll work out which goes where later, they’re each broad enough that alignment shouldn’t be an issue).

Note 2: SIM stands for ‘Sunshine Instructional Model’, we have a pre-established instructional model so I’ve just highlighted the main points that I think map really well onto that.

Any thoughts or comments appreciatively received : )


Edit, I have replaced the original version with the most recent version, as attached below.

What’s my function? Derivative game to deepen learning

Edit: This activity follows a broadly constructivist approach. Since running this activity last year, and in light of the evidence, my view on constructivist teaching methods has changed. If I were to use this activity again, I’d change the order. That is, I wouldn’t run it as a way to try to get students to discover the embedded concepts, instead I’d run it as a way to consolidate the knowledge of concepts previously taught via direct instruction. I leave old posts like this on my blog because I think it’s valuable to have a record of how my views and approaches to teaching have changed over time, and continue to change. 

At the recent Educational Changemakers 2016 conference (EC16)  I came across the work of Shane Loader and the Empowering local learners project. After an engaging chat at the post-conference drinks, Shane was generous enough to share with me some of the top blogs that he follows as well as a link to the Empowering Local Learners (ELL) project that he’s currently working on. Whilst browsing the site I found a great lesson plan on surface area and volume that gave me an idea for my Methods Class. The activity: ‘What’s my function?’.

The previous few lessons had been on introducing the derivative function. Students were able to calculate the derivative of a polynomial both from first principals and ‘the rule’. The goal of this lesson was to scaffold students to make the link between the zeroes of a derivative function and the turning points of its parent/primitive function.

I split my (luxuriously small) class into two groups and gave each group one of the following instruction sheets.

more students using desmos to check their functions and derivativesStudents were excited! They were straight onto Desmos to explore their functions

Students using desmos to check their functions and derivatives













Then they jumped into asking their questions. Here’s how the discussion went…

first functions and derivatives questions and answersThe students were taking the activity really seriously, and being very competitive. Due their deep consideration of the questions they were going to ask, they were really taking their time, and we were fast coming up to the end of the period. With about 15-20 minutes to go I took advantage of my teacher privilege with the following change to the rules (I recorded the following in class).

Here’s he chat that followed… (The team on the left knew their original fn was a cubic, and the other knew they were looking for a quartic).

function and derivative questions and answers It can be seen that the team on the left got the hint and jumped into asking for the zeroes of the derivative to find the turning point. The team on the right tried a similar thing, but only on their second turn, and they were playing copy-cat so didn’t quite know why they were asking the question. Results? Here’s what was produced…

What's my function, results By this time we were up against the end of the period so I took a photo of the fns to prompt discussion at the start of the following lesson.

In the following period, I asked one basic question of the team who drew the fn on the bottom right: “Why did the other team ask for the zeroes of your derivative function?”. A long discussion ensued but we managed to boil it down to the following spaced repetition card…

Screen Shot 2016-09-15 at 7.44.41 pmI guess I could have just told them that at the start, but that would have taken out the joy and excitement of making the connection themselves*. We’ll see if it helped it stick… term break starts tomorrow!


*Many readers will already be familiar with Paul Lockhart’s ‘A Mathematician’s Lament‘ on this point.