Category Archives: *Cognitive: Science of Learning

Better Learning: The art and science of teaching. From effective content delivery to supportive relationships.

Present new material in small steps with student practice after each step: How’s it look?

Embed from Getty Images

The second recommendation in Rosenshine’s ‘Principles of Instruction is “Present new material in small steps with student practice after each step”. The basis for this recommendation is the fact that working memory is limited and, for learning to occur it’s important to avoid overloading working memory. But that isn’t the focus of this post. In this post I just wanted to share what ‘new material in small steps with student practice after each step’ can look like in the classroom.

As a rule of thumb, the longer a teacher talks for the more likely they are delivering sufficient information to overload their students’ working memory. As I reflected upon this point, prompted by Craig Barton’s in-depth interview with Kris Boulton recently, I found myself thinking, ‘I wonder how long I talk for?’ It was time to collect some data.

Next lesson I split my notebook into three columns ‘explain’, ‘student work’, and ‘check solution’ (I always teach my maths lessons in an ‘I do’ then ‘You do’ format, then go over the solutions as a class), then I got to recording! First class I got distracted and fell off the timing bandwagon (first half of the page) but second class I remembered to stay on task and that whole class (90 mins) is recorded in the image below (red box).

To set the scene, I wanted students to be able to answer the exam question presented by the end of the lesson. This required them to be able to go from a transition diagram and an initial state matrix to the result after multiple periods with or without the addition of extra units each period, as well as determining the result of such transitions ‘in the long run’, and working backwards in such a relation. I split this up into the following sub-steps for the purposes of instruction.

  • Constructing a transition matrix from a transition diagram.
  • Applying a transition diagram to interpret a transition
  • Applying a transition matrix to interpret change after one transition
  • Understanding transition matrices as recurrent relations (And results after multiple periods with a formula)
  • ‘In the long term’: Steady state solutions to Transition matrices
  • Results after multiple periods (using brute force, that means with a calculator)
  • Transition matrix modelling when the total number of units changes.
  • Working backwards in matrix multiplications

20813172_10214166499896038_1281376121_o

The astute observer will note that the total time adds up to about 60 mins. The additional time was taken up with approx. 20 mins of revising previous content and 10 mins talking about an upcoming assessment and doing a ‘brain break’.

Below is the lesson as I presented it, with the timing for each segment added in italics (images weren’t in the original as students had all questions in front of them. I added them for readers here)

I found it really valuable to look at the timing of my lessons in this level of detail. I’d love to know if it’s prompted any similar reflections for you.

References:

Rosenshine, B. (2012). Principles of Instruction: Research-Based Strategies That All Teachers Should Know. American Educator, 36(1), 12.

 

Assessment feedback: Processes to ensure that students think!

We know that ‘Memory is the residue of thought’ (Daniel Willingham) and that in order for our students to learn they must actively think about the content to be learnt. This allows this content to occupy their working memory for long enough, and become anchored to sufficient elements in their long term memory, to trigger a change in long term memory, one of the well respected definitions of ‘learning’ (Paul Kirschner).

One of the arenas of teaching in which this can be most challenging is that of feedback delivery to students. Dylan Wiliam sums it up well in the following quote (Which I came across thanks to Alfie Kohn).

Note: The original quote is “When students receive both scores and comments, the first thing they look at is their score, and the second thing they look at is…someone else’s score”, and can be found here (beware the paywall). 

The challenge is, then, how do we give feedback to our students in a way that encourages them to actively think about their mistakes, and helps them to do better next time?

In the following I’ll share how I give feedback to students in two contexts. The first is on low stakes assessments that I carry out in my own classroom, the second is on major assessment pieces that contribute towards their final unit mark.

Assessment Feedback on weekly Progress Checks.

Before we dive in I’ll just paint a picture of how my weekly ‘Progress Checks’ fit into my teaching and learning cycle, and how each of these elements is informed by education theory.

At the start of each week students are provided with a list of ‘weekly questions’. They know that the teaching during the week will teach them how to answer these questions. Questions are to be aligned with what we want students to be able to do (curriculum and exams) (Backwards Design). Students are provided with worked solutions to all questions at the time of question distribution (The worked example effect). The only homework at this stage of the cycle is for students to ensure that they can do the weekly questions.

Progress Checks’ (mini tests, max 15 minutes) are held weekly (Testing Effect). Progress checks include content from the previous three weeks. This means that students see the main concepts from each week for a month (Distributed Practice). These PCs are low-stakes for year 11 students (contribute 10% to their final overall mark) and are simply used to inform teachers and students of student progress in year 12 (where assessment protocols are more specifically defined).

Edit: Here’s a new post on how I use student responses to these PCs to construct the next PCs. 

When designing the progress checks I had two main goals: 1) Ensure that students extract as much learning as possible from these weekly tests, 2) Make sure that marking them didn’t take up hours of my time. The following process is what I came up with.

Straight after the PC I get students to clear their desks, I hand them a red pen, and I do a think-alound for the whole PC and get them to mark their own papers. This is great because it’s immediate feedback and self marking (See Dylan Wiliam’s black box paper), and it allows me to model the thinking of a (relative) expert, and to be really clear about what students will and won’t receive marks for. Following this, for any student who didn’t attain 100% on the progress check, they choose one question that they got incorrect and do a reflection on it based on 4 questions: 1) What was the q?, 2) Which concept did this address?, 3) What did you get wrong?, 4) What will you do next time?

Here are some examples of student self-marked progress checks and accompanying PC reflections from the same students (both from my Y11 physics class). Note: Photos of reflections are submitted via email and I use Gmail filters to auto-file these emails by class.

Brandon PC

Note how this student was made aware of consequential of follow through marks on question 1.

Here’s the PC reflection from this same student (based upon question 2).

B PC ref

Here’s another students’ self-marked Progress Check.

R PC

And the associated reflection.

Screen Shot 2017-04-11 at 7.18.54 am

Screen Shot 2017-04-11 at 7.19.47 am

Students are recognised and congratulated by the whole class if they get 100% on their progress checks, as well as one student from each class winning the ‘Best PC Reflection of the Week’ award. This allows me to project their reflection onto the board and point out what was good about it, highlighting an ideal example to the rest of the class, celebrating students’ successes, rewarding students for effort, and framing mistakes as learning opportunities.

I think that this process achieves my main two goals pretty well. Clearly these PCs form an integral learning opportunity, and in sum it only takes me about 7 minutes per class per week to enter PC marks into my gradebook.

Assessment Feedback on Mandated Assessment Tasks.

There are times when, as a teacher, we need to knuckle down and mark a bunch of work. For me this is the case on school assessed coursework (SACs), which contribute to my students’ end of year study scores. I was faced with the challenge of making feedback for such a test as beneficial to my students’ learning as the PC feedback process is, here’s what I worked out.

  1. On test day, students receive their test in a plastic sheet and unstapled.
  2. At the start of the test, students are told to put their name at the top of every sheet.
  3. At the end of the test I take all of the papers straight to the photocopier and, before marking, photocopy the unmarked papers.
  4. I mark the originals (Though the photocopying takes some time I think that in the end this process makes marking faster because, a) I can group all page 1s together (etc) and mark one page at a time (this is better for moderation too) and b) because I write minimal written feedback because I know what’s coming next…)
  5. In the next lesson I hand out students’ photocopied versions and I go through the solutions with the whole class. This means that students are still marking their own papers and still concentrating on all the answers.
  6. Once they’ve marked their own papers I hand them back their marked original (without a final mark on it, just totals at the bottom of each page), they identify any discrepancies between my marking and their marking, then we discuss and come to an agreement. This also prompts me to be more explicit about my marking scheme as I’m being held to account by the students.

In Closing

I’ve already asked students for feedback on the progress checks through whole class surveys. The consensus is that they really appreciate them and they like the modelling of the solutions and self-marking also. One good thing is that putting together this post prompted me to contact my students and ask for feedback on the self-marking process of their photocopied mandated assessment task. I’ll finish this post with a few comments that students said they’d be happy for me to share. It also provides some great feedback to me for next time .

I’d love any reflections that readers have on the efficacy of these processes and how they could potentially be improved.

From the keyboards of some of my students (3 males, 3 females, 5 from Y12, one from Y11).

Screen Shot 2017-04-12 at 9.09.34 am

Screen Shot 2017-04-19 at 9.23.09 amScreen Shot 2017-04-12 at 9.17.38 am Screen Shot 2017-04-12 at 9.06.27 amScreen Shot 2017-04-12 at 9.08.26 am

Screen Shot 2017-04-13 at 11.32.22 am

Edit:

A  fellow maths teacher from another school in Melbourne, Wendy, tried out this method with a couple of modifications. I thought that the modifications were really creative, and I think they offer another approach that could work really well. Here’s what Wendy said.

Hey Ollie,

I used your strategy today with photocopying students’ sacs and having them self correct. The kids responded so well!

Beyond them asking lots of questions and being highly engaged, those that I got feedback from were really positive saying how it made them look at their work more closely than they would if I just gave them an already corrected test, understood how the marking scheme worked (and seeing a perfect solution) and they liked that they could see why they got the mark they did and had ‘prewarning’ of their mark.

Thanks heaps for sharing the approach.
A couple of small changes I made were
  • I stapled the test originally then just cut the corner, copied them and then restapled. It was very quick and could be done after the test without having to put each test in a plastic pocket
  • I gave the students one copy of the solutions between two. Almost all kids scored above 50% and most around the 70% mark, and I didn’t want them to have to sit through solutions they already had.

if you have thoughts/comments on these changes I’d love to hear them.

Thanks again!

References

Find references to all theories cited (in brackets) here.

Sweller’s Goal Free Effect… giving it a go.

Thanks to a recent tweet by Dylan Wiliam, and a great article that it linked to by Michael Pershan, I gained a fuller understanding of a cognitive effect that I’ve been exploring recently (see this paper), the ‘goal free effect’.

Discovered by John Sweller, it essentially posits that explicitly trying to solve a problem can result in a lot of ‘attention’ or ‘working memory’ (see here for a discussion of which term to use) being expended in the search process, limiting (or eliminating) the working memory available for ‘learning’ from the actual task. The result is that the problem gets solved, but the problem solver fails to make any generalisations from the solution and won’t be able to necessarily do it again in future.

It doesn’t come across as a a particularly complex theory, but what I’ve been trying to work out is how to make it work in a classroom. I read Pershan’s post but was keen to know more about linking the goal free approach to explicit learning intentions that the teacher has for the lesson (we discuss that here if you’d a bit more detail on this chat).

Sometimes it takes trying something out to get your head around it, and I was determined to do so. This week I encountered a question that I wanted my students to be able to solve, and I thought the goal free effect might be relevant. Here’s the question (see part b):

Question to use with the goal free effect

I recognised that there was a danger here. This was a relatively open question and I anticipated that several of my students would find it difficult. I could anticipate that many of them would just stare at the table without making connections and then after some work time and a few prompts I’d show a solution (or they’d find it themselves in the resource) but, because they’d been so solution focussed along the way, they’d just write the provided solution down and try to memorise it (the provided solution just focussed on the trend for 19 years and under) and fail to see all of the associations that they could have pointed out in the table.

What I did instead was try out Sweller’s theory.

I clipped out the table and showed it by itself on the whiteboard with my projector. I then asked ‘Look at this table… What can you tell me by looking at it? Do you notice any patterns?’

I also gave the following hint: ‘Focus on one row ( ← a row goes like this → ) at a time’.

We then shared as a class and it was an incredibly rich discussion. What I hadn’t anticipated was how asking such a question reduced the barrier to participation for students. I had students point out the patters for each of the age groups, but I also had one student say ‘The years go up in 10s’ as well as another similarly volunteer that ‘The years all end in 6’, this was in addition to associations being found between the year of first marriage and age of first marriage for each of the age categories in the table.

I then gave each student a half sheet of A4 paper and got them to put into words their association (I’d identified from the discussion that students were struggling to put their thoughts into formal mathematics terms, so wanted them to make these descriptions less transient by eliciting a written response) and collected up these bits. I read some out and, as a group, we identified what it was that made the strong ones strong. I hadn’t anticipated this at all, but we ended up making a template for answering these such questions, here it is:

Template from goal free effect activity

For me this was an incredible experience. We’d made it all the way from an open question to a generalisation, and scaffolded literacy along the way too (I work in a very low SES school with a large English as Additional Language student base, literacy needs are a constant in all classes), something I’d failed to anticipate in my planning.

In carrying out this activity I managed to get a much deeper understanding of how the goal free effect can work, and how it can be tied into a generalisation directly in line with my learning intention for this segment of the lesson (FYI, the explicit learning intention was for them to be able to identify associations from a two-way contingency table then describe the association and back up their claim with data from the table). In future cases, especially when there’s a lot going on in a diagram (see ‘split attention effect’ on bottom left of page 6 in this paper) I’ll definitely have the goal free effect in the back of my mind as one option in my teacher toolbox.

 

If students can’t be little scientists, can PSTs be little teachers?

When I read the above tweet I made a connection. A lot of people have been writing recently (and not-so-recently too) about the fact that trying to teach students science by the scientific method doesn’t work because novices approach problems in different ways to experts. Novices don’t have the same background knowledge as their expert counterparts, meaning that they don’t have sufficient info in their long-term memory to evaluate complex problems and are essentially rendered ineffective in complex situations due to overloaded working memory.

But have we applied this to our teacher training too? Is it reasonable to expect a pre-service teacher to comprehend and apply the science of learning in a complex classroom that requires them to simultaneously apply content knowledge, pedagogical knowledge, then pedagogical content knowledge.

I’m thinking about what load reduction instruction look like for pre-service teachers. Surely we’d need to “Present new material in small steps with student practice after each step” as Rosenshine tells us.

A good lesson is made up of a balanced confluence of clear instruction and searching questions, teacher direction and independent work time (etc,), with key transition points existing at the junction between each of these facets of the lesson. Maybe micro-teaching is the gateway for novice teachers to master these skills, with an expert teacher guiding and holding the other elements of the class as the novice focusses on one at a time.

Food for thought, and something I’ll be keeping in mind when I take on my first pre-service teacher later this year.

Lessons from Myanmar: Cartesian co-ordinates and Fruit Salad

On November the 8th, whilst millions of excited Burmese voters headed to the polls, I arrived bleary-eyed and hungry from an overnight bus ride in Myanmar’s second largest city, Mandalay.  Along with 9 other pre-service teachers and our two group leaders, I was about to start a two week teaching placement at the Phaung Daw Oo Monastic School.

This post is about what I learned from that two week placement. I’ve chosen to present it through the ‘case study’ of adapting a lesson on Cartesian co-ordinates from the Australian to the Burmese context. I feel that the process of adaptation provides a good framework for me to discuss and explore my two main learnings from the trip, one on the use of language in the classroom, and the other on modelling.

The genesis for this lesson was in August of 2014 when I was lucky enough to attend a Dan Meyer workshop at the Love Learning Conference in Sydney. Dan introduced attendees to a ‘fruitful’ approach to introducing Cartesian co-ordinates to students. Almost exactly a year after that first workshop, I used a very similar approach in the first lesson of my second placement with a class of year 9 students.

The following video shows how I did it in Aus[1].

Fruit Slideshow (in case you’d like to use all or part of it in your class)

Students appeared really engaged in this approach. They loved the debates about who was better at describing things, and about which fruit was tastier or ‘easier’ than the other. Upon leaving one student remarked, “That was a really good lesson, sir.”

When I arrived in Myanmar and my associate teacher, Thanta, told me that it would be good if I could do some lessons on Cartesian co-ordinates (she called them rectangular co-ordinates), this introduction seemed an obvious choice[2]. The task became, how to adapt it to the local context? There were three main challenges that I anticipated.

  1. Would students understand the concept of ‘rating scales’?
  2. Very limited English
  3. No electricity

Addressing challenge #1 – Rating scales:

This wasn’t too hard. It required me pacing the ‘Tasty’ spectrum (I altered it to go from -5 to 5 to make it clearer on the blackboard) and miming delicious at one end (rubbing stomach and making contented chewing sounds) vs. disgusting at the other (pretending to be sick). Then doing this a few more times. That was the easy part…

Addressing challenge #2 – Very limited English:

The first big lesson for me from this placement came through the challenge of surmounting the language barrier. To give you an idea of the level of English competency of my classes, if I asked ‘Please get your books out and write this down’, about half of the class would understand the instruction and begin, and the other half would copy the half that understood. This is where I was able to explore, in context, the concept of CLIL (Content and Language Integrated Learning). What CLIL means in a maths class is that the goal of every lesson isn’t just to teach maths concepts, but to simultaneously teach maths concepts as well as the language required to understand and talk about them.

Here’s what my lesson plan for the Myanmar fruit lesson looked like…

The green box down the bottom was the text I put on the board to help structure the lesson (note: It was way too ambitious, we only made it to ‘rectangular co-ordinate systems’).

Orange box up top represents the sentence that I wrote on the board to introduce the idea of a ‘scale’ to students.

You’ll note I changed the y-axis title from ‘easy’ to ‘easy to eat’ to reduce ambiguity.

More broadly, I adopted the Q:, A:,  format that permeates the lesson plan as a way to teach students that I wanted students to answer in full sentences (related to the CLIL approach). They were quite strong with their numbers, for example, if I asked ‘How tasty is a pineapple’ they would be quite confident saying ‘minus 5[3]’ but saying ‘On a tasty scale of -5 to 5, I think a pineapple is -5[4]’ was a serious challenge for all but a few of them. Through this challenge, and the placement more generally, I became aware of the importance of scaffolding the language needed to express the ideas. As can be seen, this whole lesson plan is structured around supporting the students to employ a few basic sentences, inclusive of key vocabulary, to communicate the mathematical ideas. For me this was a real revelation and an approach that I’ll definitely be taking into classes in future. In the past I’ve expected students to be able to replicate the language that I use, and to intuitively employ the relevant metalanguage. But this just isn’t realistic. Only by reflecting back on my Australia-based placements through this new CLIL lense have I been able to understand how much of a barrier language was for my students here, and how a language-conscious approach to teaching them would have helped so much more. Every question a student answers is an opportunity to encourage them to employ your discipline-specific metalanguage.

Addressing challenge #3 – No electricity (and foreign fruit):

Pictures help. Here’s how I tackled this one…

Photographer: Gabriella Sabatino

Photographer: Gabriella Sabatino

Photographer: Kira Clarke

Printed out tropical fruits, with the english names on them, and a bit of elephant snot (blu-tac) did the trick. If you look closely at the board you can see the sentences from the lesson plan’s green and red boxes there.

And just in case, here’s the PDF of the fruit in-case you ever find yourself teaching co-ordinate systems in a tropical area.

This brings me to my second big lesson from the placement, Modelling. The electronic approach that I used in the Aussie context was great for a few reasons. It was clear and easy for students to see, it was quick and enabled me to move through the lesson efficiently, and it was dynamic and enabled for quick transitions between tasks. But it had one major flaw. My axes just ‘appeared’ on the screen. This has ramifications related to board and book work more generally.

Traditionally my board work has been pretty atrocious. Focussing on the clarity of my digital presentations, I’d often free-drawn my axes and scribble working all over the board in random spots that I found free. In the words of one of our team leaders ‘Everything you do is modelling’. I hadn’t thought about this before and hadn’t noticed the intrinsic contradiction of my unstructured board work and my expectations for students to be neat with their book work. This also passes up opportunities to discuss some of the key elements of tasks like drawing axes: deciding how far apart to set your numbers and leaving space for axis and chart titles.

The thing that really drove this modelling lesson home for me was the language barrier. There was no way that I could rely on scribbling something up on the board and the verbally explaining to students the key points and the ‘don’t forgets’. That would just be met with blank faces (as it was in my first lesson…). Against the backdrop of this linguistic challenge, the importance of modelling was made undeniably obvious to me.

Summary of key lessons:

This post has really been a combination of two. One on what I have found to be an engaging approach to introducing Cartesian co-ordinates to students, and the other on two key lessons that I learned through the process of adapting this approach for a group of students in Myanmar.

The first key lesson was the importance of CLIL (Content and Language Integrated Learning). In the same way that this blog post has introduced the term CLIL to readers (I’m assuming it’s new for at least some readers) at the same time as introducing the concept itself, I will in future strive to better scaffold and explicitly teach the linguistic skills required by my students, in conjunction with the teaching of concepts.

Secondly, I’ve come to appreciate the value and importance of clear modelling in the classroom. This is important for all students, but especially for those who have English as an additional language or auditory processing challenges[5].

In a recent speech that I gave at the Australian College of Educators Media Awards, I spoke about how stepping into different realms can often bring the most important lessons and opportunities for innovation. For me, this Myanmar teaching placement brought this assertion home more profoundly than I could have expected. The chance to teach in such a different setting, and with such a supportive team[6], was the catalyst needed to bring me back to basics and strengthen some of the core foundations of my teaching praxis. I was able to consolidate many of the lessons learned in this first year of my Masters, and I look forward to continuing this fascinating journey through my teaching and research project next year.

 

For Dan’s very short mention of the fruit salad activity, check out this post (scroll down to ‘Anyway. Part 1’). His posts on personality co-ordinates and this one on co-ordinate battleships are also definitely worth a look!

A big thank you to fellow Maths teacher Dot Yung for her edits and suggestions during the writing of this blog post : ) 


[1] I recognise that using the binaries of ‘boys’ and ‘girls’ could be exclusionary to some more gender diverse students. In future I will choose non-gendered categories for such an exercise. Check out the work of the safe schools coalition for more on this.

[2] I only adapted the core ‘fruit salad’ element of the lesson and not the dot related introduction.

[3] I would usually teach students to say ‘negative five’ but being there for only 2 weeks and knowing they’d return to ‘minus 5’ after my departure I didn’t bother to insist on the change of terminology.

[4] Pineapples are DEFINITELY not -5 on a tasty scale of -5 to 5… Unless they’re not that ripe, in which case I find that they give me ulcers…

[5] If you’re keen to experiment with the importance of modelling in your own classroom, I came up with the idea of challenging teachers to teach a lesson in silence! Please let me know if you decide to take this challenge on!!

[6] Every class that I taught was observed by another teacher who gave me feedback and suggestions for improvement. This post has underplayed the importance of the ‘instructional rounds’ approach taken by the team, and the impact that that had on all of our teaching. The opportunity to have my own lesson critiqued as well as analyse other teachers’ lessons, and hear what was seen through other people’s eyes, was invaluable beyond my expectations. This instructional rounds approach is a process that I hope to learn more about and do more in future. As a starting point I plan to read the article: Ensuring Instruction Changes: Evidence Based Teaching–How Can Lesson Study Inform Coaching, Instructional Rounds and Learning Walks?

 

Tying Together Backwards Design, Self Marking and Criterion Referencing for effective teaching

You may have heard of the concept of Backwards Design before. Often associated with Grant Wiggins, the concept basically states ‘Work out what you want your students to know, then design your lessons with that end in mind.’ Pretty simple, and self explanatory. But the question I had on my recent placement was ‘How can I do this practically, and how can I effectively guide students through the learning process and offer useful feedback along the way?’

My ‘design brief’ (as specified by the Maths department) was to use the year 9  Australian Curriculum mathematics textbook and, over 4 weeks, cover all of the content in the chapter on Linear Equations and Algebra. But I was keen to make this more explicit for students.

Backwards Design

I had been inspired by Sarah Hagan’s work on ‘I Can’ sheets so wanted to tie in these ‘I Can’ sheets with the idea of backwards design.

I surveyed the chapter, looking for a good way to organise this information, and I found that each of the worked examples was teaching a unique competency (or an important but incremental advancement of a pre-taught competency), and they were well labelled. Here’s an example:Screen Shot 2015-06-22 at 8.17.35 am

(Source, textbook)

So I took these competencies and constructed an ‘I can’ sheet that I distributed to students on day 1 of my placement. The ‘I can’ sentence from the above example was ‘I can substitute values into expressions and evaluate’.

In the above ‘I can’ sheet the ‘My pg.’ column was designed as a place for students to write the page number in their book that corresponds to that ‘I can’ statement. For example, if they stuck these two sheets into their book and began numbering straight after these two sheets then the first lesson, covering example 1 (E1) would be on page 1, so they would write a ‘1’ in the My pg. column. The space ‘Key points’ was for the students to write their own key points at the end of every lesson. The Tinycc link took students to videos of the content that I made (I hope to upload these videos to youtube in the near future).

So that was the backwards design, now it was time to tie it into assessment.

I put together a pre-test for the unit. This pre-test assessed students from E1 to E11, inclusive*. And each of the questions was numbered to do so (as can be seen below). This test was administered by my mentor prior to my actual placement and I was able to pick the tests up and mark them to get an idea of where each of the students was at even before I entered my teaching role. I didn’t return these tests to students and told them explicitly ‘The pre-test was about me working out where you guys are at the moment and helps me identify misconceptions so I can directly address them when I introduce the content. Your mark doesn’t matter, what matters is what the test told me about how each of you are currently thinking about this maths’.

Now, the goal of the backwards design was to show students where we were going in terms of content, and give them a clear roadmap of the stops along the way. The goal of assessment was to help me (pre-test) and them (mid-unit test) to see how they were tracking along this path.
My placement was 4 weeks long and at the 2 week mark students were given a mid-unit test. Little did they know that this mid-unit test was EXACTLY THE SAME as the pre-unit test! (one of the benefits of not returning their pre-test ; ) so it gave me a perfect picture of how each of them had progressed.

But, as mentioned, I was very keen to help the students become evaluators of their own learning. To do this, I got them to self-mark. Here’s the process…

Criterion Referencing and Self Marking

I printed out the mid-unit tests (which I referred to as the ‘mid-unit checkup’) double sided on an A4 sheet (one sheet). What this allowed me to do was give students the sheets then collect them up and run them as a batch through the photocopier, giving me both their original and a photocopied version. I then marked the photocopied version so that I had the marks myself and, in the following lesson, I handed back the unmarked original to students, stapled together to two other sheets.

The first sheet stapled behind the original mid-unit test was a set of clearly worked solutions, the second sheet was the following…

For each of the students I filled out the ‘Based on pre-test’ column with either a tick, arrow, or cross (feel free to read instructions on the above sheet so that this makes sense), I then encouraged students to fill out the ‘Based on mid-unit check’ column in the same fashion.
Screen Shot 2015-06-24 at 8.19.06 am
This did a couple of things. Firstly, it enable students to link the competencies to questions, reinforcing what metalanguage such as ‘common factors’ and ‘collecting like terms’ meant. Secondly, it showed them the areas that they still needed to work on. Equally importantly, it showed students how they had progressed, they were able to say things like ‘Great, on the pre-test I couldn’t simply by by collecting like terms, but now I can!’, this was a real plus.
The thing that I really appreciated was the ability to generate a graph of class average results from the pre and mid-unit tests. I then showed this to the class.
Screen Shot 2015-06-25 at 8.04.33 am

This graph really excited students. Students were able to celebrate the progress that they had collectively made, and to identify what they still needed to work on. Students even noticed that for E1, which was ‘write algebraic expressions for word problems’ the class had actually gone backwards! This prompted conversation and allowed us to talk about how this was the first thing we covered in the unit, and provided an opportunity to to re-visit the concept of the forgetting curve that I’d introduced them to earlier.

Reflections

I was really happy with how this approach came off. I saw marked increases in engagement from students. Most importantly, they totally ‘got it’. Student feedback alluded to the fact that students gained a greater understanding of where they were, and what they needed to work on. Here is what some of the students had to say about it (from a feedback form that I handed out on my final day of placement).

Next Time…

Next time I would like to improve upon this method by keeping the students’ ‘I Can…’ sheets all in one place, and preferably in a digital form, so that they can’t get lost and both students and I can access them from home, I’m thinking google spreadsheets for this but I’ll continue to consider options. This would allow them to take greater charge and track their own progress through the formative, mid-unit, and summative assessments.

I’m happy with how this approach went and I look forward to refining this approach when I’m next in the classroom.

The image below shows the class distribution of score in the pre-and post tests. I calculated my ‘effect size’ based on this information and was very pleased with the result : )

First Placement Effect Size

*I now think that it would be worth considering giving students a pre-test that contained ALL of the content from the unit (i.e., Example 1 to Example 21). This is because 1: It would have given them a better idea of the answer to the question ‘where are we going’, 2: it would have given some of the students opportunities to problem solve and try to work out for themselves how to do it and 3: I recently attended a lecture by Patrick Griffin in which he talked about how students shouldn’t EVER be getting 100% on tests because that doesn’t actually give you accurate information on where they are up to, it only tells you what level of achievement they are can perform higher than! Some students did get very close to 100% on this pre-test. But it is important to acknowledge that for these students I hadn’t had the time to build the culture of ‘have a go’ so some were even reluctant to attempt the pre-test, not quite understanding why they should be tested on something they haven’t even been taught yet. Making the pre-test longer and harder by including all content from the unit to be taught could have been overwhelming for some students.

Metacognition: Can it help students problem solve?

I was recently doing a little reading into metacognition and began to wonder if it could be used as a tool in the Maths classroom to help students, particularly with problems solving. I got hunting in the research and found the following paper.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

I got a lot out of it, and thought some others might like to hear how the IMPROVE method works.

This study was done in the late 90’s in Israel. It was to test a modified teaching model based on the incorporation of three elements that aren’t always seen in the classroom

  • Metacognitive Training
  • cooperative learning
  • systematic provision of feedback-corrective-enrichment.

I’ll expand on each of these a little below, then talk about the results of using them in tandem (as was done by Mevarech and Kramarski).

Metacognitive Training

Screen Shot 2015-03-07 at 12.00.43 pm

http://upload.wikimedia.org/wikipedia/commons/d/d2/A_picture_is_worth_a_thousand_words.jpg

I like to think bout metacognition as stepping back from a situation and asking ‘how is my brain reacting to the stimulus here? And how would I like it to react?’. The IMPROVE method got students to begin to do this by introducing three new questions to the mathematics classroom. These three questions were made into cards and passed around when problems got challenging. The questions were

Comprehension Questions: What’s the problem actually saying. Students were asked to ‘read the problem aloud, describe concepts in their own words, and discuss what the concepts meant or into which category the problem could be classified’ (p. 374)

Connection Questions: “How does this question relate to things that you’ve seen before?”

Strategic Questions: All about how you’re going to attack a problem. Ask “What strategy/tactic/principle can be used in order to solve this problem?”, “Why” and “How will you carry this out?” (p. 376)

The idea of these questions was to help students to differentiate between equivalent problems (Qs with the same structure and ‘story context’), similar problems (different structure but same ‘story complex), isomorphic problems (same structure, different ‘story context’), and unrelated problems (very little in common). These categories are from Weaver and Kintsch (1992) and the terminology wasn’t taught to students.

Cooperative Learning

The method used followed Brown and Palincsar’s method (1989) in which students were put into teams of 4 students of 1 high, 2 middle and 1 low achieving student (p. 377). As students progressed teams were changed to maintain this structure. It was stated at this point in the paper that the question-answering technique based on that of Marx and Walsh (1988) was used following a brief teacher exposition of approximately 5 minutes. I plan to further explore this mentioned questioning technique.

Feedback-Corrective-Enrichment

At the end of each 10 or so lessons (constituting a unit) students took a formative test to check their comprehension of the unit’s main ideas. Tests were based on the work of Bloom (1976). Students who didn’t achieve ‘Mastery’ (taken as 80% correct) were given extra support to solidify the basics, students who did went on to enrichment activities. Essentially a form of differentiation.

The Studies

The paper detailed 2 studies that were undertaken. The first included 247 year 7 students split into an experimental (n=99) and control (n=148)  group and the second study consisted of 265 students (experimental n=164, control n=101)). The first study was completed in a region of heterogeneous classrooms (ie: students weren’t split into classes based on ability, ‘tracked’. These classes spanned more than 5 ability years) whilst the second was undertaken in a district where ‘tracking’ was the norm. The second study was undertaken in order to see if the IMPROVE method applied for an entire year would yield encouraging results as it did over the shorter period as in Study 1, as well as to expand the topics to which the method was applied.

Study 1 applied IMPROVE to the topics of rational numbers, identification of rational numbers on the number axis, operations with rational numbers, order of operations, and the basic laws of mathematics operations.

Study 2 applied IMPROVE to the topics of numerals and rational numbers, variables and algebraic expressions, substitutions in algebraic expressions, linear equations with one variable, converting words into symbols, and using equations to solve problems of different kinds.

Tests were composed of computational questions (25 items) and reasoning questions with no computational requirements (11 items).  The reasoning questions were marked based on the following progressive marking scheme: 1 point: Justification by use of 1 specific example, 2 points: reference to a mathematical law but imprecise, 3 points: references mathematical law correctly but conflict incompletely resolved, 4 points: question completely resolved with correct reference to relevant mathematical laws.

Based on pre-intervention test scores students were classified as low, middle or high achieving with pre and post test results compared within these groups.

Results

Screen Shot 2015-03-07 at 12.03.02 pm

p. 381

Study 1: No difference existed between control and experimental group prior to the intervention but IMPROVE students significantly outperformed those in the control post-intervention. Overall mean scores were 68.03 (control) vs. 74.72 (treatment) post-intervention (p<0.05) with means scores on the reasoning component 53.15 (control) vs. 62.56 (treatment). Improvements were seen at all achievement levels.

p. 384

p. 384

Study 2: As with study 1 mean scores for the experimental group increase significantly more than those in the control group with 2 important points to note. Firstly, only the gains to the high achievers group were statistically significant, with the medium achievers group being milldly significant (p-0.052). Low achieving treatment scores>Low achieving control scores in all cases but this result wasn’t statistically significant. Secondly, these trends held for all topics except for ‘operations with algebraic expressions’. It was suggested this was due to the fact that this unit required more practice than other units, thus, being a more of a procedural topic the benefits of metacognitive approaches weren’t as impactful.

Discussion

It’s clear that the IMPROVE intervention aided in student achievement. It increased their ability to draw on prior knowledge to solve problems and to establish mathematical mental schemata to increase their ease of access to this prior knowledge. One challenges with this study (as outlined by Mevarech and Kramarski themselves) was that the three elements; metacognitive training, co-operative learning, and feedback-corrective-enrichment were all applied simultaneously making it impossible to distinguish which of these was contributing by how much to the observed effects. Another question surrounds how this method appeared to facilitate gains to students proportional to their ability starting point, with higher achievers improving relatively more than middle and low achievers.

The authors suggest the program was successful in the following ways. it:

  • made it necessary for participants to use formal mathematical language accurately
  • made students aware that problems can be solved in multiple ways
  • encouraged students to see different aspects of problems.
  • gave high achievers opportunities to articulate and further develop their thinking processes at the same time as letting lower achievers see these thinking processes modelled

Post-intervention the IMPROVE method was implemented in all classes of all schools that the trials were performed in. 

Notes: I initially found this article via the article:  Schneider, W., & Artelt, C. (2010). Metacognition and mathematics education. Zdm, 42(2), 149-161. doi:10.1007/s11858-010-0240-2 . Schneider and Artelt’s article also outlined various other metacognitive training strategies that have been trialled in different classrooms. I chose to focus on Mevarech and Kramarski’s IMPROVE model here as it was a rigorous study and was cross referenced in several other papers also.

IMPROVE is an acronym for: Introducing new concepts, Metacognitive questioning, Practicing, Reviewing and reducing difficulties, Obtaining mastery, Verification, and Enrichment.

References:

Brown, A., & Palincsar, A. (1989). Guided cooperative learning: An individual knowledge acquisition. In L. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 393-451). Hillsdale, NJ: Erlbaum.

Marx, R. W., & Walsh, J. (1988). Learning from academic tasks. The ElementarySchool Journal, 88, 207-219.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

Weaver, C. A., III,& Kintsch,W. (1992). Enhancing students’ comprehension of the conceptual structure of algebra word problems. Journal of Educational Psychology,84, 419-428.

 

Dealing with Test Anxiety: Avoidance, Acceptance and White Bears.

Have you ever heard of the white bear intelligence test? Whoever thinks of a white bear the least is the smartest. So, let’s try it out:

Screen shot 2014-10-23 at 8.42.06 AM

The test starts now: Don’t think of a white bear…

I told you not to think of a white bear!  Ok, so you thought of a white bear. But now you really have to stop thinking about a white bear, the more you think about it the dumber you are. Just suppress the thought of a white bear so there is absolutely no image of a white bear in your head.

I said DON’T THINK ABOUT A WHITE BEAR, this really isn’t looking good for your intelligence score…

Obviously this isn’t a very good test of intelligence, so why are we thinking about white bears and trying to suppress these thoughts? Because this is an exercise used by Senay, Cetinkaya and Usaka (2012) to explore acceptance of test-anxiety-related thoughts as a means of  helping students to improve their test performance.

For many people, anxiety about tests is one of the main factors that reduces test performance. This occurs because anxious thoughts such as “I’m no good at maths” or “I’m going to fail” or “this is too hard” occupy space in working memory. This reduces the cognitive processing power that’s available to be allocated to actually doing the test (Ashcraft & Kirk, 2001). The default coping strategy for many students is to try to suppress these anxious thoughts, but what can often happen is that (as with the white bear whom we met above) the thoughts just keep on popping up, and sometimes trying to suppress them can just increase their prevalence!

Screen shot 2014-10-23 at 9.21.27 AMThere’s a compounding factor at play here too, and that’s the fact that students often realise that these anxious thoughts are compromising their performance. This adds extra pressure on them to suppress these thoughts (pressure that I tried to simulate above by suggesting that the white bear exercise was in fact an intelligence test [but I probably didn’t fool you]) and can lead to the vicious cycle pictured to the right.

note: It can infact be more like a vicious spiral, with the student getting more and more stressed as the test goes on… but I didn’t know how to make a spiral in Microsoft SmartArt Graphics.

Senay, Cetinkaya and Usaka (2012) wanted to test ‘acceptance’ as a technique to help students deal with test-related-anxiety. They took 87 college freshmen, both male and female who were doing an intro-to-psychology class, and performed the intervention immediately prior to a class test. They split the participants into 4 groups. A control group (told to just do the exam is they normally would), a group who had a 10 minute training on anxiety avoidance techniques*, a group had a 10 minute training on anxiety acceptance techniques**, and a group who received training in both.

*ie: avoid the things that are likely to produce anxiety for as long as you can. In this case the main technique spoken about was to pass any difficult questions and come back to them once all of the easy questions were completed

**Here the students were told to 1: don’t try to suppress anxious thoughts (at this point the white bear example was invoked to prove that suppression doesn’t actually work), 2: not pass judgement on whether or not their anxious thoughts were justified (eg: ‘Am I having this thought because I actually am dumb?’ This equates to realising that the “White Bear Intelligence Test” is in fact not an intelligence test), 3: see anxious thoughts are something that are going to naturally pass through a person’s mind, and that they don’t have to do anything about them. From time to time, everyone thinks about white bears!

I’m keen to emphasise here that this intervention was only 10 minutes long, and immediately prior to the test, this makes the results even more interesting!

Screen shot 2014-10-23 at 8.55.01 AM

Of course it was checked that there wasn’t any bias present in the groups prior to the training (ie: all groups had a similar distribution of ‘anxious’ and ‘not-so-anxious-ish’ people) and all that jazz, and in the end, this is what came out in the wash (see right, from pg. 423)

All 3 treatment groups did (statistically) significantly better than the control group!

The authors also looked at test scores as a function of how frequently test strategy was employed. This was measured by asking the participants to rate, on a scale from 1 to 7, whether they used coping techniques (7 being very frequently). This revealed an interesting result.

Screen shot 2014-10-23 at 9.03.22 AMEssentially, the more often the treatment participants employed the techniques, the more successful they were in the test (correlation). Conversely, more frequent use of techniques by the control group (techniques of their own choosing) was correlated with lower exam scores. This was likely because it was simply an indication that they were having more anxious thoughts, which were not being effectively dealt with and thus compromising their performance.

Also interesting to note is that there was no statistically significant difference between the results of the 3 treatment groups. The authors suggested that this could have been due to a ceiling effect whereby maximum returns to technique were reached by either of the strategies used in isolation (acceptance or avoidance). Thus, using strategies in combination didn’t yield any significant improvements above the use of either of them individually.

So, in conclusion, you help your students to improve their test performance by letting them know that skipping hard questions and coming back to them later and by telling them that it’s ok and normal to have anxious thoughts. “When you have anxious thoughts you can just think to yourself ‘how interesting, an anxious thought, oh well, that’s normal’ and continue on with your test”. I’m amazed by how just a 10 minute intervention had statistically significant results!

I would be interested to see the effects of longer term acceptance strategy training, such as meditation, on an individual’s ability to deal with anxious thoughts. I’m personally really enjoying using the Headspace app at the moment to do daily meditation. And I do feel that an approach of ‘seeing my thoughts as passing cars on the road, there’s no need to get picked up and taken away by them, just watch them pass’ has really helped me to be more positive and let negative emotions go more easily since I started the training a few weeks ago  :)

References:

Ashcraft, M. H., & Kirk, E. P. (2001). The Relationships among working memory, math anxiety, and performance. Journal of Experimental Psychology: General, 130, 224–237.

Senay, I., Cetinkaya, M. and Usak, M. (2012). Accepting test-anxiety-related thoughts increases academic performance among undergraduate students. Psihologija, 45(4), pp.417–432.

 

 

 

Goal Setting: How a 2 hour goal setting exercise can facilitate long term success (with downloadable worksheet)

Out of my recent summary of Richard Shell’s  Springboard, Launching Your Personal Search for Success  came several things that I want to explore further. One of them was about goal setting.

Here’s the tantalising intro to goal setting that Richard gave us in his book. (Kindle location 3819)

“In a notable study of academic achievement, researchers randomly selected college students who were struggling with their grades and conducted a simple intervention. Half the students were given a two-hour, web-based, goal-setting tutorial.

Screen shot 2014-09-10 at 10.15.52 AM The program led students through a five-step process to conceive, frame, and write out specific personal goals related to their future, followed by a three-step tutorial to help them lay out detailed strategies for how they would achieve the goals they had set. (control group did a personality/aptitude test). At the end of the following semester, the researchers reviewed the academic performance…. Four months later, however, the grades of the group that had received the goal tutorial had risen, on average, from 2.2 to 2.9, while the other group’s grades rose only from 2.2 to 2.3. In addition, the members of the goal-tutorial group carried heavier course loads and felt better about themselves and their academic performance. This simple intervention, in short, had materially improved the chances for these students to graduate on time and with a new, more positive attitude.”

Well, that sounds like a pretty good use of 2 hours! I had to look into it in more detail.

Dominique Morisano, Jacob B. Hirsch, and Jordan B. Peterson, “Setting, Elaborating, and Reflecting on Personal Goals Improves Academic Performance,” Journal of Applied Psychology 95 no. 2 (2010): 255-64.  (See the appendix of the paper for more detail on the 8 step process that I outline below. Please see the bottom of this page for a downloadable worksheet on goal setting.)

So, what were these 8 steps?

1. Vision Get students to free-write about a) their ideal future, b) qualities they admire in others, c) things they could do better, d) their school and career futures, e) things they would like to learn more about, f) habits they would like to improve.
2. Label Label the main ideas and concepts that came out of the visioning process. Take a few of these (7 to 8 in the study) and write about what a successful outcome would actually look like if realised.  Ensure that each labelled goal is clear and specific.
3. Prioritise Prioritise goals from step 2. Detail specific reasons for the pursuit of each goal and consider the attainability of each goal within a self-specified timeframe. (Attainability considered as success-expectation has a large influence on motivation)
4. Impact Ask students to write about the impact that attaining each of the goals would have on their life. This exercise provides further motivation for students. (As an interesting aside on Impact, check out what Dan Gilbert has to say about Impact Bias)
5. Chunk “Many things which cannot be overcome when they stand together yield themselves up when taken little by little.”-Quintus Sertorius. This step is about getting students to break their goals up into bite sized pieces/subgoals and constructing concrete strategies for achieving ach of them.
6. Obstacles Encourage students to identify likely obstacles to each subgoal and think of strategies to over come these.
7. Cap Students cap each subgoal, ie: define what it will look like for each sub-goal to be achieved. This is about benchmarking goal attainment to help keep students focused through aiding them in monitoring their own progress.
8. Committed Students evaluate the degree to which they are committed to achieving each goal. This is about the student forming a personal contract with themselves to strive for the goals that they have defined. At the end of the exercise all documents were emailed to students for their reference*.

Screen shot 2014-09-10 at 10.18.42 AM

*A possible improvement on this would be to get students to schedule emails to themselves (could use boomerang or set as events with reminders in google calendar) so that they are reminded on a weekly (or so) basis of their goals and their goal accomplishment timeline.

Good luck using the below worksheet in your classroom or life to help your students or yourself live and learn better  :)

Downloadable goal setting worksheet.

______________________________________________________________________________________________

Mnemonic

For those who would like to be able to remember this 8 step process without referring to this article again, please consider the following mnemonic:

note: When i say “a friend’ or ‘a brand’ make sure you envision a specific friend and an specific brand, etc.

You’re looking into the eyes of a good friend  (Vision), the camera shot zooms out and you notice that they’re wearing a new pair of glasses with a new brand (Label), You ask them how they chose this new label and they admit that it was hard to prioritise (Prioritise). Suddenly, the glasses fall of your partners nose and impact the ground (Impact). They’ve fallen into two big chunks (Chunk). You decide to walk to the hardware store to try and fix them, but your friend, who is missing their glasses, has trouble avoiding the telephone poles on the way (Obstacles). You decide that it’s best to get them a hard hat to keep them safe on the journey (Cap). The hard hat salesperson ask you out, but you have to tell them that you’re already committed (Committed). 

 

Are Facts more Important than Critical Thinking?

‘Factual Knowledge Precedes Skill’

This is the ‘guiding principal’ in chapter 2 of Daniel Willingham’s Why Students Don’t Like School. I’ll start by pointing out that this title for Willingham’s book is a bit misleading and the subtitle ‘A cognitive scientist answers questions about how the mind works and what it means for the classroom’ gives readers a much better idea of the book’s content.

Willingham’s book is an excellent overview of 7 crucial cognitive principals that are of great value to anyone who is interested in teaching and learning. In fact, the book in large part inspired this set of posts (of which this is the first) for me and I’ll be going over each principle in detail in the coming weeks.

Of all of the lessons in the book, this one was for me the most profound.Why? For many years I have been of a certain mind that “we are wasting our time teaching kids facts at school, what we need to be teaching them is how to learn and how to think critically!” Whilst I still firmly believe that learning how to learn and critical thinking are… critical, this chapter helped me to realise that:

“Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about.The very processes that teachers care about most—critical thinking processes such as reasoning and problem solving—are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).”-kindle location 552

…It’s all to do with working memory.  Here’s (my elaborated version of) “Just about the simplest model of the mind possible” that Daniel introduces in Chapter 1. Let’s talk through it

This is what your mind looks like

When we begin to solve a problem, 3 windows of our mind are engaged, The environment, our working memory, and our long term memory. The environment is where the question is posed, it’s also where we have access to other information like youtube clips, formula sheets, the working of the kid sitting next to us, and so on. Long term memory is where we store all of the stuff that we’ve already learned. Working memory is where the processing happens. So when we solve a problem we can draw both from our long term memory and from the environment to come up with the solution. If that’s all there is to it then theoretically we should be able to solve any problem as we have access to a seemingly limitless amount of information, but there’s a catch, your working memory only has about 7 slots. 7 precious slots with which you can work*. The reason why long term memory (knowing stuff) is so important is that by remembering stuff you can compress many individual pieces of information and concepts (represented above as pink blocks) in such a way that they only take up one slot in working memory (ie: 1 blue block=many pink blocks). This process is called Chunking and it frees up working memory space for additional info and processing facilitating higher order and more complex thinking. This has important implications for teaching/learning techniques such as the use of formula sheets and scaffolding.

* (7 plus or minus 2 slots covers the majority of the population)

For an example of this please see the bottom of this page.

This information has completely changed my view of what it means to ‘learn’ something…

“I don’t have to memorise anything because I can just put it onto my formula sheet.”

This was my mindset throughout the majority of my undergraduate degree in Physics. See the picture below for an example of one of these such sheets (which we were permitted to take into exams).

Final Cheat Sheet Photo, page 1

After reading this chapter of Willingham’s book I now better understand why I found some parts of my degree as challenging as I did. My ‘I’ll just put it onto my cheat sheet’ mentality was actually preventing me from taking my Physics to the next level. The 7 slot limit of my working memory was being overwhelmed. I hadn’t memorised important facts and info sufficiently for me to ‘chunk’ them, which was limiting my ability to combine concepts in creative ways to solve problems. This conclusion has opened my eyes to the importance of storing things in long term memory and from now on I’ll be making a more concerted effort to use programs such as Anki to do just that! (also the reason why I’m changing my Wot-I-Got blog post format and will be introducing more mnemonics to help readers/my self to better remember blog post content in future).

Conclusion

So, are facts more important than critical thinking? Well… it’s more that facts are a precursor to critical thinking. Knowing facts frees up the processing power of your brain to analyse new information as it comes in.

But this isn’t the only reason why learning facts is super important. Another reason is because knowledge is like money, the more you have the easier it is to get. This is the topic of the next post in this series (coming soon).

If you liked this post you can sign up for all of my posts on learning, teaching and living to be delivered straight to your email inbox (absolute maximum of 1 email per week. Don’t worry, I hate spam too!)

An example of how we’re limited by our 7 slots.

Let’s consider the importance of knowing stuff with an example.

Q: If the nightly revenue of a restaurant is represented by R=-20c2 + 200c + 1920 (where c is no. of customers per night) use calculus to find the maximum nightly revenue.

Without being too exhaustive let’s list some of the things that someone would need to know to answer this question. (think of each number as a pink block)

  1. How to read
  2. What a restaurant is (etc, etc, etc with the really obvious stuff)
  3. what revenue is
  4. that c2 means that it’s a quadratic
  5. that a quadratic equation has a gradient
  6. That the turning point of a quadratic is when the gradient = 0
  7. How to take a derivative
  8. How to set a derivative, R’, equal to 0
  9. Basic algebra to isolate C once you’ve set R’ equal to 0
  10. That that’s the number of customers that would generate maximum revenue
  11. that the R equation relates the number of customers to the revenue associated with that many customers
  12. That you can sub C into the R equation to calculate to find the maximum nightly revenue possible

Now, to me that looks like more that 7 pink blocks. For pretty much all students we can conclude that they have combined pink blocks 1 and 2 (ie, all the obvious stuff) into a blue block, but after that it’s still clear that other stuff must be ‘known’ in order for them to successfully complete the problem, especially if one of their 7 working memory slot is being taken up with a “I can’t do this, I’m confused” mantra.

From the above it’s hopefully clear that for a student to successfully solve this problem they must have stored at minimum 6 of the above bits of info in their long term memory.