Tag Archives: Educational Papers

Present new material in small steps with student practice after each step: How’s it look?

Embed from Getty Images

The second recommendation in Rosenshine’s ‘Principles of Instruction is “Present new material in small steps with student practice after each step”. The basis for this recommendation is the fact that working memory is limited and, for learning to occur it’s important to avoid overloading working memory. But that isn’t the focus of this post. In this post I just wanted to share what ‘new material in small steps with student practice after each step’ can look like in the classroom.

As a rule of thumb, the longer a teacher talks for the more likely they are delivering sufficient information to overload their students’ working memory. As I reflected upon this point, prompted by Craig Barton’s in-depth interview with Kris Boulton recently, I found myself thinking, ‘I wonder how long I talk for?’ It was time to collect some data.

Next lesson I split my notebook into three columns ‘explain’, ‘student work’, and ‘check solution’ (I always teach my maths lessons in an ‘I do’ then ‘You do’ format, then go over the solutions as a class), then I got to recording! First class I got distracted and fell off the timing bandwagon (first half of the page) but second class I remembered to stay on task and that whole class (90 mins) is recorded in the image below (red box).

To set the scene, I wanted students to be able to answer the exam question presented by the end of the lesson. This required them to be able to go from a transition diagram and an initial state matrix to the result after multiple periods with or without the addition of extra units each period, as well as determining the result of such transitions ‘in the long run’, and working backwards in such a relation. I split this up into the following sub-steps for the purposes of instruction.

  • Constructing a transition matrix from a transition diagram.
  • Applying a transition diagram to interpret a transition
  • Applying a transition matrix to interpret change after one transition
  • Understanding transition matrices as recurrent relations (And results after multiple periods with a formula)
  • ‘In the long term’: Steady state solutions to Transition matrices
  • Results after multiple periods (using brute force, that means with a calculator)
  • Transition matrix modelling when the total number of units changes.
  • Working backwards in matrix multiplications


The astute observer will note that the total time adds up to about 60 mins. The additional time was taken up with approx. 20 mins of revising previous content and 10 mins talking about an upcoming assessment and doing a ‘brain break’.

Below is the lesson as I presented it, with the timing for each segment added in italics (images weren’t in the original as students had all questions in front of them. I added them for readers here)

I found it really valuable to look at the timing of my lessons in this level of detail. I’d love to know if it’s prompted any similar reflections for you.


Rosenshine, B. (2012). Principles of Instruction: Research-Based Strategies That All Teachers Should Know. American Educator, 36(1), 12.


Metacognition: Can it help students problem solve?

I was recently doing a little reading into metacognition and began to wonder if it could be used as a tool in the Maths classroom to help students, particularly with problems solving. I got hunting in the research and found the following paper.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

I got a lot out of it, and thought some others might like to hear how the IMPROVE method works.

This study was done in the late 90’s in Israel. It was to test a modified teaching model based on the incorporation of three elements that aren’t always seen in the classroom

  • Metacognitive Training
  • cooperative learning
  • systematic provision of feedback-corrective-enrichment.

I’ll expand on each of these a little below, then talk about the results of using them in tandem (as was done by Mevarech and Kramarski).

Metacognitive Training

Screen Shot 2015-03-07 at 12.00.43 pm


I like to think bout metacognition as stepping back from a situation and asking ‘how is my brain reacting to the stimulus here? And how would I like it to react?’. The IMPROVE method got students to begin to do this by introducing three new questions to the mathematics classroom. These three questions were made into cards and passed around when problems got challenging. The questions were

Comprehension Questions: What’s the problem actually saying. Students were asked to ‘read the problem aloud, describe concepts in their own words, and discuss what the concepts meant or into which category the problem could be classified’ (p. 374)

Connection Questions: “How does this question relate to things that you’ve seen before?”

Strategic Questions: All about how you’re going to attack a problem. Ask “What strategy/tactic/principle can be used in order to solve this problem?”, “Why” and “How will you carry this out?” (p. 376)

The idea of these questions was to help students to differentiate between equivalent problems (Qs with the same structure and ‘story context’), similar problems (different structure but same ‘story complex), isomorphic problems (same structure, different ‘story context’), and unrelated problems (very little in common). These categories are from Weaver and Kintsch (1992) and the terminology wasn’t taught to students.

Cooperative Learning

The method used followed Brown and Palincsar’s method (1989) in which students were put into teams of 4 students of 1 high, 2 middle and 1 low achieving student (p. 377). As students progressed teams were changed to maintain this structure. It was stated at this point in the paper that the question-answering technique based on that of Marx and Walsh (1988) was used following a brief teacher exposition of approximately 5 minutes. I plan to further explore this mentioned questioning technique.


At the end of each 10 or so lessons (constituting a unit) students took a formative test to check their comprehension of the unit’s main ideas. Tests were based on the work of Bloom (1976). Students who didn’t achieve ‘Mastery’ (taken as 80% correct) were given extra support to solidify the basics, students who did went on to enrichment activities. Essentially a form of differentiation.

The Studies

The paper detailed 2 studies that were undertaken. The first included 247 year 7 students split into an experimental (n=99) and control (n=148)  group and the second study consisted of 265 students (experimental n=164, control n=101)). The first study was completed in a region of heterogeneous classrooms (ie: students weren’t split into classes based on ability, ‘tracked’. These classes spanned more than 5 ability years) whilst the second was undertaken in a district where ‘tracking’ was the norm. The second study was undertaken in order to see if the IMPROVE method applied for an entire year would yield encouraging results as it did over the shorter period as in Study 1, as well as to expand the topics to which the method was applied.

Study 1 applied IMPROVE to the topics of rational numbers, identification of rational numbers on the number axis, operations with rational numbers, order of operations, and the basic laws of mathematics operations.

Study 2 applied IMPROVE to the topics of numerals and rational numbers, variables and algebraic expressions, substitutions in algebraic expressions, linear equations with one variable, converting words into symbols, and using equations to solve problems of different kinds.

Tests were composed of computational questions (25 items) and reasoning questions with no computational requirements (11 items).  The reasoning questions were marked based on the following progressive marking scheme: 1 point: Justification by use of 1 specific example, 2 points: reference to a mathematical law but imprecise, 3 points: references mathematical law correctly but conflict incompletely resolved, 4 points: question completely resolved with correct reference to relevant mathematical laws.

Based on pre-intervention test scores students were classified as low, middle or high achieving with pre and post test results compared within these groups.


Screen Shot 2015-03-07 at 12.03.02 pm

p. 381

Study 1: No difference existed between control and experimental group prior to the intervention but IMPROVE students significantly outperformed those in the control post-intervention. Overall mean scores were 68.03 (control) vs. 74.72 (treatment) post-intervention (p<0.05) with means scores on the reasoning component 53.15 (control) vs. 62.56 (treatment). Improvements were seen at all achievement levels.

p. 384

p. 384

Study 2: As with study 1 mean scores for the experimental group increase significantly more than those in the control group with 2 important points to note. Firstly, only the gains to the high achievers group were statistically significant, with the medium achievers group being milldly significant (p-0.052). Low achieving treatment scores>Low achieving control scores in all cases but this result wasn’t statistically significant. Secondly, these trends held for all topics except for ‘operations with algebraic expressions’. It was suggested this was due to the fact that this unit required more practice than other units, thus, being a more of a procedural topic the benefits of metacognitive approaches weren’t as impactful.


It’s clear that the IMPROVE intervention aided in student achievement. It increased their ability to draw on prior knowledge to solve problems and to establish mathematical mental schemata to increase their ease of access to this prior knowledge. One challenges with this study (as outlined by Mevarech and Kramarski themselves) was that the three elements; metacognitive training, co-operative learning, and feedback-corrective-enrichment were all applied simultaneously making it impossible to distinguish which of these was contributing by how much to the observed effects. Another question surrounds how this method appeared to facilitate gains to students proportional to their ability starting point, with higher achievers improving relatively more than middle and low achievers.

The authors suggest the program was successful in the following ways. it:

  • made it necessary for participants to use formal mathematical language accurately
  • made students aware that problems can be solved in multiple ways
  • encouraged students to see different aspects of problems.
  • gave high achievers opportunities to articulate and further develop their thinking processes at the same time as letting lower achievers see these thinking processes modelled

Post-intervention the IMPROVE method was implemented in all classes of all schools that the trials were performed in. 

Notes: I initially found this article via the article:  Schneider, W., & Artelt, C. (2010). Metacognition and mathematics education. Zdm, 42(2), 149-161. doi:10.1007/s11858-010-0240-2 . Schneider and Artelt’s article also outlined various other metacognitive training strategies that have been trialled in different classrooms. I chose to focus on Mevarech and Kramarski’s IMPROVE model here as it was a rigorous study and was cross referenced in several other papers also.

IMPROVE is an acronym for: Introducing new concepts, Metacognitive questioning, Practicing, Reviewing and reducing difficulties, Obtaining mastery, Verification, and Enrichment.


Brown, A., & Palincsar, A. (1989). Guided cooperative learning: An individual knowledge acquisition. In L. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 393-451). Hillsdale, NJ: Erlbaum.

Marx, R. W., & Walsh, J. (1988). Learning from academic tasks. The ElementarySchool Journal, 88, 207-219.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

Weaver, C. A., III,& Kintsch,W. (1992). Enhancing students’ comprehension of the conceptual structure of algebra word problems. Journal of Educational Psychology,84, 419-428.


Dealing with Test Anxiety: Avoidance, Acceptance and White Bears.

Have you ever heard of the white bear intelligence test? Whoever thinks of a white bear the least is the smartest. So, let’s try it out:

Screen shot 2014-10-23 at 8.42.06 AM

The test starts now: Don’t think of a white bear…

I told you not to think of a white bear!  Ok, so you thought of a white bear. But now you really have to stop thinking about a white bear, the more you think about it the dumber you are. Just suppress the thought of a white bear so there is absolutely no image of a white bear in your head.

I said DON’T THINK ABOUT A WHITE BEAR, this really isn’t looking good for your intelligence score…

Obviously this isn’t a very good test of intelligence, so why are we thinking about white bears and trying to suppress these thoughts? Because this is an exercise used by Senay, Cetinkaya and Usaka (2012) to explore acceptance of test-anxiety-related thoughts as a means of  helping students to improve their test performance.

For many people, anxiety about tests is one of the main factors that reduces test performance. This occurs because anxious thoughts such as “I’m no good at maths” or “I’m going to fail” or “this is too hard” occupy space in working memory. This reduces the cognitive processing power that’s available to be allocated to actually doing the test (Ashcraft & Kirk, 2001). The default coping strategy for many students is to try to suppress these anxious thoughts, but what can often happen is that (as with the white bear whom we met above) the thoughts just keep on popping up, and sometimes trying to suppress them can just increase their prevalence!

Screen shot 2014-10-23 at 9.21.27 AMThere’s a compounding factor at play here too, and that’s the fact that students often realise that these anxious thoughts are compromising their performance. This adds extra pressure on them to suppress these thoughts (pressure that I tried to simulate above by suggesting that the white bear exercise was in fact an intelligence test [but I probably didn’t fool you]) and can lead to the vicious cycle pictured to the right.

note: It can infact be more like a vicious spiral, with the student getting more and more stressed as the test goes on… but I didn’t know how to make a spiral in Microsoft SmartArt Graphics.

Senay, Cetinkaya and Usaka (2012) wanted to test ‘acceptance’ as a technique to help students deal with test-related-anxiety. They took 87 college freshmen, both male and female who were doing an intro-to-psychology class, and performed the intervention immediately prior to a class test. They split the participants into 4 groups. A control group (told to just do the exam is they normally would), a group who had a 10 minute training on anxiety avoidance techniques*, a group had a 10 minute training on anxiety acceptance techniques**, and a group who received training in both.

*ie: avoid the things that are likely to produce anxiety for as long as you can. In this case the main technique spoken about was to pass any difficult questions and come back to them once all of the easy questions were completed

**Here the students were told to 1: don’t try to suppress anxious thoughts (at this point the white bear example was invoked to prove that suppression doesn’t actually work), 2: not pass judgement on whether or not their anxious thoughts were justified (eg: ‘Am I having this thought because I actually am dumb?’ This equates to realising that the “White Bear Intelligence Test” is in fact not an intelligence test), 3: see anxious thoughts are something that are going to naturally pass through a person’s mind, and that they don’t have to do anything about them. From time to time, everyone thinks about white bears!

I’m keen to emphasise here that this intervention was only 10 minutes long, and immediately prior to the test, this makes the results even more interesting!

Screen shot 2014-10-23 at 8.55.01 AM

Of course it was checked that there wasn’t any bias present in the groups prior to the training (ie: all groups had a similar distribution of ‘anxious’ and ‘not-so-anxious-ish’ people) and all that jazz, and in the end, this is what came out in the wash (see right, from pg. 423)

All 3 treatment groups did (statistically) significantly better than the control group!

The authors also looked at test scores as a function of how frequently test strategy was employed. This was measured by asking the participants to rate, on a scale from 1 to 7, whether they used coping techniques (7 being very frequently). This revealed an interesting result.

Screen shot 2014-10-23 at 9.03.22 AMEssentially, the more often the treatment participants employed the techniques, the more successful they were in the test (correlation). Conversely, more frequent use of techniques by the control group (techniques of their own choosing) was correlated with lower exam scores. This was likely because it was simply an indication that they were having more anxious thoughts, which were not being effectively dealt with and thus compromising their performance.

Also interesting to note is that there was no statistically significant difference between the results of the 3 treatment groups. The authors suggested that this could have been due to a ceiling effect whereby maximum returns to technique were reached by either of the strategies used in isolation (acceptance or avoidance). Thus, using strategies in combination didn’t yield any significant improvements above the use of either of them individually.

So, in conclusion, you help your students to improve their test performance by letting them know that skipping hard questions and coming back to them later and by telling them that it’s ok and normal to have anxious thoughts. “When you have anxious thoughts you can just think to yourself ‘how interesting, an anxious thought, oh well, that’s normal’ and continue on with your test”. I’m amazed by how just a 10 minute intervention had statistically significant results!

I would be interested to see the effects of longer term acceptance strategy training, such as meditation, on an individual’s ability to deal with anxious thoughts. I’m personally really enjoying using the Headspace app at the moment to do daily meditation. And I do feel that an approach of ‘seeing my thoughts as passing cars on the road, there’s no need to get picked up and taken away by them, just watch them pass’ has really helped me to be more positive and let negative emotions go more easily since I started the training a few weeks ago  :)


Ashcraft, M. H., & Kirk, E. P. (2001). The Relationships among working memory, math anxiety, and performance. Journal of Experimental Psychology: General, 130, 224–237.

Senay, I., Cetinkaya, M. and Usak, M. (2012). Accepting test-anxiety-related thoughts increases academic performance among undergraduate students. Psihologija, 45(4), pp.417–432.




Goal Setting: How a 2 hour goal setting exercise can facilitate long term success (with downloadable worksheet)

Out of my recent summary of Richard Shell’s  Springboard, Launching Your Personal Search for Success  came several things that I want to explore further. One of them was about goal setting.

Here’s the tantalising intro to goal setting that Richard gave us in his book. (Kindle location 3819)

“In a notable study of academic achievement, researchers randomly selected college students who were struggling with their grades and conducted a simple intervention. Half the students were given a two-hour, web-based, goal-setting tutorial.

Screen shot 2014-09-10 at 10.15.52 AM The program led students through a five-step process to conceive, frame, and write out specific personal goals related to their future, followed by a three-step tutorial to help them lay out detailed strategies for how they would achieve the goals they had set. (control group did a personality/aptitude test). At the end of the following semester, the researchers reviewed the academic performance…. Four months later, however, the grades of the group that had received the goal tutorial had risen, on average, from 2.2 to 2.9, while the other group’s grades rose only from 2.2 to 2.3. In addition, the members of the goal-tutorial group carried heavier course loads and felt better about themselves and their academic performance. This simple intervention, in short, had materially improved the chances for these students to graduate on time and with a new, more positive attitude.”

Well, that sounds like a pretty good use of 2 hours! I had to look into it in more detail.

Dominique Morisano, Jacob B. Hirsch, and Jordan B. Peterson, “Setting, Elaborating, and Reflecting on Personal Goals Improves Academic Performance,” Journal of Applied Psychology 95 no. 2 (2010): 255-64.  (See the appendix of the paper for more detail on the 8 step process that I outline below. Please see the bottom of this page for a downloadable worksheet on goal setting.)

So, what were these 8 steps?

1. Vision Get students to free-write about a) their ideal future, b) qualities they admire in others, c) things they could do better, d) their school and career futures, e) things they would like to learn more about, f) habits they would like to improve.
2. Label Label the main ideas and concepts that came out of the visioning process. Take a few of these (7 to 8 in the study) and write about what a successful outcome would actually look like if realised.  Ensure that each labelled goal is clear and specific.
3. Prioritise Prioritise goals from step 2. Detail specific reasons for the pursuit of each goal and consider the attainability of each goal within a self-specified timeframe. (Attainability considered as success-expectation has a large influence on motivation)
4. Impact Ask students to write about the impact that attaining each of the goals would have on their life. This exercise provides further motivation for students. (As an interesting aside on Impact, check out what Dan Gilbert has to say about Impact Bias)
5. Chunk “Many things which cannot be overcome when they stand together yield themselves up when taken little by little.”-Quintus Sertorius. This step is about getting students to break their goals up into bite sized pieces/subgoals and constructing concrete strategies for achieving ach of them.
6. Obstacles Encourage students to identify likely obstacles to each subgoal and think of strategies to over come these.
7. Cap Students cap each subgoal, ie: define what it will look like for each sub-goal to be achieved. This is about benchmarking goal attainment to help keep students focused through aiding them in monitoring their own progress.
8. Committed Students evaluate the degree to which they are committed to achieving each goal. This is about the student forming a personal contract with themselves to strive for the goals that they have defined. At the end of the exercise all documents were emailed to students for their reference*.

Screen shot 2014-09-10 at 10.18.42 AM

*A possible improvement on this would be to get students to schedule emails to themselves (could use boomerang or set as events with reminders in google calendar) so that they are reminded on a weekly (or so) basis of their goals and their goal accomplishment timeline.

Good luck using the below worksheet in your classroom or life to help your students or yourself live and learn better  :)

Downloadable goal setting worksheet.



For those who would like to be able to remember this 8 step process without referring to this article again, please consider the following mnemonic:

note: When i say “a friend’ or ‘a brand’ make sure you envision a specific friend and an specific brand, etc.

You’re looking into the eyes of a good friend  (Vision), the camera shot zooms out and you notice that they’re wearing a new pair of glasses with a new brand (Label), You ask them how they chose this new label and they admit that it was hard to prioritise (Prioritise). Suddenly, the glasses fall of your partners nose and impact the ground (Impact). They’ve fallen into two big chunks (Chunk). You decide to walk to the hardware store to try and fix them, but your friend, who is missing their glasses, has trouble avoiding the telephone poles on the way (Obstacles). You decide that it’s best to get them a hard hat to keep them safe on the journey (Cap). The hard hat salesperson ask you out, but you have to tell them that you’re already committed (Committed). 


Learning in the Fast Lane-Suzy Pepper Rollins, Book Summary

This is an experimental post format. I’m using a story as a memory device to generate a solid ‘memory anchor’ on which to attach the following information. Hopefully the content of this article will stick in your head better than it would if it was just in text format!

I came across Learning in the Fast Lane when I attended an online webinar with the author, Suzy Pepper Rollins (read about that webinar here).  I got so much out of the hour that I thought I’d make the time investment to read her whole book.

No regrets.

Here’s what I got. ..

The LITFL methodology consists of 6 steps that the books walks you through. Here’s an image and associated short story that I’ve put together to help me to remember the methodology.

Screen shot 2014-08-12 at 2.03.40 PM

So, this is the LITFL methodology.

  1. Generate Curiosity: “curiosity killed the cat”. A cat walks into a room
  2. Map Learning Goals: “the cat sat on the mat” The cat sits down on a mat, it’s one of those map-mats that kids sometimes play on
  3. Scaffold: The kids on the mat are building stuff
  4. Vocabulary: As you look closer, they’re building a taxi rank (cabs are in vogue… vogue-cab-ulary ; )
  5. Apply:  One of the kids applies pressure to the cat’s tail!
  6. Feedback: A parent comes in and provides some feedback to that child!

Now look back up at the picture and link all of the concepts to the images, play the story over in your minds eye, and see if you can recall all of the 6 steps with ease. 

Here’s those same points in Suzy’s words.

  1. Generate thinking, purpose, relevance and curiosity
  2. Clearly articulate learning goals and expectations
  3. Scaffold and practice pre-requisite skills
  4. Introduce and practice key vocabulary
  5. Apply the new concept to a task
  6. Regularly assess and provide feedback (ie: formative assessment)

Chapter Layout

LITFL cover

In Chapter 1 Suzy outlines this methodology and each of the chapters thereafter delves into detail about each of these elements, and more.

This is one of those books where it’s obvious that the author actually thought about what it would be like to use their book as a resource. Let’s take chapter 5 (on Vocab) as an example. Each chapter begins with a justification of why that chapter exists. Suzy tells us the following at the beginning of Chapter 5 (numbers refer to kindle locations, information paraphrased)

  • 1157:  3-year-olds from welfare families typically have 70%of the vocab of children living in working-class homes (Hart and Risley, 1995)
  • 1164: kids in grades 4–12 who score at the 50th percentile know 6,000 more words than 25th percentilers. (Nagy & Herman) 1984)
  • 1184: students need multiple exposures—typically, six—to new words to be able to grasp, retain, and use them (Jenkins et al, 1984)
  • 1194: there is a strong correlation between vocabulary knowledge and reading comprehension.  (Vacca & Vacca, 2002)
  • 1204:  students have just a 7 percent chance of understanding new words from dense text (Swanborn & de Glopper, 1999)
  • 1220: all students who received direct vocab instruction outperformed those who didn’t. (Nagy and Townsend, 2012)

Great, now we know that vocab matters! Suzy then goes on to the section ‘Strategies to Develop Strong Vocabularies’ and lists 9 different methods of introducing new vocab, she also lets us know that learning with pictures is 37% more effective than just learning off definitions (that’s why I included pics at the start of this blog post!). My favourite one of these 9 methods is the TIP (A poster with Term, Information, Picture on it), which I wrote a bit more about here.

The chapter concludes with a “Checklist for vocabulary development” to ensure that you’re on track and for quick reference.

Every chapter is like this, it covers the Why, How and the What in a way that’s both practical and engaging. I got a lot out of this book and will continue to use it as a resource. I loved getting the whole picture from a front-to-back read but I think it would also be great as a quick reference guide for the educator who’s looking for ‘apply in class tomorrow’ kind of ideas.

See below for my summary notes. There’s a lot of them, it was a super info dense book and excellently referenced. Good stuff!

note:  numbers refer to Kindle locations, click the  Screen shot 2014-08-12 at 3.14.04 PM image top right to make the display bigger in another page.




What makes University graduates thrive post graduation?-Gallup Poll

What makes a good college education?

“Gallup recently did a study of college graduates to gauge how engaged they are with their work and whether they are thriving in the world. In the past, the most studies centered around on how much college graduates earned compared to peers without degrees.

(if a student) reported with strong affirmatives that they worked on a long term project (at least a semester), had an internship where they could apply skills, and were very engaged in an extracurricular.” then “he or she was three times as likely to be engaged at work.

The survey found that student who felt supported — that their professors cared about them as individuals, that professors made them want to learn, that they had a mentor — were three times more likely to thrive as those who did not feel supported. Only 14 percent of college graduates answered that all three of those qualities were present in their college experience.”-Direct quote from a Mindshift blog piece.

The report can be downloaded in full here.