Tag Archives: Formative Assessment

Assessment feedback: Processes to ensure that students think!

We know that ‘Memory is the residue of thought’ (Daniel Willingham) and that in order for our students to learn they must actively think about the content to be learnt. This allows this content to occupy their working memory for long enough, and become anchored to sufficient elements in their long term memory, to trigger a change in long term memory, one of the well respected definitions of ‘learning’ (Paul Kirschner).

One of the arenas of teaching in which this can be most challenging is that of feedback delivery to students. Dylan Wiliam sums it up well in the following quote (Which I came across thanks to Alfie Kohn).

Note: The original quote is “When students receive both scores and comments, the first thing they look at is their score, and the second thing they look at is…someone else’s score”, and can be found here (beware the paywall). 

The challenge is, then, how do we give feedback to our students in a way that encourages them to actively think about their mistakes, and helps them to do better next time?

In the following I’ll share how I give feedback to students in two contexts. The first is on low stakes assessments that I carry out in my own classroom, the second is on major assessment pieces that contribute towards their final unit mark.

Assessment Feedback on weekly Progress Checks.

Before we dive in I’ll just paint a picture of how my weekly ‘Progress Checks’ fit into my teaching and learning cycle, and how each of these elements is informed by education theory.

At the start of each week students are provided with a list of ‘weekly questions’. They know that the teaching during the week will teach them how to answer these questions. Questions are to be aligned with what we want students to be able to do (curriculum and exams) (Backwards Design). Students are provided with worked solutions to all questions at the time of question distribution (The worked example effect). The only homework at this stage of the cycle is for students to ensure that they can do the weekly questions.

Progress Checks’ (mini tests, max 15 minutes) are held weekly (Testing Effect). Progress checks include content from the previous three weeks. This means that students see the main concepts from each week for a month (Distributed Practice). These PCs are low-stakes for year 11 students (contribute 10% to their final overall mark) and are simply used to inform teachers and students of student progress in year 12 (where assessment protocols are more specifically defined).

Edit: Here’s a new post on how I use student responses to these PCs to construct the next PCs. 

When designing the progress checks I had two main goals: 1) Ensure that students extract as much learning as possible from these weekly tests, 2) Make sure that marking them didn’t take up hours of my time. The following process is what I came up with.

Straight after the PC I get students to clear their desks, I hand them a red pen, and I do a think-alound for the whole PC and get them to mark their own papers. This is great because it’s immediate feedback and self marking (See Dylan Wiliam’s black box paper), and it allows me to model the thinking of a (relative) expert, and to be really clear about what students will and won’t receive marks for. Following this, for any student who didn’t attain 100% on the progress check, they choose one question that they got incorrect and do a reflection on it based on 4 questions: 1) What was the q?, 2) Which concept did this address?, 3) What did you get wrong?, 4) What will you do next time?

Here are some examples of student self-marked progress checks and accompanying PC reflections from the same students (both from my Y11 physics class). Note: Photos of reflections are submitted via email and I use Gmail filters to auto-file these emails by class.

Brandon PC

Note how this student was made aware of consequential of follow through marks on question 1.

Here’s the PC reflection from this same student (based upon question 2).

B PC ref

Here’s another students’ self-marked Progress Check.

R PC

And the associated reflection.

Screen Shot 2017-04-11 at 7.18.54 am

Screen Shot 2017-04-11 at 7.19.47 am

Students are recognised and congratulated by the whole class if they get 100% on their progress checks, as well as one student from each class winning the ‘Best PC Reflection of the Week’ award. This allows me to project their reflection onto the board and point out what was good about it, highlighting an ideal example to the rest of the class, celebrating students’ successes, rewarding students for effort, and framing mistakes as learning opportunities.

I think that this process achieves my main two goals pretty well. Clearly these PCs form an integral learning opportunity, and in sum it only takes me about 7 minutes per class per week to enter PC marks into my gradebook.

Assessment Feedback on Mandated Assessment Tasks.

There are times when, as a teacher, we need to knuckle down and mark a bunch of work. For me this is the case on school assessed coursework (SACs), which contribute to my students’ end of year study scores. I was faced with the challenge of making feedback for such a test as beneficial to my students’ learning as the PC feedback process is, here’s what I worked out.

  1. On test day, students receive their test in a plastic sheet and unstapled.
  2. At the start of the test, students are told to put their name at the top of every sheet.
  3. At the end of the test I take all of the papers straight to the photocopier and, before marking, photocopy the unmarked papers.
  4. I mark the originals (Though the photocopying takes some time I think that in the end this process makes marking faster because, a) I can group all page 1s together (etc) and mark one page at a time (this is better for moderation too) and b) because I write minimal written feedback because I know what’s coming next…)
  5. In the next lesson I hand out students’ photocopied versions and I go through the solutions with the whole class. This means that students are still marking their own papers and still concentrating on all the answers.
  6. Once they’ve marked their own papers I hand them back their marked original (without a final mark on it, just totals at the bottom of each page), they identify any discrepancies between my marking and their marking, then we discuss and come to an agreement. This also prompts me to be more explicit about my marking scheme as I’m being held to account by the students.

In Closing

I’ve already asked students for feedback on the progress checks through whole class surveys. The consensus is that they really appreciate them and they like the modelling of the solutions and self-marking also. One good thing is that putting together this post prompted me to contact my students and ask for feedback on the self-marking process of their photocopied mandated assessment task. I’ll finish this post with a few comments that students said they’d be happy for me to share. It also provides some great feedback to me for next time .

I’d love any reflections that readers have on the efficacy of these processes and how they could potentially be improved.

From the keyboards of some of my students (3 males, 3 females, 5 from Y12, one from Y11).

Screen Shot 2017-04-12 at 9.09.34 am

Screen Shot 2017-04-19 at 9.23.09 amScreen Shot 2017-04-12 at 9.17.38 am Screen Shot 2017-04-12 at 9.06.27 amScreen Shot 2017-04-12 at 9.08.26 am

Screen Shot 2017-04-13 at 11.32.22 am

Edit:

A  fellow maths teacher from another school in Melbourne, Wendy, tried out this method with a couple of modifications. I thought that the modifications were really creative, and I think they offer another approach that could work really well. Here’s what Wendy said.

Hey Ollie,

I used your strategy today with photocopying students’ sacs and having them self correct. The kids responded so well!

Beyond them asking lots of questions and being highly engaged, those that I got feedback from were really positive saying how it made them look at their work more closely than they would if I just gave them an already corrected test, understood how the marking scheme worked (and seeing a perfect solution) and they liked that they could see why they got the mark they did and had ‘prewarning’ of their mark.

Thanks heaps for sharing the approach.
A couple of small changes I made were
  • I stapled the test originally then just cut the corner, copied them and then restapled. It was very quick and could be done after the test without having to put each test in a plastic pocket
  • I gave the students one copy of the solutions between two. Almost all kids scored above 50% and most around the 70% mark, and I didn’t want them to have to sit through solutions they already had.

if you have thoughts/comments on these changes I’d love to hear them.

Thanks again!

References

Find references to all theories cited (in brackets) here.

WHY CAN’T THEY REMEMBER THIS FROM LAST YEAR??? Help students remember key information: Spaced Repetition Software (SRS).

This post is one of a series detailing my current mathematics lesson rhythm and routine. This one outlines how I use spaced repetition software (SRS) at the start of my lessons to help students to remember key information. There is a video of me teaching with SRS at the bottom of this post. 

Thinking back to my own time at school, I distinctly remember one challenge in particular. I remember feeling that studying  mathematics in discrete topics (or units), made it really hard for me to remember the relevant concepts when it was time to revisit that branch of mathematics again, sometimes over a year later.

Through my post-schooling forays into language learning in particular, I have come across some research backing up those schoolboy intuitions.

What I was feeling was the effects of a cognitive phenomena called the ‘forgetting curve’ (Ebbinghaus, 1913). The forgetting curve (pictured below) is a graph that approximates the rate at which an individual will forget a given unit of information.

the forgetting curve

(image source: https://www.flickr.com/photos/suzymushu/3411344554)

In the late 1800s, a German chap by the name of Hermann Ebbinghaus constructed the first forgetting curve by trying to memorise nonsense syllables (such as “WID” and “ZOF”) and then testing himself at regular intervals, rating his level of accuracy, then plotting these points out on a graph.

Hermann Ebbinghaus

(Old mate Ebbinghaus: https://commons.wikimedia.org/wiki/File:Ebbinghaus2.jpg)

As can be seen in the picture of the forgetting curve, if we want to remember something, we need to be reminded about it at regular intervals*. The good news is that the more times we’re reminded about it, the longer the interval until we need to be reminded about it again!

*(The necessity of reviewing a unit of information at regular intervals is obviously dependent on what the unit of info is, and how it relates to your prior knowledge/how emotionally charged that memory is. For example, It’s highly unlikely you’ll ever forget your first kiss! Ebbinghaus’ original forgetting curve is, however, a great approximation for units of info like; words in a foreign language, or even terms such as ‘perimeter’ or ‘circumference’.)

Such a curve has important implications for teaching and learning. If we want a student to remember the basics of trigonometry when we come around to the topic again a year later (e.g., basic terminology, sum of the angles in a triangle, etc), we had better ensure that several times between now (time of teaching) and next year, they get reminders at key intervals.

The basic idea underlying this reminding-at-intervals is the spacing of repetition. We all know that it isn’t a good idea to cram your study, but a recent meta-analysis of studies, Carpenter, Cepeda, Rohrer, Kang and Paschler (2012) brought together research on the actual effectiveness of spacing repetition. The following excerpt details the results from just one of the studies that they cited in their meta-analysis.

the benefit of spacing repetition

(Carpenter et al., 2012, p. 371)

This is all well and good as a concept, but how can we do it in practice? There are literally hundreds of new words and concepts that a student is expected to grasp in a year, is it realistic for a teacher to keep track of each of these terms and ideas, and remind students of all of them at periodic intervals?

I’m hoping that the answer is yes.

In 2014 I set myself the challenge to learn Mandarin Chinese in a year. As I delved deeper and deeper into effective learning methods,  I came across spaced repetition software (SRS). SRS is a program of digital flash cards (you can make them yourself, or download pre-made decks) that, based on self-ratings, uses an algorithm to calculate the optimum time to review each given unit of information. It is essentially plotting your forgetting curve and reminding you of that piece of information just before you forget!

This software has been notably used to great success by such polyglots as Scott Young (who learnt 4 languages to a very high standard in one year) and Benny Lewis (very famous polyglot). It definitely helped me, and with the help of the SRS program that I use, Anki, I was able to  reach my goal and achieve a conversational level of Mandarin within a year. These days I use it to remember a whole host of things; from people’s names, to new english words, to the countries of the world. I currently have a little over 3000 digital flash cards in my review ‘circulation’ and to keep on top of all this info it only takes between 10 to 15 minutes of my time per day. Here’s a snapshot of my study statistics from the last month.

Anki statistics

(my personal spaced repetition data from the past month)

I was really keen to bring this incredibly powerful tool into the classroom to try to help my students to overcome the memory challenges that I, myself, faced as a student. So I did!

Since I started teaching at the start of this year, I’ve been using an SRS program (Anki)  in all of my classes. We use it at the start of every lesson and I call students’ names with the use of coloured pop-sticks, a method that I’ve written about previously.

The result?

It’s hard to comment on the long term effects as it’s still early days, but student feedback has been good, for example: On the end of Term 1 feedback form that I handed out to students, many of them made comments such as the following:

But hey, I thought that the most helpful thing would be to give readers some eyes into my classroom to see exactly how it plays out. With my students’ permission, I’m sharing below a clip from my  VCAL (Victorian Certificate of Applied Learning) numeracy class. Just for a bit of context, VCAL is a program designed for students who are planning to explore post-secondary pathways into vocational training. I have students who want to be nurses, flight attendants, and many of them aspire to the a position in the military. What you see below is a classic beginning of lesson episode. One of the students (Sharnee) is in charge of the pop-sticks, pulling out student names, and the other students are sitting (with varying degrees of focus), considering what their answer would be, then answering if their name is called up. I’ve found that the students enjoy the routine and it adds a game show like feel to the start of the class. Hopefully this little clip gives you a bit of a glimpse into how Anki works, and how I feel it can help my students to overcome one of the challenges that I myself faced in school.

References:

Carpenter, S. K., Cepeda, N. J., Rohrer, D., Kang, S. H. K., & Pashler, H. (2012). Using Spacing to Enhance Diverse Forms of Learning: Review of Recent Research and Implications for Instruction. Educational Psychology Review, 24(3), 369–378. http://doi.org/10.1007/s10648-012-9205-z

Ebbinghaus, H. (1913). Memory: A contribution to experimental psychology, (3).

 

 

MAV Conference 2015 Highlights

Selected notes and titbits from this year’s MAV conference  :)

Matt Skoss

Hungarian sorting dance

Can use this for decimal examples?

(assume this is a cube, phone stretched the pic…)

(Extension cube from above still to come…

Matt says to check out brilliant for heaps of awesome maths problems!!!

You can find Matt Skoss’ conference notes here.

Peter Sullivan and Caroline Brown: Turning engaging tasks into robust learning

Question: What’s bigger, 2/3 or 201/301?

A few that I was thinking about…, What’s bigger, 4/5 or 444/555? What’s bigger, ¾ or 306/408

Extension: is a/b always < (a+1)/(b+1)??? (always true, sometimes true, never true???)

Question: ‘I wasn’t paying attention in class but I heard the teacher say ‘a turning point is at (2, -3).’ What could the function be? (then, think of another one)

Enabling prompt: change the turning point to (0, -3)

Question:  I saw 10 legs under the farm gate. Draw which animals I might have seen there.

Another Q…

Peter’s Slideshow:Peter Sullivan-Turning engaging mathematics classroom experiences into robust learning

Caroline’s Slideshow: MAV secondary 2015 fractions-Caroline Brown

Yvonne, Jodie and Thao from Sunshine

The award winning Maths program at Sunshine.

The weekly lesson breakdown.

Keep the kids informed about their progress “You started 6 steps behind the other students in the state, and you’re catching up.”

Differentiation

Tasks look similar, but they are different

Students learn how to select task for themselves

In it for the long game. Supporting students to make the right choices for themselves!

Challenge with worded problems (Reciprocal Teaching)

Check out the amazing resources from Sunshine . See more on Sunshine’s numeracy program . They also recommended NAPLAN as a great place to source questions from.

The Steps of Reciprocal Teaching

  • Predict
    • Recognise key worlds and use them as keys to determine the area of maths that they’re looking at.
  • Clarify
    • Re-read the question. Get to a point where they’re comprehending the text. Identify vocab that they don’t know and extract the key info they think they’ll need to solve it.
  • Big Question (added on top of literacy approaches)
    • Recognise and articulate the main problems.
  • Solve
    • Solve and check your answer.
  • Reflection
    • Talk about how the problem was solved. What was there that you learned that you’ll be able to use in future
    • (This is the section that students often struggle with the most!)
      • ß High expectations are they key here!
      • Note: They used to ask students to write a reflection based on the learning intention but found that that was too difficult for students.
    • They’ve thought that they should encourage students to make a glossary!!!
  • ‘It’s a bit smoke and mirrors…
    • The effort goes in prior to the lesson to ensure that the questions are quality and pitched at the correct level.
    • Expecting students to do about 3 problems in a session BUT if it’s a very challenging question sometimes they’ll work on one problem for two lessons!!!
  • Cool tech. ‘Plickers’!!! plickers.com (matt laminate them!!!)

(the Sunshine reciprocal teaching worded problem page looks like this…)

Sunshine’s Reciprocal Teaching Sheet (Also available on their website).

Yvonne’s tip. ‘A quick acid test to see how  good your teacher is to give them a set of questions and see if they ask for the solutions.

Scaffolding Numeracy in the Middle Years approach

They group the students. But it’s ok because there are 9 groups and it’s hard for students to work out which group they’re in.

Lots of teachers. Teachers have 2 groups each (each in a different zone). A teacher will take a lower zone group and a higher zone group. It’s all about the bigger classes, because that means there are more people in the groups and you’ve got the free teachers : )

Fluency (Speedy Maths)

Flash cards, time them, keep track of their times so they can track their improvement.

Students do graphs so they can see their own learning progress.

You can make multiplication sheets in excel with a random number generator!

Growth Mindset

No topic tests

Progress is valued above all else

Data Driven

On demand testing (4 tests per year, 2 number and 2 general)

Wendy Taylor and Sabine Partington’s session

Question from another punter: Take any ‘L’ shape and, without doing any calculations, determine a way to cut it exactly in half with one line (I don’t know the answer to this yet!)

Allan Thomson, Maths on OneNote

  • Students can have their own pages
  • com
  • Check out
    • Onenotecentral (on twitter)
    • OneNote Toolkit for Teachers

Some selected tweets extras

This blog post by Dr Nic on Engaging students in learning statistics using The Islands was brought to my attention by James Dann.

Regarding the tweet below: Sorry Sara McKee, spelt your name incorrectly!)

Keep an eye out for Sara’s upcoming paper:  ‘Using teacher capacity to measure improvement in key elements of teachers’ mathematical pedagogical content knowledge.’

Screen Shot 2015-12-07 at 8.25.25 am Screen Shot 2015-12-07 at 8.25.07 am Screen Shot 2015-12-07 at 8.24.59 am

 

 

 

 

 

 

 

 

 

 

 

 

Tying Together Backwards Design, Self Marking and Criterion Referencing for effective teaching

You may have heard of the concept of Backwards Design before. Often associated with Grant Wiggins, the concept basically states ‘Work out what you want your students to know, then design your lessons with that end in mind.’ Pretty simple, and self explanatory. But the question I had on my recent placement was ‘How can I do this practically, and how can I effectively guide students through the learning process and offer useful feedback along the way?’

My ‘design brief’ (as specified by the Maths department) was to use the year 9  Australian Curriculum mathematics textbook and, over 4 weeks, cover all of the content in the chapter on Linear Equations and Algebra. But I was keen to make this more explicit for students.

Backwards Design

I had been inspired by Sarah Hagan’s work on ‘I Can’ sheets so wanted to tie in these ‘I Can’ sheets with the idea of backwards design.

I surveyed the chapter, looking for a good way to organise this information, and I found that each of the worked examples was teaching a unique competency (or an important but incremental advancement of a pre-taught competency), and they were well labelled. Here’s an example:Screen Shot 2015-06-22 at 8.17.35 am

(Source, textbook)

So I took these competencies and constructed an ‘I can’ sheet that I distributed to students on day 1 of my placement. The ‘I can’ sentence from the above example was ‘I can substitute values into expressions and evaluate’.

In the above ‘I can’ sheet the ‘My pg.’ column was designed as a place for students to write the page number in their book that corresponds to that ‘I can’ statement. For example, if they stuck these two sheets into their book and began numbering straight after these two sheets then the first lesson, covering example 1 (E1) would be on page 1, so they would write a ‘1’ in the My pg. column. The space ‘Key points’ was for the students to write their own key points at the end of every lesson. The Tinycc link took students to videos of the content that I made (I hope to upload these videos to youtube in the near future).

So that was the backwards design, now it was time to tie it into assessment.

I put together a pre-test for the unit. This pre-test assessed students from E1 to E11, inclusive*. And each of the questions was numbered to do so (as can be seen below). This test was administered by my mentor prior to my actual placement and I was able to pick the tests up and mark them to get an idea of where each of the students was at even before I entered my teaching role. I didn’t return these tests to students and told them explicitly ‘The pre-test was about me working out where you guys are at the moment and helps me identify misconceptions so I can directly address them when I introduce the content. Your mark doesn’t matter, what matters is what the test told me about how each of you are currently thinking about this maths’.

Now, the goal of the backwards design was to show students where we were going in terms of content, and give them a clear roadmap of the stops along the way. The goal of assessment was to help me (pre-test) and them (mid-unit test) to see how they were tracking along this path.
My placement was 4 weeks long and at the 2 week mark students were given a mid-unit test. Little did they know that this mid-unit test was EXACTLY THE SAME as the pre-unit test! (one of the benefits of not returning their pre-test ; ) so it gave me a perfect picture of how each of them had progressed.

But, as mentioned, I was very keen to help the students become evaluators of their own learning. To do this, I got them to self-mark. Here’s the process…

Criterion Referencing and Self Marking

I printed out the mid-unit tests (which I referred to as the ‘mid-unit checkup’) double sided on an A4 sheet (one sheet). What this allowed me to do was give students the sheets then collect them up and run them as a batch through the photocopier, giving me both their original and a photocopied version. I then marked the photocopied version so that I had the marks myself and, in the following lesson, I handed back the unmarked original to students, stapled together to two other sheets.

The first sheet stapled behind the original mid-unit test was a set of clearly worked solutions, the second sheet was the following…

For each of the students I filled out the ‘Based on pre-test’ column with either a tick, arrow, or cross (feel free to read instructions on the above sheet so that this makes sense), I then encouraged students to fill out the ‘Based on mid-unit check’ column in the same fashion.
Screen Shot 2015-06-24 at 8.19.06 am
This did a couple of things. Firstly, it enable students to link the competencies to questions, reinforcing what metalanguage such as ‘common factors’ and ‘collecting like terms’ meant. Secondly, it showed them the areas that they still needed to work on. Equally importantly, it showed students how they had progressed, they were able to say things like ‘Great, on the pre-test I couldn’t simply by by collecting like terms, but now I can!’, this was a real plus.
The thing that I really appreciated was the ability to generate a graph of class average results from the pre and mid-unit tests. I then showed this to the class.
Screen Shot 2015-06-25 at 8.04.33 am

This graph really excited students. Students were able to celebrate the progress that they had collectively made, and to identify what they still needed to work on. Students even noticed that for E1, which was ‘write algebraic expressions for word problems’ the class had actually gone backwards! This prompted conversation and allowed us to talk about how this was the first thing we covered in the unit, and provided an opportunity to to re-visit the concept of the forgetting curve that I’d introduced them to earlier.

Reflections

I was really happy with how this approach came off. I saw marked increases in engagement from students. Most importantly, they totally ‘got it’. Student feedback alluded to the fact that students gained a greater understanding of where they were, and what they needed to work on. Here is what some of the students had to say about it (from a feedback form that I handed out on my final day of placement).

Next Time…

Next time I would like to improve upon this method by keeping the students’ ‘I Can…’ sheets all in one place, and preferably in a digital form, so that they can’t get lost and both students and I can access them from home, I’m thinking google spreadsheets for this but I’ll continue to consider options. This would allow them to take greater charge and track their own progress through the formative, mid-unit, and summative assessments.

I’m happy with how this approach went and I look forward to refining this approach when I’m next in the classroom.

The image below shows the class distribution of score in the pre-and post tests. I calculated my ‘effect size’ based on this information and was very pleased with the result : )

First Placement Effect Size

*I now think that it would be worth considering giving students a pre-test that contained ALL of the content from the unit (i.e., Example 1 to Example 21). This is because 1: It would have given them a better idea of the answer to the question ‘where are we going’, 2: it would have given some of the students opportunities to problem solve and try to work out for themselves how to do it and 3: I recently attended a lecture by Patrick Griffin in which he talked about how students shouldn’t EVER be getting 100% on tests because that doesn’t actually give you accurate information on where they are up to, it only tells you what level of achievement they are can perform higher than! Some students did get very close to 100% on this pre-test. But it is important to acknowledge that for these students I hadn’t had the time to build the culture of ‘have a go’ so some were even reluctant to attempt the pre-test, not quite understanding why they should be tested on something they haven’t even been taught yet. Making the pre-test longer and harder by including all content from the unit to be taught could have been overwhelming for some students.

One approach to introducing cold calling to the classroom

After reading Glen Pearsall’s Top 10 Strategic Questions for Teachers (concise, free ebook) recently I was really inspired to introduce cold calling into the classroom. I had also wanted to incorporate spaced repetition into my class as well, so I thought it would be a good opportunity to kill two birds with one stone. (note: I’ve just started in my teacher training placement school, I’ve taught 3 lessons there in previous weeks but today was the first day of a month taking two year 9 math classes full time).

The way I thought I’d implement cold calling, to use pop sticks with the student’s names on them, was inspired by watching Dylan William’s ‘The Classroom Experiment‘, but I was afraid of making kids feel too put on the spot by the questioning. How to scaffold their participation?

The first student question I wanted to pre-empt was, ‘Why are we doing this spaced repetition stuff?’ (I didn’t actually use that term, I’ve called it ‘Micro-revision’). I addressed that by showing them the following clip.

I asked students what they’d learned from the clip, and if anything surprised them. I then mentioned that at the start of every class from now on we’ll be doing ‘micro revision’, 3 questions to jog our memory on previous topics.

Then, to address fear of making mistakes and a feeling of being put on the spot, I showed this clip from Jo Boaler.

I then said, ‘remember those coloured pop sticks you wrote your names on the other day?’ (Last Friday I had asked them to write their name on a coloured pop stick and used them as exit cards). ‘Well, for each question on the micro-revisions, I’m going to be using the sticks to select someone to share their thinking with the class.’ I then flashed up the following slide.Screen Shot 2015-04-20 at 6.11.05 pm

Then we got stuck in, and I revealed the first 3 micro-revision question (I also added in a challenge question in-case any of the students were streaming ahead. I stated that I wouldn’t be going through that one with the whole class). I watched the class and walked around to gauge when most of them had done about as much as they were going to.

I had a few questions planned, 1: What is this question asking us to do? What does it relate to that we’ve learned before? (More on metacognitive questioning and practices here), 2: Why did you do that?, i.e., just basically get them to justify their mathematical thinking. I also encouraged them to use correct terminology. Then… the moment of truth!

Screen Shot 2015-04-20 at 6.25.21 pmThe first student I asked said “I didn’t do it”. I replied “That’s ok, if you were going to do it now, how would you do it?”, he went right ahead and solved it in front of the class (him talking, me writing on the board). The second student was really happy and excited to share as well, and the third student even used the terms ‘numerator’ and ‘denominator’ in their answer which blew me away! I’ll also note that I was careful not to say ‘What’s the answer?’ but instead used the two questions outline above along with ‘Can you share with us how you approached this question?’. The emphasis was on sharing approaches, not on answers.

In closing, I was really happy with how the whole exercise came off, and I look forward to keeping up with the micro-revisions. I’ve also just had the thought of making Fridays ‘Feedback Fridays’, where I’ll give them five questions on the micro-revision and ask them all to hand in their answers so I can get a more clear picture of how each student is doing (That’s ‘Formative Feedback Fridays’ in teacher lingo ; ), and to write on the back of the sheet any feedback that they have on my teaching.

 

Metacognition: Can it help students problem solve?

I was recently doing a little reading into metacognition and began to wonder if it could be used as a tool in the Maths classroom to help students, particularly with problems solving. I got hunting in the research and found the following paper.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

I got a lot out of it, and thought some others might like to hear how the IMPROVE method works.

This study was done in the late 90’s in Israel. It was to test a modified teaching model based on the incorporation of three elements that aren’t always seen in the classroom

  • Metacognitive Training
  • cooperative learning
  • systematic provision of feedback-corrective-enrichment.

I’ll expand on each of these a little below, then talk about the results of using them in tandem (as was done by Mevarech and Kramarski).

Metacognitive Training

Screen Shot 2015-03-07 at 12.00.43 pm

http://upload.wikimedia.org/wikipedia/commons/d/d2/A_picture_is_worth_a_thousand_words.jpg

I like to think bout metacognition as stepping back from a situation and asking ‘how is my brain reacting to the stimulus here? And how would I like it to react?’. The IMPROVE method got students to begin to do this by introducing three new questions to the mathematics classroom. These three questions were made into cards and passed around when problems got challenging. The questions were

Comprehension Questions: What’s the problem actually saying. Students were asked to ‘read the problem aloud, describe concepts in their own words, and discuss what the concepts meant or into which category the problem could be classified’ (p. 374)

Connection Questions: “How does this question relate to things that you’ve seen before?”

Strategic Questions: All about how you’re going to attack a problem. Ask “What strategy/tactic/principle can be used in order to solve this problem?”, “Why” and “How will you carry this out?” (p. 376)

The idea of these questions was to help students to differentiate between equivalent problems (Qs with the same structure and ‘story context’), similar problems (different structure but same ‘story complex), isomorphic problems (same structure, different ‘story context’), and unrelated problems (very little in common). These categories are from Weaver and Kintsch (1992) and the terminology wasn’t taught to students.

Cooperative Learning

The method used followed Brown and Palincsar’s method (1989) in which students were put into teams of 4 students of 1 high, 2 middle and 1 low achieving student (p. 377). As students progressed teams were changed to maintain this structure. It was stated at this point in the paper that the question-answering technique based on that of Marx and Walsh (1988) was used following a brief teacher exposition of approximately 5 minutes. I plan to further explore this mentioned questioning technique.

Feedback-Corrective-Enrichment

At the end of each 10 or so lessons (constituting a unit) students took a formative test to check their comprehension of the unit’s main ideas. Tests were based on the work of Bloom (1976). Students who didn’t achieve ‘Mastery’ (taken as 80% correct) were given extra support to solidify the basics, students who did went on to enrichment activities. Essentially a form of differentiation.

The Studies

The paper detailed 2 studies that were undertaken. The first included 247 year 7 students split into an experimental (n=99) and control (n=148)  group and the second study consisted of 265 students (experimental n=164, control n=101)). The first study was completed in a region of heterogeneous classrooms (ie: students weren’t split into classes based on ability, ‘tracked’. These classes spanned more than 5 ability years) whilst the second was undertaken in a district where ‘tracking’ was the norm. The second study was undertaken in order to see if the IMPROVE method applied for an entire year would yield encouraging results as it did over the shorter period as in Study 1, as well as to expand the topics to which the method was applied.

Study 1 applied IMPROVE to the topics of rational numbers, identification of rational numbers on the number axis, operations with rational numbers, order of operations, and the basic laws of mathematics operations.

Study 2 applied IMPROVE to the topics of numerals and rational numbers, variables and algebraic expressions, substitutions in algebraic expressions, linear equations with one variable, converting words into symbols, and using equations to solve problems of different kinds.

Tests were composed of computational questions (25 items) and reasoning questions with no computational requirements (11 items).  The reasoning questions were marked based on the following progressive marking scheme: 1 point: Justification by use of 1 specific example, 2 points: reference to a mathematical law but imprecise, 3 points: references mathematical law correctly but conflict incompletely resolved, 4 points: question completely resolved with correct reference to relevant mathematical laws.

Based on pre-intervention test scores students were classified as low, middle or high achieving with pre and post test results compared within these groups.

Results

Screen Shot 2015-03-07 at 12.03.02 pm

p. 381

Study 1: No difference existed between control and experimental group prior to the intervention but IMPROVE students significantly outperformed those in the control post-intervention. Overall mean scores were 68.03 (control) vs. 74.72 (treatment) post-intervention (p<0.05) with means scores on the reasoning component 53.15 (control) vs. 62.56 (treatment). Improvements were seen at all achievement levels.

p. 384

p. 384

Study 2: As with study 1 mean scores for the experimental group increase significantly more than those in the control group with 2 important points to note. Firstly, only the gains to the high achievers group were statistically significant, with the medium achievers group being milldly significant (p-0.052). Low achieving treatment scores>Low achieving control scores in all cases but this result wasn’t statistically significant. Secondly, these trends held for all topics except for ‘operations with algebraic expressions’. It was suggested this was due to the fact that this unit required more practice than other units, thus, being a more of a procedural topic the benefits of metacognitive approaches weren’t as impactful.

Discussion

It’s clear that the IMPROVE intervention aided in student achievement. It increased their ability to draw on prior knowledge to solve problems and to establish mathematical mental schemata to increase their ease of access to this prior knowledge. One challenges with this study (as outlined by Mevarech and Kramarski themselves) was that the three elements; metacognitive training, co-operative learning, and feedback-corrective-enrichment were all applied simultaneously making it impossible to distinguish which of these was contributing by how much to the observed effects. Another question surrounds how this method appeared to facilitate gains to students proportional to their ability starting point, with higher achievers improving relatively more than middle and low achievers.

The authors suggest the program was successful in the following ways. it:

  • made it necessary for participants to use formal mathematical language accurately
  • made students aware that problems can be solved in multiple ways
  • encouraged students to see different aspects of problems.
  • gave high achievers opportunities to articulate and further develop their thinking processes at the same time as letting lower achievers see these thinking processes modelled

Post-intervention the IMPROVE method was implemented in all classes of all schools that the trials were performed in. 

Notes: I initially found this article via the article:  Schneider, W., & Artelt, C. (2010). Metacognition and mathematics education. Zdm, 42(2), 149-161. doi:10.1007/s11858-010-0240-2 . Schneider and Artelt’s article also outlined various other metacognitive training strategies that have been trialled in different classrooms. I chose to focus on Mevarech and Kramarski’s IMPROVE model here as it was a rigorous study and was cross referenced in several other papers also.

IMPROVE is an acronym for: Introducing new concepts, Metacognitive questioning, Practicing, Reviewing and reducing difficulties, Obtaining mastery, Verification, and Enrichment.

References:

Brown, A., & Palincsar, A. (1989). Guided cooperative learning: An individual knowledge acquisition. In L. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 393-451). Hillsdale, NJ: Erlbaum.

Marx, R. W., & Walsh, J. (1988). Learning from academic tasks. The ElementarySchool Journal, 88, 207-219.

Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A Multidimensional Method for Teaching Mathematics in Heterogeneous Classrooms. American Educational Research Journal, (2). 365.

Weaver, C. A., III,& Kintsch,W. (1992). Enhancing students’ comprehension of the conceptual structure of algebra word problems. Journal of Educational Psychology,84, 419-428.

 

Learning in the Fast Lane-Suzy Pepper Rollins, Book Summary

This is an experimental post format. I’m using a story as a memory device to generate a solid ‘memory anchor’ on which to attach the following information. Hopefully the content of this article will stick in your head better than it would if it was just in text format!

I came across Learning in the Fast Lane when I attended an online webinar with the author, Suzy Pepper Rollins (read about that webinar here).  I got so much out of the hour that I thought I’d make the time investment to read her whole book.

No regrets.

Here’s what I got. ..

The LITFL methodology consists of 6 steps that the books walks you through. Here’s an image and associated short story that I’ve put together to help me to remember the methodology.

Screen shot 2014-08-12 at 2.03.40 PM

So, this is the LITFL methodology.

  1. Generate Curiosity: “curiosity killed the cat”. A cat walks into a room
  2. Map Learning Goals: “the cat sat on the mat” The cat sits down on a mat, it’s one of those map-mats that kids sometimes play on
  3. Scaffold: The kids on the mat are building stuff
  4. Vocabulary: As you look closer, they’re building a taxi rank (cabs are in vogue… vogue-cab-ulary ; )
  5. Apply:  One of the kids applies pressure to the cat’s tail!
  6. Feedback: A parent comes in and provides some feedback to that child!

Now look back up at the picture and link all of the concepts to the images, play the story over in your minds eye, and see if you can recall all of the 6 steps with ease. 

Here’s those same points in Suzy’s words.

  1. Generate thinking, purpose, relevance and curiosity
  2. Clearly articulate learning goals and expectations
  3. Scaffold and practice pre-requisite skills
  4. Introduce and practice key vocabulary
  5. Apply the new concept to a task
  6. Regularly assess and provide feedback (ie: formative assessment)

Chapter Layout

LITFL cover

In Chapter 1 Suzy outlines this methodology and each of the chapters thereafter delves into detail about each of these elements, and more.

This is one of those books where it’s obvious that the author actually thought about what it would be like to use their book as a resource. Let’s take chapter 5 (on Vocab) as an example. Each chapter begins with a justification of why that chapter exists. Suzy tells us the following at the beginning of Chapter 5 (numbers refer to kindle locations, information paraphrased)

  • 1157:  3-year-olds from welfare families typically have 70%of the vocab of children living in working-class homes (Hart and Risley, 1995)
  • 1164: kids in grades 4–12 who score at the 50th percentile know 6,000 more words than 25th percentilers. (Nagy & Herman) 1984)
  • 1184: students need multiple exposures—typically, six—to new words to be able to grasp, retain, and use them (Jenkins et al, 1984)
  • 1194: there is a strong correlation between vocabulary knowledge and reading comprehension.  (Vacca & Vacca, 2002)
  • 1204:  students have just a 7 percent chance of understanding new words from dense text (Swanborn & de Glopper, 1999)
  • 1220: all students who received direct vocab instruction outperformed those who didn’t. (Nagy and Townsend, 2012)

Great, now we know that vocab matters! Suzy then goes on to the section ‘Strategies to Develop Strong Vocabularies’ and lists 9 different methods of introducing new vocab, she also lets us know that learning with pictures is 37% more effective than just learning off definitions (that’s why I included pics at the start of this blog post!). My favourite one of these 9 methods is the TIP (A poster with Term, Information, Picture on it), which I wrote a bit more about here.

The chapter concludes with a “Checklist for vocabulary development” to ensure that you’re on track and for quick reference.

Every chapter is like this, it covers the Why, How and the What in a way that’s both practical and engaging. I got a lot out of this book and will continue to use it as a resource. I loved getting the whole picture from a front-to-back read but I think it would also be great as a quick reference guide for the educator who’s looking for ‘apply in class tomorrow’ kind of ideas.

See below for my summary notes. There’s a lot of them, it was a super info dense book and excellently referenced. Good stuff!

note:  numbers refer to Kindle locations, click the  Screen shot 2014-08-12 at 3.14.04 PM image top right to make the display bigger in another page.

 

 

 

Key take-homes from webinar with Suzy Pepper Rollins-Learning in the Fast Lane

This post is part of an ongoing series entitled “Wot-I-Got”. This series acts as a way for me to share Wot-I-Got out of a book or presentation and whet your appetite for enquiry. It also forces me to finish books that I start, then to review and summarise my conference notes!

This morning I woke before my alarm at 4:43am (Australian Eastern Standards Time) eager to jump into the ASCD webinar on Learning in the Fast Lane, with Suzy Pepper Rollins. Here’s some things that Suzy suggested could be done in classrooms and some of the ideas that her techniques sparked in my mind.

[update: My summary of the complete book is now available here]

Standards Walls and TIPS: Suzy suggested using Standards Walls. A standards wall is a poster or space on the wall to outline all of the learning goals for a class. But it isn’t just an A4 printout. It’s a big colourful poster that’s constantly evolving and edited by students. It’s acts as a reference for you to say “this is what we learnt last week, and this is what we’ll focus on today” and it provides a context for each lesson, building connections between the numerous concepts in a course. They also provide a framework for assessment and feedback by ensuring that both are in line with the learning goals.

TIP chart

TIPS: Based on the fact that it takes the average student 6 times to learn a word, Suzy suggests a chart with three columns: Term, Information, Picture to help students learn new words. This is better than just a vocab chart because “there’s a 35% increase in retention of words if there’s a picture”. Again, this can be an evolving chart that’s stuck on the wall and acts as a reference for students as you move through content.

An idea sparked: I thought that instead of just adding new terms to the TIP as you introduce them in class you could award prizes or rewards to students who interject your teaching with “TIP” when you use a new term that they haven’t heard before. A teacher could deliberately place certain new terms throughout the lesson and introduce the concept with something like “today you will hear 5 new terms, prizes for anyone who shouts ‘TIP’ first when I say each term for the first time”.

Scaffolding for Rigor: Scaffolding is, in Suzy’s words “plugging holes in the boat whilst moving forward”. The basic idea is that learning today depends on learning in the past, and one of the biggest challenges that students and teachers face when covering new content is the fact that they don’t have the prior knowledge required. “Scaffolding” refers to asking yourself prior to the class “My students could master this concept if only they knew…” then establishing ways to ‘scaffold’ learning by trying to plug these knowledge holes in the lesson. Scaffolding devices that were suggested are bookmarks (students make bookmarks with times tables or the like), posters on the wall, sticky notes and one really original example was a number line posted the ceiling! I must admit, I would have loved hear about more scaffolding devices for Math particularly and, unfortunately, the webinar didn’t have time to get to my question of “How do we scaffold for holes that are really big?” eg: the student still can’t do algebra!

Success Starters: Suzy mentioned that in the book How the Brain Learns Mathematics by David Souza, David suggests that students remember what’s covered mostly in the first few minutes and in the last few minutes of a class. The idea of a “Success Starter” is it’s like a warmup but it’s content focused to take advantage of those vital learning minutes at the start of a class. Students can never take in all of the information that they encounter throughout the day, so they are constantly selectively deleting information and (consciously or unconsciously)  asking  themselves questions like “what’s this got to do with me?” and “will I have a good chance of being successful at learning this?”. Some examples of Success Starters are doing a survey that’s related to the class content or getting students to do an “Alphabetic Brainstorm, essentially trying to think of something that starts with A, B, C, D, etc (you can restrict the range to keep the task shorter) that’s related to one of the concepts that you’ll be covering during the day. This also links to Suzy’s suggestion that one of the biggest determinants of success in class is prior knowledge. So alphabet brainstorms really tap into that prior knowledge and share it around with the wider class.

An idea sparked: I thought that a good success starter could be to have different tasks or questions on different tables, and get students to walk around the room and look at the questions and decide to sit at the table with the question that they’d most like to address, or the piece of information that they would most like to learn. Alternatively, you could run a pop quiz 5 minutes into the lesson, but don’t call it a pop quiz, call it a “peep quiz” , and the students actually have the answers to the quiz sitting on their tables when they walk into the room and they have 5 minutes to peep the answers before they do the quiz. This could be used well in a summary class where the remainder of the class could be used to focus on points that the students struggled with in the peep quiz.

See what they’re doing minute-by-minute, Formative Assessment: Many readers will be familiar with formative assesment so I’ll just point out the techniques that were suggested in the webinar:

  • Bow TieBow Tie: Get work on a task together on a big ‘bow tie’
    where the centre acts as a 
    space for consensus answers (see pic)
  • Sorts: Sorts are getting students to sort information (often presented on small cards) to promote discussion about categories and aid comprehension
  • Cubes: Teachers can make ‘cubes’ that have questions (or components of questions) on them and students can roll the cubes to introduce see which question to do next (or to make up a question. Eg: multiply the number on the first cube by the no on the second)

An Idea sparked: I’ve been thinking about employing Flubaroo in the classroom for a while but I had 2 ideas of how to make this a bit smoother.

Firstly: Get students to only enter their answers into the google docs solutions page at the end of the test, give them 3 mins or so to do this. This has multiple benefits

  • students won’t get distracted by their computer during the test
  • their writing will have to be tidy so that they can see what they did
  • this forces them to look back over their working and answers
  • It would help to avoid them googling answers (due to the limited time)
  • The teacher could sit at the back of the class and watch screens for this short period of time

Secondly: Double tests in one class! Test students once, and aportion some level of value to the first test, then you use the feedback from that first test instantly motivate them to learn to study hard and fast for a second test (may be better for tests for which marks aren’t recorded as don’t want to put too much pressure on students). Furthremore, the info from test 1 could allow a teacher to match up students that got the answer right with students who could do with a little help (for the qs where there is a balance of correct and incorrect answers). For the Qs where lots of students need help, a teacher could go over the process on the board for the whole class’ benefit.

Self-Efficacy Development: It was suggested that a teacher should consider the following when trying to build self efficacy among students

  • level of difficulty of the task
  • value of task (perceived and real)
  • Incorporate choices and social interactions into learning
  • compelling openers that build success (see “Success Starters” above)
  • safe for mistakes (try to foster a culture of sharing in the classroom)
  • short term goals (keep learning goals relevant and today)
  • ongoing, quick feedback (see “See what they’re doing” above)

Building self-efficacy among students remains one of the biggest and most important challenges for teachers.  Unfortunately, with the 4 or so minutes that we had a the end of the webinar to cover this point we weren’t able to go into that much depth.

Summary: I found the webinar extremely insightful and Suzy’s enthusiasm and obvious experiences of success were really inspiring. I found the learning in the fast lane approach to be a useful collection of practical suggestions for the classroom and particularly like how teachers can pick and choose which elements of the approach to implement and which to leave out. It isn’t an “all or nothing” model.

A great summary of the the collection of articles that form the pedagogical  basis for the Learning in the Fast Lane model can be found here.

A podcast with Suzy Pepper Rollins can be found here.