TOT012: Knowledge Management, Hiring, + more Twitter Takeaways

Teacher Ollie’s Takeaways is a weekly post (and sometimes a podcast!) bringing together all of the fascinating things that Ollie read throughout the week! Find all past posts of Teacher Ollie’s Takeaways here

A great summary of Cognitive Load Theory

For those interested in CLT, I’ve found no better simplified account of it than this whole website by Michael Pershan. Here’s an excerpt or two.

From the page: The Difference Between Solving a Problem and Learning Some Math From It

If problem solving was ineffective for learning to win a simple game, then it would likewise be trouble for learning something more complex, such as an algebraic procedure. Sweller designed experiments that allowed him to observe novices attempting to solve mathematics problems. He saw the same thing: beginners chose “search” strategies that drew attention away from the sorts of observations that might lead to obtaining a more powerful strategy. If teachers wanted to foster expertise, they would need techniques to circumvent these learning-killing search strategies.

To discover a pattern or a rule, one needs to look away from the goals and their present progress, and instead turn to work in the past. What moves have you already tried? Which combinations of moves work particularly well together? Which angles in a diagram, when derived, help you calculate other angles? By eliminating a single, clear goal for participants to fixate on, participants were free to notice patterns in their past moves. (And if there was a gap between their current status and a goal? They could discard the goal and choose another, instead of working backwards to derive it.) This freedom to think about the past is precisely what is needed for discovering useful, expert-like shortcuts. Sweller’s results showed that these discoveries did, in fact, take place more frequently when problems were given with nonspecific goals. Therefore, nonspecific goals were better for learning than conventional problems.

Worked examples are not problems – they are explanations of how a problem is correctly solved. Goal-free problems function by eliminating means-end search, instead drawing participants’ attention to their past successes.

In another series of experiments, Sweller carefully tested this idea. His results confirmed the hypothesis: the quality of learning was the same whether students learned via worked examples or self-discovered solutions. The major difference was time – problem solving took a lot of it! Worked examples took far less time. In this sense, explanations were more efficient than discovery.

From the page: The Invention of Cognitive Load Theory

“There can be only one ultimate goal,” he wrote, “the generation of new, useful instructional techniques.” Goldman may be right — CLT can not explain learning, in general — but that’s not its purpose. The purpose of CLT, for Sweller, was inventing new teaching techniques.

The best article I’ve ever read on knowledge management within schools

This article by Harry Fletcher-Wood suggests a tangible template that can be used to help experienced teachers to sketch out key information, like student misconceptions, horizon knowledge (how current learning is related to future learning), and key sequencing, to help with knowledge management within a school. Here’s what Harry says about it.

More powerfully, I think a template like this can draw on and collate the collective wisdom of teams of teachers. Lesson plans and powerpoints rarely travel well: collections of representations and misconceptions will: teachers can easily use a good representation, no matter what their teaching style or context. A collection of good representations is transferable between different contexts, in the way that a lesson plan is not. Much of this knowledge is tacit, held in the heads of experienced teachers, passed on by word of mouth and implicit in resources. Collaboratively constructing such planning documents could also be a productive way to share knowledge within departments.

Quick Tip for Leading a Team

And here’s the key takeaway for me.

You need to remember, it isn’t your job to lead each item. The more others take the lead, the more you will be working as a team rather than as a group of individuals that are doing what they are told.

How much do different types of teacher training cost? (plus, info on dropout rates)

Great publication entitled .

The Do’s and Don’ts of effective and efficient marking

Hot tip: The students should spend more time reviewing the feedback than you do writing it!

Nel Noddings… What is caring anyway?

Turns out that to be caring, the cared-for has to interpret the caring as caring. Interesting… has implications for looking after those who are struggling with mental health issues in particular. I like how it sets up a kind of society where I know my autonomy will be preserved

Understanding the types of evidence in Ed Research

Dylan William Treasure Trove of info!

And there are lots of videos too! . This list is awesome. And so many of them are super short too, really easy to digest : )

Four questions to ask yourself at the start of any initiative

In this post, Mark Enser suggests that we need to develop systems and culture in tandem to achieve sustainable change.

And here are the four questions:

1. What is the purpose? What culture are we trying to achieve through this? What impact are we hoping for?
2. How will it be supported? What structures will we put in place to achieve this?
3. What will be the success criteria? Set in advance please! How will we know it has been successful when we evaluate it?
4. How does it fit in the time budget? Where is the time coming from? Most school leaders say that they feel their teachers are already working as hard as they can – so what are you taking out to make room for this?

Screen Shot 2017-05-15 at 2.47.41 pm

The ultimate question to ask when lesson/unit planning

I think that any maths educators would enjoy reading this piece in full!

6 Edtech tools to explore in 2017

Systematic review of mindfulness interventions

Conclusion: “The findings show that MBIs in schools had a small positive effect on cognitive outcomes and socioemotional outcomes, but did not improve behavior or academic achievement. There was little heterogeneity for all outcomes, apart from behavioral outcomes, suggesting that the interventions produced similar results across studies on cognitive, socioemotional, and academic outcomes, despite the interventions being quite diverse. Overall, Brandy Maynard and colleagues found a lack of support at post-test to indicate that the positive effects on cognitive and socioemotional outcomes then translate into positive outcomes on behavior and academic achievement.”

Restorative justice questions

For more on this, listen to episode 6 of the ERRR podcast!

The ultimate guide to conducting school interviews

In this series of posts, David Didau brings psychology to bear on the teacher interview process. How do our unconscious biases skew our selections, and what can we do to get around this challenge?

Daniel Kahneman offers some useful suggestions in Thinking, Fast and Slow:

If you are serious about hiring the best possible person for the job, this is what you should do. First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on). Don’t over do it – six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of the those questions for each trait and think about how you will score it, say on a 1 – 5 scale. You should have an idea of what you will call ‘very weak’ or ‘very strong’. (p. 232)

Didau offers just such a list for hiring teachers in blog post 2. Check it out!

ERRR Podcast #005. Pamela Snow, Phonics + How can we get the real story from students?

Listen to all past episodes of the ERRR podcast here.

In this episode of the Education Research Reading Room we were lucky enough to have as our guest Professor Pamela Snow.

Pamela is a registered psychologist, having qualified originally in speech pathology. Her research has been funded by nationally competitive schemes such as the ARC Discovery Program, ARC Linkage Program, and the Criminology Research Council, and spans various aspects of risk in childhood and adolescence, in particular:

-the oral language skills of high-risk young people (youth offenders and those in the state care system), and the role of oral language competence as an academic and mental health protective factor in childhood and adolescence;
-applying evidence in the language-to-literacy transition in the early years of school;
-linguistic aspects of investigative interviewing with children / adolescents as witnesses, suspects, victims in criminal investigations;

Pamela has taught a wide range of undergraduate health professionals, and also has experience in postgraduate teacher education. She has research links with the education, welfare and justice sectors, and her research has been published in a wide range of international journals. She is frequently called upon to address education, health, welfare, and forensic audiences.

Pamela’s Twitter handle is @PamelaSnow2 and her blog The Snow Report can be found at http://pamelasnow.blogspot.com.au/

This month we have two articles from Pamela. Guidelines for teachers to elicit detailed and accurate narrative accounts from children and The way we teach most children to read sets them up to fail. The first article is a truly gripping piece on how to talk to students in a way that makes them feel comfortable and willing to share what’s happening at home (when appropriate) or what happened following an incident at school. The second article is a concise and valuable overview of the current landscape of effective literacy instruction.

Links mentioned in the podcast:

Links from the intro/outro.

Links from the body of the interview.

How do we know what to put on the weekly quiz?

I’ve really enjoyed working my way through Brian Penfounds series of three (1, 2, 3) blogposts in his Journey to Interleaved Practice series recently. They detail how, prompted by a discussion with the Learning Scientists, Brian has been incorporating interleaving into his integral calculus class.

One particular instrument got me excited, it’s an excel spreadsheet that can be used to interleave questions when you’re planning both lessons and quizzes (see the blank version here (edit: Learning scientists just released a new version here) and Brian’s version here). Here’s a screenshot to give you a taster.

Screen Shot 2017-04-20 at 9.11.59 pm

Being the focussed (and sometimes obsessed) learning strategist that I am, I really loved this idea. But it got me thinking, is this better than what I’m already doing? Should I adapt my current practice to incorporate this approach?

I’ve written about my assessment  and feedback process before here , in which I talk about the weekly quizzes that I give students, and how they incorporate content from the previous three weeks. This means that students see content for a month in a row (in the teaching week, then in the three weeks after that), then they’ll see it in the unit test (maximum 4 weeks later, as each topic is approx. 8 weeks long) then in the mid-year practice exam, then in the end of year exam.

I wanted to take the opportunity to share how I actually choose which questions to put on these weekly tests (or ‘Progress Checks’ (PC) as they’re called in my classes).

Each week I run the PC, students self mark in class immediately after, then I collect up the PCs. I keep them overnight and return them to students the next day (for two of my classes, the third class waits for 3 days due to timetabling) and in the meantime I enter the marks into my gradebook. When I return the PCs to students (I do this once they’ve settled into some question work), I carry around a little notebook and have a mini-conference with each student, the questions I ask are generally

“How do you feel you went?”

‘What did you get wrong?”

“What mistake did you make?”

“How much prep did you do for this Progress Check?”

And finally

“Which question numbers did you get wrong”.

From that, I collate the following.

MAFPCW5_Hard Qs (de-identified)

(For any student who doesn’t demonstrate that they prepared for the PC, they get a detention, which I also note on this sheet).

I then take a photo of this and store it with the progress check itself, like so.

Screen Shot 2017-04-20 at 9.23.20 pm

Then, when it comes time to write the next week’s PC, I feed in the questions that were answered incorrectly (variations thereof) as well as new content, in addition to other concepts from the previous 3 weeks that I think also important to touch on again.

I was really excited by the excel approach, but I’m still very attached to the adaptive approach that I’m using. Perhaps the optimum would lie somewhere in-between, using both a more-complex structure than ‘the last 3 weeks’ (such as is offered by the excel spreadsheet), plus some element of adaptability to the questions and concepts that students are clearly struggling with.

An opportunity for further exploration!

TOT011: Teaching Curiosity, Pre-questioning + more Twitter Takeaways

Teacher Ollie’s Takeaways is a weekly post (and sometimes a podcast!) bringing together all of the fascinating things that Ollie read throughout the week! Find all past posts of Teacher Ollie’s Takeaways here

How do we teach Curiosity?

In this blog post Michael Fordham writes that we can’t teach curiosity in the abstract, we need to teach students things that they can therefore be curious about.

Here’s an excerpt.

On this line, when I say that I want to cultivate the curiosity of my pupils, what I am in practice saying is that I want them to be curious about a greater range of things. I want to bring more parts of our reality into the realm of their experience. I cannot make them more or less curious per se: what I can do is give them more things to be curious about.

This is why memories are so important to me. A pupil who has remembered some of the things I taught her about neoclassical architecture is more likely to be curious about a building built in that style. Indeed, she may well be more likely to be curious about a building not built in that style. Another pupil who remembers something I taught him about the causes of cholera in the nineteenth century might have his ears prick up when he hears about an outbreak, or reads about it somewhere else. This is in part what I think people mean when they say that knowledge breeds more knowledge.

Should we use pre-questions?

This is a fantastic article detailing a study by Carpenter and Toftness that explores whether or not we should ask pre questions. Here’s what it found.

  1. The benefit of prequestioning prior to reading is that it improves students’ retention of the information that was asked about
  2. The cost of prequestioning prior to reading is that it reduces student’s retention of information that wasn’t asked about
  3. Interestingly, when pre-questioning for video we see a boost of retention of both prequestioned and non-prequestioned information.

So why is this?

Authors suggest that it could be because when an individual is reading, it’s easier for them to ignore information and focus on the pre-questioned information. When watching a video, the effect is for students to simply focus harder the whole time.

Podcast with Michaela Head of Mathematics, Dani Quinn

Well worth a listen, I’ll leave it at that.

The more a teacher knows about how to teach their subect, the more they should use direct instruction

In this post, Greg Ashman outlines how the work of Agodini and colleagues pitted two constructivist based approaches against two direct instruction approaches to middle years maths instruction (in a RCT). A recent analysis of their results by Eric Taylor found that for teachers who scored lower on a ‘Mathematical Knowledge for Teaching’ test (i.e., PCK), there was less difference between the outcomes of the constructivist and the DI methods. Ashman explains this as follows.

In a program where the teacher has to stand up and actually teach maths, their maths skills matter, but when the students have to figure things out for themselves then the more skilled teachers have no way of making use of their greater skill level.

And from this, Ashman makes the following suggestions.

  1. Primary teachers must pass a maths skills test if they are to teach mathematics (schools could perhaps reorganise so that maths was taught by specialists to get around the problem of getting all teachers to this level)
  2. Primary teachers who lack maths skills should be given training in this area
  3. Explicit programs for teaching maths should be adopted in primary schools

How to rebut an argument with style

With Name-calling at the lowest rung on the disagreement hierachy we move through Ad Hominem, Responding to Tone, Contradiction, Counterargument, Refutation, and conclude with Refuting the Central Point. A relevant article in these times of online jousting.

Why do some Immigrants Excel in Science?

The study by Marcos Rangel reported in the article found that a particular set of characteristics were associated with immigrant students doing particularly well in Science. The article states.

Bacolod and Rangel subdivided the immigrants in two ways. First, whether they arrived in early childhood, before age 10. Second, whether their native language was linguistically close to English — say, German — or less similar — say, Vietnamese. Most linguists agree that these two factors have a dramatic impact on someone’s chances of becoming perfectly fluent in a second language…

…among the subset of immigrants who attended college, the ones who arrived later and from more linguistically distinct places — think the Vietnamese teen, not the German toddler — were many times more likely to major in a STEM field.

The authors argue that this is simply specialisation suggesting “If it were just as easy for me to write with my left hand as with my right, I would be using both. But no, I specialize,”. So, in many ways, it appears to be a very rational decision. For those learning a second language later on in life, the greatest chance at success is to focus on an area where a potential language differential less threatens to be an achilles heel.

Hey teacher, are you really as good as you think at explaining things?

In this post, Ben Newmark details his somewhat humbling journey to clearer explanations for his students, and the role that videotaping himself played in this journey.

To remember: the phrase ‘Illusory superiority” coined in 1991 by Van Vperen. We tend to overestimate our abilities in relation to others.

Assessment feedback: Processes to ensure that students think!

We know that ‘Memory is the residue of thought’ (Daniel Willingham) and that in order for our students to learn they must actively think about the content to be learnt. This allows this content to occupy their working memory for long enough, and become anchored to sufficient elements in their long term memory, to trigger a change in long term memory, one of the well respected definitions of ‘learning’ (Paul Kirschner).

One of the arenas of teaching in which this can be most challenging is that of feedback delivery to students. Dylan Wiliam sums it up well in the following quote (Which I came across thanks to Alfie Kohn).

Note: The original quote is “When students receive both scores and comments, the first thing they look at is their score, and the second thing they look at is…someone else’s score”, and can be found here (beware the paywall). 

The challenge is, then, how do we give feedback to our students in a way that encourages them to actively think about their mistakes, and helps them to do better next time?

In the following I’ll share how I give feedback to students in two contexts. The first is on low stakes assessments that I carry out in my own classroom, the second is on major assessment pieces that contribute towards their final unit mark.

Assessment Feedback on weekly Progress Checks.

Before we dive in I’ll just paint a picture of how my weekly ‘Progress Checks’ fit into my teaching and learning cycle, and how each of these elements is informed by education theory.

At the start of each week students are provided with a list of ‘weekly questions’. They know that the teaching during the week will teach them how to answer these questions. Questions are to be aligned with what we want students to be able to do (curriculum and exams) (Backwards Design). Students are provided with worked solutions to all questions at the time of question distribution (The worked example effect). The only homework at this stage of the cycle is for students to ensure that they can do the weekly questions.

Progress Checks’ (mini tests, max 15 minutes) are held weekly (Testing Effect). Progress checks include content from the previous three weeks. This means that students see the main concepts from each week for a month (Distributed Practice). These PCs are low-stakes for year 11 students (contribute 10% to their final overall mark) and are simply used to inform teachers and students of student progress in year 12 (where assessment protocols are more specifically defined).

Edit: Here’s a new post on how I use student responses to these PCs to construct the next PCs. 

When designing the progress checks I had two main goals: 1) Ensure that students extract as much learning as possible from these weekly tests, 2) Make sure that marking them didn’t take up hours of my time. The following process is what I came up with.

Straight after the PC I get students to clear their desks, I hand them a red pen, and I do a think-alound for the whole PC and get them to mark their own papers. This is great because it’s immediate feedback and self marking (See Dylan Wiliam’s black box paper), and it allows me to model the thinking of a (relative) expert, and to be really clear about what students will and won’t receive marks for. Following this, for any student who didn’t attain 100% on the progress check, they choose one question that they got incorrect and do a reflection on it based on 4 questions: 1) What was the q?, 2) Which concept did this address?, 3) What did you get wrong?, 4) What will you do next time?

Here are some examples of student self-marked progress checks and accompanying PC reflections from the same students (both from my Y11 physics class). Note: Photos of reflections are submitted via email and I use Gmail filters to auto-file these emails by class.

Brandon PC

Note how this student was made aware of consequential of follow through marks on question 1.

Here’s the PC reflection from this same student (based upon question 2).

B PC ref

Here’s another students’ self-marked Progress Check.

R PC

And the associated reflection.

Screen Shot 2017-04-11 at 7.18.54 am

Screen Shot 2017-04-11 at 7.19.47 am

Students are recognised and congratulated by the whole class if they get 100% on their progress checks, as well as one student from each class winning the ‘Best PC Reflection of the Week’ award. This allows me to project their reflection onto the board and point out what was good about it, highlighting an ideal example to the rest of the class, celebrating students’ successes, rewarding students for effort, and framing mistakes as learning opportunities.

I think that this process achieves my main two goals pretty well. Clearly these PCs form an integral learning opportunity, and in sum it only takes me about 7 minutes per class per week to enter PC marks into my gradebook.

Assessment Feedback on Mandated Assessment Tasks.

There are times when, as a teacher, we need to knuckle down and mark a bunch of work. For me this is the case on school assessed coursework (SACs), which contribute to my students’ end of year study scores. I was faced with the challenge of making feedback for such a test as beneficial to my students’ learning as the PC feedback process is, here’s what I worked out.

  1. On test day, students receive their test in a plastic sheet and unstapled.
  2. At the start of the test, students are told to put their name at the top of every sheet.
  3. At the end of the test I take all of the papers straight to the photocopier and, before marking, photocopy the unmarked papers.
  4. I mark the originals (Though the photocopying takes some time I think that in the end this process makes marking faster because, a) I can group all page 1s together (etc) and mark one page at a time (this is better for moderation too) and b) because I write minimal written feedback because I know what’s coming next…)
  5. In the next lesson I hand out students’ photocopied versions and I go through the solutions with the whole class. This means that students are still marking their own papers and still concentrating on all the answers.
  6. Once they’ve marked their own papers I hand them back their marked original (without a final mark on it, just totals at the bottom of each page), they identify any discrepancies between my marking and their marking, then we discuss and come to an agreement. This also prompts me to be more explicit about my marking scheme as I’m being held to account by the students.

In Closing

I’ve already asked students for feedback on the progress checks through whole class surveys. The consensus is that they really appreciate them and they like the modelling of the solutions and self-marking also. One good thing is that putting together this post prompted me to contact my students and ask for feedback on the self-marking process of their photocopied mandated assessment task. I’ll finish this post with a few comments that students said they’d be happy for me to share. It also provides some great feedback to me for next time .

I’d love any reflections that readers have on the efficacy of these processes and how they could potentially be improved.

From the keyboards of some of my students (3 males, 3 females, 5 from Y12, one from Y11).

Screen Shot 2017-04-12 at 9.09.34 am

Screen Shot 2017-04-19 at 9.23.09 amScreen Shot 2017-04-12 at 9.17.38 am Screen Shot 2017-04-12 at 9.06.27 amScreen Shot 2017-04-12 at 9.08.26 am

Screen Shot 2017-04-13 at 11.32.22 am

Edit:

A  fellow maths teacher from another school in Melbourne, Wendy, tried out this method with a couple of modifications. I thought that the modifications were really creative, and I think they offer another approach that could work really well. Here’s what Wendy said.

Hey Ollie,

I used your strategy today with photocopying students’ sacs and having them self correct. The kids responded so well!

Beyond them asking lots of questions and being highly engaged, those that I got feedback from were really positive saying how it made them look at their work more closely than they would if I just gave them an already corrected test, understood how the marking scheme worked (and seeing a perfect solution) and they liked that they could see why they got the mark they did and had ‘prewarning’ of their mark.

Thanks heaps for sharing the approach.
A couple of small changes I made were
  • I stapled the test originally then just cut the corner, copied them and then restapled. It was very quick and could be done after the test without having to put each test in a plastic pocket
  • I gave the students one copy of the solutions between two. Almost all kids scored above 50% and most around the 70% mark, and I didn’t want them to have to sit through solutions they already had.

if you have thoughts/comments on these changes I’d love to hear them.

Thanks again!

References

Find references to all theories cited (in brackets) here.

TOT010: The limits of ‘evidence based’ education + more Twitter Takeaways

Teacher Ollie’s Takeaways is a weekly post (and sometimes a podcast!) bringing together all of the fascinating things that Ollie read throughout the week! Find all past posts of Teacher Ollie’s Takeaways here

 Sensitive Instruction of English

Really enjoyed this episode of the Cult of Pedagogy Podcast. It can be a challenge to know how to correct the culturally-based idiosyncrasies in our students’ speech in a culturally-sensitive way. Jennifer Gonzales and Dena Simmons discuss how to do this with both respect and finesse. Well worth a listen!

Challenging the fallacious Fact/Value divide in Education Research

I’m naturally a very quantitatively driven guy. I seem to be drawn to numerical metrics of success, sometimes missing the forest for the trees. Something I’ve been exploring a lot recently is the assumptions underlying much educational research. Here’s just one of the blog posts that I’ve found stimulating in this space in the last few weeks. I’ll hopefully blog more about this in the not too distant future.


On the ‘Fact/Value’ false dichotomy. 

The one side asserts the importance of facts and thinks you cannot argue rationally using evidence  about values so excludes them from science, the other side asserts the importance of values and agrees that these cannot be put on a proper research footing so exclude themselves from science. But what if the claim introduced so casually by Hume nearly 300 years ago is simply wrong? What if we can derive values from an investigation of the facts?

And values are always present

Values enter into research when we select what to look at, when we decide how to look at it and when we interpret the meaning of what we think we see.(Standish, 2001). So values are always implicit behind ‘experimental designs’.

Double loop learning!

Double Loop Learning

My suggestion from this example is that what appears to many researchers as an unbridgeable divide between facts and values within educational research is perhaps better understood as the real difference in quality and temporality of these two intertwined research loops. On the one hand focused empirical research within a theoretical framework that embeds values and on the other hand the rather larger and rather longer dialogue of theory that can question and develop the assumptions of research frames. Both loops can be seen as united in a larger conception of science as collective ‘wissenshaft’ or shared knowledge. Both call upon evidence from experience and both make arguments that seek to persuade. While research findings from the smaller loops of empirical science based on narrow frameworks of assumptions can seem to progress faster and be more certain for a while than the findings of the larger borderless transdisciplinary shared enquiry of humanity this is an illusion because in fact the cogency of the assumptions behind narrow research programmes depend upon justifications that can only be made within the larger dialogue.

This boils down to the fact that we need to ask ourselves… ‘more efficient at what?’


And here’s another quote from a Schoenfeld article on the same topic!

Do Comprehensive Schools Reduce Social Mobility?

Just a paper that I thought some readers might like to check out on this subject!

Boliver, V., & Swift, A. (2011). Do comprehensive schools reduce social mobility? 1. The British journal of sociology, 62(1), 89-110

Maybe the source of PISA discrepancy is in large part due to paper vs. computer based implementation!?!

 

A Behaviour Management Guide for School Leaders

Google Drive Tools for Teachers

Addressing issues of cultural and political sensitivity in the classroom

This article is well worth a read! Here are some of my favourite quotes…

“When it feels more partisan, we walk more of a tightrope. For the ‘alt-right,’ I didn’t feel we had to walk a tightrope,” said Leslie, who viewed teaching about the alt-right as akin to teaching about the KKK. Racism ought to be a non-partisan subject, she said.

Learning about the alt-right, for example, is a lesson in political literacy. Teachers should not ask students to decide whether the alt-right is a good thing, but they can teach how it came about and how it has affected the political system, Hess said.

 

ERRR Podcast #004. Paul Weldon, Teacher Supply and Demand, and Out of Field Teaching

Listen to all past episodes of the ERRR podcast here.


Paul Weldon  is a Senior Research Fellow with the Australian Council for Educational Research. He works on multiple different educational research programs and is commonly involved in program evaluation and the design, delivery and analysis of surveys. Through his work on the Staff in Australia’s Schools (SiAS) surveys in 2010 and 2013, Paul developed a particular interest in the teacher workforce. He was the lead writer of the most recent Victorian Teacher Supply and Demand Report, and led the recent AEU Victoria workload survey.

In this episode we talk to Paul about his two papers, The Teacher workforce in Australia: Supply, demand and data issues and Out-of-field teaching in Australian secondary schools. This episode’s discussion includes an in-depth examination of the ’30% of teachers leave with in the first 3 years and 50% within the first 5’ that’s often quoted in relation to retention of early career teachers, the landscape of teacher supply and demand out to 2020, as well as what the distribution of out of field teaching in Australia says about how we value our out of field teachers.

Links mentioned in the podcast:

 

Australian Policy Online. ‘a research database and alert service providing free access to full text research reports and papers, statistics and other resources essential for public policy development and implementation in Australia, New Zealand and beyond.’. 

My attempt at an evidence informed student feedback form.

Seeing as my students have to endure my presence, instructions, and bad jokes for 3 hours and 45 minutes each week, I figure the least I can do is give them an opportunity to tell me how I can make this task a little easier for them. In my first year of teaching I knocked together the below form. I’ve used it for a year now and it’s been really helpful to date. In particular, it’s helped me to bring more celebration into my classroom, with many students over the past year indicating that they want their successes to be celebrated more (usually with lollies!). 
Screen Shot 2017-04-01 at 6.27.41 pm

This has been great, but as I’ve moved into my role as head of senior maths this year it’s prompted me to think more strategically about student feedback, and the role it can play in my own, and my team’s professional development.

No feedback form is going to tell a teacher, or a team leader, everything they need to know in terms of ‘Where am I going? How am I going? Where to next?’, but I’ve been feeling more and more as thought these forms do have a key role to play in helping teachers to spot gaps, and  motivating and inspiring us to improve our praxis.

I was really happy with the willingness of my team to roll out the above form (Obviously with ‘Ollie’ changed to their individual names) in their own classes, and the insights gained were very illuminating. But coupling these feedback forms with my own observations provided and even bigger insight for me. It surprised me just how differently student (novices when it comes to principles of instruction) and I (a relative expert) view what happens in a classroom.

From this it’s became more apparent to me that if I want student feedback to more effectively drive my own professional development, I need to start asking better and more targeted questions that will allow me to see exactly where my teaching is excelling, and where I’m falling short.

So, here’s a first draft of the new feedback questions (which I’ll eventually turn into a google form). I’ve based it off the Sutton Trust’s article What makes great teaching? Review of the underpinning research, headed up by Robert Coe. I’ve used the first four out of the six “common components suggested by research that teachers should consider when assessing teaching quality.” (p. 2). These are the components rated as having ‘strong’ or ‘moderate’ evidence of impact on student outcomes, and they’re also the components with observable outcomes in the classroom (5 and 6 are ‘Teacher Beliefs’ and ‘Professional Behaviours’, which encapsulate practices like reflecting on praxis and collaborating with colleagues).

For each of the following I’ll get students to rate the sentence from 1, strongly disagree, to 5, strongly agree, in the hope that this will give me a better idea of how students interpret the various components of my teaching and teacher disposition.

I’ll also add a question at the end along the lines of ‘Is there anything else you’d like to add?’.

I’ve numbered the Qs to make it easy for people to make comments about them on twitter. This is a working document and today is the second day of our 2 week Easter break. I’m keen to perfect this as much as possible prior to Term 2. Please have a read and I’d love your thoughts and feedback  : )

Ollie.

Link to Twitter discussion here.

Edit: A copy of the live form can now be viewed at: http://tiny.cc/copyscstudentfeedback

Four (of the 6) components of great teaching (Coe et al., 2014). Questions.
1. (Pedagogical) content knowledge (Strong evidence of impact on student outcomes)

Student friendly language: Knowledge of the subject and how to teach it.

The most effective teachers have deep knowledge of the subjects they teach, and when teachers’ knowledge falls below a certain level it is a significant impediment to students’ learning. As well as a strong understanding of the material being taught, teachers must also understand the ways students think about the content, be able to evaluate the thinking behind students’ own methods, and identify students’ common misconceptions.

1.1 This teacher has a deep understanding of the maths that they teach you. They really ‘know their stuff’.

1.2 This teacher has a good understanding of how students learn. They really ‘know how to teach’.

 

If you have any comments on this teacher’s knowledge of the content and how to teach it, please write them below.

2. Quality of instruction (Strong evidence of impact on student outcomes)

Student friendly language: Quality of instruction

Includes elements such as effective questioning and use of assessment by teachers. Specific practices, like reviewing previous learning, providing model responses for students, giving adequate time for practice to embed skills securely and progressively introducing new learning (scaffolding) are also elements of high quality instruction.

 

2.1 This teacher clearly communicates to students what they need to be able to do, and how to do it.

2.2 This teacher asks good questions of the class. Their questions test our understanding and help us to better understand too.

2.3 This teacher gives us enough time to practice in class.

2.4 The different parts of this teacher’s lessons are clear. Students know what they should be doing at different times throughout this teacher’s lessons.

2.5 The way that this teacher assesses us helps both us and them to know where we’re at, what we do and don’t know, and what we need to work more on.

2.6 This teacher spends enough time revisiting previous content in class that we don’t forget it.

If you have any comments on the quality of this teacher’s instruction, please write them below.

3. Classroom climate (Moderate evidence of impact on student outcomes)

Student friendly language: Classroom Atmosphere and Student Relations

Covers quality of interactions between teachers and students, and teacher expectations: the need to create a classroom that is constantly demanding more, but still recognising students’ self-worth. It also involves attributing student success to effort rather than ability and valuing resilience to failure (grit).

 

3.1 Students in this teacher’s class feel academically safe. That is, they don’t feel they’ll be picked on (by teacher or students) if they get something wrong.

3.2 Students in this teacher’s class feel socially safe. That is, this teacher promotes cooperation and support between students.

3.3 Even if I don’t get a top score, if I try my best I know that this teacher will appreciate my hard work.

3.4 This teacher cares about every student in their class

3.5 This teacher has high expectations of us and what we can achieve.

If you have any comments on the atmosphere of this teacher’s classroom, or their student relations, please write them below.

4. Classroom management (Moderate evidence of impact on student outcomes)

Student friendly language: Classroom Management

A teacher’s abilities to make efficient use of lesson time, to coordinate classroom resources and space, and to manage students’ behaviour with clear rules that are consistently enforced, are all relevant to maximising the learning that can take place. These environmental factors are necessary for good learning rather than its direct components.

 

4.1 This teacher manages the class’ behavior well so that we can maximize our time spent learning.

4.2 There are clear rules and consequences in this teacher’s class.

4.3 This teacher is consistent in applying their rules.

4.4 The rules and consequences in this teacher’s class are fair and reasonable, and they help to support our learning.

4.5 Students work hard in this teacher’s class.

If you have any comments on this teacher’s classroom management, please write them below.

Final Open-ended Questions If you have any further comments or questions in relation to this teacher, please feel free to share them below.

 

 

TOT009: What makes good PD + more twitter takeaways.

Teacher Ollie’s Takeaways is a weekly post bringing together al of the fascinating things that Ollie read throughout the week! Find all past posts of Teacher Ollie’s Takeaways here

Astrophysicists and feminism

A great post, prompted by a meme shared for International Womens’ Day, on how young women aspiring to be astrophysicists is great, but os is little girls aspiring to be princesses…

What makes a good PD?

Turns out that almost all professional development for teachers fails, that is, it doesn’t have any measurable impact on student learning (great citations for this in this article). In the face of this, should we give up on PD all together? In this article @HfFletcherWood tells us some of the keys to good PD.

PISA and Technology in the Classroom

20 good youtube channels for Maths Teachers

The back and forth on explicit instruction

If you want to hear leaders in their field engaging in the constructivism vs. explicit instruction debate, the articles linked to in the comments of this article are a fantastic place to start. I’m working my way through them at the moment.

The performance of partially selective schools in England

Do partially selective schools improve results for students? Here’s a moderate scale study suggesting partially selective schools maybe don’t have such beneficial effects for those who attend…

Philosophy For Children. Effective or not?

Philosophy for Children is a program that aims to teach students how to think philosophically, and to improve oracy skills, and communication more broadly. Here’s a study attesting to its efficacy, see replies to this tweet for an alternative view…

The Mighty, A website highlighting the writing of Mighty People

Eloquent argument against the same old ‘new education’ assumptions

Tom Bennett argues agains a new film that rips on our educational system. Film states all the usual ‘stifles creativity’, ‘rote learning’, tropes. Great reply from Tom Bennett.

What to do when your child stares at another child with a disability?

Great post from Daniel Willingham. Hot tip, ensure it’s a social interaction. Follow the link for more.

Trump’s policies in perspective

Just because…