Nosmo King – There’s nothing worse than an ex-prog teacher.

H. Vernon Watson (1886–1952), was a popular English variety artist. He toured the music halls before World War I. However, he remained relatively obscure until the 1920s, when he became increasingly popular, despite his terrible routines, often misogynistic jokes and the fact he blacked up, taking to the stage with the name Nosmo King.

We had it all along. Smoke. 

Until recently, I assumed Nosmo King was a relatively innocent (non-blacked up) character who my dad often referred to when we were on a fun-packed family holiday back in the mid eighties. A typical Wells reference to Nosmo King began with an hour long hunt around a drizzly Cornish tin mining town or a Welsh coal mining village for a pub which:

  1. Was willing to allow children in
  2. Served food for lunch
  3. Had a tiny area of about three tables with little plastic signs declaring it a “No Smoking” zone

“Oh look, it’s Nosmo King,” Dad would hilariously declare on spotting the signs. We would all reluctantly mumble a laugh, before ordering a round of crinkly cheese sandwiches with cress and blue Panda Pop garnish. This relief would be short lived, however, as the smoke mysteriously drifted across from the surrounding areas of the pub and Dad began the ritual complaining of the worst kind of anti-smoker: the ex-smoker.

Rockin’ the Suburbs

It would be difficult, these days, to find a pub that didn’t do at least two of the three things on that list. Numbers one and two now make fairly sound business sense, whilst number three could end up in a legal case. It would also be rare for a performer to take the decision to black up on stage or screen as the practice is widely held to be racist; it is mainly associated with derogatory portrayals of racial stereotypes.

Go ahead you can laugh all you want, but I’ve got my philosophy. 

These two examples demonstrate that, as individuals, groups and societies, our attitudes and our resulting actions can change over time – we’d hope for the better, though quite often this is not the case. In this old post, I outlined how my philosophy relating to education has changed over the past few years; I am an ex-progressive. I thought that post would get things off my chest. However, I feel the need to speak out again as the yellowing smoke of those denying the need for debate has come creeping once more towards my now traditionally No Smoking zone of traditional education. It’s even making a muddled mess of my metaphors.

I’m missing the war (ba ba ba ba)

The first example of this occurred in a conversation on Twitter in which a headteacher claimed that the teachers in their school weren’t interested in the debate. This headteacher went on to say that they wouldn’t be wasting their time trying to explain the debate to their teachers as taking an interest in the debate might encourage teachers to take sides and prevent them from being reflective and open minded.

Fighting the battle of who could care less

The second, similar point of view came from the #Nobestwayoverall thread on Twitter. The argument is that there is no single way of teaching that has been proven to be the best methodology in all contexts. In this line of argument, traditionalism is a set of teaching strategies which can sit alongside any other. It is a pragmatic view as it means teachers should be free to select the best way of teaching for their context.

I don’t want to take on this view in its entirety here, but rather focus on a more specific claim from the same thread – that acceptance of the #nobestwayoverall viewpoint would end the ‘trad’ and ‘everyone else’ Twitter war for good. 

I don’t get many things right the first time. In fact, I am told that a lot. 

The issue I have with these two stances is that they seem to imply that the debate between traditionalists and progressives is primarily about the processes of education rather than its priorities. This is, perhaps, because many of the, admittedly very useful, books and blogs which have been written about the debate (some of which can be found at the end of this post) have set out the differences in a manner which pits the processes one by one, head to head.

If you switch this and focus on the differences in the priorities of the two ideologies rather than the processes, I think you can make more sense of the longevity of the debate. Apologies if any of the following seems caricatured. If you feel there are inaccuracies, then please let me know, but keep in mind that it is very difficult to capture the contrasts in two sets of beliefs that various people have defined in different ways over the years.

In essence, traditionalists see the purpose of education as being the passing on of a body of factual and procedural knowledge (a tradition). The priority is the tradition because retaining it and passing it on are seen as being beneficial to society and individual children now and, therefore, society and adults in the future. Authority lies with the tradition and the individuals and institutions who pass it on – teachers and schools. Children are expected to behave in a way which is respectful towards the tradition as well as these individuals and institutions.

The progressive school of thought pits itself against this sequence of priorities. It is harder to define as it is, I think, a broader “church.” The educational priority of progressives is the development of the child. For some this means the development of the child as an individual. For others it relates to the development of the child as a part of society. For most, it is about preparing children for their lives after school. As a result of the child coming first the curriculum and pedagogy are built around the children who are being taught. In many ways, this results in a symbiotic relationship between the curriculum and the pedagogy as the process being used to learn can be as important as, or more important than, the material or knowledge which is being manipulated in the process. 

To examplify this (and again I risk caricature here) a progressive teacher might want their students to improve their critical thinking. They will spend time selecting materials to use, but likely more time on the methods they will use (the activity/ies) as it is the practice of critical thinking which they want the students to have and develop more than the retention of the content in the materials. Students will learn about the topic of the materials though this will be as a by product of them learning how to be critical thinkers. In contrast, a traditional teacher will carefully select the knowledge which they wish students to learn and retain. Becoming a critical thinker will likely be, in their minds, a by product of learning the knowledge – the more knowledge their students retain, the more critical they can be when they encounter others’ viewpoints relating to that topic in the future. Thus, progressives are not unconcerned with knowledge and traditionalists are not uncaring toward children. It is just that, in terms of education, their priorities are different. 

From this dichotomous set of priorities and purposes stem a set of associated methodologies rather than the other way round. 

Come pick me up. I’ve landed. 

So, where does this leave us with the views highlighted earlier on. 

In the case of #Nobestwayoverall it is difficult to see that the hashtag will bring an end to the debate as it is primarily about what “best” means rather than the ways we teach – though some people have tried to make it about this. If we can’t agree on a definition for the purposes and priorities of education, then the empirical evidence offered by science in its different forms, used by one side or the other, is not going to help us to agree as to the methods we should use.

In terms of the other line of argument, it seems to me that it is easier to achieve the purpose a leader has in mind for their school if that sense of purpose is shared as broadly as possible by the people who are working within the school. If the purpose is changeable or the priorities are debatable, then perhaps a pragmatic approach is required. Perhaps the fixedness of the curriculum and teaching methodology matters less in these circumstances, though I doubt it. If there is a clarity and steadfastness to the purpose, then a more fixed set of methods is likely to be shared too and, one would think, the purpose is more likely to be achieved. 

It seems to me that a shared sense of purpose, curriculum and methodology is most likely the best way. But then I’m an ex-progressive: the worst kind of teacher. 

 

Dory says…

“Some activities, such as playing pop music in pop music groups, solving crossword puzzles and folk dancing have no standard training approaches. Whatever methods there are seem slapdash and produce unpredictable results. Other activities, like classical music performance, mathematics and ballet are blessed with highly developed, broadly accepted training methods. If one follows these methods carefully and diligently, one will almost surely become an expert.” Peak by Anders Ericsson and Robert Pool

Saturday mornings in the Wells household begin with a manic game of hunt the swimming kit before a trip to the local leisure centre for my son’s swimming lesson. 

Both of our kids are making their way through the stages of Aqualetes, based on the nationally recognised Swim England Learn to Swim programme. As you watch the sessions unfold over time, you can see the way everything has been carefully sequenced – from the way the instructors get children used to having water in their hair through doggy paddle, breaststroke, back and front crawl to developing the children’s knowledge of survival strategies. I’m still not quite convinced by butterfly or backwards skulling, but the rest all makes sense. 

Last week, I watched as one of the teachers called a boy back to the side of the pool, re-explained the correct leg movement for breaststroke, showing him with her arms, then gave him two more opportunities to show her that he could do it correctly. The boy set off the first time and you could tell from the reduction in his speed and slight awkwardness in his movement that he was really thinking carefully about the corrections he’d been coached to make. His legs were moving better, but the front half of his body was submerging more between each stroke. This corrective stage was affecting his fluency but he was trying to do exactly as he’d been told. The second time through, his performance improved. It wasn’t, by any means, perfect but it was more fluid and resembled breaststroke more closely. This was Stage 5 swimming and he was moving closer to Stage 5 performance. 

“You’ve been struggling to make things right. That’s how a superhero learns to fly.” The Script. 

Watching this expert swimming teacher and her pupil reminded me of a couple of other interactions I’d seen recently on Twitter about Direct Instruction and scripted lessons. If you’re uncertain about what Direct Instruction is, then this is worth reading. In essence, it’s a carefully sequenced, carefully scripted approach to teaching which is designed to ensure mastery of the content covered. The programmes, written by experts, have been tested and reworked based on a set of core principles. They are designed for students who, based on assessments, are currently at a very similar level in terms of the knowledge gaps they address. They are designed to maximise the impact of the interactions between teacher and students. Here’s a page of scripted material from Corrective Reading, available from McGraw Hill. 

Direct Instruction was road-tested in the most extensive educational experiments ever conducted: Project Follow Through. Details of how this was done can be found here.

The objections to Direct Instruction on Twitter seem to come in two main forms. The first is that the Direct Instruction programmes are likely to be detrimental to students as their scripted nature negates the fact that students are human beings who don’t talk in scripts – scripting is dehumanising, opponents argue, and students aren’t all the same. The second is that scripting deprofessionalises teachers. In particular, there is a concern that new teachers will not learn how to plan themselves or how to adapt to the needs of different learners. 

We run two Direct Instruction programmes at our school and are planning on implementing a third next year. Though I haven’t personally taught one, I’ve observed a number of sessions and I’d disagree with both of the issues raised above, though I can partially understand why people might raise them. 

The students who are learning through these programmes need the structured support which they receive from the teacher, the materials and the method of delivery, even more so than the boy in the pool last Saturday. They need explanations which have been planned with precision; they need their teacher to ask questions which elicit responses from them, either individually or chorally, in order for that teacher to intervene and correct swiftly where necessary; and they need each session to build on the last with application of new material as well as knowledge that has previously been mastered. All students need these things, but these students have gaps in their knowledge which need closing quickly, efficiently and in a humane way. Some of the critics of Direct Instruction seem to imply that it is a robotic process. When I walk past the classrooms where these sessions take place, it seems far from robotic. These classrooms are places of joy because the students taking part are motivated by the fact that they are building their knowledge together from one session to the next. 

The teacher leading the session loves it too and certainly doesn’t feel de-skilled by the process of using the script. They can see, over time, the students retaining more knowledge and able to apply this in more skilled ways. The feedback they get from the students, because of the way the scripts are designed, is more frequent and interventions or corrections can be made immediately due to the clarity of the questioning. 

So, should we be concerned for new teachers? Well, when I look back on my very first lesson ten years ago, I think not. I was going to be teaching my mentor’s top set Year 9 group for a one off lesson on persuasive speaking. I must have spent the week beforehand preparing myself. I recorded, on videotape, Thursday’s episode of Question Time. Amongst the panel members that week were Richard Branson, Theresa May, presumably a model of poor speech, and Margaret Beckett. I spent ages and ages watching a short clip, rewinding again and again and again until I’d pulled together worksheets on a rickety word processor for each of the speakers. Different students would focus on different members of the panel and then feedback to other students in jigsaw groups. 

Genius and a worthwhile use of a new teacher’s time? 

No. 

I thought it was ok and it probably wasn’t a car crash. But probably not a car crash is not a good model of lesson for students and is not a good model for teacher training either. As a result of all the time I’d spent on resourcing, I’d spent almost no time on the precision of my explanation of what makes excellent panel debate or on the concision of the questions I’d ask afterwards. I also had no sense of how this lesson would have fitted into the whole curriculum and, because I was so busy focusing on whether I could pause the VHS at the precise moment I wanted, I had no time to focus on the basics of classroom management. I had a limited mental model of what expert teaching could be like and I’d certainly not practiced teaching using someone else’s model – a decent model mapped out or scripted by an expert. I was basically sinking in style rather than swimming through substance.  

Though I’m not advocating their use for every lesson, what Direct Instruction scripts offer is a way to rapidly and efficiently build students’ knowledge whilst also benefiting trainee and expert teachers. 

As an added bonus, I’ve yet to see any DI materials featuring faltering politicians.

Litteranguage – Part 1

Before you read the rest of this blog, take a few minutes to go off, get yourself a drink and consider these questions:

  • What do you think should be the purpose of an English Language GCSE?
  • What do you think is the purpose of the current English Language GCSE?
  • To what extent is the current English Language GCSE fit for purpose?

I literally want you to not read on – at least not until you’ve had a think anyway…

Stop right now. Thank you very much.

…So, I hope you’ve had a good drink as well as a bit of time to consider. Now, about those questions. Let’s mull over some possible responses to the first one.

My suspicion is that your answer fell into or across one or more of the categories below as they’re an amalgamation of responses which I received to the same question on Twitter.

The purpose of the English Language GCSE is to:

  1. Act as a governmental lever to ensure schools cover specific curriculum content
  2. Establish a clear set of “skills” which students should develop during their time in Years 10 and 11.
  3. Define the knowledge that students should be able to recall at and beyond the end of Year 11, ensuring that students are exposed to the best that has been thought and said in order for them to get “cleverer.”
  4. Assess mastery of a proportion of the content of the National Curriculum for English at Key Stage 4.
  5. Gauge the “progress” which individuals or cohorts of students have made between Key Stage 2 and Key Stage 4.
  6. Function as a qualification that demonstrates to employers a certain level of proficiency in language use.
  7. Prepare pupils for the study of English Language at a higher level.

Too much of something is bad enough. 

What’s interesting is the potential for conflict which exists between these aims. You may favour one or more of these purposes over another one. You may think that one of these purposes should play no role in the design of a qualification which most children take at age 16. You may think it’s possible to do all of these things through a single qualification. You may even want to suggest an alternative or additional purpose which has no connection with any of these. If you do, I’d be interested to know what you’ve came up with so please do get in contact.

Now though, I’d like to explore each aim in turn, thinking about how it ties in with the current GCSE and whether the qualification as it stands is fit for purpose. In the rest of this post, I’ll look at numbers 1-3, Part 2 will explore 4-6 and the final post will examine number 7 before going on to tentatively suggest some possible improvements.

Tell me what you want, what you really, really want.

As academies and free schools no longer have to follow the National Curriculum and there are currently no Key Stage 3 SATs tests, the GCSE qualification is the main lever by which the government can attempt to ensure curriculum content coverage.

This lever is used to make some content less important and some more significant.

For example, the National Curriculum for Key Stage 4 says that “Pupils should be taught to understand and use the conventions for discussion and debate.” In the English Language GCSE however, students are only assessed through an individual presentation with follow up questions. Moreover, the fact this assessment no longer contributes to the final grade, but instead is allocated a grade in its own right (albeit one that doesn’t feed into the school’s data set) has impacted on the way many schools approach the spoken language aspect of the qualification. There will be schools which focus on the teaching of debate and discussion, but, due to their disappearance in the assessment model, I would expect the numbers to be fewer than had previously been the case.

There are similar examples in other aspects of the curriculum. In writing, the National Curriculum establishes the expectation that teachers will teach students how to make notes, draft and write, including using information provided by others [e.g.writing a letter from key points provided; drawing on and using information from apresentation].” If this act of note making and transformation fed into a question in the GCSE paper, then it would be covered extensively by English departments across the country. As it is, I’d guess, in comparison with other aspects of the curriculum it’s barely touched upon.

In contrast, “seeking evidence in the text to support a point of view, including justifying inferences with evidence” is assessed a great deal. This is required for every single reading question in both AQA papers.

One reason this happens is that there’s a document which acts, in some ways, as a filter between the National Curriculum and the GCSE Specifications – the English Language GCSE Subject Content and Assessment Objectives.

According to the gov.uk website, these “GCSE subject content publications [set] out the knowledge, understanding and skills common to all GCSE specifications.” I knew something like this existed, but have only read it properly as part of the process of writing this blog.

In English, this is the document which establishes that “All texts in the examination will be ‘unseen’, that is, students will not have studied the examination texts during the course.” It set out that these have to be, “challenging texts from the 19th, 20th and 21st centuries.” It shifted English language exams away from media studies by declaring, “Texts that are essentially transient, such as instant news feeds, must not be included.” Like it or not, the government have clear leverage over the content of the GCSE in English Language as well as some leverage over the style of the papers. If you like it over all of the other purposes outlined at the start of this blog, you could argue that the GCSE is fit for purpose.

It’s also worth keeping in mind what might happen if we removed government leverage altogether and removed the English Language GCSE. In some schools, what is assessed is what is taught. Were the English Language GCSE to be removed in favour of assessing reading and writing through other subjects, then the focus on coherence, clarity and accuracy in communication would, in my view, fade. 

Swing it, shake it, move it, make it, who do you think you are?

I’ve also, never taken a look at the documents for other subjects before. Here are the content and assessment objectives for science, maths and religious studies. When you compare them, one thing that’s likely to immediately strike you is the relative brevity of the English document. In fact, there are very few which are shorter (drama and computer science are two examples). Some might argue that this is due to the nature of each subject – that science is primarily a knowledge based subject whilst the English curriculum leans towards the development of a set of skills. The science content document explains that its “first section explains the main ways in which working scientifically should be developed and assessed…The second section sets out the key ideas and subject contents for the biology, chemistry and physics components of combined science.” To exemplify this, one of the scientific ways of working is “presenting observations and other data using appropriate methods” whilst one of the scientific ideas which students need to know from biology is that “life processes depend on molecules whose structure is related to their function.” This structuring of skills and knowledge isn’t present in the English Language document. Rather, we have a list of skills like “critical reading and comprehension…summary and synthesis…and producing clear and coherent texts.”

There are clear differences between science and English and the way the document is structured to focus on skills is beneficial if you believe purpose number 2 in the list at the start of this blog post to be most important. However, this structuring leads to two key issues in terms of the English GCSE. The first relates to teaching and the second to the design of the exam papers.

Firstly, it would be difficult to argue that the skills listed are not worth developing over time in terms of our students’ ability to communicate. There is, though, a body of knowledge which needs to lie behind this. In terms of English, this includes specifically punctuation, vocabulary, spelling, word classes, grammar, figurative language and rhetoric. There is little point, I think, in getting students to apply a set of skills if they do not possess a body of knowledge to apply them with. In addition, due to the unseen nature of the texts, if they are not to be disadvantaged, the reading section of the exam also requires students to have a broad knowledge of history, geography, science – a general knowledge. It is possible to a degree, to improve performance in exam questions by practicing exam questions. If we want students to improve as communicators, then we need to think more widely, addressing gaps in factual and procedural knowledge. To some extent, the exam can be pot luck test rather than a test of students’ knowledge and application of English language. Ensuring that students develop the knowledge which is hidden behind the subject content document for English Language requires great teaching across the whole of the curriculum and, likely, a restructuring of the English curriculum.

Secondly, the weighting of the content document towards skills and the fact that the final say in the document is given to the Assessment Objectives, has led exam boards (most notably AQA) to directly focus each of the questions in their exam papers on a limited range of the Assessment Objectives. AQA have also designed their papers so that the structure and question stems will be almost identical year on year. The result of this is a set of questions which is mechanical in nature and responses which, I’d predict over time, will become increasingly mechanical in order to address aspects of the mark scheme. The skills that students will be developing with this style of GCSE will be the skill of responding to a specific question type, rather than the skills of reading and writing. If we genuinely want students to develop skills and apply their knowledge effectively, then we may have to accept a less standardised form of exam paper. For this reason, I’d argue that the current model of GCSE in English Language is not fit for purposes 2 and 3 in our list.

 

Literature is not Clickbait

Over the last fifteen years or so of English teaching, I’ve read my fair share of analytical responses to literature. I have just over a hundred, year 10 mock papers sat in a box, calling to me to befriend them so that they can be marked this weekend. They will all be my friends by Tuesday once they’re marked – not quite so much on Sunday or Monday whilst I’m marking them though. 

Amongst these papers there will likely appear the phrase “this makes the reader want to read on.” It is a phrase which I have seen many times before and it is a phrase which I will not befriend. 

Yesterday on Twitter, I posted the following: “Seriously, who has ever told a child that a writer does something to make the reader want to read on? Where does that come from?” The initial response was, as I’d expected. A number of other English teachers retweeted what I’d written, presumably as they felt the same frustration I did. Then came a challenge. 

Some people began to question whether I was suggesting it was a stupid way to respond to a text. It was claimed that writers, with some exceptions, do write with the intention of their readers continuing through their texts til they finish them. Writers, they argued, have commercial interests in their texts being read to the end. 

This is true to an extent. There are relatively few writers, I’d expect, who would want their books, articles or poems to sit in the bargain bucket gathering dust and decreasing in value. So, why does the phrase “makes us want to read on” bug me so much?

  1. It is banal. 
  2. If the texts we are putting in front of our students are more than mere clickbait, then there is so much more that could be said about them. 
  3. It doesn’t take much to extend students’ thinking and the expression of children who say or write this as a response. It just requires one more question; some time to think and some time to write or rewrite. Why does it make you want to read on? How does it get you to read on? What does it make you feel or think that makes you want to continue feeling that way? All kinds of question. 
  4. Modelling how to enhance analytical writing from this point is not difficult and will help students to see there is so much more to say. 
  5. A child who struggles to express much more than this as a response to literature is more likely to struggle to craft their writing effectively. 

My frustration at the statement was, therefore, a frustration at a system which has allowed too many of our students to reach the age of sixteen, unable or unwilling to respond to literature in a way that sees beyond  its commercial quality. There is beauty, ugliness, peculiarly, familiarity, truth, illusion, fear, contentment, tragedy, comedy and a whole spectrum of other emotions, sensations, traditions and thoughts to be found in literature. Supporting children to move beyond the banal should help them to see much of this. 

Pedagogical Content Knowledge in English

“…the question of what teachers should understand if they wish to teach a domain responsibly is no simple challenge. In the field of English teaching, where canons are under question and “consensus” is more frequently misspelled than accomplished, the problem of teacher knowledge is daunting.”

In her paper ‘Knowing, Believing and the Teaching of English’, quoted above, Pamela Grossman outlines just some of the key challenges faced by those who try to define the knowledge English teachers require. In essence, they are that:

  • There are numerous ways of dividing up the English curriculum. For example, some argue it can be split into linguistics, literature and composition whilst others would divide it into reading, writing, speaking and listening.
  • English, particularly reading, is an interpretive domain and there are many interpretive schools of thought. There is therefore a question about the number of standpoints which teachers should be able to take. There is also a pedagogic question about whether it is a teacher’s role to tease out the interpretations from students or know them all themself.
  • The history of literature is as sprawling as history itself. How extensively should teachers know the impact of contexts on the texts being studied?
  • Our memories and understanding of how we developed procedural knowledge in writing and reading is buried deep, so expressing this to novices in a useful way is challenging.

There are many more issues than this. In each case though the most important element in finding a solution is ensuring clarity in the English curriculum.

Teachers require this clarity in order to know what their students ought to know so that they can also know how to preempt and address issues when their students don’t know what it is they should know as well as they should know it. Clarity is vital isn’t it?

The issues Grossman outlines in English teaching impact on our understanding of what Shulman terms pedagogical content knowledge (PCK).

PCK is a form of practical knowledge that is used by teachers to guide their decisions and actions in subject focused classrooms. This type of practical knowledge requires, amongst other things:

1. Knowledge of the relationship between content and students – a grasp of the common conceptions, misconceptions, and difficulties that students encounter when learning particular content.

2. Knowledge of the relationship between content and the curriculum – how to sequence, structure academic content to maximise the impact of direct teaching to students.

3. Knowledge of the relationship between content and teaching – the specific teaching strategies, pedagogical techniques and ways of representing, modelling and explaining knowledge that can be used to address students’ learning needs in particular classroom circumstances with specific content.

I’d like to look at each of these in turn with a specific focus on PCK in English.

As a starting point, I think it would be useful to begin a list of the key realisations students need to have in the field of English. Where students haven’t had these realisations, misconceptions emerge. Over time, I’d like to build up a resource a little like this from the AAAS in science.

Realisations for students of English language, literature and composing texts:

Spellings 

That the following words are not the same:

  • Your and you’re
  • Their, there and they’re
  • Two, to and too
  • Practice and practise
  • Bought and brought
  • It’s and its
  • Desert and dessert
  • Dryer and drier
  • Chose and choose
  • Lose and loose

That some verbs have the same sound as nouns but with a different spelling and meaning.

Grammar – Words

That nouns aren’t simply people, places and things.

That an adverb is a word or phrase that modifies the meaning of an adjective, verb, or other adverb, expressing manner, place, time, or degree. Not all adverbs end in ly and not all words ending in ly are adverbs.

That verbs aren’t simply “doing words.”

That adjectives aren’t simply “describing words.”

That a word can have different functions, dependent on the context in which it is used.

That some verbs have the same sound as nouns but with a different spelling and meaning.

Grammar – Sentences

That subjects and verbs are the central components of a sentence and not full stops and capital letters.

That the placement of a full stop or comma in a text isn’t defined by the need to breathe.

That simple sentences are not just short sentences.

That complex sentences aren’t simply long or just filled with complex information.

That writers will break syntactical conventions for specific reasons.

That we don’t use a modal verb, like should, with “of” because “of” is not a verb, it is a preposition. Some people write “should of” because when they speak, they say “should’ve.”

That auxiliary verbs can also function as main verbs.

That subjects and verbs in sentences need to be in agreement. For example, we write I/he/she/it was but we/they/you were.

That pronouns can function as sentence subjects.

Reading for understanding

That, unless you are willing to read in your own spare time, your chances of develop your knowledge and understanding of the world, as well as the world presented by the writers you encounter in class are limited; you won’t grow intellectually

That many writers will bother to spend a great deal of time considering lexical and syntactic choices.

That (chains of) words can be used by writers as symbols.

That writers do not necessarily write with the same voice they use for speaking.

That the views of a narrator are not necessarily the same as the views of the writer.

That there can be more than one narrator of a story.

That we can’t always trust what we’re told by the narrator of a story.

That writers organise their texts for both clarity and influence.

That writers do not always understand or appreciate the significance of their stories.

That writers do not always endorse everything that happens in their narratives.

That bad people can write good stories and good people can write bad stories.

That the narrative may not be the most important part of a narrative text.

That your interpretation of (part of) a text may not be the only one.

That, though there are often multiple possible interpretations of a literary text, there are also wrong interpretations.

That writers don’t start from scratch every time. They draw on ideas from other people and they adapt and combine forms and often use archetypal characters.

That some writers use the plot and characters in their texts as a way of commenting on society or providing a moral message.

That writers, particularly of non-fiction, often consider the nature and scale of their audience in deciding how to write rather than just setting off.

That the act of writing for a public audience is done to influence thoughts, feelings or actions.

That characters in fiction books aren’t the same as real people.

That writers sometimes deliberately use cliche.

That a narrative is still fictional, even if its context is real.

Reading – Poetry

That great poets consciously select forms which are almost invariably linked to the content and/or meaning of their poems.

That lyrical poems are not the same thing as lyrics for songs or raps.

That stressed syllables are louder than unstressed syllables.

That words that look similar sometimes may not rhyme.

That poets miss out letters to change metrical patterns.

That some words no longer rhyme due to language change.

That poems are often part of a sequence or collection.

That ‘ing’ is not the part of the word that rhymes.

Composition – General

That it is better to carefully consider the precise words you want to use than include as many words as you can.

That the language you’re using may not be standard English.

That what we write about and how we write is influenced by the world around us as well as who we are writing for.

That some people will judge you based on the way you speak and write.

Composition – Creative

That adding ellipses at the end of (part of) a story doesn’t increase the tension and is an unnecessary way of highlighting a cliffhanger.

That ‘said’ is often the most appropriate verb to describe speech.

Composition – Analytical

That all writers want their audience to read on so there are more interesting things you could say about the effects of their language choices.

That a text implies rather infers and a reader infers rather than implies from a text.

That rhetorical questions have a more specific purpose than to make the reader think.

That a quotation is a group of words taken from a written or a spoken text.

That you do not have to agree with the writer in order to be able to see their point of view.

That, even though you find a text dull, it can still have literary or wider cultural value.

Composition – Rhetorical

That rhetorical questions are not questions that don’t require an answer, that there are multiple types of rhetorical question and that each type has a subtly different function.

That repetition is not a rhetorical method in and of itself.

Composition – Structure

That a change in paragraph marks a shift in time, place, topic, point of view or speaker in dialogue but that writers sometimes break these conventions for effect.

That the plot of a story doesn’t always have to be told in chronological order.

 

Integrating Assessment, Knowledge and Practice

“If we want pupils to develop a certain skill, we have to break that skill down into its component parts and help pupils to acquire the underlying mental model [to see in their mind what it should look like in practice]. Similarly, when developing assessments for formative purposes we need to break down the skills and tasks that feature in summative assessments into tasks that will give us valid feedback about how pupils are progressing towards that end goal.”

(Daisy Christodoulou – Making Good Progress)

“Deliberate practice develops skills other people have already figured out how to do and for which there are effective training methods.

  • It takes you out of your comfort zone. 
  • It involves well defined, specific goals. 
  • It requires full attention. 
  • It involves feedback and modification. 
  • It requires a focus on specific aspects of procedures.”

(Summarised from Anders Erickson – Peak)

Have you ever tried connecting two hoses together while the water’s flowing? It’s a tricky, splashy business that’s easy to get wrong. There’s a pivotal point in education where you can say the same of curriculum design and assessment. 

Swindon Academy’s curriculum design follows a mastery model based on the following set of key principles:

Curriculum Model

Our Curriculum Leaders and the teachers in their teams have worked hard to develop curriculum overviews and schemes of learning which reflect these principles, often drawing on the best, publicly available resources. In some faculties and in some key stages, this work is further advanced than in others. Whatever stage the teams are at in this development though, they would agree that there is more to do, especially with the recent introduction of new GCSE and A-Level specifications and changes to vocational courses.

Over the course of last half term, I met with each of the Curriculum Leaders to review the Swindon Academy Teaching and Learning Model.

Codification Document

These discussions confirmed that the model still effectively summarises what we would want to occur in classrooms on a day to day basis as well as over time. Two areas of the model came up consistently in discussion as needing re-development though:

  1. The assessment element no longer describes current feedback practices which now vary from faculty to faculty due to the introduction of faculty feedback policies.
  2. Prep (homework) needs to feature more prominently to establish clearer expectations and reflect its importance in developing students’ independence.

Alongside this review, I’ve been reading a range of research relating to cognitive science as well as educational texts on assessment, including the books Driven by Data by Paul Bambrick Santoyo and Making Good Progress by Daisy Christodoulou and blogs by Phil Stock, Heather Fearn and Joe Kirby. These have made me consider, in a different light, how we could tighten up on our assessment model so that we:

  1. Know that the assessment systems which are being used are as reliable as we can make them.
  2. Have a shared understanding of the range of valid inferences we can draw from the data provided by these systems.
  3. Ensure that we maximise the impact of these systems, without damaging their reliability.
  4. Continue to increase the level to which students take responsibility for their own progress.

The remainder of this blog is taken from a paper designed to kick start this process. It is divided into two elements: a teacher element and a student element. The first focuses on curriculum and assessment design whilst the second looks at the use of knowledge organisers and self-testing as prep.

The teacher element:

We’ve found, in introducing a number of Doug Lemov’s Teach Like a Champion strategies, that it’s useful to have a shared vocabulary so that we can have efficient and effective conversations about teaching. This should also be the case with assessment practices.

Key definitions:

The following terms will be, I think, key to developing our shared understanding of assessment practices:

Domain:

The domain is the entirety of the knowledge from which an exam/assessment could draw to test a student’s understanding/ability. At Key Stage 4 and 5, this is defined by the specification, though there are also elements of knowledge from previous Key Stages which aren’t listed in specifications but that still form part of the domain.

Sample:

The sample indicates the parts of the domain which are assessed in a specific task or exam. It’s rare we’d assess the whole of a domain as the assessment would be overly cumbersome. Well designed assessments are carefully thought through. Samples should represent the domain effectively so that valid inferences can be made based on the data gained from the assessment.

Validity:

The validity of an assessment relates to how useful it is in allowing us to make the inferences we’d wish to draw from it. “A test may provide good support for one inference, but weak support for another.” (Koretz D, Measuring Up) We do not describe a test as valid or invalid, but rather the inferences which we draw from them.

Reliability 

If an assessment is reliable, it would “show little inconsistency between one measurement and the next.” (Christodoulou)

Test reliability can be affected by:

Sampling:

  • Most tests don’t directly measure a whole domain; they only sample from it as the domain is too big. If the sample is too narrow, the assessment can become unreliable.
  • If the sample is always the same, teachers will strategically teach to the test to seemingly improve student performance.

Marking:

  • Different markers may apply a mark scheme rubric differently.
  • One marker’s standards may fluctuate during a marking period.
  • Teachers can consciously or subconsciously be biased towards individuals or groups of students.

Students:

  • Performance on a particular day can vary between the start and end of a test.
  • Students perform differently due to illness, time of day, whether they have eaten, emotional impact of life experiences.

Difficulty model

In this form of assessment, students answer a series of questions of increasing difficulty. A high jump competition or a GCSE Maths exam paper are good examples of this model.

Quality model

Here, students perform a range of tasks and the marker judges how well they have performed, most often in relation to a set of criteria. Figure skating competitions and English and history GCSE papers use this model.

General issues which Christodoulou identifies with the most common assessment models:

  • A focus on the teaching and assessment of generic skills can lead to teachers paying insufficient attention to the knowledge required as a foundation for those skills. For example, vocabulary, number bonds, times tables, historical chronologies or relevant, subject specific facts can be overlooked in favour of how to evaluate or problem solve.
  • Generic skill teaching makes deliberate practice far more challenging as it focuses on larger scale success as opposed to fine grained assessment and training. For example, formative assessment in sport may take place during a match rather than a drill. Here, the teacher may miss an issue which a student has with a specific aspect of the sport and then not address it.
  • Using only exam questions for assessment, especially though not exclusively for subjects whose exams are based on the quality model, can hide weaknesses which are at a smaller scale.

Specific issues which Christodoulou identifies with ongoing descriptor assessment and exam based tests:

Limitations with using descriptor based assessments to formatively assess:

  • Descriptors can be vague or unspecific.
  • Using assessment descriptors to feedback can be unhelpful as the describe performance rather than explain how to improve.
  • Descriptors focus on performance rather than long term learning.

Limitations with using descriptor based assessments to summatively assess:

  • Tasks are often not taken in the same conditions by all students which makes assessment less reliable.
  • Descriptors are interpreted differently by different markers.
  • Judgement based on descriptors is subject to bias.

Limitations with using exam based assessments to formatively assess:

  • By their nature, these tests have to sample from a wider domain so we cannot identify precise areas of strength and weakness for students.
  • As questions become more challenging or more difficult, it also becomes more difficult to identify which aspect of the question students did well or badly in.
  • Exams are designed to provide grades and grades aren’t sensitive enough to measure progress in individual lessons.

Limitations with using exam based assessments to summatively assess:

  • If we use exam formats and grades too often with students we can end up teaching to the short term rather than the longer term.
  • All students need to take the assessments in the same conditions to secure levels of reliability.

Assessment Solutions: 

Having established these issues, Christodoulou suggests the following principles for effective formative and summative assessment:

Formative assessment principles:

  1. The tasks/questions set need to allow teachers/students to easily identify issues and next steps. In particular, if you teach a subject which is normally assessed through the quality method in exams, it is worth considering a more fine grained testing approach to assess formatively.
  2. The process needs to include repetition to build to mastery otherwise the formative assessment won’t have the desired impact.
  3. Once material has been mastered, students need to be required to frequently retrieve key learning from their long term memories.
  4. Formative assessment should be recorded as raw marks as this makes it easiest to track from one lesson to the next.

Summative Assessment Principles:

  1. Summative assessments should be taken in standardised conditions and marked in a way which maximises reliability.
  2. They should cover a representative sample of a significant domain.
  3. Scaled scores are more reliable than raw marks for summative assessment.
  4. Enough time should pass between summative assessments for students to make worthwhile improvements.

What are our next steps for the teacher element?[1]

  1. Ensure the curriculum is effectively mapped out and sequenced, establishing the factual and procedural knowledge which students will learn. Divide the knowledge from the curriculum into that which students need in the long term and that which students need for a specific unit. Ensure the bulk of curriculum and prep/revision time is spent on students focusing on retaining the most important knowledge. Build space into the curriculum to assess retention of knowledge from previous units which students need in the long term.
  2. Establish when students will be assessed both summatively (whole Academy calendar) and formatively (faculty curriculum overviews). As far as possible, this should take into consideration: the completion of teaching all elements, enough time between teaching and testing for revision and to suggest that our inferences are based on learning rather than performance.
  3. Ensure that the purpose of each assessment is clear to all involved in its design, delivery, marking and provision of feedback. The format of the test should enable the function to be achieved. It should also ensure that the inferences drawn from the results are as valid as possible. The main purposes of our summative assessments include re-streaming students, reporting to parents, establishing attainment and progress over time in teaching groups and cohorts of students to report to governors. A key question for you here is whether your summative assessments are reliable enough to enable you to validly infer that certain students are working at “age related expectations” in your subject. Formative assessments should be used to identify potential gaps in knowledge, misconceptions or deficiencies in ability that can be subsequently addressed.
  4. Design assessments aligned with this timing and purpose. Using Christodoulou’s principles for summative and formative assessments will help here. Over time, two separate banks could be built up: one of summative and one of formative assessment tasks. For summative assessment, it’s also worth asking yourself the following questions, based on those found in Santoyo’s book Driven by Data. Do assessments in each year:
    • Address the same standard of skill/content as the end of Key Stage assessment
    • Match the end of Key Stage assessment in format?
    • Enable students to move beyond that year’s content/skill level?
    • Reassess previously taught content which is necessary to retain until the end of the Key Stage?
  1. Trial the use of comparative judgements in subjects where the substantial proportion of assessment uses the quality model. 
  2. Preview assessment tasks to ensure that:
  • Questions don’t provide clues as to the answer.
  • Questions are actually testing that students have learned or can apply the knowledge you wanted rather than something else.
  • Questions are worded accurately and any unnecessary information is removed.
  1. Review assessments after use to establish whether they provided you with information that enabled you to make the inferences you wished. Make amendments to assessment items, where required, if they are to be reused in the future. 
  2. Standardise the conditions in which summative assessments take place and the ways in which they are marked. 
  3. Ensure that, where data from assessments is used to make key decisions, the data is sufficiently reliable. For example, when moving students between sets, data from more than one assessment is utilized.
  4. Develop the teaching and learning review which forms part of each teacher’s CPD Booklet to ensure that teachers have action plans in place to address gaps in attainment.
  5. Establish procedures for Curriculum Leaders to review and summarise teacher’s action plans, sharing them with their Line Managers for quality assurance.

The Student Element. 

Over the past two years, a number of our faculties have been trialing the use of knowledge organisers and low stakes testing or quizzing as part of the process of curriculum design. Different models have emerged, sometimes with different purposes and using different frameworks. We want to make the use of knowledge organisers, self-testing and the use of flashcards a core part of our students prep across subjects.

In order to secure the highest impact of this work, we need to evaluate the models currently in use to generate a set of shared principles and uses for these tools. We need to be sensibly consistent in our approach, keeping in mind the differences between the subjects that we teach. There are certainly potential benefits to the use of both knowledge organisers and quizzing, but we need to ensure these are harnessed effectively in each subject area.

Why should we bother with quizzing and knowledge organisers? Aren’t they just fads?

The term knowledge organiser could be a fad, but the idea of organising knowledge into schemas is certainly not as it has been going on for centuries.

As subject specialists, having carefully mapped our curriculum through from Key Stage 3 to Key Stage 5, it would be both wise and desirable to look for the most effective methods to ensure that students retain as much of the knowledge we are teaching them from one year to the next and, of course, into their lives beyond school. 

On a more pragmatic level, in order to support our students to do well with the new GCSE qualifications, we need to help them develop methods for retaining knowledge in the longer term. These qualifications are now more demanding. They require students to retain knowledge longer as they are based increasingly on terminal examinations rather than coursework and they ask more of them in terms of problem solving.

Even if it weren’t for this though, over the course of the last century, hundreds of cognitive science studies have ranked practice testing as one of the most effective methods of improving the retention of information and procedures in the long term memory. “In 2013, five cognitive scientists (Dunlosky, Rawson,Marsh, Nathan, Willingham 2013) collated hundreds of such studies and showed that practice testing has a higher utility for retention and learning than many other study techniques.”

The table below is taken from John Dunlosky’s “Strengthening the Student Toolkit”. In this paper, he argues that, “while some [study] strategies are broadly applicable, like practice testing and distributed practice, others do not provide much – if any – bang for the buck.” Low stakes, practice testing is one of the most effective study methods. 

Dunlovsky

Alongside this, sits Cognitive Load Theory and the work of John Sweller. Our teaching and learning handbook outlines the idea that our working memories have limited capacity only coping with approximately 7+/- 2 items of information. Once we go beyond these limits, then our thinking processes become bogged down. These ideas have been refined over the last couple of decades into a set of instructional principles called Cognitive Load Theory. In their book, “Efficiency in Learning” Sweller et al argue that, “Taken together, the research on segmenting content tells us that:

  • Learning is more efficient when supporting knowledge, such as facts and concepts, is taught separately from main lesson content.
  • Teaching of process stages should be proceeded by teaching the names and functions of components in the process.
  • Teaching of task steps should be segmented from teaching of supported knowledge such as the reasons for the steps and/or concepts associated with the steps.”

Well-designed knowledge organisers or schemas and effective self-testing could therefore be useful in terms of reducing the cognitive load on our students when they are applying knowledge in performance, production of problem solving.

Knowledge Organisers

In a blog post entitled, “Knowledge Organisers: Fit for Pupose?” Heather Fearn describes how she looked at lots of examples of knowledge organisers and found that often there was a confusion over their purpose which caused the documents to be muddled in design. As a result, they were confusing for students to use. She identifies three valid purposes:

  • A curriculum mapping tool for the teacher
  • A reference point for the pupil
  • A revision tool for the pupil and possibly parents

Given that we have Schemes of Learning for teachers to make use of and text books for students as a wider reference resource, I believe a useful definition of a knowledge organiser at Swindon Academy would be:

A structured, single A4 sheet which students, teachers and parents can use to create low stakes practice quizzes. The sheet identifies the raw knowledge which needs to be recalled swiftly in order to be successful within the assessment for a specific unit. This could include: 

  • Definitions of terms, concepts or key ideas
  • Components of a process
  • People/Characters involved in a chronology
  • Processes/Chronologies/Narrative summaries
  • The steps in procedures

Use the following to check the formatting of your knowledge organisers.

  • Colour code knowledge which will be required beyond the end of the unit and knowledge which is only required in the medium term.
  • Number each item in each section to enable easy self-testing.
  • Embolden the absolute key words so that peer markers of quizzes can check they have been used in answers.
  • If you have to write more than one sentence, consider your phrasing. This will make your own explanations clearer and more efficient when you speak.
  • Don’t have too many sections/categories – four or five are probably sufficient.
  • If including images, ensure these are the same format as those you will use in your actual lessons.
  • Spellcheck your knowledge organizer.
  • Don’t include questions or ‘thoughts to consider’.
  • If it isn’t essential it shouldn’t be there.

Self-testing. 

In his blog, “One Scientific Insight for Curriculum Reform” Joe Kirby of Michaela Community School poses the question: “what’s the optimal format and frequency of low-stakes testing or retrieval practice?” He cites various research papers from Roediger et al. In terms of format, he maintains that “Applied research suggests [well designed] multiple-choice questions are as effective as short-answer questions. The latest research study is as recent as March 2014, so this is a fast-evolving field, and one to keep an eye on.” With regards to frequency, he adds, shorter and more frequent quizzes outperform longer and less frequent. However, current research suggests that impact on our long term memory is maximised if this testing is spaced and interwoven.

He then goes on to summarise the work of a number of cognitive psychologists from the book “Make It Stick” in the following set of principles for self-testing:

  • Use frequent quizzing: testing interrupts forgetting
  • Roll forward into each successive quiz questions on work from the previous term.
  • Design quizzing to reach back to concepts and learning covered earlier in the term, so retrieval practice continues and learning is cumulative.
  • Frequent low-stakes quizzingin class helps the teacher verify that students are in fact learning as well as they appear to be and reveal the areas where extra attention is needed.
  • Cumulative quizzingis powerful for consolidating learning and concepts from one stage of a course into new material encountered later.
  • Simply including one test retrieval practicein a class yields a large improvement in final exam scores, and gains continue to increase as the frequency of testing increases.
  • Effortful retrieval makes for stronger learning and retention. The greater the effort to retrieve learning, *provided that you succeed*, the more learning is strengthened by retrieval.
  • In virtually all areas of learning, you build better mastery when you use testing as a tool
  • One of the best habits to instill in a learner is regular self-quizzing.

What are our next steps for the student element?

  1. Design knowledge organisers which fit the definition above for Schemes of Learning in Years 7-9.
  2. Use the checklist above to review the knowledge organisers.
  3. Devise self-tests or drills which could be used to assess students’ retention of the knowledge. This should include:
  • Completion of a blanked out timeline
  • Matching definitions and key terms
  • Labeling key diagrams from the organiser
  • Answering questions based on the knowledge organiser
  • A crossword with definitions from the organiser as the clues
  • Translation exercises for MFL using vocabulary from the organiser
  • Short answer questions and multiple choice questions based on the knowledge from the organiser
  1. Generate a prep schedule for students for self-testing of the sections of each knowledge organiser. In the first week, students will produce flashcards based on the organiser and in future weeks, students will use Look And Say and Cover and Write and Check (LASACAWAC) or an online quizzing platform for a specific proportion of their prep each week.
  2. Ensure knowledge organisers are stuck in to each prep book.
  3. Train students in how to use their knowledge organisers.
  4. Ensure that, as students move through the key stage, they are frequently testing themselves and being assessed in class on knowledge from previous units which they require at the end of the key stage.
  5. Add the schedule to E-Praise.

[1] Some of the following are taken from Phil Stock’s sequence of blog posts “Principles of Assessment.”

 

Write the Theme Tune, Sing the Theme Tune

Write the Theme Tune

One of my favourite blog posts from last year was The Exam Essay Question and How To Avoid Answering Them from Mark Roberts, in which he proposes six, possibly controversial principles for approaching exam essays.

  • Know which quotations you’ll use before you go into the exam.
  • Know which parts of the quotations you’ll analyse.
  • Know what the content of that analysis will be.
  • Know how to fit that analysis to just about any task.
  • Know how to twist the question to your own ends.
  • Have a full essay ready to reproduce so that your planning time is used fitting this esssay to the question rather than starting from scratch.

“Making your way in the world today takes everything you’ve got”

With the increased level of challenge in the new GCSEs it can sometimes, with some groups, feel as if there is such a volume of knowledge which needs to be retained that getting them to wade through the exam will be a Herculean task. I like the way Mark’s post provides an efficient approach to preparing for a content heavy, terminal, closed book GCSE exam in English Literature. I also couldn’t help but smile at the way the post reflects my own experience of studying for A-Levels in three essay based subjects: English Literature, History and Politics.

We spent most of Year 12 (or Lower Sixth in old money) covering the content for each module. Year 13 (Upper Sixth pre-decimalisation) was largely spent doing timed practice questions. By Christmas, I’d realised there were only so many question topics which were likely to come up and these could be grouped. As long as you’d learnt the right content and developed a strategy for making that content relevant to the questions, then you could score highly in all three subjects. I planned out generic essay outlines which I could manipulate and deliberately practiced crafting these into full responses with as many past paper questions as I could. You reach a point, in doing this, where you are lifting chunks of memorised paragraphs from one essay, tweaking a few words or popping in key words from the question and dropping them into a new essay.

“Now this is the story all about how my life got flipped, turned upside down.”

Since reading Mark’s post, I’ve been working on a strategy for the AQA English Literature papers – specifically the Shakespeare, 19th Century Novel and Modern Text. First of all, this involved looking closely at the kinds of questions which will come up. Those about Macbeth on the AQA paper will always be based on an extract. In the sample materials, these extracts are, on average, about twenty lines in length. Students will be asked to write about a particular feature of the extract and then link this to other parts of the play. All of the questions I’ve encountered, can be categorised into one of the following groups: character, theme or combination.

Character Questions:

  • Starting with this speech explain how Shakespeare presents Macbeth
  • Starting with this extract explain how Shakespeare presents Lady Macbeth
  • Starting with this extract explain how Shakespeare presents the witches

Theme Questions:

  • Starting with this soliloquy, explain how Shakespeare presents ambition
  • Starting with this speech explain how Shakespeare presents the supernatural

Combination Questions:

  • Starting with this extract explain how Shakespeare presents the effects of the supernatural on Macbeth
  • Starting with this speech explain how Shakespeare presents Lady Macbeth as a powerful woman

It’s also worth noting four aspects of the Level 6 descriptors in the mark scheme.

Students need to use “judicious” quotations. In practice, this means they need to be relevant to the specific point the student is making at that moment in their essay as well as short and embedded in the line of their argument. This will require memorisation as well as practice in using the quotations.

They need to write an “exploratory” response. This means they need to know a range of interpretations of at least some of the quotations they memorise so that they can weave them into their response.

They’re required to craft a “conceptualised” answer, meaning they need to have a clear thread of themes and ideas running through their response. If the question is thematic, this is relatively easy. If the question focuses on a character, it is more challenging. What they need to do in this case, is consider the way Shakespeare uses the character(s) as constructs to impact on our thinking about the themes.

They have to make detailed links between the task, text and context. As a result, they’ll need to have in their memories a range of contextual knowledge which is directly linked to the themes, quotations, and analytical points they’ve revised.

Before I outline what I’ve come up with in terms of a strategy, I have to emphasise that students shouldn’t and can’t get away with this if they don’t have a sound knowledge of the texts (the plot, the characters, the context) already and if they aren’t taught and don’t know how to twist the material to suit the question. If these things have been taught and retained, then I think it could feasibly work.

“What’ll I do when you are far away and skies are blue? What’ll I do?”

So, using Macbeth as an example, I’ll break the strategy down into three stages:

  • Before the exam
  • Planning in the exam
  • Crafting the response

The rest of this post will look at the first two of these stages and the next will look at crafting the response.

“I don’t wanna wait til our lives will be over”

Before the exam:

Having studied the texts, read through various revision guides and looked at the sample papers and other documents produced by the exam board and other teachers, I’ve created four groupings of themes (Fear vs Courage, Ambition vs Acceptance, Superntatural vs Natural order and Truth vs Illusion). Each grouping contains a number of synonyms and antonyms. There is no way I’ve covered all possible themes here, but there are enough to ensure that students could feasibly respond well to questions which are likely to come up in a GCSE exam about Macbeth if they memorise these, as I hope you see when we get onto the planning phase.

In the run up to the examination, to focus students’ revision. I’ve created lists of quotations linked to each of these themed groups. These have then been added to the Quizlet app which students can access. We’ve also printed them off as flashcards in packs. Specific words or phrases have been deliberately removed from the quotations and placed on the reverse of the flashcards so that students, during their revision, are memorising these words and their word classes or phrases and their connected literary terms. Grouping the quotations in this way is intended to support the students in learning them as clusters. Each quotation is linked to a specific character too, in case the task in the exam is character based instead of thematic.

We’ll work with students on modelling how to make use of these quotations in their responses, adding to the flashcards with analysis of the quotations which they can memorise too. The thinking behind this links to this piece by Andy Tharby on teaching interpretations of literature as facts.

“Don’t know about the future, that’s anybody’s guess. Ain’t no good reason for getting all depressed”

Planning:

Even if they know the text and the extract well and they’ve done plenty of practice questions, students just can’t know what the actual question they’ll be confronted with in the exam will be. Of the fifty minutes we’re encouraging students to use for the Macbeth question, we’re suggesting that about ten to fifteen minutes should be annotating, preparing their thoughts and ideas with planning. I’ve developed the following steps to success for the planning phase, using the KAP acronym I’ve drawn from an unknown origin. Having a strategy is important in terms of keeping a clear head in the exam itself.

Steps to success:

Step 1: KAP The question

  • Find the Key focus of the question.
  • Annotate the extract using the FAST annotation method – jot down the key theme words beside the extract, then find and annotate key quotation in the extract which link both to the key focus of the question and these themes.
  • Plan the four Points you’ll make in your answer.
  1. Each of these should link to the question and could link to one of the FAST themes (Fear, Ambition, Supernatural, Truth or their acronyms).
  2. Remember to think about Shakespeare in all your points so that you stay focused on the writer.
  3. Look to include the aspects of context from each of the FAST sections you choose

Step 2: Decide on the quotations which you’ll use to support your points – at least one from the extract and at least one from the FAST theme lists.

So far, our experience is that the process has led to students producing plans which are much more focused on the question, systematic and likely to lead to a thematic or conceptualised approach to the question. In particular, there are fewer annotations which treat characters as if they are real people. There is a risk here that the FAST approach could reduce the text to just these four themes. The intention is that these open up the gateway to a wider approach to the text, but that pragmatically, two months prior to the exam, students need to focus their attention on a process which will make them most successful.

In the next post, I’ll go through the crafting process we’re using and share a few sample responses. In the meantime, here’s the Macbeth Planning  pack we’ve shared with students and the Macbeth Flashcards from Quizlet.