Litteranguage – Part 1

Before you read the rest of this blog, take a few minutes to go off, get yourself a drink and consider these questions:

  • What do you think should be the purpose of an English Language GCSE?
  • What do you think is the purpose of the current English Language GCSE?
  • To what extent is the current English Language GCSE fit for purpose?

I literally want you to not read on – at least not until you’ve had a think anyway…

Stop right now. Thank you very much.

…So, I hope you’ve had a good drink as well as a bit of time to consider. Now, about those questions. Let’s mull over some possible responses to the first one.

My suspicion is that your answer fell into or across one or more of the categories below as they’re an amalgamation of responses which I received to the same question on Twitter.

The purpose of the English Language GCSE is to:

  1. Act as a governmental lever to ensure schools cover specific curriculum content
  2. Establish a clear set of “skills” which students should develop during their time in Years 10 and 11.
  3. Define the knowledge that students should be able to recall at and beyond the end of Year 11, ensuring that students are exposed to the best that has been thought and said in order for them to get “cleverer.”
  4. Assess mastery of a proportion of the content of the National Curriculum for English at Key Stage 4.
  5. Gauge the “progress” which individuals or cohorts of students have made between Key Stage 2 and Key Stage 4.
  6. Function as a qualification that demonstrates to employers a certain level of proficiency in language use.
  7. Prepare pupils for the study of English Language at a higher level.

Too much of something is bad enough. 

What’s interesting is the potential for conflict which exists between these aims. You may favour one or more of these purposes over another one. You may think that one of these purposes should play no role in the design of a qualification which most children take at age 16. You may think it’s possible to do all of these things through a single qualification. You may even want to suggest an alternative or additional purpose which has no connection with any of these. If you do, I’d be interested to know what you’ve came up with so please do get in contact.

Now though, I’d like to explore each aim in turn, thinking about how it ties in with the current GCSE and whether the qualification as it stands is fit for purpose. In the rest of this post, I’ll look at numbers 1-3, Parts 2 and 3 will explore 4 and 5 whilst the final post will examine number 6 and 7 before going on to tentatively suggest some possible improvements.

Tell me what you want, what you really, really want.

As academies and free schools no longer have to follow the National Curriculum and there are currently no Key Stage 3 SATs tests, the GCSE qualification is the main lever by which the government can attempt to ensure curriculum content coverage.

This lever is used to make some content less important and some more significant.

For example, the National Curriculum for Key Stage 4 says that “Pupils should be taught to understand and use the conventions for discussion and debate.” In the English Language GCSE however, students are only assessed through an individual presentation with follow up questions. Moreover, the fact this assessment no longer contributes to the final grade, but instead is allocated a grade in its own right (albeit one that doesn’t feed into the school’s data set) has impacted on the way many schools approach the spoken language aspect of the qualification. There will be schools which focus on the teaching of debate and discussion, but, due to their disappearance in the assessment model, I would expect the numbers to be fewer than had previously been the case.

There are similar examples in other aspects of the curriculum. In writing, the National Curriculum establishes the expectation that teachers will teach students how to make notes, draft and write, including using information provided by others [e.g.writing a letter from key points provided; drawing on and using information from apresentation].” If this act of note making and transformation fed into a question in the GCSE paper, then it would be covered extensively by English departments across the country. As it is, I’d guess, in comparison with other aspects of the curriculum it’s barely touched upon.

In contrast, “seeking evidence in the text to support a point of view, including justifying inferences with evidence” is assessed a great deal. This is required for every single reading question in both AQA papers.

One reason this happens is that there’s a document which acts, in some ways, as a filter between the National Curriculum and the GCSE Specifications – the English Language GCSE Subject Content and Assessment Objectives.

According to the website, these “GCSE subject content publications [set] out the knowledge, understanding and skills common to all GCSE specifications.” I knew something like this existed, but have only read it properly as part of the process of writing this blog.

In English, this is the document which establishes that “All texts in the examination will be ‘unseen’, that is, students will not have studied the examination texts during the course.” It set out that these have to be, “challenging texts from the 19th, 20th and 21st centuries.” It shifted English language exams away from media studies by declaring, “Texts that are essentially transient, such as instant news feeds, must not be included.” Like it or not, the government have clear leverage over the content of the GCSE in English Language as well as some leverage over the style of the papers. If you like it over all of the other purposes outlined at the start of this blog, you could argue that the GCSE is fit for purpose.

It’s also worth keeping in mind what might happen if we removed government leverage altogether and removed the English Language GCSE. In some schools, what is assessed is what is taught. Were the English Language GCSE to be removed in favour of assessing reading and writing through other subjects, then the focus on coherence, clarity and accuracy in communication would, in my view, fade. 

Swing it, shake it, move it, make it, who do you think you are?

I’ve also, never taken a look at the documents for other subjects before. Here are the content and assessment objectives for science, maths and religious studies. When you compare them, one thing that’s likely to immediately strike you is the relative brevity of the English document. In fact, there are very few which are shorter (drama and computer science are two examples). Some might argue that this is due to the nature of each subject – that science is primarily a knowledge based subject whilst the English curriculum leans towards the development of a set of skills. The science content document explains that its “first section explains the main ways in which working scientifically should be developed and assessed…The second section sets out the key ideas and subject contents for the biology, chemistry and physics components of combined science.” To exemplify this, one of the scientific ways of working is “presenting observations and other data using appropriate methods” whilst one of the scientific ideas which students need to know from biology is that “life processes depend on molecules whose structure is related to their function.” This structuring of skills and knowledge isn’t present in the English Language document. Rather, we have a list of skills like “critical reading and comprehension…summary and synthesis…and producing clear and coherent texts.”

There are clear differences between science and English and the way the document is structured to focus on skills is beneficial if you believe purpose number 2 in the list at the start of this blog post to be most important. However, this structuring leads to two key issues in terms of the English GCSE. The first relates to teaching and the second to the design of the exam papers.

Firstly, it would be difficult to argue that the skills listed are not worth developing over time in terms of our students’ ability to communicate. There is, though, a body of knowledge which needs to lie behind this. In terms of English, this includes specifically punctuation, vocabulary, spelling, word classes, grammar, figurative language and rhetoric. There is little point, I think, in getting students to apply a set of skills if they do not possess a body of knowledge to apply them with. In addition, due to the unseen nature of the texts, if they are not to be disadvantaged, the reading section of the exam also requires students to have a broad knowledge of history, geography, science – a general knowledge. It is possible to a degree, to improve performance in exam questions by practicing exam questions. If we want students to improve as communicators, then we need to think more widely, addressing gaps in factual and procedural knowledge. To some extent, the exam can be pot luck test rather than a test of students’ knowledge and application of English language. Ensuring that students develop the knowledge which is hidden behind the subject content document for English Language requires great teaching across the whole of the curriculum and, likely, a restructuring of the English curriculum.

Secondly, the weighting of the content document towards skills and the fact that the final say in the document is given to the Assessment Objectives, has led exam boards (most notably AQA) to directly focus each of the questions in their exam papers on a limited range of the Assessment Objectives. AQA have also designed their papers so that the structure and question stems will be almost identical year on year. The result of this is a set of questions which is mechanical in nature and responses which, I’d predict over time, will become increasingly mechanical in order to address aspects of the mark scheme. The skills that students will be developing with this style of GCSE will be the skill of responding to a specific question type, rather than the skills of reading and writing. If we genuinely want students to develop skills and apply their knowledge effectively, then we may have to accept a less standardised form of exam paper. For this reason, I’d argue that the current model of GCSE in English Language is not fit for purposes 2 and 3 in our list.


Literature is not Clickbait

Over the last fifteen years or so of English teaching, I’ve read my fair share of analytical responses to literature. I have just over a hundred, year 10 mock papers sat in a box, calling to me to befriend them so that they can be marked this weekend. They will all be my friends by Tuesday once they’re marked – not quite so much on Sunday or Monday whilst I’m marking them though. 

Amongst these papers there will likely appear the phrase “this makes the reader want to read on.” It is a phrase which I have seen many times before and it is a phrase which I will not befriend. 

Yesterday on Twitter, I posted the following: “Seriously, who has ever told a child that a writer does something to make the reader want to read on? Where does that come from?” The initial response was, as I’d expected. A number of other English teachers retweeted what I’d written, presumably as they felt the same frustration I did. Then came a challenge. 

Some people began to question whether I was suggesting it was a stupid way to respond to a text. It was claimed that writers, with some exceptions, do write with the intention of their readers continuing through their texts til they finish them. Writers, they argued, have commercial interests in their texts being read to the end. 

This is true to an extent. There are relatively few writers, I’d expect, who would want their books, articles or poems to sit in the bargain bucket gathering dust and decreasing in value. So, why does the phrase “makes us want to read on” bug me so much?

  1. It is banal. 
  2. If the texts we are putting in front of our students are more than mere clickbait, then there is so much more that could be said about them. 
  3. It doesn’t take much to extend students’ thinking and the expression of children who say or write this as a response. It just requires one more question; some time to think and some time to write or rewrite. Why does it make you want to read on? How does it get you to read on? What does it make you feel or think that makes you want to continue feeling that way? All kinds of question. 
  4. Modelling how to enhance analytical writing from this point is not difficult and will help students to see there is so much more to say. 
  5. A child who struggles to express much more than this as a response to literature is more likely to struggle to craft their writing effectively. 

My frustration at the statement was, therefore, a frustration at a system which has allowed too many of our students to reach the age of sixteen, unable or unwilling to respond to literature in a way that sees beyond  its commercial quality. There is beauty, ugliness, peculiarly, familiarity, truth, illusion, fear, contentment, tragedy, comedy and a whole spectrum of other emotions, sensations, traditions and thoughts to be found in literature. Supporting children to move beyond the banal should help them to see much of this. 

Pedagogical Content Knowledge in English

“…the question of what teachers should understand if they wish to teach a domain responsibly is no simple challenge. In the field of English teaching, where canons are under question and “consensus” is more frequently misspelled than accomplished, the problem of teacher knowledge is daunting.”

In her paper ‘Knowing, Believing and the Teaching of English’, quoted above, Pamela Grossman outlines just some of the key challenges faced by those who try to define the knowledge English teachers require. In essence, they are that:

  • There are numerous ways of dividing up the English curriculum. For example, some argue it can be split into linguistics, literature and composition whilst others would divide it into reading, writing, speaking and listening.
  • English, particularly reading, is an interpretive domain and there are many interpretive schools of thought. There is therefore a question about the number of standpoints which teachers should be able to take. There is also a pedagogic question about whether it is a teacher’s role to tease out the interpretations from students or know them all themself.
  • The history of literature is as sprawling as history itself. How extensively should teachers know the impact of contexts on the texts being studied?
  • Our memories and understanding of how we developed procedural knowledge in writing and reading is buried deep, so expressing this to novices in a useful way is challenging.

There are many more issues than this. In each case though the most important element in finding a solution is ensuring clarity in the English curriculum.

Teachers require this clarity in order to know what their students ought to know so that they can also know how to preempt and address issues when their students don’t know what it is they should know as well as they should know it. Clarity is vital isn’t it?

The issues Grossman outlines in English teaching impact on our understanding of what Shulman terms pedagogical content knowledge (PCK).

PCK is a form of practical knowledge that is used by teachers to guide their decisions and actions in subject focused classrooms. This type of practical knowledge requires, amongst other things:

1. Knowledge of the relationship between content and students – a grasp of the common conceptions, misconceptions, and difficulties that students encounter when learning particular content.

2. Knowledge of the relationship between content and the curriculum – how to sequence, structure academic content to maximise the impact of direct teaching to students.

3. Knowledge of the relationship between content and teaching – the specific teaching strategies, pedagogical techniques and ways of representing, modelling and explaining knowledge that can be used to address students’ learning needs in particular classroom circumstances with specific content.

I’d like to look at each of these in turn with a specific focus on PCK in English.

As a starting point, I think it would be useful to begin a list of the key realisations students need to have in the field of English. Where students haven’t had these realisations, misconceptions emerge. Over time, I’d like to build up a resource a little like this from the AAAS in science.

Realisations for students of English language, literature and composing texts:


That the following words are not the same:

  • Your and you’re
  • Their, there and they’re
  • Two, to and too
  • Practice and practise
  • Bought and brought
  • It’s and its
  • Desert and dessert
  • Dryer and drier
  • Chose and choose
  • Lose and loose

That some verbs have the same sound as nouns but with a different spelling and meaning.

Grammar – Words

That nouns aren’t simply people, places and things.

That an adverb is a word or phrase that modifies the meaning of an adjective, verb, or other adverb, expressing manner, place, time, or degree. Not all adverbs end in ly and not all words ending in ly are adverbs.

That verbs aren’t simply “doing words.”

That adjectives aren’t simply “describing words.”

That a word can have different functions, dependent on the context in which it is used.

That some verbs have the same sound as nouns but with a different spelling and meaning.

Grammar – Sentences

That subjects and verbs are the central components of a sentence and not full stops and capital letters.

That the placement of a full stop or comma in a text isn’t defined by the need to breathe.

That simple sentences are not just short sentences.

That complex sentences aren’t simply long or just filled with complex information.

That writers will break syntactical conventions for specific reasons.

That we don’t use a modal verb, like should, with “of” because “of” is not a verb, it is a preposition. Some people write “should of” because when they speak, they say “should’ve.”

That auxiliary verbs can also function as main verbs.

That subjects and verbs in sentences need to be in agreement. For example, we write I/he/she/it was but we/they/you were.

That pronouns can function as sentence subjects.

Reading for understanding

That, unless you are willing to read in your own spare time, your chances of develop your knowledge and understanding of the world, as well as the world presented by the writers you encounter in class are limited; you won’t grow intellectually

That many writers will bother to spend a great deal of time considering lexical and syntactic choices.

That (chains of) words can be used by writers as symbols.

That writers do not necessarily write with the same voice they use for speaking.

That the views of a narrator are not necessarily the same as the views of the writer.

That there can be more than one narrator of a story.

That we can’t always trust what we’re told by the narrator of a story.

That writers organise their texts for both clarity and influence.

That writers do not always understand or appreciate the significance of their stories.

That writers do not always endorse everything that happens in their narratives.

That bad people can write good stories and good people can write bad stories.

That the narrative may not be the most important part of a narrative text.

That your interpretation of (part of) a text may not be the only one.

That, though there are often multiple possible interpretations of a literary text, there are also wrong interpretations.

That writers don’t start from scratch every time. They draw on ideas from other people and they adapt and combine forms and often use archetypal characters.

That some writers use the plot and characters in their texts as a way of commenting on society or providing a moral message.

That writers, particularly of non-fiction, often consider the nature and scale of their audience in deciding how to write rather than just setting off.

That the act of writing for a public audience is done to influence thoughts, feelings or actions.

That characters in fiction books aren’t the same as real people.

That writers sometimes deliberately use cliche.

That a narrative is still fictional, even if its context is real.

Reading – Poetry

That great poets consciously select forms which are almost invariably linked to the content and/or meaning of their poems.

That lyrical poems are not the same thing as lyrics for songs or raps.

That stressed syllables are louder than unstressed syllables.

That words that look similar sometimes may not rhyme.

That poets miss out letters to change metrical patterns.

That some words no longer rhyme due to language change.

That poems are often part of a sequence or collection.

That ‘ing’ is not the part of the word that rhymes.

Composition – General

That it is better to carefully consider the precise words you want to use than include as many words as you can.

That the language you’re using may not be standard English.

That what we write about and how we write is influenced by the world around us as well as who we are writing for.

That some people will judge you based on the way you speak and write.

Composition – Creative

That adding ellipses at the end of (part of) a story doesn’t increase the tension and is an unnecessary way of highlighting a cliffhanger.

That ‘said’ is often the most appropriate verb to describe speech.

Composition – Analytical

That all writers want their audience to read on so there are more interesting things you could say about the effects of their language choices.

That a text implies rather infers and a reader infers rather than implies from a text.

That rhetorical questions have a more specific purpose than to make the reader think.

That a quotation is a group of words taken from a written or a spoken text.

That you do not have to agree with the writer in order to be able to see their point of view.

That, even though you find a text dull, it can still have literary or wider cultural value.

Composition – Rhetorical

That rhetorical questions are not questions that don’t require an answer, that there are multiple types of rhetorical question and that each type has a subtly different function.

That repetition is not a rhetorical method in and of itself.

Composition – Structure

That a change in paragraph marks a shift in time, place, topic, point of view or speaker in dialogue but that writers sometimes break these conventions for effect.

That the plot of a story doesn’t always have to be told in chronological order.


Integrating Assessment, Knowledge and Practice

“If we want pupils to develop a certain skill, we have to break that skill down into its component parts and help pupils to acquire the underlying mental model [to see in their mind what it should look like in practice]. Similarly, when developing assessments for formative purposes we need to break down the skills and tasks that feature in summative assessments into tasks that will give us valid feedback about how pupils are progressing towards that end goal.”

(Daisy Christodoulou – Making Good Progress)

“Deliberate practice develops skills other people have already figured out how to do and for which there are effective training methods.

  • It takes you out of your comfort zone. 
  • It involves well defined, specific goals. 
  • It requires full attention. 
  • It involves feedback and modification. 
  • It requires a focus on specific aspects of procedures.”

(Summarised from Anders Erickson – Peak)

Have you ever tried connecting two hoses together while the water’s flowing? It’s a tricky, splashy business that’s easy to get wrong. There’s a pivotal point in education where you can say the same of curriculum design and assessment. 

Swindon Academy’s curriculum design follows a mastery model based on the following set of key principles:

Curriculum Model

Our Curriculum Leaders and the teachers in their teams have worked hard to develop curriculum overviews and schemes of learning which reflect these principles, often drawing on the best, publicly available resources. In some faculties and in some key stages, this work is further advanced than in others. Whatever stage the teams are at in this development though, they would agree that there is more to do, especially with the recent introduction of new GCSE and A-Level specifications and changes to vocational courses.

Over the course of last half term, I met with each of the Curriculum Leaders to review the Swindon Academy Teaching and Learning Model.

Codification Document

These discussions confirmed that the model still effectively summarises what we would want to occur in classrooms on a day to day basis as well as over time. Two areas of the model came up consistently in discussion as needing re-development though:

  1. The assessment element no longer describes current feedback practices which now vary from faculty to faculty due to the introduction of faculty feedback policies.
  2. Prep (homework) needs to feature more prominently to establish clearer expectations and reflect its importance in developing students’ independence.

Alongside this review, I’ve been reading a range of research relating to cognitive science as well as educational texts on assessment, including the books Driven by Data by Paul Bambrick Santoyo and Making Good Progress by Daisy Christodoulou and blogs by Phil Stock, Heather Fearn and Joe Kirby. These have made me consider, in a different light, how we could tighten up on our assessment model so that we:

  1. Know that the assessment systems which are being used are as reliable as we can make them.
  2. Have a shared understanding of the range of valid inferences we can draw from the data provided by these systems.
  3. Ensure that we maximise the impact of these systems, without damaging their reliability.
  4. Continue to increase the level to which students take responsibility for their own progress.

The remainder of this blog is taken from a paper designed to kick start this process. It is divided into two elements: a teacher element and a student element. The first focuses on curriculum and assessment design whilst the second looks at the use of knowledge organisers and self-testing as prep.

The teacher element:

We’ve found, in introducing a number of Doug Lemov’s Teach Like a Champion strategies, that it’s useful to have a shared vocabulary so that we can have efficient and effective conversations about teaching. This should also be the case with assessment practices.

Key definitions:

The following terms will be, I think, key to developing our shared understanding of assessment practices:


The domain is the entirety of the knowledge from which an exam/assessment could draw to test a student’s understanding/ability. At Key Stage 4 and 5, this is defined by the specification, though there are also elements of knowledge from previous Key Stages which aren’t listed in specifications but that still form part of the domain.


The sample indicates the parts of the domain which are assessed in a specific task or exam. It’s rare we’d assess the whole of a domain as the assessment would be overly cumbersome. Well designed assessments are carefully thought through. Samples should represent the domain effectively so that valid inferences can be made based on the data gained from the assessment.


The validity of an assessment relates to how useful it is in allowing us to make the inferences we’d wish to draw from it. “A test may provide good support for one inference, but weak support for another.” (Koretz D, Measuring Up) We do not describe a test as valid or invalid, but rather the inferences which we draw from them.


If an assessment is reliable, it would “show little inconsistency between one measurement and the next.” (Christodoulou)

Test reliability can be affected by:


  • Most tests don’t directly measure a whole domain; they only sample from it as the domain is too big. If the sample is too narrow, the assessment can become unreliable.
  • If the sample is always the same, teachers will strategically teach to the test to seemingly improve student performance.


  • Different markers may apply a mark scheme rubric differently.
  • One marker’s standards may fluctuate during a marking period.
  • Teachers can consciously or subconsciously be biased towards individuals or groups of students.


  • Performance on a particular day can vary between the start and end of a test.
  • Students perform differently due to illness, time of day, whether they have eaten, emotional impact of life experiences.

Difficulty model

In this form of assessment, students answer a series of questions of increasing difficulty. A high jump competition or a GCSE Maths exam paper are good examples of this model.

Quality model

Here, students perform a range of tasks and the marker judges how well they have performed, most often in relation to a set of criteria. Figure skating competitions and English and history GCSE papers use this model.

General issues which Christodoulou identifies with the most common assessment models:

  • A focus on the teaching and assessment of generic skills can lead to teachers paying insufficient attention to the knowledge required as a foundation for those skills. For example, vocabulary, number bonds, times tables, historical chronologies or relevant, subject specific facts can be overlooked in favour of how to evaluate or problem solve.
  • Generic skill teaching makes deliberate practice far more challenging as it focuses on larger scale success as opposed to fine grained assessment and training. For example, formative assessment in sport may take place during a match rather than a drill. Here, the teacher may miss an issue which a student has with a specific aspect of the sport and then not address it.
  • Using only exam questions for assessment, especially though not exclusively for subjects whose exams are based on the quality model, can hide weaknesses which are at a smaller scale.

Specific issues which Christodoulou identifies with ongoing descriptor assessment and exam based tests:

Limitations with using descriptor based assessments to formatively assess:

  • Descriptors can be vague or unspecific.
  • Using assessment descriptors to feedback can be unhelpful as the describe performance rather than explain how to improve.
  • Descriptors focus on performance rather than long term learning.

Limitations with using descriptor based assessments to summatively assess:

  • Tasks are often not taken in the same conditions by all students which makes assessment less reliable.
  • Descriptors are interpreted differently by different markers.
  • Judgement based on descriptors is subject to bias.

Limitations with using exam based assessments to formatively assess:

  • By their nature, these tests have to sample from a wider domain so we cannot identify precise areas of strength and weakness for students.
  • As questions become more challenging or more difficult, it also becomes more difficult to identify which aspect of the question students did well or badly in.
  • Exams are designed to provide grades and grades aren’t sensitive enough to measure progress in individual lessons.

Limitations with using exam based assessments to summatively assess:

  • If we use exam formats and grades too often with students we can end up teaching to the short term rather than the longer term.
  • All students need to take the assessments in the same conditions to secure levels of reliability.

Assessment Solutions: 

Having established these issues, Christodoulou suggests the following principles for effective formative and summative assessment:

Formative assessment principles:

  1. The tasks/questions set need to allow teachers/students to easily identify issues and next steps. In particular, if you teach a subject which is normally assessed through the quality method in exams, it is worth considering a more fine grained testing approach to assess formatively.
  2. The process needs to include repetition to build to mastery otherwise the formative assessment won’t have the desired impact.
  3. Once material has been mastered, students need to be required to frequently retrieve key learning from their long term memories.
  4. Formative assessment should be recorded as raw marks as this makes it easiest to track from one lesson to the next.

Summative Assessment Principles:

  1. Summative assessments should be taken in standardised conditions and marked in a way which maximises reliability.
  2. They should cover a representative sample of a significant domain.
  3. Scaled scores are more reliable than raw marks for summative assessment.
  4. Enough time should pass between summative assessments for students to make worthwhile improvements.

What are our next steps for the teacher element?[1]

  1. Ensure the curriculum is effectively mapped out and sequenced, establishing the factual and procedural knowledge which students will learn. Divide the knowledge from the curriculum into that which students need in the long term and that which students need for a specific unit. Ensure the bulk of curriculum and prep/revision time is spent on students focusing on retaining the most important knowledge. Build space into the curriculum to assess retention of knowledge from previous units which students need in the long term.
  2. Establish when students will be assessed both summatively (whole Academy calendar) and formatively (faculty curriculum overviews). As far as possible, this should take into consideration: the completion of teaching all elements, enough time between teaching and testing for revision and to suggest that our inferences are based on learning rather than performance.
  3. Ensure that the purpose of each assessment is clear to all involved in its design, delivery, marking and provision of feedback. The format of the test should enable the function to be achieved. It should also ensure that the inferences drawn from the results are as valid as possible. The main purposes of our summative assessments include re-streaming students, reporting to parents, establishing attainment and progress over time in teaching groups and cohorts of students to report to governors. A key question for you here is whether your summative assessments are reliable enough to enable you to validly infer that certain students are working at “age related expectations” in your subject. Formative assessments should be used to identify potential gaps in knowledge, misconceptions or deficiencies in ability that can be subsequently addressed.
  4. Design assessments aligned with this timing and purpose. Using Christodoulou’s principles for summative and formative assessments will help here. Over time, two separate banks could be built up: one of summative and one of formative assessment tasks. For summative assessment, it’s also worth asking yourself the following questions, based on those found in Santoyo’s book Driven by Data. Do assessments in each year:
    • Address the same standard of skill/content as the end of Key Stage assessment
    • Match the end of Key Stage assessment in format?
    • Enable students to move beyond that year’s content/skill level?
    • Reassess previously taught content which is necessary to retain until the end of the Key Stage?
  1. Trial the use of comparative judgements in subjects where the substantial proportion of assessment uses the quality model. 
  2. Preview assessment tasks to ensure that:
  • Questions don’t provide clues as to the answer.
  • Questions are actually testing that students have learned or can apply the knowledge you wanted rather than something else.
  • Questions are worded accurately and any unnecessary information is removed.
  1. Review assessments after use to establish whether they provided you with information that enabled you to make the inferences you wished. Make amendments to assessment items, where required, if they are to be reused in the future. 
  2. Standardise the conditions in which summative assessments take place and the ways in which they are marked. 
  3. Ensure that, where data from assessments is used to make key decisions, the data is sufficiently reliable. For example, when moving students between sets, data from more than one assessment is utilized.
  4. Develop the teaching and learning review which forms part of each teacher’s CPD Booklet to ensure that teachers have action plans in place to address gaps in attainment.
  5. Establish procedures for Curriculum Leaders to review and summarise teacher’s action plans, sharing them with their Line Managers for quality assurance.

The Student Element. 

Over the past two years, a number of our faculties have been trialing the use of knowledge organisers and low stakes testing or quizzing as part of the process of curriculum design. Different models have emerged, sometimes with different purposes and using different frameworks. We want to make the use of knowledge organisers, self-testing and the use of flashcards a core part of our students prep across subjects.

In order to secure the highest impact of this work, we need to evaluate the models currently in use to generate a set of shared principles and uses for these tools. We need to be sensibly consistent in our approach, keeping in mind the differences between the subjects that we teach. There are certainly potential benefits to the use of both knowledge organisers and quizzing, but we need to ensure these are harnessed effectively in each subject area.

Why should we bother with quizzing and knowledge organisers? Aren’t they just fads?

The term knowledge organiser could be a fad, but the idea of organising knowledge into schemas is certainly not as it has been going on for centuries.

As subject specialists, having carefully mapped our curriculum through from Key Stage 3 to Key Stage 5, it would be both wise and desirable to look for the most effective methods to ensure that students retain as much of the knowledge we are teaching them from one year to the next and, of course, into their lives beyond school. 

On a more pragmatic level, in order to support our students to do well with the new GCSE qualifications, we need to help them develop methods for retaining knowledge in the longer term. These qualifications are now more demanding. They require students to retain knowledge longer as they are based increasingly on terminal examinations rather than coursework and they ask more of them in terms of problem solving.

Even if it weren’t for this though, over the course of the last century, hundreds of cognitive science studies have ranked practice testing as one of the most effective methods of improving the retention of information and procedures in the long term memory. “In 2013, five cognitive scientists (Dunlosky, Rawson,Marsh, Nathan, Willingham 2013) collated hundreds of such studies and showed that practice testing has a higher utility for retention and learning than many other study techniques.”

The table below is taken from John Dunlosky’s “Strengthening the Student Toolkit”. In this paper, he argues that, “while some [study] strategies are broadly applicable, like practice testing and distributed practice, others do not provide much – if any – bang for the buck.” Low stakes, practice testing is one of the most effective study methods. 


Alongside this, sits Cognitive Load Theory and the work of John Sweller. Our teaching and learning handbook outlines the idea that our working memories have limited capacity only coping with approximately 7+/- 2 items of information. Once we go beyond these limits, then our thinking processes become bogged down. These ideas have been refined over the last couple of decades into a set of instructional principles called Cognitive Load Theory. In their book, “Efficiency in Learning” Sweller et al argue that, “Taken together, the research on segmenting content tells us that:

  • Learning is more efficient when supporting knowledge, such as facts and concepts, is taught separately from main lesson content.
  • Teaching of process stages should be proceeded by teaching the names and functions of components in the process.
  • Teaching of task steps should be segmented from teaching of supported knowledge such as the reasons for the steps and/or concepts associated with the steps.”

Well-designed knowledge organisers or schemas and effective self-testing could therefore be useful in terms of reducing the cognitive load on our students when they are applying knowledge in performance, production of problem solving.

Knowledge Organisers

In a blog post entitled, “Knowledge Organisers: Fit for Pupose?” Heather Fearn describes how she looked at lots of examples of knowledge organisers and found that often there was a confusion over their purpose which caused the documents to be muddled in design. As a result, they were confusing for students to use. She identifies three valid purposes:

  • A curriculum mapping tool for the teacher
  • A reference point for the pupil
  • A revision tool for the pupil and possibly parents

Given that we have Schemes of Learning for teachers to make use of and text books for students as a wider reference resource, I believe a useful definition of a knowledge organiser at Swindon Academy would be:

A structured, single A4 sheet which students, teachers and parents can use to create low stakes practice quizzes. The sheet identifies the raw knowledge which needs to be recalled swiftly in order to be successful within the assessment for a specific unit. This could include: 

  • Definitions of terms, concepts or key ideas
  • Components of a process
  • People/Characters involved in a chronology
  • Processes/Chronologies/Narrative summaries
  • The steps in procedures

Use the following to check the formatting of your knowledge organisers.

  • Colour code knowledge which will be required beyond the end of the unit and knowledge which is only required in the medium term.
  • Number each item in each section to enable easy self-testing.
  • Embolden the absolute key words so that peer markers of quizzes can check they have been used in answers.
  • If you have to write more than one sentence, consider your phrasing. This will make your own explanations clearer and more efficient when you speak.
  • Don’t have too many sections/categories – four or five are probably sufficient.
  • If including images, ensure these are the same format as those you will use in your actual lessons.
  • Spellcheck your knowledge organizer.
  • Don’t include questions or ‘thoughts to consider’.
  • If it isn’t essential it shouldn’t be there.


In his blog, “One Scientific Insight for Curriculum Reform” Joe Kirby of Michaela Community School poses the question: “what’s the optimal format and frequency of low-stakes testing or retrieval practice?” He cites various research papers from Roediger et al. In terms of format, he maintains that “Applied research suggests [well designed] multiple-choice questions are as effective as short-answer questions. The latest research study is as recent as March 2014, so this is a fast-evolving field, and one to keep an eye on.” With regards to frequency, he adds, shorter and more frequent quizzes outperform longer and less frequent. However, current research suggests that impact on our long term memory is maximised if this testing is spaced and interwoven.

He then goes on to summarise the work of a number of cognitive psychologists from the book “Make It Stick” in the following set of principles for self-testing:

  • Use frequent quizzing: testing interrupts forgetting
  • Roll forward into each successive quiz questions on work from the previous term.
  • Design quizzing to reach back to concepts and learning covered earlier in the term, so retrieval practice continues and learning is cumulative.
  • Frequent low-stakes quizzingin class helps the teacher verify that students are in fact learning as well as they appear to be and reveal the areas where extra attention is needed.
  • Cumulative quizzingis powerful for consolidating learning and concepts from one stage of a course into new material encountered later.
  • Simply including one test retrieval practicein a class yields a large improvement in final exam scores, and gains continue to increase as the frequency of testing increases.
  • Effortful retrieval makes for stronger learning and retention. The greater the effort to retrieve learning, *provided that you succeed*, the more learning is strengthened by retrieval.
  • In virtually all areas of learning, you build better mastery when you use testing as a tool
  • One of the best habits to instill in a learner is regular self-quizzing.

What are our next steps for the student element?

  1. Design knowledge organisers which fit the definition above for Schemes of Learning in Years 7-9.
  2. Use the checklist above to review the knowledge organisers.
  3. Devise self-tests or drills which could be used to assess students’ retention of the knowledge. This should include:
  • Completion of a blanked out timeline
  • Matching definitions and key terms
  • Labeling key diagrams from the organiser
  • Answering questions based on the knowledge organiser
  • A crossword with definitions from the organiser as the clues
  • Translation exercises for MFL using vocabulary from the organiser
  • Short answer questions and multiple choice questions based on the knowledge from the organiser
  1. Generate a prep schedule for students for self-testing of the sections of each knowledge organiser. In the first week, students will produce flashcards based on the organiser and in future weeks, students will use Look And Say and Cover and Write and Check (LASACAWAC) or an online quizzing platform for a specific proportion of their prep each week.
  2. Ensure knowledge organisers are stuck in to each prep book.
  3. Train students in how to use their knowledge organisers.
  4. Ensure that, as students move through the key stage, they are frequently testing themselves and being assessed in class on knowledge from previous units which they require at the end of the key stage.
  5. Add the schedule to E-Praise.

[1] Some of the following are taken from Phil Stock’s sequence of blog posts “Principles of Assessment.”


Write the Theme Tune, Sing the Theme Tune

Write the Theme Tune

One of my favourite blog posts from last year was The Exam Essay Question and How To Avoid Answering Them from Mark Roberts, in which he proposes six, possibly controversial principles for approaching exam essays.

  • Know which quotations you’ll use before you go into the exam.
  • Know which parts of the quotations you’ll analyse.
  • Know what the content of that analysis will be.
  • Know how to fit that analysis to just about any task.
  • Know how to twist the question to your own ends.
  • Have a full essay ready to reproduce so that your planning time is used fitting this esssay to the question rather than starting from scratch.

“Making your way in the world today takes everything you’ve got”

With the increased level of challenge in the new GCSEs it can sometimes, with some groups, feel as if there is such a volume of knowledge which needs to be retained that getting them to wade through the exam will be a Herculean task. I like the way Mark’s post provides an efficient approach to preparing for a content heavy, terminal, closed book GCSE exam in English Literature. I also couldn’t help but smile at the way the post reflects my own experience of studying for A-Levels in three essay based subjects: English Literature, History and Politics.

We spent most of Year 12 (or Lower Sixth in old money) covering the content for each module. Year 13 (Upper Sixth pre-decimalisation) was largely spent doing timed practice questions. By Christmas, I’d realised there were only so many question topics which were likely to come up and these could be grouped. As long as you’d learnt the right content and developed a strategy for making that content relevant to the questions, then you could score highly in all three subjects. I planned out generic essay outlines which I could manipulate and deliberately practiced crafting these into full responses with as many past paper questions as I could. You reach a point, in doing this, where you are lifting chunks of memorised paragraphs from one essay, tweaking a few words or popping in key words from the question and dropping them into a new essay.

“Now this is the story all about how my life got flipped, turned upside down.”

Since reading Mark’s post, I’ve been working on a strategy for the AQA English Literature papers – specifically the Shakespeare, 19th Century Novel and Modern Text. First of all, this involved looking closely at the kinds of questions which will come up. Those about Macbeth on the AQA paper will always be based on an extract. In the sample materials, these extracts are, on average, about twenty lines in length. Students will be asked to write about a particular feature of the extract and then link this to other parts of the play. All of the questions I’ve encountered, can be categorised into one of the following groups: character, theme or combination.

Character Questions:

  • Starting with this speech explain how Shakespeare presents Macbeth
  • Starting with this extract explain how Shakespeare presents Lady Macbeth
  • Starting with this extract explain how Shakespeare presents the witches

Theme Questions:

  • Starting with this soliloquy, explain how Shakespeare presents ambition
  • Starting with this speech explain how Shakespeare presents the supernatural

Combination Questions:

  • Starting with this extract explain how Shakespeare presents the effects of the supernatural on Macbeth
  • Starting with this speech explain how Shakespeare presents Lady Macbeth as a powerful woman

It’s also worth noting four aspects of the Level 6 descriptors in the mark scheme.

Students need to use “judicious” quotations. In practice, this means they need to be relevant to the specific point the student is making at that moment in their essay as well as short and embedded in the line of their argument. This will require memorisation as well as practice in using the quotations.

They need to write an “exploratory” response. This means they need to know a range of interpretations of at least some of the quotations they memorise so that they can weave them into their response.

They’re required to craft a “conceptualised” answer, meaning they need to have a clear thread of themes and ideas running through their response. If the question is thematic, this is relatively easy. If the question focuses on a character, it is more challenging. What they need to do in this case, is consider the way Shakespeare uses the character(s) as constructs to impact on our thinking about the themes.

They have to make detailed links between the task, text and context. As a result, they’ll need to have in their memories a range of contextual knowledge which is directly linked to the themes, quotations, and analytical points they’ve revised.

Before I outline what I’ve come up with in terms of a strategy, I have to emphasise that students shouldn’t and can’t get away with this if they don’t have a sound knowledge of the texts (the plot, the characters, the context) already and if they aren’t taught and don’t know how to twist the material to suit the question. If these things have been taught and retained, then I think it could feasibly work.

“What’ll I do when you are far away and skies are blue? What’ll I do?”

So, using Macbeth as an example, I’ll break the strategy down into three stages:

  • Before the exam
  • Planning in the exam
  • Crafting the response

The rest of this post will look at the first two of these stages and the next will look at crafting the response.

“I don’t wanna wait til our lives will be over”

Before the exam:

Having studied the texts, read through various revision guides and looked at the sample papers and other documents produced by the exam board and other teachers, I’ve created four groupings of themes (Fear vs Courage, Ambition vs Acceptance, Superntatural vs Natural order and Truth vs Illusion). Each grouping contains a number of synonyms and antonyms. There is no way I’ve covered all possible themes here, but there are enough to ensure that students could feasibly respond well to questions which are likely to come up in a GCSE exam about Macbeth if they memorise these, as I hope you see when we get onto the planning phase.

In the run up to the examination, to focus students’ revision. I’ve created lists of quotations linked to each of these themed groups. These have then been added to the Quizlet app which students can access. We’ve also printed them off as flashcards in packs. Specific words or phrases have been deliberately removed from the quotations and placed on the reverse of the flashcards so that students, during their revision, are memorising these words and their word classes or phrases and their connected literary terms. Grouping the quotations in this way is intended to support the students in learning them as clusters. Each quotation is linked to a specific character too, in case the task in the exam is character based instead of thematic.

We’ll work with students on modelling how to make use of these quotations in their responses, adding to the flashcards with analysis of the quotations which they can memorise too. The thinking behind this links to this piece by Andy Tharby on teaching interpretations of literature as facts.

“Don’t know about the future, that’s anybody’s guess. Ain’t no good reason for getting all depressed”


Even if they know the text and the extract well and they’ve done plenty of practice questions, students just can’t know what the actual question they’ll be confronted with in the exam will be. Of the fifty minutes we’re encouraging students to use for the Macbeth question, we’re suggesting that about ten to fifteen minutes should be annotating, preparing their thoughts and ideas with planning. I’ve developed the following steps to success for the planning phase, using the KAP acronym I’ve drawn from an unknown origin. Having a strategy is important in terms of keeping a clear head in the exam itself.

Steps to success:

Step 1: KAP The question

  • Find the Key focus of the question.
  • Annotate the extract using the FAST annotation method – jot down the key theme words beside the extract, then find and annotate key quotation in the extract which link both to the key focus of the question and these themes.
  • Plan the four Points you’ll make in your answer.
  1. Each of these should link to the question and could link to one of the FAST themes (Fear, Ambition, Supernatural, Truth or their acronyms).
  2. Remember to think about Shakespeare in all your points so that you stay focused on the writer.
  3. Look to include the aspects of context from each of the FAST sections you choose

Step 2: Decide on the quotations which you’ll use to support your points – at least one from the extract and at least one from the FAST theme lists.

So far, our experience is that the process has led to students producing plans which are much more focused on the question, systematic and likely to lead to a thematic or conceptualised approach to the question. In particular, there are fewer annotations which treat characters as if they are real people. There is a risk here that the FAST approach could reduce the text to just these four themes. The intention is that these open up the gateway to a wider approach to the text, but that pragmatically, two months prior to the exam, students need to focus their attention on a process which will make them most successful.

In the next post, I’ll go through the crafting process we’re using and share a few sample responses. In the meantime, here are:

Closer than close

In the summer of last year I came across a paper and some articles by Daniel Willingham in the Washington Post which brought to mind a nasty flashback.

A nasty flashback – the heart of darkness:

Back during my second year of teaching, I was asked to work with a Local Authority Literacy Consultant. We would trial some of the National Literacy Strategies materials on guided reading – rarely used at the time and still rarely used now in secondary schools. She was a very kind and well-meaning, experienced teacher: keen to support someone early on in their career with a top set Year 7 group who were eager to be stretched.

The focus of these sessions was to be on developing the students’ “reading strategies.”

The advisor brought in some posters about skimming, scanning, empathasing, questioning, predicting, highlighting, inference, deduction and ‘reading backwards and forwards’ (which was and probably still is a real thing).

It was 2002. It was the future. The posters included some ‘state of the clip art’ images of bright pink and bizarrely orange faced children who were reading books.

We planned together how we’d revolve each of our guided sessions around one of these skills. Our core text would be the novel Holes, by Louis Sachar. I bought into all of this. It was the future.

If you tolerate this…

We worked our way through team teaching some skimming lessons. They went ok, though I now suspect many of our top set, 11 year olds were probably wondering why we were modeling such a straightforward process in such great detail. We weren’t stretching them, but they tolerated us.

We worked our way through team teaching some scanning lessons. A bit better, but most of these students could scan for evidence in the level of text we were looking at already. Even in the easy peasy days of ‘the noughties’ the Key Stage 2 SATS were at least a little bit challenging and, it turns out, Holes was already part of the EYFS curriculum – high expectations. Again though, we were tolerated.

Then we worked our way through a lesson on close reading. I was teaching close reading. I thought I was teaching close reading. We looked at an extract about the warden who paints her fingernails with poisoned varnish, then attacks another character with them. We gave the students a number of quotations from the passage. We gave them a question: How does Sachar present the warden in this extract? We gave them some stock phrases they could use: this suggests, this implies, this heightens the impression that, this escalates the idea that, this conveys. We modeled how to use these. Then the students tried it in pairs. Then independently. Again, we were tolerated. I thought I was going great guns.

The following week we tried it again with some non-fiction. We were moving into the realms of close reading. It all fell flat. I hadn’t been teaching close reading. I didn’t really appreciate why back then and I still feel I’m working on getting my head round it.

You Oughta Know

So, what did this ‘nightmare’ inducing paper by Daniel Willingham say. Well, here’s the full text and here’s a brief summary:

There is a correlation between listening comprehension and reading comprehension. However, the differences between listening and reading, particularly the demands of decoding letter strings, make reading comprehension more complex than listening comprehension.

When listening, speakers can periodically check our comprehension either by checking our non-verbal cues or through questioning us. Writers of texts can’t do this as they aren’t generally present when we’re reading their texts and can’t amend what they’ve written to suit us if we are confused. Equally, when listening, we can ask questions of the person creating the spoken text. When we read, we can only ask questions of the text, go back and read the text again or seek answers within ourselves.

Limits in decoding skills, vocabulary and subject knowledge therefore act as barriers to comprehension.

Three overarching routines are, as a consequence, important:

  • Monitoring your own comprehension to decide when you need to re-read a text
  • Making links between the information in different sentences
  • Making links between the text and what you already know

There are numerous studies in strategies which support these routines. However, out of these 481 studies of reading comprehension strategies, only sixteen fulfilled both of the following criteria:

1. They had been peer reviewed

2. They showed a causal relationship between the strategy and the improvement

Only eight out of these sixteen strategies “appear to have a firm scientific basis for concluding that they improve comprehension in normal readers.” Just two of these have been studied in enough depth to provide an effect size. There is an issue as the effect size given by the original testers is impacted by the tests being designed by the researchers. When independent tests were used, the effect sizes were smaller. Experimenters tend to use texts and questions which are well suited to their strategies performing well. Despite this, these two strategies have a significant effect size: question generation (0.36) and multiple strategy instruction (0.32). Most research has been into individual strategies rather than comparing the strategies or unpicking which strategies might be best for which students. There is also little evidence of strategies having any impact before 3rd grade (Year 4). When students’ working memories are focused on decoding, there will be little space left for comprehension strategies.

Willingham’ view, which he’s expressed in a range of publications since is that, based on the evidence he’s seen, that “Reading strategy programs that were relatively short (around six sessions) were no more or less effective than longer programs that included as many as 50 sessions.”

Willingham describes reading strategies as tricks rather than skills. They are shortcuts to a surface understanding of a text. They have impact – though it’s unnecessary to spend weeks practicing them. However, as every text is different, comprehending each text actually requires the reader to hold the vocabulary and the background knowledge to unlock that text.

Where do we go now? Where do we go?

Every other year or so the makers of shaving razors produce an updated version of their product.

The risk here is that I suggest a way forward, then keep adding extra reading razors to enable students to read close, closer than close, closer than you could ever imagine. If I do this, apologies. This is my current best attempt and it’s heavily influenced by Reading Reconsidered by Doug Lemov.

Firstly, let’s go back to the holes in my Holes lessons. At the time, I thought that the reason these worked, whilst the subsequent non-fiction lessons failed was because the novel was pitched at the right level and the non-fiction was pitched too high. Perhaps counterintuitively, I now think the opposite. There was less to say about the Holes extracts than the non-fiction, the students weren’t learning any new vocabulary from the section of Holes which would help them in the short or the long term and they were unused to having to struggle with texts so, when it came to the non-fiction, with its odd ways, the tricks they’d learnt weren’t enough.

Instead, I now believe the focus needs to be on:

1. Exposing students to challenging texts on a wide range of topics in order to increase their background knowledge

2. Selecting texts which exemplify excellence and basing your planning around these

3. Implicitly and explicitly teaching tier two vocabulary and subject specific terminology

4. Increasing the amount students are thinking and writing about both the content and crafting of texts

This is how we do it:

Each Key Stage 3 lesson will begin with a ten minute Fluency Fix session. This will incorporate:

  • Explicit vocabulary teaching using a mixture of tier two words which relate to the core text and other texts that students will be studying that half term.
  • The learning of quotations from the core text to be used in the end of term exam.

Over the coming term in Year 7, alongside studying Beowulf, students will have a weekly close reading lesson and a writing lesson. This term’s close reading sessions will focus on narratives. In subsequent terms, we’ll move on to non-fiction. For the next six weeks, we’ve chosen extracts from the following texts and put them in this Wild Adventures Anthology:

  • Heart of Darkness – a narrative featuring a journey through challenging terrain
  • Lord of the Flies – a narrative featuring two contrasting characters in a challenging landscape
  • Treasure Island – a narrative featuring a threatening character
  • Jaws – a narrative featuring a threatening creature
  • Witch Child – a narrative featuring a wonderful discovery
  • Metamorphosis – a narrative featuring an increasingly desperate situation

We’ve selected these as they incorporate a range of complex vocabulary. They’re also great for exploring both language and structure.

In order to try to maximize the impact of these text choices, we’ve prepared teaching scripts in the style of those in Reading Reconsidered. We’ve used the comments function on Word to highlight where questions should be used to draw out meaning from the text. Here’s the document for the Heart of Darkness session. These scripts incorporate a number of readings of the text – conveying to students that they shouldn’t expect to understand the text on their first reading. We’ve mixed the following types of reading:

1. Contiguous reading – working through the text from start to finish

2. Leapfrog reading – jumping through the text to explore a specific image, theme, character

3. Line by line reading – analysing a part of the text in great detail

During the first reading, opportunities are taken for implicit vocabulary teaching.

Following on from this, the other readings feature text dependent questions, moving from establishing the literal meaning of parts of the text to analysing the deeper meaning of the language or structure used by the writer.

We’ve used this grid from the Reading Reconsidered training to devise these questions. This was a real eye-opener for me as I’ve had a tendency in the past to jump to analysis too quickly, before students have understood the literal meaning. We’ve also built in some of the question types Andy Tharby has provided in this post on an approach to improving analysis. Finally, in some of the more analytical questions, we’ve used more tentative language to open up a culture of error and exploration.

I’ll keep you posted with how much closer this gets our students.

Fluency Fix – An Approach to Vocabulary Teaching

Last year, we introduced a Word of the Week programme during tutor time. As you’d expect, systematically introducing only one word a week across the whole academy during tutor time had a very limited impact on the quality of students’ writing and reading. Having said this, it did raise the profile of this aspect of literacy with all staff and students and it enabled us to try out some of the strategies from Isabel Beck’s work in her books, Bringing Words to Life and Creating Robust Vocabulary. These have helped us to think through and begin to implement a new programme which we’re calling Fluency Fix. 

Beck’s principles are outlined in this post on the Word of the Week programme. These blogs from Josie Mingay, David Didau and Doug Lemov are great reads about methodologies for explicitly teaching vocabulary.

Particularly important in influencing our planning for the new programme was Josie’s reminder of Graham Nutall’s three conditions leading to effective processing;

  • Strength – multiple exposures to new information (at least 3 or 4 within a limited time) is essential in order to embed knowledge
  • Depth – ensuring students think ‘hard’ about new information so as not to allow it to just hover on the surface, instead challenging learners to wrestle with new ideas and concepts to ensure they are deeply rooted
  • Elaboration – providing opportunities for learners to make connections and associations with previously acquired knowledge, in order for this to ‘latch’ onto something

I don’t want to spend long on theory here though as the intention of this post is to introduce the Fabulous Five Programme, seek peer critique and invite other teachers or English departments to become involved in its development if they wish.

Fluency Fix introduces students to five, tier two words at the start of each week.

We’ve been piloting it in Year 11 at the moment and are initially focusing on abstract nouns, verbs or adjectives relating to emotions. We’ve begun with these as, in addition to believing in the importance of broadening the students’ vocabulary generally, pragmatically these words will help the students in responding as a character in Question 1 of the iGCSE English paper and communicating their emotional response to language in both Question 2 and the unseen poetry question in their Literature exam.

When we introduce the programme into other year groups, we will combine these kinds of words with tier two words identified in the texts the students are covering as part of the curriculum.

The process occurs in six steps at present. Each stage has a common framework so that students become familiar with the process and only need to focus on developing their knowledge of the new vocabulary rather than what to do. Below is a description of each stage, the framework and an example.

Stage one is an introduction of the week’s words, focusing on familiarity with the definitions, pronunciation, graphemes, morphemes and other methods of memorising the spellings.

Fabulous Five – Session 1 Framework

Session 1 Aggravation-Optimism

Stage two focuses on developing memories of the meaning of the word. It is a cloze exercise incorporating a short passage which uses all five of the words and a comprehension question about the impression given of a character or event as a result of the use of the words.

Fabulous Five – Session 2 Framework

Session 2 Aggravation-Optimism

Stage three requires students to apply their developing knowledge of the meanings of the words. They answer a range of questions, incorporating the words (in different forms) into full sentence answers.

Fabulous Five Session 3 Framework

Session 3 Aggravation-Optimism

Stage four involves students writing an extended, directed piece, using all five of the words.

Fabulous Five Session 4 Framework

Session 4 Aggravation-Optimism

A further exposure occurs through a weekly spelling test of the words.

Fabulous Five Homework Frame

Homework Aggravation-Optimism

As we’ve moved through the weeks, we’ve been weaving words from previous weeks in to these exposures so as to increase the likelihood of students retaining the words in their long term memories. We’ve also been looking into how we can best utilise online tools like Memrise and Quizlet, as Andy Tharby discusses here. Finally we’ve set the expection that  students use these words in their speech and writing to embed the vocabulary through more frequent usage.

I’d be really interested, first of all, in what you think of this approach to vocabulary teaching and the frameworks we’ve developed. Do you have amendments you’d suggest or tweaks you think we should make? Should we introduce further steps or do you have other frameworks you think would enhance our work. Lastly, if you like the way this is heading and would be introducing it or something very similar in your faculty, would you be interested in sharing the workload of setting it all up across five year groups on a Dropbox or Google shared drive? Let me know on Twitter (@NSMWells) or via e-mail (Nick.Wells@Swindon-Academy.Org)