Integrating Assessment, Knowledge and Practice

“If we want pupils to develop a certain skill, we have to break that skill down into its component parts and help pupils to acquire the underlying mental model [to see in their mind what it should look like in practice]. Similarly, when developing assessments for formative purposes we need to break down the skills and tasks that feature in summative assessments into tasks that will give us valid feedback about how pupils are progressing towards that end goal.”

(Daisy Christodoulou – Making Good Progress)

“Deliberate practice develops skills other people have already figured out how to do and for which there are effective training methods.

  • It takes you out of your comfort zone. 
  • It involves well defined, specific goals. 
  • It requires full attention. 
  • It involves feedback and modification. 
  • It requires a focus on specific aspects of procedures.”

(Summarised from Anders Erickson – Peak)

Have you ever tried connecting two hoses together while the water’s flowing? It’s a tricky, splashy business that’s easy to get wrong. There’s a pivotal point in education where you can say the same of curriculum design and assessment. 

Swindon Academy’s curriculum design follows a mastery model based on the following set of key principles:

Curriculum Model

Our Curriculum Leaders and the teachers in their teams have worked hard to develop curriculum overviews and schemes of learning which reflect these principles, often drawing on the best, publicly available resources. In some faculties and in some key stages, this work is further advanced than in others. Whatever stage the teams are at in this development though, they would agree that there is more to do, especially with the recent introduction of new GCSE and A-Level specifications and changes to vocational courses.

Over the course of last half term, I met with each of the Curriculum Leaders to review the Swindon Academy Teaching and Learning Model.

Codification Document

These discussions confirmed that the model still effectively summarises what we would want to occur in classrooms on a day to day basis as well as over time. Two areas of the model came up consistently in discussion as needing re-development though:

  1. The assessment element no longer describes current feedback practices which now vary from faculty to faculty due to the introduction of faculty feedback policies.
  2. Prep (homework) needs to feature more prominently to establish clearer expectations and reflect its importance in developing students’ independence.

Alongside this review, I’ve been reading a range of research relating to cognitive science as well as educational texts on assessment, including the books Driven by Data by Paul Bambrick Santoyo and Making Good Progress by Daisy Christodoulou and blogs by Phil Stock, Heather Fearn and Joe Kirby. These have made me consider, in a different light, how we could tighten up on our assessment model so that we:

  1. Know that the assessment systems which are being used are as reliable as we can make them.
  2. Have a shared understanding of the range of valid inferences we can draw from the data provided by these systems.
  3. Ensure that we maximise the impact of these systems, without damaging their reliability.
  4. Continue to increase the level to which students take responsibility for their own progress.

The remainder of this blog is taken from a paper designed to kick start this process. It is divided into two elements: a teacher element and a student element. The first focuses on curriculum and assessment design whilst the second looks at the use of knowledge organisers and self-testing as prep.

The teacher element:

We’ve found, in introducing a number of Doug Lemov’s Teach Like a Champion strategies, that it’s useful to have a shared vocabulary so that we can have efficient and effective conversations about teaching. This should also be the case with assessment practices.

Key definitions:

The following terms will be, I think, key to developing our shared understanding of assessment practices:

Domain:

The domain is the entirety of the knowledge from which an exam/assessment could draw to test a student’s understanding/ability. At Key Stage 4 and 5, this is defined by the specification, though there are also elements of knowledge from previous Key Stages which aren’t listed in specifications but that still form part of the domain.

Sample:

The sample indicates the parts of the domain which are assessed in a specific task or exam. It’s rare we’d assess the whole of a domain as the assessment would be overly cumbersome. Well designed assessments are carefully thought through. Samples should represent the domain effectively so that valid inferences can be made based on the data gained from the assessment.

Validity:

The validity of an assessment relates to how useful it is in allowing us to make the inferences we’d wish to draw from it. “A test may provide good support for one inference, but weak support for another.” (Koretz D, Measuring Up) We do not describe a test as valid or invalid, but rather the inferences which we draw from them.

Reliability 

If an assessment is reliable, it would “show little inconsistency between one measurement and the next.” (Christodoulou)

Test reliability can be affected by:

Sampling:

  • Most tests don’t directly measure a whole domain; they only sample from it as the domain is too big. If the sample is too narrow, the assessment can become unreliable.
  • If the sample is always the same, teachers will strategically teach to the test to seemingly improve student performance.

Marking:

  • Different markers may apply a mark scheme rubric differently.
  • One marker’s standards may fluctuate during a marking period.
  • Teachers can consciously or subconsciously be biased towards individuals or groups of students.

Students:

  • Performance on a particular day can vary between the start and end of a test.
  • Students perform differently due to illness, time of day, whether they have eaten, emotional impact of life experiences.

Difficulty model

In this form of assessment, students answer a series of questions of increasing difficulty. A high jump competition or a GCSE Maths exam paper are good examples of this model.

Quality model

Here, students perform a range of tasks and the marker judges how well they have performed, most often in relation to a set of criteria. Figure skating competitions and English and history GCSE papers use this model.

General issues which Christodoulou identifies with the most common assessment models:

  • A focus on the teaching and assessment of generic skills can lead to teachers paying insufficient attention to the knowledge required as a foundation for those skills. For example, vocabulary, number bonds, times tables, historical chronologies or relevant, subject specific facts can be overlooked in favour of how to evaluate or problem solve.
  • Generic skill teaching makes deliberate practice far more challenging as it focuses on larger scale success as opposed to fine grained assessment and training. For example, formative assessment in sport may take place during a match rather than a drill. Here, the teacher may miss an issue which a student has with a specific aspect of the sport and then not address it.
  • Using only exam questions for assessment, especially though not exclusively for subjects whose exams are based on the quality model, can hide weaknesses which are at a smaller scale.

Specific issues which Christodoulou identifies with ongoing descriptor assessment and exam based tests:

Limitations with using descriptor based assessments to formatively assess:

  • Descriptors can be vague or unspecific.
  • Using assessment descriptors to feedback can be unhelpful as the describe performance rather than explain how to improve.
  • Descriptors focus on performance rather than long term learning.

Limitations with using descriptor based assessments to summatively assess:

  • Tasks are often not taken in the same conditions by all students which makes assessment less reliable.
  • Descriptors are interpreted differently by different markers.
  • Judgement based on descriptors is subject to bias.

Limitations with using exam based assessments to formatively assess:

  • By their nature, these tests have to sample from a wider domain so we cannot identify precise areas of strength and weakness for students.
  • As questions become more challenging or more difficult, it also becomes more difficult to identify which aspect of the question students did well or badly in.
  • Exams are designed to provide grades and grades aren’t sensitive enough to measure progress in individual lessons.

Limitations with using exam based assessments to summatively assess:

  • If we use exam formats and grades too often with students we can end up teaching to the short term rather than the longer term.
  • All students need to take the assessments in the same conditions to secure levels of reliability.

Assessment Solutions: 

Having established these issues, Christodoulou suggests the following principles for effective formative and summative assessment:

Formative assessment principles:

  1. The tasks/questions set need to allow teachers/students to easily identify issues and next steps. In particular, if you teach a subject which is normally assessed through the quality method in exams, it is worth considering a more fine grained testing approach to assess formatively.
  2. The process needs to include repetition to build to mastery otherwise the formative assessment won’t have the desired impact.
  3. Once material has been mastered, students need to be required to frequently retrieve key learning from their long term memories.
  4. Formative assessment should be recorded as raw marks as this makes it easiest to track from one lesson to the next.

Summative Assessment Principles:

  1. Summative assessments should be taken in standardised conditions and marked in a way which maximises reliability.
  2. They should cover a representative sample of a significant domain.
  3. Scaled scores are more reliable than raw marks for summative assessment.
  4. Enough time should pass between summative assessments for students to make worthwhile improvements.

What are our next steps for the teacher element?[1]

  1. Ensure the curriculum is effectively mapped out and sequenced, establishing the factual and procedural knowledge which students will learn. Divide the knowledge from the curriculum into that which students need in the long term and that which students need for a specific unit. Ensure the bulk of curriculum and prep/revision time is spent on students focusing on retaining the most important knowledge. Build space into the curriculum to assess retention of knowledge from previous units which students need in the long term.
  2. Establish when students will be assessed both summatively (whole Academy calendar) and formatively (faculty curriculum overviews). As far as possible, this should take into consideration: the completion of teaching all elements, enough time between teaching and testing for revision and to suggest that our inferences are based on learning rather than performance.
  3. Ensure that the purpose of each assessment is clear to all involved in its design, delivery, marking and provision of feedback. The format of the test should enable the function to be achieved. It should also ensure that the inferences drawn from the results are as valid as possible. The main purposes of our summative assessments include re-streaming students, reporting to parents, establishing attainment and progress over time in teaching groups and cohorts of students to report to governors. A key question for you here is whether your summative assessments are reliable enough to enable you to validly infer that certain students are working at “age related expectations” in your subject. Formative assessments should be used to identify potential gaps in knowledge, misconceptions or deficiencies in ability that can be subsequently addressed.
  4. Design assessments aligned with this timing and purpose. Using Christodoulou’s principles for summative and formative assessments will help here. Over time, two separate banks could be built up: one of summative and one of formative assessment tasks. For summative assessment, it’s also worth asking yourself the following questions, based on those found in Santoyo’s book Driven by Data. Do assessments in each year:
    • Address the same standard of skill/content as the end of Key Stage assessment
    • Match the end of Key Stage assessment in format?
    • Enable students to move beyond that year’s content/skill level?
    • Reassess previously taught content which is necessary to retain until the end of the Key Stage?
  1. Trial the use of comparative judgements in subjects where the substantial proportion of assessment uses the quality model. 
  2. Preview assessment tasks to ensure that:
  • Questions don’t provide clues as to the answer.
  • Questions are actually testing that students have learned or can apply the knowledge you wanted rather than something else.
  • Questions are worded accurately and any unnecessary information is removed.
  1. Review assessments after use to establish whether they provided you with information that enabled you to make the inferences you wished. Make amendments to assessment items, where required, if they are to be reused in the future. 
  2. Standardise the conditions in which summative assessments take place and the ways in which they are marked. 
  3. Ensure that, where data from assessments is used to make key decisions, the data is sufficiently reliable. For example, when moving students between sets, data from more than one assessment is utilized.
  4. Develop the teaching and learning review which forms part of each teacher’s CPD Booklet to ensure that teachers have action plans in place to address gaps in attainment.
  5. Establish procedures for Curriculum Leaders to review and summarise teacher’s action plans, sharing them with their Line Managers for quality assurance.

The Student Element. 

Over the past two years, a number of our faculties have been trialing the use of knowledge organisers and low stakes testing or quizzing as part of the process of curriculum design. Different models have emerged, sometimes with different purposes and using different frameworks. We want to make the use of knowledge organisers, self-testing and the use of flashcards a core part of our students prep across subjects.

In order to secure the highest impact of this work, we need to evaluate the models currently in use to generate a set of shared principles and uses for these tools. We need to be sensibly consistent in our approach, keeping in mind the differences between the subjects that we teach. There are certainly potential benefits to the use of both knowledge organisers and quizzing, but we need to ensure these are harnessed effectively in each subject area.

Why should we bother with quizzing and knowledge organisers? Aren’t they just fads?

The term knowledge organiser could be a fad, but the idea of organising knowledge into schemas is certainly not as it has been going on for centuries.

As subject specialists, having carefully mapped our curriculum through from Key Stage 3 to Key Stage 5, it would be both wise and desirable to look for the most effective methods to ensure that students retain as much of the knowledge we are teaching them from one year to the next and, of course, into their lives beyond school. 

On a more pragmatic level, in order to support our students to do well with the new GCSE qualifications, we need to help them develop methods for retaining knowledge in the longer term. These qualifications are now more demanding. They require students to retain knowledge longer as they are based increasingly on terminal examinations rather than coursework and they ask more of them in terms of problem solving.

Even if it weren’t for this though, over the course of the last century, hundreds of cognitive science studies have ranked practice testing as one of the most effective methods of improving the retention of information and procedures in the long term memory. “In 2013, five cognitive scientists (Dunlosky, Rawson,Marsh, Nathan, Willingham 2013) collated hundreds of such studies and showed that practice testing has a higher utility for retention and learning than many other study techniques.”

The table below is taken from John Dunlosky’s “Strengthening the Student Toolkit”. In this paper, he argues that, “while some [study] strategies are broadly applicable, like practice testing and distributed practice, others do not provide much – if any – bang for the buck.” Low stakes, practice testing is one of the most effective study methods. 

Dunlovsky

Alongside this, sits Cognitive Load Theory and the work of John Sweller. Our teaching and learning handbook outlines the idea that our working memories have limited capacity only coping with approximately 7+/- 2 items of information. Once we go beyond these limits, then our thinking processes become bogged down. These ideas have been refined over the last couple of decades into a set of instructional principles called Cognitive Load Theory. In their book, “Efficiency in Learning” Sweller et al argue that, “Taken together, the research on segmenting content tells us that:

  • Learning is more efficient when supporting knowledge, such as facts and concepts, is taught separately from main lesson content.
  • Teaching of process stages should be proceeded by teaching the names and functions of components in the process.
  • Teaching of task steps should be segmented from teaching of supported knowledge such as the reasons for the steps and/or concepts associated with the steps.”

Well-designed knowledge organisers or schemas and effective self-testing could therefore be useful in terms of reducing the cognitive load on our students when they are applying knowledge in performance, production of problem solving.

Knowledge Organisers

In a blog post entitled, “Knowledge Organisers: Fit for Pupose?” Heather Fearn describes how she looked at lots of examples of knowledge organisers and found that often there was a confusion over their purpose which caused the documents to be muddled in design. As a result, they were confusing for students to use. She identifies three valid purposes:

  • A curriculum mapping tool for the teacher
  • A reference point for the pupil
  • A revision tool for the pupil and possibly parents

Given that we have Schemes of Learning for teachers to make use of and text books for students as a wider reference resource, I believe a useful definition of a knowledge organiser at Swindon Academy would be:

A structured, single A4 sheet which students, teachers and parents can use to create low stakes practice quizzes. The sheet identifies the raw knowledge which needs to be recalled swiftly in order to be successful within the assessment for a specific unit. This could include: 

  • Definitions of terms, concepts or key ideas
  • Components of a process
  • People/Characters involved in a chronology
  • Processes/Chronologies/Narrative summaries
  • The steps in procedures

Use the following to check the formatting of your knowledge organisers.

  • Colour code knowledge which will be required beyond the end of the unit and knowledge which is only required in the medium term.
  • Number each item in each section to enable easy self-testing.
  • Embolden the absolute key words so that peer markers of quizzes can check they have been used in answers.
  • If you have to write more than one sentence, consider your phrasing. This will make your own explanations clearer and more efficient when you speak.
  • Don’t have too many sections/categories – four or five are probably sufficient.
  • If including images, ensure these are the same format as those you will use in your actual lessons.
  • Spellcheck your knowledge organizer.
  • Don’t include questions or ‘thoughts to consider’.
  • If it isn’t essential it shouldn’t be there.

Self-testing. 

In his blog, “One Scientific Insight for Curriculum Reform” Joe Kirby of Michaela Community School poses the question: “what’s the optimal format and frequency of low-stakes testing or retrieval practice?” He cites various research papers from Roediger et al. In terms of format, he maintains that “Applied research suggests [well designed] multiple-choice questions are as effective as short-answer questions. The latest research study is as recent as March 2014, so this is a fast-evolving field, and one to keep an eye on.” With regards to frequency, he adds, shorter and more frequent quizzes outperform longer and less frequent. However, current research suggests that impact on our long term memory is maximised if this testing is spaced and interwoven.

He then goes on to summarise the work of a number of cognitive psychologists from the book “Make It Stick” in the following set of principles for self-testing:

  • Use frequent quizzing: testing interrupts forgetting
  • Roll forward into each successive quiz questions on work from the previous term.
  • Design quizzing to reach back to concepts and learning covered earlier in the term, so retrieval practice continues and learning is cumulative.
  • Frequent low-stakes quizzingin class helps the teacher verify that students are in fact learning as well as they appear to be and reveal the areas where extra attention is needed.
  • Cumulative quizzingis powerful for consolidating learning and concepts from one stage of a course into new material encountered later.
  • Simply including one test retrieval practicein a class yields a large improvement in final exam scores, and gains continue to increase as the frequency of testing increases.
  • Effortful retrieval makes for stronger learning and retention. The greater the effort to retrieve learning, *provided that you succeed*, the more learning is strengthened by retrieval.
  • In virtually all areas of learning, you build better mastery when you use testing as a tool
  • One of the best habits to instill in a learner is regular self-quizzing.

What are our next steps for the student element?

  1. Design knowledge organisers which fit the definition above for Schemes of Learning in Years 7-9.
  2. Use the checklist above to review the knowledge organisers.
  3. Devise self-tests or drills which could be used to assess students’ retention of the knowledge. This should include:
  • Completion of a blanked out timeline
  • Matching definitions and key terms
  • Labeling key diagrams from the organiser
  • Answering questions based on the knowledge organiser
  • A crossword with definitions from the organiser as the clues
  • Translation exercises for MFL using vocabulary from the organiser
  • Short answer questions and multiple choice questions based on the knowledge from the organiser
  1. Generate a prep schedule for students for self-testing of the sections of each knowledge organiser. In the first week, students will produce flashcards based on the organiser and in future weeks, students will use Look And Say and Cover and Write and Check (LASACAWAC) or an online quizzing platform for a specific proportion of their prep each week.
  2. Ensure knowledge organisers are stuck in to each prep book.
  3. Train students in how to use their knowledge organisers.
  4. Ensure that, as students move through the key stage, they are frequently testing themselves and being assessed in class on knowledge from previous units which they require at the end of the key stage.
  5. Add the schedule to E-Praise.

[1] Some of the following are taken from Phil Stock’s sequence of blog posts “Principles of Assessment.”

 

Write the Theme Tune, Sing the Theme Tune

Write the Theme Tune

One of my favourite blog posts from last year was The Exam Essay Question and How To Avoid Answering Them from Mark Roberts, in which he proposes six, possibly controversial principles for approaching exam essays.

  • Know which quotations you’ll use before you go into the exam.
  • Know which parts of the quotations you’ll analyse.
  • Know what the content of that analysis will be.
  • Know how to fit that analysis to just about any task.
  • Know how to twist the question to your own ends.
  • Have a full essay ready to reproduce so that your planning time is used fitting this esssay to the question rather than starting from scratch.

“Making your way in the world today takes everything you’ve got”

With the increased level of challenge in the new GCSEs it can sometimes, with some groups, feel as if there is such a volume of knowledge which needs to be retained that getting them to wade through the exam will be a Herculean task. I like the way Mark’s post provides an efficient approach to preparing for a content heavy, terminal, closed book GCSE exam in English Literature. I also couldn’t help but smile at the way the post reflects my own experience of studying for A-Levels in three essay based subjects: English Literature, History and Politics.

We spent most of Year 12 (or Lower Sixth in old money) covering the content for each module. Year 13 (Upper Sixth pre-decimalisation) was largely spent doing timed practice questions. By Christmas, I’d realised there were only so many question topics which were likely to come up and these could be grouped. As long as you’d learnt the right content and developed a strategy for making that content relevant to the questions, then you could score highly in all three subjects. I planned out generic essay outlines which I could manipulate and deliberately practiced crafting these into full responses with as many past paper questions as I could. You reach a point, in doing this, where you are lifting chunks of memorised paragraphs from one essay, tweaking a few words or popping in key words from the question and dropping them into a new essay.

“Now this is the story all about how my life got flipped, turned upside down.”

Since reading Mark’s post, I’ve been working on a strategy for the AQA English Literature papers – specifically the Shakespeare, 19th Century Novel and Modern Text. First of all, this involved looking closely at the kinds of questions which will come up. Those about Macbeth on the AQA paper will always be based on an extract. In the sample materials, these extracts are, on average, about twenty lines in length. Students will be asked to write about a particular feature of the extract and then link this to other parts of the play. All of the questions I’ve encountered, can be categorised into one of the following groups: character, theme or combination.

Character Questions:

  • Starting with this speech explain how Shakespeare presents Macbeth
  • Starting with this extract explain how Shakespeare presents Lady Macbeth
  • Starting with this extract explain how Shakespeare presents the witches

Theme Questions:

  • Starting with this soliloquy, explain how Shakespeare presents ambition
  • Starting with this speech explain how Shakespeare presents the supernatural

Combination Questions:

  • Starting with this extract explain how Shakespeare presents the effects of the supernatural on Macbeth
  • Starting with this speech explain how Shakespeare presents Lady Macbeth as a powerful woman

It’s also worth noting four aspects of the Level 6 descriptors in the mark scheme.

Students need to use “judicious” quotations. In practice, this means they need to be relevant to the specific point the student is making at that moment in their essay as well as short and embedded in the line of their argument. This will require memorisation as well as practice in using the quotations.

They need to write an “exploratory” response. This means they need to know a range of interpretations of at least some of the quotations they memorise so that they can weave them into their response.

They’re required to craft a “conceptualised” answer, meaning they need to have a clear thread of themes and ideas running through their response. If the question is thematic, this is relatively easy. If the question focuses on a character, it is more challenging. What they need to do in this case, is consider the way Shakespeare uses the character(s) as constructs to impact on our thinking about the themes.

They have to make detailed links between the task, text and context. As a result, they’ll need to have in their memories a range of contextual knowledge which is directly linked to the themes, quotations, and analytical points they’ve revised.

Before I outline what I’ve come up with in terms of a strategy, I have to emphasise that students shouldn’t and can’t get away with this if they don’t have a sound knowledge of the texts (the plot, the characters, the context) already and if they aren’t taught and don’t know how to twist the material to suit the question. If these things have been taught and retained, then I think it could feasibly work.

“What’ll I do when you are far away and skies are blue? What’ll I do?”

So, using Macbeth as an example, I’ll break the strategy down into three stages:

  • Before the exam
  • Planning in the exam
  • Crafting the response

The rest of this post will look at the first two of these stages and the next will look at crafting the response.

“I don’t wanna wait til our lives will be over”

Before the exam:

Having studied the texts, read through various revision guides and looked at the sample papers and other documents produced by the exam board and other teachers, I’ve created four groupings of themes (Fear vs Courage, Ambition vs Acceptance, Superntatural vs Natural order and Truth vs Illusion). Each grouping contains a number of synonyms and antonyms. There is no way I’ve covered all possible themes here, but there are enough to ensure that students could feasibly respond well to questions which are likely to come up in a GCSE exam about Macbeth if they memorise these, as I hope you see when we get onto the planning phase.

In the run up to the examination, to focus students’ revision. I’ve created lists of quotations linked to each of these themed groups. These have then been added to the Quizlet app which students can access. We’ve also printed them off as flashcards in packs. Specific words or phrases have been deliberately removed from the quotations and placed on the reverse of the flashcards so that students, during their revision, are memorising these words and their word classes or phrases and their connected literary terms. Grouping the quotations in this way is intended to support the students in learning them as clusters. Each quotation is linked to a specific character too, in case the task in the exam is character based instead of thematic.

We’ll work with students on modelling how to make use of these quotations in their responses, adding to the flashcards with analysis of the quotations which they can memorise too. The thinking behind this links to this piece by Andy Tharby on teaching interpretations of literature as facts.

“Don’t know about the future, that’s anybody’s guess. Ain’t no good reason for getting all depressed”

Planning:

Even if they know the text and the extract well and they’ve done plenty of practice questions, students just can’t know what the actual question they’ll be confronted with in the exam will be. Of the fifty minutes we’re encouraging students to use for the Macbeth question, we’re suggesting that about ten to fifteen minutes should be annotating, preparing their thoughts and ideas with planning. I’ve developed the following steps to success for the planning phase, using the KAP acronym I’ve drawn from an unknown origin. Having a strategy is important in terms of keeping a clear head in the exam itself.

Steps to success:

Step 1: KAP The question

  • Find the Key focus of the question.
  • Annotate the extract using the FAST annotation method – jot down the key theme words beside the extract, then find and annotate key quotation in the extract which link both to the key focus of the question and these themes.
  • Plan the four Points you’ll make in your answer.
  1. Each of these should link to the question and could link to one of the FAST themes (Fear, Ambition, Supernatural, Truth or their acronyms).
  2. Remember to think about Shakespeare in all your points so that you stay focused on the writer.
  3. Look to include the aspects of context from each of the FAST sections you choose

Step 2: Decide on the quotations which you’ll use to support your points – at least one from the extract and at least one from the FAST theme lists.

So far, our experience is that the process has led to students producing plans which are much more focused on the question, systematic and likely to lead to a thematic or conceptualised approach to the question. In particular, there are fewer annotations which treat characters as if they are real people. There is a risk here that the FAST approach could reduce the text to just these four themes. The intention is that these open up the gateway to a wider approach to the text, but that pragmatically, two months prior to the exam, students need to focus their attention on a process which will make them most successful.

In the next post, I’ll go through the crafting process we’re using and share a few sample responses. In the meantime, here’s the Macbeth Planning  pack we’ve shared with students and the Macbeth Flashcards from Quizlet.

Closer than close

In the summer of last year I came across a paper and some articles by Daniel Willingham in the Washington Post which brought to mind a nasty flashback.

A nasty flashback – the heart of darkness:

Back during my second year of teaching, I was asked to work with a Local Authority Literacy Consultant. We would trial some of the National Literacy Strategies materials on guided reading – rarely used at the time and still rarely used now in secondary schools. She was a very kind and well-meaning, experienced teacher: keen to support someone early on in their career with a top set Year 7 group who were eager to be stretched.

The focus of these sessions was to be on developing the students’ “reading strategies.”

The advisor brought in some posters about skimming, scanning, empathasing, questioning, predicting, highlighting, inference, deduction and ‘reading backwards and forwards’ (which was and probably still is a real thing).

It was 2002. It was the future. The posters included some ‘state of the clip art’ images of bright pink and bizarrely orange faced children who were reading books.


We planned together how we’d revolve each of our guided sessions around one of these skills. Our core text would be the novel Holes, by Louis Sachar. I bought into all of this. It was the future.

If you tolerate this…

We worked our way through team teaching some skimming lessons. They went ok, though I now suspect many of our top set, 11 year olds were probably wondering why we were modeling such a straightforward process in such great detail. We weren’t stretching them, but they tolerated us.

We worked our way through team teaching some scanning lessons. A bit better, but most of these students could scan for evidence in the level of text we were looking at already. Even in the easy peasy days of ‘the noughties’ the Key Stage 2 SATS were at least a little bit challenging and, it turns out, Holes was already part of the EYFS curriculum – high expectations. Again though, we were tolerated.

Then we worked our way through a lesson on close reading. I was teaching close reading. I thought I was teaching close reading. We looked at an extract about the warden who paints her fingernails with poisoned varnish, then attacks another character with them. We gave the students a number of quotations from the passage. We gave them a question: How does Sachar present the warden in this extract? We gave them some stock phrases they could use: this suggests, this implies, this heightens the impression that, this escalates the idea that, this conveys. We modeled how to use these. Then the students tried it in pairs. Then independently. Again, we were tolerated. I thought I was going great guns.

The following week we tried it again with some non-fiction. We were moving into the realms of close reading. It all fell flat. I hadn’t been teaching close reading. I didn’t really appreciate why back then and I still feel I’m working on getting my head round it.

You Oughta Know

So, what did this ‘nightmare’ inducing paper by Daniel Willingham say. Well, here’s the full text and here’s a brief summary:

There is a correlation between listening comprehension and reading comprehension. However, the differences between listening and reading, particularly the demands of decoding letter strings, make reading comprehension more complex than listening comprehension.

When listening, speakers can periodically check our comprehension either by checking our non-verbal cues or through questioning us. Writers of texts can’t do this as they aren’t generally present when we’re reading their texts and can’t amend what they’ve written to suit us if we are confused. Equally, when listening, we can ask questions of the person creating the spoken text. When we read, we can only ask questions of the text, go back and read the text again or seek answers within ourselves.

Limits in decoding skills, vocabulary and subject knowledge therefore act as barriers to comprehension.

Three overarching routines are, as a consequence, important:

  • Monitoring your own comprehension to decide when you need to re-read a text
  • Making links between the information in different sentences
  • Making links between the text and what you already know

There are numerous studies in strategies which support these routines. However, out of these 481 studies of reading comprehension strategies, only sixteen fulfilled both of the following criteria:

1. They had been peer reviewed

2. They showed a causal relationship between the strategy and the improvement

Only eight out of these sixteen strategies “appear to have a firm scientific basis for concluding that they improve comprehension in normal readers.” Just two of these have been studied in enough depth to provide an effect size. There is an issue as the effect size given by the original testers is impacted by the tests being designed by the researchers. When independent tests were used, the effect sizes were smaller. Experimenters tend to use texts and questions which are well suited to their strategies performing well. Despite this, these two strategies have a significant effect size: question generation (0.36) and multiple strategy instruction (0.32). Most research has been into individual strategies rather than comparing the strategies or unpicking which strategies might be best for which students. There is also little evidence of strategies having any impact before 3rd grade (Year 4). When students’ working memories are focused on decoding, there will be little space left for comprehension strategies.

Willingham’ view, which he’s expressed in a range of publications since is that, based on the evidence he’s seen, that “Reading strategy programs that were relatively short (around six sessions) were no more or less effective than longer programs that included as many as 50 sessions.”

Willingham describes reading strategies as tricks rather than skills. They are shortcuts to a surface understanding of a text. They have impact – though it’s unnecessary to spend weeks practicing them. However, as every text is different, comprehending each text actually requires the reader to hold the vocabulary and the background knowledge to unlock that text.

Where do we go now? Where do we go?

Every other year or so the makers of shaving razors produce an updated version of their product.


The risk here is that I suggest a way forward, then keep adding extra reading razors to enable students to read close, closer than close, closer than you could ever imagine. If I do this, apologies. This is my current best attempt and it’s heavily influenced by Reading Reconsidered by Doug Lemov.

Firstly, let’s go back to the holes in my Holes lessons. At the time, I thought that the reason these worked, whilst the subsequent non-fiction lessons failed was because the novel was pitched at the right level and the non-fiction was pitched too high. Perhaps counterintuitively, I now think the opposite. There was less to say about the Holes extracts than the non-fiction, the students weren’t learning any new vocabulary from the section of Holes which would help them in the short or the long term and they were unused to having to struggle with texts so, when it came to the non-fiction, with its odd ways, the tricks they’d learnt weren’t enough.

Instead, I now believe the focus needs to be on:

1. Exposing students to challenging texts on a wide range of topics in order to increase their background knowledge

2. Selecting texts which exemplify excellence and basing your planning around these

3. Implicitly and explicitly teaching tier two vocabulary and subject specific terminology

4. Increasing the amount students are thinking and writing about both the content and crafting of texts

This is how we do it:

Each Key Stage 3 lesson will begin with a ten minute Fluency Fix session. This will incorporate:

  • Explicit vocabulary teaching using a mixture of tier two words which relate to the core text and other texts that students will be studying that half term.
  • The learning of quotations from the core text to be used in the end of term exam.

Over the coming term in Year 7, alongside studying Beowulf, students will have a weekly close reading lesson and a writing lesson. This term’s close reading sessions will focus on narratives. In subsequent terms, we’ll move on to non-fiction. For the next six weeks, we’ve chosen extracts from the following texts and put them in this Wild Adventures Anthology:

  • Heart of Darkness – a narrative featuring a journey through challenging terrain
  • Lord of the Flies – a narrative featuring two contrasting characters in a challenging landscape
  • Treasure Island – a narrative featuring a threatening character
  • Jaws – a narrative featuring a threatening creature
  • Witch Child – a narrative featuring a wonderful discovery
  • Metamorphosis – a narrative featuring an increasingly desperate situation

We’ve selected these as they incorporate a range of complex vocabulary. They’re also great for exploring both language and structure.

In order to try to maximize the impact of these text choices, we’ve prepared teaching scripts in the style of those in Reading Reconsidered. We’ve used the comments function on Word to highlight where questions should be used to draw out meaning from the text. Here’s the document for the Heart of Darkness session. These scripts incorporate a number of readings of the text – conveying to students that they shouldn’t expect to understand the text on their first reading. We’ve mixed the following types of reading:

1. Contiguous reading – working through the text from start to finish

2. Leapfrog reading – jumping through the text to explore a specific image, theme, character

3. Line by line reading – analysing a part of the text in great detail

During the first reading, opportunities are taken for implicit vocabulary teaching.

Following on from this, the other readings feature text dependent questions, moving from establishing the literal meaning of parts of the text to analysing the deeper meaning of the language or structure used by the writer.

We’ve used this grid from the Reading Reconsidered training to devise these questions. This was a real eye-opener for me as I’ve had a tendency in the past to jump to analysis too quickly, before students have understood the literal meaning. We’ve also built in some of the question types Andy Tharby has provided in this post on an approach to improving analysis. Finally, in some of the more analytical questions, we’ve used more tentative language to open up a culture of error and exploration.

I’ll keep you posted with how much closer this gets our students.

Fluency Fix – An Approach to Vocabulary Teaching

Last year, we introduced a Word of the Week programme during tutor time. As you’d expect, systematically introducing only one word a week across the whole academy during tutor time had a very limited impact on the quality of students’ writing and reading. Having said this, it did raise the profile of this aspect of literacy with all staff and students and it enabled us to try out some of the strategies from Isabel Beck’s work in her books, Bringing Words to Life and Creating Robust Vocabulary. These have helped us to think through and begin to implement a new programme which we’re calling Fluency Fix. 

Beck’s principles are outlined in this post on the Word of the Week programme. These blogs from Josie Mingay, David Didau and Doug Lemov are great reads about methodologies for explicitly teaching vocabulary.

Particularly important in influencing our planning for the new programme was Josie’s reminder of Graham Nutall’s three conditions leading to effective processing;

  • Strength – multiple exposures to new information (at least 3 or 4 within a limited time) is essential in order to embed knowledge
  • Depth – ensuring students think ‘hard’ about new information so as not to allow it to just hover on the surface, instead challenging learners to wrestle with new ideas and concepts to ensure they are deeply rooted
  • Elaboration – providing opportunities for learners to make connections and associations with previously acquired knowledge, in order for this to ‘latch’ onto something

I don’t want to spend long on theory here though as the intention of this post is to introduce the Fabulous Five Programme, seek peer critique and invite other teachers or English departments to become involved in its development if they wish.

Fluency Fix introduces students to five, tier two words at the start of each week.

We’ve been piloting it in Year 11 at the moment and are initially focusing on abstract nouns, verbs or adjectives relating to emotions. We’ve begun with these as, in addition to believing in the importance of broadening the students’ vocabulary generally, pragmatically these words will help the students in responding as a character in Question 1 of the iGCSE English paper and communicating their emotional response to language in both Question 2 and the unseen poetry question in their Literature exam.

When we introduce the programme into other year groups, we will combine these kinds of words with tier two words identified in the texts the students are covering as part of the curriculum.

The process occurs in six steps at present. Each stage has a common framework so that students become familiar with the process and only need to focus on developing their knowledge of the new vocabulary rather than what to do. Below is a description of each stage, the framework and an example.

Stage one is an introduction of the week’s words, focusing on familiarity with the definitions, pronunciation, graphemes, morphemes and other methods of memorising the spellings.

Fabulous Five – Session 1 Framework

Session 1 Aggravation-Optimism

Stage two focuses on developing memories of the meaning of the word. It is a cloze exercise incorporating a short passage which uses all five of the words and a comprehension question about the impression given of a character or event as a result of the use of the words.

Fabulous Five – Session 2 Framework

Session 2 Aggravation-Optimism

Stage three requires students to apply their developing knowledge of the meanings of the words. They answer a range of questions, incorporating the words (in different forms) into full sentence answers.

Fabulous Five Session 3 Framework

Session 3 Aggravation-Optimism

Stage four involves students writing an extended, directed piece, using all five of the words.

Fabulous Five Session 4 Framework

Session 4 Aggravation-Optimism

A further exposure occurs through a weekly spelling test of the words.

Fabulous Five Homework Frame

Homework Aggravation-Optimism

As we’ve moved through the weeks, we’ve been weaving words from previous weeks in to these exposures so as to increase the likelihood of students retaining the words in their long term memories. We’ve also been looking into how we can best utilise online tools like Memrise and Quizlet, as Andy Tharby discusses here. Finally we’ve set the expection that  students use these words in their speech and writing to embed the vocabulary through more frequent usage.

I’d be really interested, first of all, in what you think of this approach to vocabulary teaching and the frameworks we’ve developed. Do you have amendments you’d suggest or tweaks you think we should make? Should we introduce further steps or do you have other frameworks you think would enhance our work. Lastly, if you like the way this is heading and would be introducing it or something very similar in your faculty, would you be interested in sharing the workload of setting it all up across five year groups on a Dropbox or Google shared drive? Let me know on Twitter (@NSMWells) or via e-mail (Nick.Wells@Swindon-Academy.Org)

Reviewing the situation. 

‘”There are a good many books, are there not, my boy?” said Mr. Brownlow, observing the curiosity with which Oliver surveyed the shelves that reached from the floor to the ceiling.

“A great number, sir,” replied Oliver; “I never saw so many.”

“You shall read them if you behave well,” said the old gentleman kindly; “and you will like that, better than looking at the outsides…”‘

In my previous post, I mentioned I’d been on the Reading Reconsidered training, led by Doug Lemov, Erica Woolway and Maggie Johnson. The two days were packed full of challenge: challenging practices and challenging practises. 

Nobody ever writes “Six things I know about…” or “Six interesting facts about…” or “Six of the greatest….of all time” posts. Five’s doing alright. Ten’s overexposed. Six gets a bad press. So, in the spirit of reconsidering and reviewing situations, here are my “Six Things I Reconsidered about Reading” whilst on the training. 

1: Read-Write-Discuss-Revise

Early during day one, Doug made a staggering confession. After recounting a significant part of the plot of Shakespeare’s The Winter’s Tale, he reavealed that as a university undergraduate he had (be prepared for a shock) gone to the seminar focusing on this play without having done the required reading. 

Apart from shattering all my illusions about Doug, the real revelation here was that the pattern we often use in English lessons, from the primary classroom all the way through to the university seminar room, read-discuss-write, reduces levels of student accountability. We read, we talk about texts (though often the issues in the texts rather than the texts themselves) then we write about them. This means it’s possible for some students to understand the gist of a text, take a back seat during the discussion phase and then make a decent fist of writing about the text because of what they’ve picked up from other people in the room. This is ok if you’re preparing students for coursework, but a really bad move for exam classes. 

I recall fairly vividly the rumor circulating during my own first year at university that there was a set of detailed plot summaries on the third floor of the library for all of the core texts. These were probably the most well read texts amongst my fellow students. I may have used them once. 

Doug was keen to point out that sometimes read-discuss-write and other similar sequences have their place, but in terms of increasing the ratio of thinking and participation in classrooms, shifting to read-write-discuss-revise is a good move as students are having to think for themselves before gleaning ideas from others. 

2: Plan from the text

It may seem unsurprising to people outside the world of English teaching, but everything we did over the two days was about drawing implicit and explicit meaning from texts and everything was rooted in the texts. 

When you look in many English text books, there are superfluous activities which can detract from reading the texts themselves – activities designed to lead to empathy with a character, activities designed to help students consider the relevance of social issues, woolly and fluffy activities which pad out lessons in the worst possible way. Make a Facebook page for Macbeth and Lady Macbeth, write a letter from Beowulf to his Danish pen friend, hold a conference call between Eva Smith and the other women who were protesting about the conditions at Mr Birling’s factory with success criteria including the use of AFOREST devices. 

The reading strategies we experienced and watched videos of during the training were the opposite of this. All meaning was drawn from the texts being read. All planning was done with the texts as the starting point. 

3: Read aloud and rehearse reading aloud – Control the Game

We spent about twenty minutes preparing for and then practicing reading aloud with a class using the Teach Like a Champion technique, Control the Game. It seems rather stupid now to type these words out as it’s such a key part of my job but I’ve never, even in my training year, practiced reading aloud with other teachers and had feedback. Ridiculous, isn’t it? Modeling reading ourselves and hearing students reading whilst preempting or addressing decoding issues, fluency issues or issues with expression are at the foundations of what we do. 

I think there’s an assumption that these are so basic that anyone can do them. Almost anyone can do them, but it’s only when you really  consider them and practice them that you can maximize their impact. If we want, as English teachers, to be experts then it’s worth working together on these aspects of our work. This is where we could make real marginal gains. 

4: Implicitly and explicitly teach vocabulary 

I’ve written previously about explicit vocabulary teaching here. Despite having read Isabel Beck’s work on vocabulary, I haven’t set aside the time to fully think through implicit vocabulary teaching  – teaching new and challenging words just prior to or during the study of the text. 

Having tried this since the training, the first challenge is in selecting the words to focus on in the text the students are exploring. Beck suggests we should select words which:

  • Don’t feature regularly in oral communication
  • Aren’t domain specific
  • Aren’t text specific and therefore will be revisited in reading or could be revisited in writing

The Reading Reconsidered team advise we then work out which of these words we will:

  • Work on the pronunciation of as, when this is cracked, understanding will follow
  • Provide a definition for
  • Provide a definition for and opportunities to practice
  • Selectively neglect 

Again, these may seem like easy decisions on the surface and for a one off lesson. However, if you begin to think about which words you’ll focus on from each text in order to make connections between texts on your curriculum, it requires a much deeper level of planning. 

5: Questions – move from questions about explicit to implicit meanings

One of the most joyous moments of the two days was experiencing Maggie Johnson modeling a close reading session. It was inspiring. 

We focused on this short extract from Steinbeck’s Grapes of Wrath. 

“To the red country and part of the gray country of Oklahoma, the last rains came gently, and they did not cut the scarred earth. The plows crossed and recrossed the rivulet marks. The last rains lifted the corn quickly and scattered weed colonies and grass along the sides of the roads so that the gray country and the dark red country began to disappear under a green cover. In the last part of May the sky grew pale and the clouds that had hung in high puffs for so long in the spring were dissipated. The sun flared down on the growing corn day after day until a line of brown spread along the edge of each green bayonet. The clouds appeared, and went away, and in a while they did not try any more. The weeds grew darker green to protect themselves, and they did not spread any more. The surface of the earth crusted, a thin hard crust, and as the sky became pale, so the earth became pale, pink in the red country and white in the gray country.

In the water-cut gullies the earth dusted down in dry little streams. Gophers and ant lions started small avalanches. And as the sharp sun struck day after day, the leaves of the young corn became less stiff and erect; they bent in a curve at first, and then, as the central ribs of strength grew weak, each leaf tilted downward. Then it was June, and the sun shone more fiercely. The brown lines on the corn leaves widened and moved in on the central ribs. The weeds frayed and edged back toward their roots. The air was thin and the sky more pale; and every day the earth paled.”

What Maggie did was to combine the Control the Game reading we’d tried out previously with a “Leapfrog Read” where we jumped to the various references to the sun and a “Line by Line” read focusing on explicit and implicit meanings of some of the figurative language in the text. Though she pointed us to specific parts of the text, all of the thinking was ours and we were often made to write or think before discussing. Kris Boulton asks here whether this is common in English classrooms and I’d say that parts of it are, but I’ve never seen them done so expertly. 

The biggest thing I’ll take away from this session though is the shift between questions about explicit meaning and implicit meaning. It’s made me rethink and tighten up the way I structure questions about texts already. 

6: Consider implications for teachers outside of the English department

There are obvious implications of Reading Reconsidered for English faculties. They should be fairly clear for most other subjects too. However, I’ll redirect you to Kris Boulton’s blog as it takes you through his response to a question I asked him about subjects in which the relevance may be less obvious. 

There are a good many books in a good many subjects. Our students should read widely across these subjects and the strategies offered by Reading Reconsidered do, I believe, offer a way of moving deeply into the texts between the covers. 

Reconsider Yourself at Home

“What an excellent example of the power of dress, young Oliver Twist was! Wrapped in the blanket which had hitherto formed his only covering, he might have been the child of a nobleman or a beggar; it would have been hard for the haughtiest stranger to have assigned him his proper station in society. But now that he was enveloped in the old calico robes which had grown yellow in the same service, he was badged and ticketed, and fell into his place at once — a parish child — the orphan of a workhouse — the humble, half-starved drudge — to be cuffed and buffeted through the world — despised by all, and pitied by none.”

Only a few paragraphs into his novel, Oliver Twist, Dickens establishes his protagonist as being representative of those children who, through the  circumstances of their birth, the state of Victorian society and the treatment of others, were destined to a life of economic, social and cultural poverty. 

But our Olivers, our Olivias, our Olgas and our Omars, they’re alright aren’t they? We have higher expectations now. We have more schools than ever before which are judged to be good and outstanding by Ofsted. The government says. Even Sir Michael Wilshaw says – and he definitely has high expectations. 

Yet, according to Barnado’s, the British children’s charity, “There are currently 3.7 million children living in poverty in the UK. That’s over a quarter of all children.” Worryingly, on the same page of Barnado’s website, they tell us:

  • “Only 48 per cent of 5 year olds entitled to free school meals have a good level of development at the end of their reception year, compared to 65 per cent of all other pupils.  
  • Less than half of pupils entitled to free school meals (just 34 per cent) achieve 5 GCSEs at C or above, including English and Maths, this compares to 61 per cent of pupils who are not eligible.”

So, at both ends of our system of compulsory schooling, statistically there are still significant educational inequalities. Though many of these children won’t be bound by these statistics, in too many cases the underlying issues affect students as they move through adulthood and linger into old age. Although the numbers are fewer, too many, like Oliver, are still left “badged and ticketed” to fall into their “place” in society. 

In their 2013 paper for ASCL, “What is Preventing Social Mobility? A Review of the Evidence,” Francis and Wong identify the following two factors as playing a key roll in generating the attainment and opportunity gaps between advantaged and disadvantaged students in the U.K. 

1) The high level of educational and social segregation in our system.

2) The facilitation of those with better financial and social capital to use this to secure advantage for their children.

Captured between these two areas, Francis and Wong list:

  • School (particularly teacher) quality and dis/advantage. 
  • Educational segregation through private and selective schools and through setting or streaming within schools. 
  • Identity and self-fulfilling prophecies. 
  • Curriculum. 
  • Work experience and school to work routes. 
  • Access to higher education. 

Addressing these issues requires action at system, school and teacher levels. 

Last week, I had the privilege of spending two days training with the Teach Like a Champion team – Doug Lemov, Erica Woolway and Maggie Johnson – as well as welcoming Doug to Swindon Academy, the school where I teach. I remember, on first reading Teach Like a Champion, being seriously impressed by the analysis Doug and his team had carried out in terms of what the most effective teachers in their school systems were doing to secure rapid progress, particularly for students from deprived backgrounds. If you’re unfamiliar with his work, this video is a useful starting point. 

When we were beginning to develop our teaching model at Swindon Academy, it struck us that there would be great benefit in having a shared language of teaching which we could use with both staff, during their coaching and CPD sessions, and students during their lessons. We could also see that the strategies tied in exceedingly well with the mastery curriculum model we were moving towards. Most importantly, we agreed with Lemov’s message that the strategies he identifies in the book are a toolkit to draw from, rather than a prescriptive list of methods which must be used robotically and unthinkingly in every lesson. Teachers must, whilst aligning themselves with the vision and ethos of their school, be seen as professional thinkers who make choices about the best strategies to support their students’ to make progress. 

I think this also links to what Doug alludes to in this post about school leaders making professional judgements in terms of what’s going to be of greatest benefit to the students of their school at that point on their developmental journey. It’s also why I’m really proud of what Doug says in the video at the end of this post about what he saw happening in our school. 

I’ll be writing another post soon about the ways in which we’re looking to weave in the strategies from Reading Reconsidered with the same sense of mission – not one born out of the pity which Dickens suggests people could/should have for Oliver, but rather one based on a belief in the benefits of engaging all of our students – whatever their background – in the richness of the English language and the wonder of English literature. 


 

The Higher You Build Your Barriers – Analyse This 2


In the previous post in this sequence, I established the premise that, in the literature classroom, reading is essentially an intellectual, emotional and/or behavioural reaction to text(s) and that, when we’re teaching students to study literature, we’re teaching them factual and/or procedural knowledge which will enable them to more successfully communicate these reactions.

Now I want to look at the potential barriers to students communicating a knowledgeable reaction in the form of an analytical piece of writing at KS3 and beyond. As this is such a huge topic, my aim, in this instance, is to categorise these barriers rather than list every possible permutation. I also don’t intend to explore any solutions here just yet. Instead, I’ll be saving these for a later post in this sequence. To help structure my thinking I’ll be splitting the issues up into two core categories.

Hands up please if you think I’ve missed something. 

Text based barriers relating to:

  • The mechanics of reading 
  • Emotional impact 
  • Behavioural impact 
  • Intellectual impact

Task based barriers relating to:

  • Question type
  • Mark scheme
  • Specimen exemplar responses from exam boards

Text based barriers:


At the most fundamental level, this set of barriers includes students having gaps in their phonic knowledge on arrival at secondary school, not having reached fluency in their decoding skills and making little use of expression or variation of tone in their reading. If these more basic problems are still lingering at the end of Key Stage 2, then they clearly need to be addressed early on during Key Stage 3 for students to make any sense of the more complex literary texts they’ll encounter during their GCSE years. 

Beyond the foundations, these mechanical barriers also encompass limited levels of perseverance with potentially unfamiliar or archaic language; the possibility that students may be reading a text written in a language in which they are not yet proficient; and difficulties caused by the complex, syntactic sequencing often used in poetry and some (particularly older) prose texts. 

Barriers relating to students’ emotional, behavioural and intellectual reactions can, of course, be caused by a range of specific educational needs which make, for example, empathising with characters in a text or cognitively processing a text’s meaning much more challenging. 

In addition, some students have limited vocabulary with which to either comprehend or express subtly different feelings or actions. Comprehension and communication of comprehension can also be stifled if students don’t know much about the themes or the concepts which a text focuses upon. 

A lack of exposure to a range of cultural, social or emotional experiences inhibitting empathy with the narrator or character(s) can prevent or limit an emotional reaction. Conversely, students can be unwilling to open up about emotions or actions as a result of past social experiences they have had, such as mockery at the hands of their peers. 

When students have little knowledge relating to the possible impacts of choices of forms, structures and figurative or rhetorical language, it can limit their reaction going beyond the emotional or behavioral. This issue can also restrict their ability to express why they or others may have had these reactions to a text in the first place.

A lack of knowledge linked to social, historical and cultural contexts can prevent students expressing how or why texts are characteristic of their time or how they break away from traditions or conventions. It can also prevent students understanding why characters have acted in certain ways if they deviate from the manner in which they would act themselves as a result of differences in culture. 

Task based barriers:


As English teachers we are, in the majority of cases, graduates of English literature and/or language degree courses. Consequently, I’m sure we’d all like to think, we have a clear sense of how analytical writing should be structured and crafted. The ideal in our minds, most likely, takes the form of an academic essay – start to finish. 

A number of the examination questions which students have to answer for the latest GCSE exam specifications, though, require them to write something more like a mini-essay or ‘essay-let.’ This is in part because of the wording of the questions themselves and in part because of the time students are given to respond in the exams. 

At times, therefore, I think there is a mismatch between what we have in mind in terms of structuring academic writing and what is required for a successful response from the students in the form of a high grade. This is more so the case in English than in English Literature, but I believe the issue exists in both qualifications. 

To exemplify this, in AQA’s GCSE Specimen Literature Paper 2, students have to complete this question:

In both ‘Poem to my Sister’ and ‘To a Daughter Leaving Home’ the speakers describe feelings about watching someone they love grow up. What are the similarities and or differences between the ways the poets present those feelings?

Students’ responses to this question are worth a maximum of eight marks from a paper worth 94 marks in total. The time allocated for the paper is 2 1/4 hours. If you were to allocate the same proportion of time to each question as the proportion of the overall marks it is worth, then this question should take just under 12 minutes. Although the two unseen poems referenced in this question are nowhere near as rich in language or as structurally dense as Ozymandias, by Shelley, or Exposure, by Owen, (two of the pre-studied poems the same paper may also include a question about) and although the students will have read the first poem in order to answer the previous question, 12 minutes seems very little to respond to a question which could, if more time were given, potentially lead into a full analytical, comparative essay. 

The time allocation for this question will clearly mean students will produce a less than full response to these relatively simplistic poems. One wonders, therefore, whether it’s a worthwhile task or whether it’s actually been included in the paper to fulfill a government requirement, especially as it’s likely to lead to the teaching of a more simplistic form of response. 

Potentially exacerbating the issue of which structure to use for each question are the bullet points which exam boards provide in some of their tasks. These are, in the most part, designed to support students with basic prompts relating to the content of their responses. However, they can actually act as a barrier if students use them as a guide to structuring and organising their answer. 

In the same specimen AQA paper as the poetry question we’ve just looked at, students are assessed on their knowledge of a modern prose or drama text. One of the options for the question relating to An Inspector Calls is:

How and why does Sheila change in An Inspector Calls?

Write about:

  • How Sheila responds to her family and the Inspector. 
  • How Priestley presents Sheila by the way he writes. 

This question is odd for a number of reasons. Firstly, of the twenty four questions in this section of the paper, this is one of only four that don’t mention the writer’s name. What’s stranger still is that this is the only question which, prior to the bullet points, treats the events of the text as if they’re reality and the character as a real person rather than a literary construct. The vast majority of questions in this paper begin with the stem, “How does (insert the writer’s name) present/explore…?” The three other questions which don’t mention the writer of the text ask about the importance of a particular feature, theme or character. 

The reason this is important is because a student could answer the question above about Sheila well within the terms of the question itself by giving a narrative based response but they would be penalized in terms of the mark scheme and the bullet points as they would be less likely to have discussed the effects of the writer’s choices. I may be wrong, but I think this would restrict them to level one of the mark scheme and no higher than five marks. 

Protecting students against this is presumably why the bullet points have been included. They’re designed to remind students of the other aspects of the mark scheme, but it is plausible that this would be too late. It’s also quite possible that he bullet points in the Sheila question could actively promote a way of thinking which risks taking students further away from the original question. The first bullet could quite feasibly lead students to discuss Sheila’s separate responses to the other members of her family, without making these relevant by linking each back to the way it results in the changes to her world view or sense of morality. The second bullet point finally suggests to students that they should view Shiela as a literary construct, crafted by Priestley. However, there is no reference to changes, alterations or shifts in her character in the last bullet point and my concern is that this creates an unnecessary barrier to students crafting an effective response. The question itself has prompted one way of thinking and therefore writing. The bullet points suggest a different approach. 

The question itself should lead towards success within the terms of the mark scheme. In this, and other cases, it does not. 

One reason why teachers revert to teaching a PEE/PEEL/PEEZ style structure as a basic form for structuring the parts of a response is that you can feel like you’re wangling the different parts of the acronym around to address the different parts of the mark scheme. This, you might think, mitigates against a dodgy question like the one about Sheila. 

There are four Assessment Objectives covered in the English Literature qualification:

AO1 – Read, understand and respond to texts:

  • Maintain a critical style and develop an informed, personal response.
  • Use textual references, including quotations, to support and illustrate interpretations. 

AO2 – Analyse the language, form and structure used by a writer to create meanings and effects, using relevant subject terminology where appropriate. 

AO3 – Show understanding of the relationships between texts and the contexts in which they were written. 

AO4 – Use a range of vocabulary and sentence structures for clarity, purpose and effect with accurate spelling and punctuation. 

PEE covers some of this, but not all. In the next post in this sequence, I’m going to look at where PEE comes from and pull some responses to literature from people who’ve not been taught such a structured response to see what they do.