Martin Galway  is an English Teaching and Learning Adviser for Herts for Learning

gps-test

Earlier this week, I had promised on twitter to write a blog on why learning about grammar can be a wonderful, liberating  – and life-changing –  thing.  I still intend to write that blog; it’s next on my list.   Before I start banging my grammar drum too loudly though, I have to address some rather troubling pictures from this year’s grammar, punctuation and spelling test, shared on twitter over the past couple of days.  One such picture is this:

qu-9-orig
Question 9: marking fronted adverbials

This might not seem too bothersome until we know that it was not credited in the marking process.  On appeal, it received this response:

qu 9 review.jpg
Review response setting out why the response above is not credited

I am not given to knee-jerk reactions around grammar teaching and testing but this bothered me.  I’ll explain why shortly.  As is often the case on twitter, debate ensued and another example came to light.  Here it is:

comma2-actual
Question 30: commas to clarify meaning or avoid ambiguity in writing

Again, not credited. Again, for insufficiently resembling a comma. Now I was properly bothered.  The first question is essentially testing whether the student is able to demarcate a fronted adverbial.  That’s not just my opinion – it is clearly stated in the test materials.  According to the mark scheme, question 9 (the first example) falls under test domain G5.6b  – here  it is in the test framework (the document that frames the whole test development process):

test-domain-question-1

Note that there are no qualifying comments here in relation to the nature of the mark itself.

The second example bothered me more.  Where the first is relatively simple and tests what is essentially year 4 knowledge, the second has unquestionably greater complexity in that there is more to do physically and mentally with more marks to place. The fact that they are all exactly where they should be tends – to my mind at least – to suggest this child knows exactly what they are doing.Here is the description of the domain that this second question is testing:

test-domain-question-2

So we have a year 5 curricular requirement, with the additional level of complexity of using commas to manipulate the same group of words to carry entirely different meanings.  How great is that?  A simple change of marks and the meaning alters entirely. And what’s this?  The student has done just that, under some degree of pressure.  Bravo youngster!

Only…no.  Credit withheld.  Your mark is insufficiently commaesque (new word – patent pending). And why? Apparently, it does not look like a comma.  Some might argue that this is self-evident, that a comma has a particular ‘look’: a right-leaning minor slash perhaps, and that’s that. But that is not how testing should work.  Testing should aim to establish exactly the degree to which an item or items of knowledge have been understood and retained.  The knowledge here is less about the geometric properties of a comma and more about how the use of a comma serves the meaning of a part or a whole sentence.  The curriculum and the test frameworks set out the required knowledge.  Knowledge is important and assessing whether our students have retained core content is a central concern. In designing tests and exams, we have a duty to ensure that this required knowledge is clearly defined and is properly measured.   Parameters of what is and isn’t acceptable should be as unequivocal as can be reasonably expected. I’m just not at all sure that this is the case here.

It’s really important to note that there are no qualifying comments in either of the test domain descriptors as to what a “proper” comma looks like  – the test domains are squarely focused on determining whether a child can mark a fronted adverbial with a comma  and whether they can use commas appropriately to ensure that the meaning of a sentence is clear.  To my  mind, both candidates have  done that.  The commas are  certainly in the right place.  The question seems to be: are they  commas?

The only real “style of punctuation” guidance I could find in the test materials are set out below.   The middle box refers to what is acceptable and creditworthy; the right hand box sets out what is deemed not worthy of credit:

punc-guidance

There often seems to be a danger in relation to primary GPS that we assume that testable elements are clear cut, have a well-defined right/wrong divide.  This guidance suggests otherwise.  There’s room for inference here in the same way that there is room for inference in the pupil responses above.  I infer that they know what they are doing.  I infer that they are unambiguous marks – even in light of this guidance here.  Consider the example given – are we in the realm of an upside down question mark?

It is also worth noting that this guidance was not available to teachers until the week after the tests had been administered.  An experienced year 6 teacher, aware of these requirements, may (and I stress may) have thought to themselves, “hmm, what does a clear, unambiguous mark look like?   Is this going to be one of those otherwise unforeseen pitfalls that rob my children of their rightful credit?  Right – I’ll factor that in to my test prep sessions and make sure I model a clear, unambiguous comma. I’m going to go for the ‘standing on its tail, leaning to the right, tadpole model’.That’ll do it. I’ll especially make sure those pesky left-handers do the right thing.”

[Author’s note: I count myself amongst those devilish left-handed miscreants]

Let’s just consider that last paragraph.  In  addition to the Interim Framework, not to mention the extensive prescribed elements for grammar , punctuation and spelling that stretch across the really rather large primary English curriculum, do we really want rubrics that set out the accepted dimensions and orientation of punctuation marks?

Incidentally – yet related to this question – this particular turn of events had some foreshadowing back in the previous academic year. I think we can agree that we seem able to generate myths faster than Ofsted are able to bust them. One particular myth sprang forth from the especially mythic period of the launch of the Interim Teacher Assesment Framework exemplification materials.  Understandably, teachers were trying to make sense of what age-relatedness looked like in terms of writing: a tricky task at the best of times.  At one point, someone shared a rumour that under the statutory requirements,  inverted commas (good old speech marks) literally  had to look like “sixty-sixes and ninety-nines.”  I remember rolling my eyes and thinking “what are we doing to ourselves?” Yet here we are – schools facing a system that suggests a punctuation mark has to meet especially strict criteria.   I would understand if the “commas” in the photos above were floating distinctly above the line, apostrophe-like, or they had more of a whiff of full stop about them…but they don’t.  I really do not think it takes a particularly generous mind to see them for what they are: deliberately deployed marks that shepherd the elements of sentence – marshal them so that their meaning is clear?  Surely, this is what grammar teaching, learning and assessment is all about?   If not, why bother?

For the sake of transparency, I must declare that I have sat on panels that review  both KS1 and Ks2 GPS tests. It’s essential that those of us that have taught the content and administered these tests are involved in this process. It’s a rigorous, long-term  process that seeks to determine one thing above all else: does the question – as written – assess what needs to be assessed as set out in the testing framework?  Does it do so fairly and in line with all of the relevant published materials?   This is far harder to do  than I think anyone involved in drafting the (extensive) primary curriculum for GPS ever envisaged.  Once again, grammar is nowhere near as cut and dried as many people like to think.

My question now, in the light of these  shared photos, is this: do the marking protocols (including, more problematically,  the additional guidance for markers that evolve across the marking process and that are not  transparently  available to schools) properly measure the expected/desired learning or are they unduly subject to subjective, (in this case oddly geometric) concerns?  Are they, perhaps, vulnerable to  swift decisions that are not filtered through several rounds of panel-based scrutiny, geared towards preserving validity and reliability?

So a  question then – not a judgement.  I  would dearly love to know the answer.

As to my blog on why grammar can – and should – be a wonderful passage to wider enjoyment of English, it’s on its way and I shall link it here as soon as it is ready.

Excerpts from  test materials:

Contains public sector information licensed under the Open Government licence v.3.0

www.nationalarchives.gov.uk/doc/open-government-licence