Search

Herts for Learning

Blogs for Learning

Category

Assessment

Expectations for handwriting: you’re write to be confused!

Sabrina Wright is a Teaching and Learning Adviser for English at Herts for Learning.

Following on from my last blog, where I unpicked the KS1 exemplification materials and moderation guidance, I felt the urge to spend a little of my time considering the handwriting element of the Interim Teacher Assessment Frameworks (ITAFs) against the National Curriculum (NC) expectations.  I have recently had a number of colleagues -teachers and SLs – ask me to clarify what exactly the ITAF statement:  ‘using the diagonal and horizontal strokes needed to join letters in some of their writingmeans.  The question they have asked is, ‘Does this statement mean that children are expected to join or not?’  At first I thought this was very simple, but whilst writing this blog I have realised that this is not the case.  I’ve unpicked the range of guidance out there and will try to clarify which, at this time, we might want to be paying most attention to.

The sensible place, in my mind, to begin was with the NC.  For Y2 it states the following (statutory):

Pupils should be taught to:

Start using some of the diagonal and horizontal strokes needed to join letters and understand which letters, when adjacent to one another, are best left unjointed

At first glance, the NC doesn’t seem to give us much more information than the ITAF does, but if you continue to read on in the notes and guidance (non-statutory) it says, ‘Pupils should revise and practise correct letter formation frequently. They should be taught to write with a joined style as soon as they can form letters securely with the correct orientation.’    Given that throughout KS1 children have been focusing on forming letters correctly, this to me would imply that although it is not an expectation that every Y2 child will be joining their handwriting, there will be children who are beginning to join in Y2.  In addition to this, the Y3/4 notes and guidance (non-statutory) state that, ‘Pupils should be using joined handwriting throughout their independent writing’.  Surely if that is the expectation in Y3/4 then at Y2 children should be beginning to join their letters? It may not be joined throughout all of their independent writing and may not be 100% accurate, but I think there would be some evidence of letters being joined.  

However, if you wanted to take the NC statement very literally, it could be interpreted as meaning that a child is only required to demonstrate the strokes needed to join writing e.g. if children have been taught to end their letter formations with the required flicks, that would then evolve into joining at a later stage.  How you interpret the NC statements is likely to influence your view of what the ITAF requirements are.

So what does the ITAF have to offer?

As in 2016, pupils can be awarded the ‘working towards the expected standard’ (WTS) or ‘working at the expected standard’ (EXS) if their writing does not evidence one, or more than one, of the handwriting statements. However, to be awarded the ‘greater depth standard’ (GDS) at the end of KS1, pupils must meet all of the statements relating to handwriting from both the EXS and GDS. standard.  

Working at the expected standard states the following:

  • using the diagonal and horizontal strokes needed to join letters in some of their writing
  • writing capital letters and digits of the correct size, orientation and relationship to one

another and to lower case letters

  • using spacing between words that reflects the size of the letters

In the GDS it states that a child should be, ‘using the diagonal and horizontal strokes needed to join letters in most of their writing.’  Therefore, the only difference between this statement and the EXS statement is the word most.   Remembering the qualifier ‘most’ is defined as, the statement is generally met with only occasional errors’ in the guidance and ITAF documents.  Again, if you take the above statements literally, you could interpret them as meaning that a child is only required to demonstrate the strokes needed to join writing e.g. if children have been taught to end their letter formations with the required flicks, that would evolve into joining at a later stage.  

Straight back to the exemplification materials with my magnifying glass! The detail for these documents states that, If teachers are confident in their judgements, they do not need to refer to the exemplification materials. The exemplification materials are there to help teachers make their judgements where they want additional guidance.’  So in theory, I could completely ignore what they demonstrate and go with the literal interpretation of the statement.  However, I just couldn’t do that I’m afraid.   So here goes… if you look at the EXS exemplification materials, the child does not join at all.  It is completely printed – not a single tick in the table provided at the back as a check-list.   Those bullets are not required for EXS anyway, so that is fine. I’m happy with that.

However, in the GDS exemplification materials, in the first piece of evidence about 50% of the words are accurately joined; 50% have some letters joined and in the remaining pieces of evidence there is much greater accuracy throughout.  The comment on the first piece of evidence does in fact say that, diagonal and horizontal strokes are used to join some letters’, but it is ticked in the GDS table as meeting the expectation. All the other pieces in the collection say joined handwriting is used ‘consistently’ and are also ticked in the table. Does this therefore mean that pupils working at GDS should be joining?  

We can definitely conclude that in order to be judged as EXS, a child could show absolutely no signs of beginning to join.  Unfortunately, the GDS evidence muddies the water a little.  The short answer to our question is that this is still a little unknown, and could definitely be down to interpretation.

As we know, the ITAFs do not include full coverage of the content of the NC as it focuses on key aspects for assessment purposes only.  However, I’ve always felt that the GDS ITAF statements/exemplification materials exemplify the NC expectation for Y2 fully, whereas the EXS ITAF statements/exemplification materials don’t.  With this in mind, I would like to think that at GDS pupils should be expected to begin to join their letters; in the same way as I would like to interpret that as the expectation in the NC.

I suppose to put it bluntly, if you want your children to scrape by, then interpreting the handwriting statements as meaning that a child is only required to demonstrate the strokes needed to join writing, is fine.  In reality, they could be, and that would be okay. It would almost be the minimum expectation at GDS.  I cannot at all blame anyone for doing this – particularly if the child is on the borderline of EXS and GDS, but to be honest I think it’s unlikely that you’ll see children producing writing that demonstrates strokes without joining when it’s actually much easier to start joining. If you want to push them further, expecting them to begin joining letters in Y2 would be just grand!


One other thing to consider would be what evidence of handwriting at greater depth would look like.  Whether or not it is joined, is down to your interpretation, but we do have some further guidance as to what evidence is acceptable for pupils working at GDS in the recent teacher assessment moderation: requirements for key stage 1.  It states that, ‘handwriting books or handwriting exercises can provide evidence of pupils’ independent application of handwriting. However, there must be evidence that all handwriting statements are met in some pieces of independent writing.’  Again, if we consider the qualifier ‘some’ which is defined as, ‘the skill/knowledge is starting to be acquired, and is demonstrated correctly on occasion, but is not consistent or frequent’ and apply. 
 Would you award me GDS or EXS?

handwriting-1
Letters are not joined but have the lead ins and flicks
handwriting-2
2 different styles completely joined
handwriting-3
GDS exemplification piece 1
handwriting-4
GDS exemplification piece 2

 

 

RAISEonline Brain Teasers part 2

Ben Fuller, Lead Assessment Adviser at Herts for Learning

raise-pupil-groups-2

And so to the second part in this series (of undefined length – might turn into a box-set) of RAISEonline Brain teasers. If you missed part 1, it’s here. You might also find this a useful discussion about a key difference between the unvalidated and the validated KS2 data.

This post features 2 frequently (ish) asked questions, together with answers.

Q1. Why do the numbers of pupils in the 3 prior attainment groups not add up to the total number of pupils in the cohort?

(For example, in the image above, the 3 figures that I have encircled in blue show that this cohort had 10 pupils in the ‘Low’ prior attainment group, 26 in the ‘Middle’ and 12 in the ‘High’. 10+26+12 = 48 pupils. But the total cohort is shown as 58. So 10 pupils are missing.)

A: The missing pupils will be children who have no measure of prior attainment, so they cannot be allocated to a prior attainment group. For example, maybe they were not in the country at the previous key stage. Or perhaps a teacher assessment of ‘A’ was submitted at the previous key stage (which would be the case if the child had been absent for a large amount of time, making it impossible to determine a teacher assessment level).

Q2. Why do the numbers of pupils in the 3 prior attainment groups shown in RAISEonline differ from the numbers shown in Inspection Dashboard?

inspectiondashboard-pupil-groups-2

Compare the Inspection Dashboard image above with the RAISEonline image at the top. These 2 images are from the same school, same data-set (KS2 Reading outcomes).

Why does Inspection Dashboard show prior attainment group sizes of 12, 27 and 9 pupils in low, middle and high groups respectively, whereas RAISEonline shows groups of 10, 26 and 12?

A: The difference is because Inspection Dashboard is grouping children according to their prior attainment in that same subject (i.e. in this case, reading) whereas RAISEonline groups the children according to their overall prior attainment from the previous key stage. (If looking at KS2 data, the prior attainment is based on children’s KS1 attainment in reading, writing and maths – but with maths given equal weighting to reading & writing combined).

When looking at prior attainment by individual subject, categorising the pupils is fairly straightforward – Level 3s are ‘High’, Level 2s are ‘Middle’, Level 1s and below are ‘Low’.

When using the ‘Overall’ prior attainment, an Average Point Score of 18 or higher is ‘High’, 12-17.9 is ‘Middle’, below 12 is ‘Low’.

So, in the example shown here, there are 9 children in the Reading High prior attainment group, i.e. they achieved level 3 at KS1. But there are 12 children in the overall High group shown in RAISEonline – meaning 3 extra children whose level in reading was below a level 3, but whose overall APS is at least 18 – most likely because they achieved level 3 in maths.

To really unpick what is going on, look at the pupil level data (Pupil List in RAISEonline – or look in your own internal management information system) to see how children have been categorised.

Arguably, the Inspection Dashboard way of doing things makes more sense (in my opinion) – and this is the more significant of the two documents when it comes to how Ofsted use the data pre-inspection.

Why are there these differences between the two documents?

Afraid I can’t answer that one…

NB – when looking at the Progress elements in Inspection Dashboard, the pupil groupings are by overall prior attainment group, not by individual subject. All of the above relates to the Attainment data.

That (probably) concludes my blogging for this term. But more brain teasers to follow in the New Year. I hope you can all cope with the antici…

 

 

 

 

 

Reflecting on the new ‘higher standards’ at Key Stages 1 and 2

Clare Hodgson, Assessment Adviser at Herts for Learning

specs

Succumbing to the inevitable, I have recently acquired, at great expense, a pair of varifocal glasses. I find that I have to hold my head at a fractionally lower angle, as I walk, in order to see clearly. Even so, I am still struggling to adjust. I’m told it will take time.

In a similar way, I am still struggling to adjust to the ramifications and implications of the first year of KS1 and KS2 results, using the new Assessment frameworks aligned with the new National Curriculum. Continue reading “Reflecting on the new ‘higher standards’ at Key Stages 1 and 2”

KS2 Performance Tables (with an added surprise)

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Yesterday saw the release of the KS2 Performance Tables (based on validated data). You can find the figures for any school in England here.

This means that anyone can look up your school and see inspiring data such as this:

progress-chart

To the casual glancer, this chart might appear to suggest that this particular school has achieved progress scores somewhere around the median. But beware, that middle section covers around 60% of schools, so what the image above actually shows is data that could be anywhere between the 21st and 80th percentiles.

The greater surprise, though, in exploring the validated data is that an unexpected  methodological change has taken place since the unvalidated data appeared in RAISEonline. This change applies to one very specific group of pupils – those pupils who were entered into the tests (reading and maths) and who failed to score enough marks to be awarded a scaled score.

In the unvalidated data, these children were excluded from the progress data (but included in attainment). (However, where children were not entered into the test because they were working below the standard of the test, their Pre-Key Stage standard teacher assessment was used instead and those children were included in the progress measure.  This seemed counter-intuitive, in terms of setting up a strange incentive for schools to enter children into a test in which they clearly were unable to achieve.)

Here’s the change: now those children have been included – provided the teacher assessment is one of the Pre Key Stage standards (PKG, PKF or PKE). If you had children who took the test and didn’t achieve a scaled score, and the teacher assessment was either PKG, PKF or PKE, your progress score will almost certainly have gone down.

If the teacher assessment for such children was HNM (Has Not Met the standard) then those children are still excluded from the measure – so the progress score should be unaffected. (This is a strange anomaly in the system. It would make more sense to me in such cases to award the same score to HNM that is used for PKG (79 points) rather than remove such a child from the progress measure altogether.)

So, if you had children who sat the KS2 tests but did not achieve a scaled score – check your validated data progress scores on the Performance Tables site. They might be different to the figures you have already been looking at in RAISEonline and Inspection Dashboard. (Both of these documents will be updated to the validated data at some point in the Spring.)

The intricacies of the KS2 progress model are very well explained in this excellent blog by James Pembroke (aka ‘sigplus’). Thanks James for bringing my attention to this methodological change via the medium of Twitter!

 

 

 

RAISEonline Brain Teasers Part 1

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Over the last half-term, my email inbox has noticed a bit of a rise in the number of queries along the lines of “I’m not sure I get page x of the new RAISEonline report – can you help?” or “What is this particular table telling me?”

I thought it might be helpful, with the permission of the enquirers, to share some of these brain teasers, along with my responses, as the chances are many others might have been wondering similar things about their own data (but perhaps were too afraid to ask!) Continue reading “RAISEonline Brain Teasers Part 1”

Just Give them a Grade – Sound Advice from the Minister?

Ben Fuller is Lead Assessment Adviser at Herts for Learning

Yesterday, our Schools Minister Nick Gibb said that teachers could save time and workload by, instead of producing in-depth marking of children’s work, just writing a grade on each piece. We do of course all want to find ways to make marking and feedback less time-consuming and more impactful, but this suggestion of using grades as part of the day-to-day process of formative assessment demonstrates a tragic vacuum of understanding about the purpose of feedback.

Continue reading “Just Give them a Grade – Sound Advice from the Minister?”

Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release

Ben Fuller is Lead Assessment Adviser at Herts for Learning

This Friday our eager anticipation will be over and the new-look RAISEonline reports, showing the 2016 unvalidated data for Key Stages 1 and 2, will be released. (Interactive reports available from Friday 21st October; Summary reports available from the following Tuesday.) Information has already been provided explaining the new-look tables and charts we are going to see.

ks2-progress-scores
Progress in RAISEonline

Continue reading “Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release”

Primary assessment: reflection and feed-forward

Ben Fuller is Lead Assessment Adviser at Herts for Learning

Welcome to the inaugural blog post from the Herts for Learning Assessment team. The aim of this blog is to periodically bring you important updates, ideas and suggestions in the world of school assessment.

I will start with some brief reflections on 2015/16, which has certainly been an interesting year in statutory assessment, with new approaches to the ways in which pupil performance has been measured and evaluated at the ends of Key Stages 1, 2, 4 and 5, as well as ongoing developments in the debate around Reception baseline assessment.

In this post I will focus on the primary phase, where teachers in Years 2 and 6 this year had to contend with new tougher tests and a new system for teacher assessment, based on the Interim Teacher Assessment Frameworks (‘ITAFs’) – which use what has been referred to as a “secure fit” (rather than “best fit”) system.  (Personally, I prefer to call it a “must have everything” approach, as I think it an unusual use of the word ‘secure’).

20160831_104309 Continue reading “Primary assessment: reflection and feed-forward”

Blog at WordPress.com.

Up ↑