Herts for Learning

Blogs for Learning



RAISEonline Brain Teasers part 2

Ben Fuller, Lead Assessment Adviser at Herts for Learning


And so to the second part in this series (of undefined length – might turn into a box-set) of RAISEonline Brain teasers. If you missed part 1, it’s here. You might also find this a useful discussion about a key difference between the unvalidated and the validated KS2 data.

This post features 2 frequently (ish) asked questions, together with answers.

Q1. Why do the numbers of pupils in the 3 prior attainment groups not add up to the total number of pupils in the cohort?

(For example, in the image above, the 3 figures that I have encircled in blue show that this cohort had 10 pupils in the ‘Low’ prior attainment group, 26 in the ‘Middle’ and 12 in the ‘High’. 10+26+12 = 48 pupils. But the total cohort is shown as 58. So 10 pupils are missing.)

A: The missing pupils will be children who have no measure of prior attainment, so they cannot be allocated to a prior attainment group. For example, maybe they were not in the country at the previous key stage. Or perhaps a teacher assessment of ‘A’ was submitted at the previous key stage (which would be the case if the child had been absent for a large amount of time, making it impossible to determine a teacher assessment level).

Q2. Why do the numbers of pupils in the 3 prior attainment groups shown in RAISEonline differ from the numbers shown in Inspection Dashboard?


Compare the Inspection Dashboard image above with the RAISEonline image at the top. These 2 images are from the same school, same data-set (KS2 Reading outcomes).

Why does Inspection Dashboard show prior attainment group sizes of 12, 27 and 9 pupils in low, middle and high groups respectively, whereas RAISEonline shows groups of 10, 26 and 12?

A: The difference is because Inspection Dashboard is grouping children according to their prior attainment in that same subject (i.e. in this case, reading) whereas RAISEonline groups the children according to their overall prior attainment from the previous key stage. (If looking at KS2 data, the prior attainment is based on children’s KS1 attainment in reading, writing and maths – but with maths given equal weighting to reading & writing combined).

When looking at prior attainment by individual subject, categorising the pupils is fairly straightforward – Level 3s are ‘High’, Level 2s are ‘Middle’, Level 1s and below are ‘Low’.

When using the ‘Overall’ prior attainment, an Average Point Score of 18 or higher is ‘High’, 12-17.9 is ‘Middle’, below 12 is ‘Low’.

So, in the example shown here, there are 9 children in the Reading High prior attainment group, i.e. they achieved level 3 at KS1. But there are 12 children in the overall High group shown in RAISEonline – meaning 3 extra children whose level in reading was below a level 3, but whose overall APS is at least 18 – most likely because they achieved level 3 in maths.

To really unpick what is going on, look at the pupil level data (Pupil List in RAISEonline – or look in your own internal management information system) to see how children have been categorised.

Arguably, the Inspection Dashboard way of doing things makes more sense (in my opinion) – and this is the more significant of the two documents when it comes to how Ofsted use the data pre-inspection.

Why are there these differences between the two documents?

Afraid I can’t answer that one…

NB – when looking at the Progress elements in Inspection Dashboard, the pupil groupings are by overall prior attainment group, not by individual subject. All of the above relates to the Attainment data.

That (probably) concludes my blogging for this term. But more brain teasers to follow in the New Year. I hope you can all cope with the antici…






KS2 Performance Tables (with an added surprise)

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Yesterday saw the release of the KS2 Performance Tables (based on validated data). You can find the figures for any school in England here.

This means that anyone can look up your school and see inspiring data such as this:


To the casual glancer, this chart might appear to suggest that this particular school has achieved progress scores somewhere around the median. But beware, that middle section covers around 60% of schools, so what the image above actually shows is data that could be anywhere between the 21st and 80th percentiles.

The greater surprise, though, in exploring the validated data is that an unexpected  methodological change has taken place since the unvalidated data appeared in RAISEonline. This change applies to one very specific group of pupils – those pupils who were entered into the tests (reading and maths) and who failed to score enough marks to be awarded a scaled score.

In the unvalidated data, these children were excluded from the progress data (but included in attainment). (However, where children were not entered into the test because they were working below the standard of the test, their Pre-Key Stage standard teacher assessment was used instead and those children were included in the progress measure.  This seemed counter-intuitive, in terms of setting up a strange incentive for schools to enter children into a test in which they clearly were unable to achieve.)

Here’s the change: now those children have been included – provided the teacher assessment is one of the Pre Key Stage standards (PKG, PKF or PKE). If you had children who took the test and didn’t achieve a scaled score, and the teacher assessment was either PKG, PKF or PKE, your progress score will almost certainly have gone down.

If the teacher assessment for such children was HNM (Has Not Met the standard) then those children are still excluded from the measure – so the progress score should be unaffected. (This is a strange anomaly in the system. It would make more sense to me in such cases to award the same score to HNM that is used for PKG (79 points) rather than remove such a child from the progress measure altogether.)

So, if you had children who sat the KS2 tests but did not achieve a scaled score – check your validated data progress scores on the Performance Tables site. They might be different to the figures you have already been looking at in RAISEonline and Inspection Dashboard. (Both of these documents will be updated to the validated data at some point in the Spring.)

The intricacies of the KS2 progress model are very well explained in this excellent blog by James Pembroke (aka ‘sigplus’). Thanks James for bringing my attention to this methodological change via the medium of Twitter!




RAISEonline Brain Teasers Part 1

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Over the last half-term, my email inbox has noticed a bit of a rise in the number of queries along the lines of “I’m not sure I get page x of the new RAISEonline report – can you help?” or “What is this particular table telling me?”

I thought it might be helpful, with the permission of the enquirers, to share some of these brain teasers, along with my responses, as the chances are many others might have been wondering similar things about their own data (but perhaps were too afraid to ask!) Continue reading “RAISEonline Brain Teasers Part 1”

Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release

Ben Fuller is Lead Assessment Adviser at Herts for Learning

This Friday our eager anticipation will be over and the new-look RAISEonline reports, showing the 2016 unvalidated data for Key Stages 1 and 2, will be released. (Interactive reports available from Friday 21st October; Summary reports available from the following Tuesday.) Information has already been provided explaining the new-look tables and charts we are going to see.

Progress in RAISEonline

Continue reading “Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release”

Blog at

Up ↑