Herts for Learning

Blogs for Learning



It’s the end of RAISEonline as we know it (and I feel fine)

A few quick thoughts and observations from me, now that I’ve had a little bit of time to have a look around the beta version of Analyse School Performance, the replacement to RAISEonline.

No big surprises in the way data is displayed – it combines elements of the graphical displays used in the Compare School Performance site with some of the key tables from the current RAISEonline report. And Key Stage 2 scatterplots are still there (who doesn’t love a scatterplot?) albeit without the option to change the x-axis from overall prior attainment to subject-specific prior attainment (which is a shame, as I rather liked that).

One curiosity though is a decision that has been made regarding the way headline progress scores are categorised into one of five groups. Some of you may recall the old ‘Data Dashboard’, which colour-coded results according to quintiles (i.e. where 20% of schools nationally would fall into each band).

When Compare School Performance was released, it was interesting to note that, instead of using equal bands (quintiles), the central band (labelled ‘Average’) represented 60% of all schools nationally, with the outer bands representing the top and bottom 10% and 20% of schools (see left-hand side of image below). This does make sense because results are so closely bunched around the middle that identifying a middle 20% is over-precise. A school result at the 39th percentile would really not be that different to the median, so to move it up a category would over-emphasise a very small difference. (Likewise a school at the 61st percentile would not be that different to the median, so to move down a  category would seem harsh.) Going for a “middle 60%” of schools alleviates this problem.

The curiosity though is that a different banding is in place in ‘Analyse School Performance’ (see right hand-side of image below).

csp v asp 2

You will note that the ‘Average’ band now represents 40% of schools, the next one up (‘Above national average’) represents about 25% and, to be in the top band (‘Well above national average’), you have to be in the top 5% of schools rather than the top 10%.  NB the images above relate to the same school. You will note that this school’s reading score has moved up from the middle category (‘average’) to the second (‘above average’).

Differences exist at the bottom end too, so that a school with a result between the 71st and 80th percentiles would drop from the average category (in CSP) to the below average category (in ASP).

Or, to put it another way, the image below shows the 2 different approaches to classifying results into 5 bands against a percentile rank scale:

csp v asp percentiles 2

Admittedly, the vast majority of schools will be unaffected (I did have to try out quite a few examples before I hit the jackpot and found one to illustrate my point), but nonetheless it is a curious difference and no rationale behind it is given.

If you are a school leader and you have not yet had an opportunity to explore your data in Analyse School Performance, I recommend you do so while this beta version is up. The opportunity is currently there to provide feedback on the site in advance of its launch with 2017 data. So far I haven’t hit too many snags, although the training video that was released does say that data can be exported to pdf, and I can’t find this functionality anywhere. But otherwise, nothing to get too alarmed about just yet.

Enjoy the half-term break. 🙂

Ben Fuller, Lead Assessment Adviser, Herts for Learning Ltd.


RAISEonline Brain Teasers part 2

Ben Fuller, Lead Assessment Adviser at Herts for Learning


And so to the second part in this series (of undefined length – might turn into a box-set) of RAISEonline Brain teasers. If you missed part 1, it’s here. You might also find this a useful discussion about a key difference between the unvalidated and the validated KS2 data.

This post features 2 frequently (ish) asked questions, together with answers.

Q1. Why do the numbers of pupils in the 3 prior attainment groups not add up to the total number of pupils in the cohort?

(For example, in the image above, the 3 figures that I have encircled in blue show that this cohort had 10 pupils in the ‘Low’ prior attainment group, 26 in the ‘Middle’ and 12 in the ‘High’. 10+26+12 = 48 pupils. But the total cohort is shown as 58. So 10 pupils are missing.)

A: The missing pupils will be children who have no measure of prior attainment, so they cannot be allocated to a prior attainment group. For example, maybe they were not in the country at the previous key stage. Or perhaps a teacher assessment of ‘A’ was submitted at the previous key stage (which would be the case if the child had been absent for a large amount of time, making it impossible to determine a teacher assessment level).

Q2. Why do the numbers of pupils in the 3 prior attainment groups shown in RAISEonline differ from the numbers shown in Inspection Dashboard?


Compare the Inspection Dashboard image above with the RAISEonline image at the top. These 2 images are from the same school, same data-set (KS2 Reading outcomes).

Why does Inspection Dashboard show prior attainment group sizes of 12, 27 and 9 pupils in low, middle and high groups respectively, whereas RAISEonline shows groups of 10, 26 and 12?

A: The difference is because Inspection Dashboard is grouping children according to their prior attainment in that same subject (i.e. in this case, reading) whereas RAISEonline groups the children according to their overall prior attainment from the previous key stage. (If looking at KS2 data, the prior attainment is based on children’s KS1 attainment in reading, writing and maths – but with maths given equal weighting to reading & writing combined).

When looking at prior attainment by individual subject, categorising the pupils is fairly straightforward – Level 3s are ‘High’, Level 2s are ‘Middle’, Level 1s and below are ‘Low’.

When using the ‘Overall’ prior attainment, an Average Point Score of 18 or higher is ‘High’, 12-17.9 is ‘Middle’, below 12 is ‘Low’.

So, in the example shown here, there are 9 children in the Reading High prior attainment group, i.e. they achieved level 3 at KS1. But there are 12 children in the overall High group shown in RAISEonline – meaning 3 extra children whose level in reading was below a level 3, but whose overall APS is at least 18 – most likely because they achieved level 3 in maths.

To really unpick what is going on, look at the pupil level data (Pupil List in RAISEonline – or look in your own internal management information system) to see how children have been categorised.

Arguably, the Inspection Dashboard way of doing things makes more sense (in my opinion) – and this is the more significant of the two documents when it comes to how Ofsted use the data pre-inspection.

Why are there these differences between the two documents?

Afraid I can’t answer that one…

NB – when looking at the Progress elements in Inspection Dashboard, the pupil groupings are by overall prior attainment group, not by individual subject. All of the above relates to the Attainment data.

That (probably) concludes my blogging for this term. But more brain teasers to follow in the New Year. I hope you can all cope with the antici…






KS2 Performance Tables (with an added surprise)

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Yesterday saw the release of the KS2 Performance Tables (based on validated data). You can find the figures for any school in England here.

This means that anyone can look up your school and see inspiring data such as this:


To the casual glancer, this chart might appear to suggest that this particular school has achieved progress scores somewhere around the median. But beware, that middle section covers around 60% of schools, so what the image above actually shows is data that could be anywhere between the 21st and 80th percentiles.

The greater surprise, though, in exploring the validated data is that an unexpected  methodological change has taken place since the unvalidated data appeared in RAISEonline. This change applies to one very specific group of pupils – those pupils who were entered into the tests (reading and maths) and who failed to score enough marks to be awarded a scaled score.

In the unvalidated data, these children were excluded from the progress data (but included in attainment). (However, where children were not entered into the test because they were working below the standard of the test, their Pre-Key Stage standard teacher assessment was used instead and those children were included in the progress measure.  This seemed counter-intuitive, in terms of setting up a strange incentive for schools to enter children into a test in which they clearly were unable to achieve.)

Here’s the change: now those children have been included – provided the teacher assessment is one of the Pre Key Stage standards (PKG, PKF or PKE). If you had children who took the test and didn’t achieve a scaled score, and the teacher assessment was either PKG, PKF or PKE, your progress score will almost certainly have gone down.

If the teacher assessment for such children was HNM (Has Not Met the standard) then those children are still excluded from the measure – so the progress score should be unaffected. (This is a strange anomaly in the system. It would make more sense to me in such cases to award the same score to HNM that is used for PKG (79 points) rather than remove such a child from the progress measure altogether.)

So, if you had children who sat the KS2 tests but did not achieve a scaled score – check your validated data progress scores on the Performance Tables site. They might be different to the figures you have already been looking at in RAISEonline and Inspection Dashboard. (Both of these documents will be updated to the validated data at some point in the Spring.)

The intricacies of the KS2 progress model are very well explained in this excellent blog by James Pembroke (aka ‘sigplus’). Thanks James for bringing my attention to this methodological change via the medium of Twitter!




RAISEonline Brain Teasers Part 1

Ben Fuller, Lead Assessment Adviser at Herts for Learning

Over the last half-term, my email inbox has noticed a bit of a rise in the number of queries along the lines of “I’m not sure I get page x of the new RAISEonline report – can you help?” or “What is this particular table telling me?”

I thought it might be helpful, with the permission of the enquirers, to share some of these brain teasers, along with my responses, as the chances are many others might have been wondering similar things about their own data (but perhaps were too afraid to ask!) Continue reading “RAISEonline Brain Teasers Part 1”

Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release

Ben Fuller is Lead Assessment Adviser at Herts for Learning

This Friday our eager anticipation will be over and the new-look RAISEonline reports, showing the 2016 unvalidated data for Key Stages 1 and 2, will be released. (Interactive reports available from Friday 21st October; Summary reports available from the following Tuesday.) Information has already been provided explaining the new-look tables and charts we are going to see.

Progress in RAISEonline

Continue reading “Unpicking KS2 Progress Scores ahead of Friday’s RAISEonline release”

Blog at

Up ↑