A few quick thoughts and observations from me, now that I’ve had a little bit of time to have a look around the beta version of Analyse School Performance, the replacement to RAISEonline.

No big surprises in the way data is displayed – it combines elements of the graphical displays used in the Compare School Performance site with some of the key tables from the current RAISEonline report. And Key Stage 2 scatterplots are still there (who doesn’t love a scatterplot?) albeit without the option to change the x-axis from overall prior attainment to subject-specific prior attainment (which is a shame, as I rather liked that).

One curiosity though is a decision that has been made regarding the way headline progress scores are categorised into one of five groups. Some of you may recall the old ‘Data Dashboard’, which colour-coded results according to quintiles (i.e. where 20% of schools nationally would fall into each band).

When Compare School Performance was released, it was interesting to note that, instead of using equal bands (quintiles), the central band (labelled ‘Average’) represented 60% of all schools nationally, with the outer bands representing the top and bottom 10% and 20% of schools (see left-hand side of image below). This does make sense because results are so closely bunched around the middle that identifying a middle 20% is over-precise. A school result at the 39th percentile would really not be that different to the median, so to move it up a category would over-emphasise a very small difference. (Likewise a school at the 61st percentile would not be that different to the median, so to move down a  category would seem harsh.) Going for a “middle 60%” of schools alleviates this problem.

The curiosity though is that a different banding is in place in ‘Analyse School Performance’ (see right hand-side of image below).

csp v asp 2

You will note that the ‘Average’ band now represents 40% of schools, the next one up (‘Above national average’) represents about 25% and, to be in the top band (‘Well above national average’), you have to be in the top 5% of schools rather than the top 10%.  NB the images above relate to the same school. You will note that this school’s reading score has moved up from the middle category (‘average’) to the second (‘above average’).

Differences exist at the bottom end too, so that a school with a result between the 71st and 80th percentiles would drop from the average category (in CSP) to the below average category (in ASP).

Or, to put it another way, the image below shows the 2 different approaches to classifying results into 5 bands against a percentile rank scale:

csp v asp percentiles 2

Admittedly, the vast majority of schools will be unaffected (I did have to try out quite a few examples before I hit the jackpot and found one to illustrate my point), but nonetheless it is a curious difference and no rationale behind it is given.

If you are a school leader and you have not yet had an opportunity to explore your data in Analyse School Performance, I recommend you do so while this beta version is up. The opportunity is currently there to provide feedback on the site in advance of its launch with 2017 data. So far I haven’t hit too many snags, although the training video that was released does say that data can be exported to pdf, and I can’t find this functionality anywhere. But otherwise, nothing to get too alarmed about just yet.

Enjoy the half-term break. 🙂

Ben Fuller, Lead Assessment Adviser, Herts for Learning Ltd.