David Cook is the Lead Teaching and Learning Adviser for Primary Mathematics at Herts for Learning

Recently he collated everything he and the team had found after analysing the scripts of both KS1 and KS2 tests and fed this back to Hertfordshire subject leaders and then gathered their experiences together. This blog reflects that presentation and subsequent work with subject leaders.

*Effective subject leadership –* *Improving teaching*

*Effective subject leadership –*

*Improving teaching*

The key priority for all mathematics subject leaders is to improve teaching and learning even further.

Ofsted have identified that, generally, subject leaders have improved their strategies to monitor and evaluate. But in their visits they see many plans that are light on the specific actions that will improve teaching. Specifically they refer to the missing leadership of high quality professional development. The questions on the slide are offered to help subject leaders consider their priorities for the year ahead and the actions they have identified or need to.

*KS2 Pupil outcomes 2016 – * *Lessons learnt and implications *

*KS2 Pupil outcomes 2016 –*

*Lessons learnt and implications*

Comparisons of pupil outcomes from last year are difficult as the assessment process is now different.

Leaders are advised to evaluate their pupils’ performance by comparison to Local Authority and National outcomes. Mathematics leaders are also advised to compare outcomes against other strands with their English colleague to look at the proportions of pupils who attained the expected standard in reading, writing and mathematics to see if this reveals any focuses for further action

*Pupil progress*

*Pupil progress*

A new measure of pupil progress has been created. See the DfE document ‘Primary School Accountability in 2016’ (page 9+) for an explanation of this new value added measure.

The testing format was composed of three papers with marks totaling a maximum of 110. This ‘raw score’ was then converted to a scaled score (80-120). A score of 100+ indicated a pupil had attained the expected standard in the test.

Pupils needed to score at least 69/110 marks to pass this threshold. In 2015, pupils needed to score at least 46/100 to be awarded a level 4. It is clear to see the new testing structure is more challenging. This is reflected in the drop in National (70%) figures.

*Higher standard?*

*Higher standard?*

Over the summer, the DfE created a new threshold measure. To attain this in mathematics, a pupil would have to reach a raw score of 89/110 marks which would translate into a scaled score of 110. A ‘Higher Standard’ equates to a scaled score of 110+ which will remain the same each year. The raw score to achieve this may change. The DfE have indicated that no more than about 1/5 of pupils nationally will attain this. Clearly, the proportion of pupils in any one school can exceed this.

This higher standard is more challenging to achieve than the previous level 5 if you consider the nature of the suite of tests. In 2015, pupils had to score 79/100 to be awarded a level 5. Nationally, 41% of pupils achieved this.

*Analysing outcomes*

*Analysing outcomes*

It is interesting to note that paper 1 increased in difficulty, in that the earlier questions were generally easier. You can see the proportions of questions drawn from each year group. The structure of the paper meant that pupils could get off to a good start. Eight of the first eleven marks tested year 3 knowledge. This would have helped pupil’s confidence and success.

Take a look at the ‘Mathematics – paper 1 arithmetic analysis’ document that identifies each question’s coded domain (year group and domain it tests) and the national success percentages. Nationally, the average percentage of correct marks was 79% in paper 1.

Analysis of your pupils’ responses in the test is available in Raiseonline.

This will help you compare your pupils’ success against the national averages. But be careful to not over analyse this and ensure that your interpretation is valid.

It would be worth looking at some of the pupil’s scripts to see which strategies and methods they used to complete the calculations.

Caution: coding from the tests is somewhat simplified and does not always reflect the mathematics required

The nationally identified weaker areas are interesting to note here. Operating with fractions and decimals is clearly a common area of development. But, I would urge caution here too. The coding is somewhat simplified and suggests that only calculations involving the most friendly of numbers would be solved using a mental strategy.

For example, consider question 7 below

89,994 + 7,643

** 5C2 **add and subtract whole numbers with more than 4 digits, including using formal written methods (columnar addition and subtraction)he most friendly of numbers would be solved using a mental strategy.

I would hope that year 6 pupils, who have good number sense and are nimble with their numbers, would be able to adjust this calculation (to ‘tidy it up’) by adding 6 to the 89994 and altering the 7643 to 7637. This will result in making the calculation easier …90,000 + 7637 reducing potential error points.

As subject leader, consider the extent to which pupils across your school are being supported to develop the security of these skills. Do you have a plan to do this?

This is why the HfL mathematics team are focusing upon the development of mental fluency.

Are your pupils nimble with number? Want to know more? Check out our brand new video: https://www.hertsforlearning.co.uk/news/mental-fluency-mathematics

It is interesting to note the difference in structure from paper 1.

Papers 2 and 3 aren’t organised in increasing difficulty. In paper 3, a greater proportion of marks are drawn from year 4 and also two more marks from years 5 and 6. This would equate to the slight difference in pitch between papers 2 and 3 and accounts for the national average difference in performance. Paper 2 was 65% and paper 3, 62%.

Analysis of these papers and comparison to previous years indicates that there is no discernible difference in the pitch. The mathematics in 2016 is no harder than it was in 2015! What is more challenging, though, is the nature of the questions. Pupils really do have to grapple and utilise a wider range of working mathematically skills to be successful. Questions are offered in various representations so pupils have to both interpret the content and context and then identify the most effective way forward. Pupils’ learning has to be secure and deep. In this way, the tests are well-designed. They reflect what we want for our pupils. Though at the moment we still need to work hard to ensure we really develop these strong mathematical habits.

Be careful to interpret the nationally identified weaker areas. For example, it is possible to interpret measurement as a weaker area and then put in significant plans to improve this aspect of teaching. But look at the examples from paper 2, below, which are ‘coded’ as measurement questions. I suggest measurement provides the context, and is a brilliant one too, for calculation.

In question 11, pupils do need to understand the proportional relationships between g and kg. But calculate thereafter. Question 15 can be viewed as an exploration of two interval scales.

So again, some time spent looking at some pupils’ scripts to see how they responded, the strategies they used and which questions were not answered may be valuable information going forward. Help pupils to ‘notice’ and to build upon what they know.

Make sure that your curriculum offer develops pupils’ representations – the models, pictures and jottings that help them both visualise, access and make sense of a problem.

Take a look at the ‘stick and triangle question’ below! *We* may interpret it as a two-dimensional representation of a pan balance with different sized objects that have equivalent mass…but do your pupils?

*Teacher assessment in 2017*

*Teacher assessment in 2017*

The interim teacher assessment framework remains exactly the same for 2017. *All *criteria must be met before a pupil can be awarded the expected standard at key stage 1 and 2, and greater depth at key stage 1. *It is not ‘best fit’.*

How are you using assessment to help pupils move smoothly on the trajectory to end of KS2 success?

The percentage of Year 6 pupils awarded the ‘expected standard’ through teacher assessment was 10% higher than in the test in Herts. This is a similar picture nationally. This is because the pitch of the test is higher than the ITAFs.

So consider your trajectory to the end of KS2. To be confident of test success, teachers in each year group will have to assess the quality of learning against the end of year statements rather than coverage – too many schools are still using a tick box approach with too few indicators and not considering the depth of learning necessary to achieve security. Pupils who achieve age related expectations are able to work as confident mathematicians how do your teachers ensure this is what is assessed? Those judged against relatively few indicators, often still have significant fragility especially where coverage is assessed over learning. They may be judged as achieving the expected standard using the eight ITAF criteria but may lack the flexibility to grapple with some of the questions in the test. Lower trajectories of attainment indicate the need to intervene to accelerate learning quickly and effectively.

If this is the case for your year 6 pupils, then also consider your trajectories to age-related expectations in other year groups and how robust assessment of learning is.

* KS1 Assessments*

*KS1 Assessments*

Pupil outcomes at the end of KS1 are formed by teacher assessment which is informed by the test outcomes. As with KS2, comparisons to 2015 attainment are difficult. Leaders, though, are urged to compare against local and National averages.

National figures broken down by gender: working at expected + 72% boys and 74% girls whereas working at greater depth 20% boys and 16% girls.

It is interesting to note the attainment gender gap at the ‘expected’ which is reversed at the ‘greater depth’ standard.

No national information is available to help leaders analyse their pupils’ performance in the test. There are no commonly identified strengths and weaknesses. In recent cluster sessions, (over 100) leaders were asked to share their reflections and analysis from the context of their own school. These have been collated and summarised below. They may not necessarily reflect your position but are offered for your consideration.

- Some leaders indicated that teacher assessment and test outcomes were quite closely matched. Others indicated that some pupils were predicted to perform better than they actually did in the test.
- The tests were felt to be more difficult as pupils were not able to use apparatus. Some pupils particularly found paper 1 trickier as a result.
- Pupils have responded very positively to school wide approaches to develop their pictorial representations and this has worked best when it has linked to the manipulatives. The ‘CPA’ approach adopted throughout the year has supported pupils well.
- Increased anxiety and awareness of pupils ‘taking a test’. Some pupils found it difficult to complete the questions in the time allocated.
- Some leaders indicated that many pupils found it difficult to identify the correct skill to use in response to a question. Other leaders suggested that the careful introduction and strategic use of question-style activities helped pupils know how and where to record their responses. Helping pupils ‘interpret’ the picture in similarly presented questions in lessons helped them develop more secure interpretation skills.
- Although there was no threshold for awarding greater depth in the test, there was a noticeable correlation between the confident completion of tests by pupils who were awarded GD in teacher assessment.

**What do the papers tell us?**

**What do the papers tell us?**

Although comparisons to any national averages are not possible, it is interesting to look at the nature of the papers and style of questioning.

Paper 1 (arithmetic) was 25 questions in length and the vast majority of questions were drawn from year 2 content. The maths team analysed the questions to see if it was possible to identify patterns of strategies we would hope pupils used. It is interesting to note that pupils’ ability to use facts and friendly numbers contribute significantly to success.

Similarly in paper 2 (31 questions and 35 marks), securing pupil’s number knowledge and their understanding of operations would support them well.

The question below is an example of this.

So the implication for school leaders, here, is to ensure that both the essential key skills are given high priority in curriculum design and, the trickier part, are secured.

Subject leaders were asked to consider these key reflection questions.

References

2016 key stage 2 mathematics Paper 2 and 3 – reasoning © Crown copyright and Crown information 2016