Selection has negative effect on pupils who aren’t selected, official data shows

Janet Downs's picture
 10

Selective schools, Theresa May believes, are the best way to lift disadvantaged high-achieving pupils out of poverty.  Those who oppose selection point out it may benefit the chosen few but has a negative effect on the unselected majority. 

Department for Education (DfE) data backs this up.  Non-selective schools in selective areas perform worse than all other types of secondary school.  Their average Attainment 8 score (42.1) is the lowest while average Progress 8 score (-0.14) is ‘statistically below the national average’.  

Non-selective schools in non-selective areas (aka comprehensive) schools had an average Attainment 8 score of 46.5 and Progress 8 was at the national average.  This shows that on average pupils in comprehensive schools made the progress expected based on achievement at the end of primary school (Key Stage 2 or KS2).

Selective schools achieved the highest Attainment 8 score (69.3) while Progress 8 was ‘statistically significantly above the national average’. 

DfE number-crunchers say ‘the difference in attainment can be explained by the prior attainment of each school type.’  As would be expected, the vast majority of grammar school pupils had better than expected level of attainment at the end of primary school.   In comprehensive areas, 41% of pupils had above-average prior attainment.  In selective areas just 30.3% of pupils in non-selective schools were high achievers at the end of KS2.

Non-selective schools in selective areas also had the highest proportion (15.9%) of pupils who achieved below the KS2 expected level.  This contrasts with 13% in comprehensive schools. And none, of course, in selective schools.

The data shows that pupils who achieved above the expected KS2 level achieved better results at grammar schools than equally high-achieving pupils in non-selective schools in their areas.  Grammar pupils on average also did better than previous high achievers at comprehensive schools in comprehensive areas.

This is likely to be seized upon by grammar school supporters as proof that selection is necessary.  But it should not be forgotten that this higher than average attainment by a minority of specially chosen children is offset by a below average attainment of the majority who are not chosen.

Official data shows that in areas where schools are fully comprehensive ALL pupils are on average likely to make the progress expected of them based on prior achievement.  Their average attainment at age 16 may be lower than if they were in grammar schools but we have to ask ourselves if this matters in the long run.  Sutton Trust research*, remember, found that students from comprehensive school outperformed their equally-qualified peers from both independent and grammar schools in degrees from the most academically selective universities.   

If May truly supports opportunity for all, then she should dump her misguided attachment to grammar schools.  They might work for the few, but they don’t work for the many.

 

*The research was done in 2009.  It’s now getting rather out-of-date.  Perhaps the Sutton Trust could find out whether this finding still holds.

Share on Twitter Share on Facebook

Be notified by email of each new post.





Comments

John Bajina's picture
Thu, 25/01/2018 - 16:29

We in Bucks (a fully Selective LA) Have to live with the consequences of Selection. In very real terms, our Secondary Moderns (where 11+ failures are sent) then have to work twice as hard to help pupil get over the sense of failure.


Janet Downs's picture
Fri, 26/01/2018 - 08:27

I agree - I worked for nearly 20 years in non-selective schools (aka secondary moderns) in a highly-selective area.  Some pupils were crushed by their 'failure'.  Others used it as a badge of honour - they would say, 'Don't expect us to do anything, we're the thickos'.

The majority, fortunately, were in between.  But there was no doubting the prejudice against them locally.  This even extended to the teachers.  There was a definite hierarchy - grammar schools good; secondary moderns inferior.


Tom Perry's picture
Fri, 26/01/2018 - 12:33

Janet Downs's picture
Sat, 27/01/2018 - 10:13

Tom - thanks for these links.  Is the answer to ditch Progress 8 altogether as Tom Sherrington wrote in Schools Week last year?   

If so, is there a better way to judge the Value Added by a particular school?

My concerns (as a non-statistician) are that P8 also appears to be biased against schools where the intake is skewed towards the previous low-acheivement end or have been creamed of previous high achievers (ie the secondary moderns).  Pupils in these schools are less likely to take 8 GCSEs in the 'correct' subjects, for example.  This also affects UTCs and Studio Schools which have particular curricula leading to exams which aren't counted in the P8 measure.  Also it's crazy for UTCs and Studio Schools to be included because they don't take pupils until the start of KS4.  But P8 is measured from the end of KS2.

And is it true that a jump from grade B to A counts more than the jump from G to F?  I read that somewhere but can't find it now.  If so, then this also introduces a bias towards grammar schools and supposedly comprehensive schools where the intake is skewed to the previous high-achieving end.


Tom Perry's picture
Sat, 27/01/2018 - 11:19

Hi Janet - I'm delighted you responded.

I'm inclined against scrapping P8 entirely. I think we need to do several things (I know others have advocated similar): a) make the full adjustment for prior attainment as I write about in my Schools Week piece; b) provide both a value-added (just accounting for prior attainment) and a contextualised value-added (FSM and EAL especially need accounting for) - making sure they are both prominent and figure when judging school performance (I think either one on its own is potentially misleading); c) create multi-year (perhaps 3-year) averages of the measures and shift the focus towards them; d) recognise the limitations of all school value-added measures and that there will always be a lot of noise to signal. The first three are fairly simply policy fixes, at least from a technical point of view. The latter is harder but policy options include displaying prominent 'health warnings' about the measures (confidence intervals are not sufficient), issuing guidance and discouraging too much weight (especially around high-stakes decisions) being placed on what they tell us. A lot of the issues are made more serious by the weight people place on the measures when judging school performance.

In terms of better ways of judging the performance of a particular school - I think we need broad professional judgements drawing on a range of evidence (including performance data) and with understanding of the context, in a system which encourages challenge and appropriate checks/balances. P8 and its successors are useful as a headline monitoring tool to identify schools with markedly good or bad results given their intake and as a way of keeping a level of accountability and bounding judgements of school performance to stop them being too detached from reality. To some extent all of this already happens - but there are a lot of misconceptions out there and it is a question of balance and emphasis.

You are right about the exam scale point (see here: http://www.aqa.org.uk/about-us/what-we-do/policy/gcse-and-a-level-change...). But this is a bit of a red herring when it comes to Progress measures. The key thing for Progress is the link between KS2 and KS4. The KS4 scale issues do not effect whether the measures are 'fair', in the sense of pupils of all attainment having an equal chance of positive/negative value-added (notwithstanding my SW piece point). They do mean that difference in progress in some areas carry more or less weight depending on how able the children are. The cost as well as the reward from pupils doing better/worse at KS4 changes but the average Progress 8 score is still zero.

What's your take on all of this? I think I am more sure of the policy points - it is harder to get a feel for the mood and impressions of use/interpretation on the ground.


Janet Downs's picture
Sat, 27/01/2018 - 12:27

Tom - thanks for your detailed response.  I can see how the technical issues could be solved but is there any reason why pupils should have to take 8 GCSEs?  Most other countries don't expect their pupils to take so many tests at age 16.  If tests are taken at all they're usually used to decide post-16 progression and not to judge schools.

I would like to see the emphasis moved from 16 to 18.  Not from an accountability point-of-view, but to promote the idea that it's age 18 which is important not high-stakes exams at 16.  The Coalition was right to say that exams at 16 needed reforming - the tragedy is that they reformed them in the wrong way by not moving to graduation at 18 via multiple routes.

A few years ago, the OECD said there was too much emphasis on GCSEs (see here where you can find the OECD recommendations about school accountability in England).  It welcomed Contextual Value Added as a step in the right direction but the DfE then abolished it.  The Government and Ofsted are welded to a 'no excuses' mantra and would likely be averse to taking disadvantage into account.  But it's a global phenomenon that all pupils (advantaged and disadvantaged) do less well than expected in schools where there are a large number of disadvantaged pupils.  The opposite is true: all pupils do better than expected in schools where there are a large number of advantaged pupils (see here).  

 


Tom Perry's picture
Sat, 27/01/2018 - 14:19

I think you are right about the need to rethink assessment to put 18 at the end point, although I have not looked into the issue in any detail. It seems more of an historical hangover than something we would chose if we had a blank sheet of paper. English policy makers do seem overly keen in driving the system using assessment. Along with Ofsted, our current approach to assessment does make our system a particularly 'high-pressure system' by international standards (http://www.tandfonline.com/doi/abs/10.1080/09243453.2014.927369). I don't know whether value-added measures make the situation better or worse - I worry they provide rhetorical support for the 'no excuses' culture (by supposing that non-school differences are taken into account).

As you say, the evidence on the influence of disadvantage are really clear and I too worry about the 'no excuses' culture (but I also worry about an excuses culture...). I was concerned that the CVA measure was convoluted and lacked transparency and in its attempt to make a perfectly fair measure just exposed that such a thing is impossible. I think once one has taken FSM and EAL into account, the improvements are marginal at the cost of a lot of complexity. Part of my thinking behind wanted VA and CVA was my awareness that there is no perfect measure. We just need to have informative indicators to work with which allow us to look at different aspects of performance. I am not even sure that bundling everything together in best 8/attainment 8 is informative (rather than a reflection of policy makers desire for headline measures to judge schools).


Janet Downs's picture
Mon, 29/01/2018 - 09:12

Perhaps the answer is NOT to try to reduce judgements about school effectiveness to a simple number which is then used for league tables.  That's unlikely with English politicians at the moment - education is a useful project for 'improvement' which results in endless reforms. 

I admire state schools which say 'stuff league tables' and continue to enter pupils for the exams they feel are most appropriate for their pupils.  These are, of course, very few in number as it's likely to be career suicide.  Ironically, it's grammar schools which are more likely to do this.   According to league tables,  there's a grammar (see John's comment) which is recorded as 'under-performing' because  iGCSEs  weren't included in official data.   According to league tables, my local grammar was the worst-performing secondary in the area a few years ago because it used (and continues to use) iGCSEs.  Nobody took any notice.   A double irony is that if one of the area's secondary moderns had been judged worst-performing for exactly the same reason (ie use of iGCSEs) nobody would have believed that explanation.  They would have taken the assessment at face value.


John Bajina's picture
Sat, 27/01/2018 - 13:49

With the move to education (or training) of children aged 16+, graduation at 18 make a lot of sense. This will motivate children to continue studying for longer giving them a chance of passing exams. They will have at least 2 years more to decide on their future direction.
Contextual Value Added has been yo-yoing under Gove and after. At first in Bucks, Ofsted & local LA reduced to look at CVA. It weas only when the true nature and inequalities of Selection was realised that Ofsed & LA re-started looking at CVA.
It certainly is a Bucks phenomenon that all pupils do less well than expected in our Secondary Modern School. Half of Bucks Secondary Schools are in RI or below.
Lastly, for your interest see (Note Royal Grammar School rated as under-performing): http://www.bucksfreepress.co.uk/news/15899686.REVEALED__Which_Bucks_scho...


Janet Downs's picture
Sat, 27/01/2018 - 13:58

John - the non-inclusion of iGCSEs in league tables leads to the sort of nonsense being experienced by the Royal Grammar.  My local grammar also uses iGCSEs.  When the Coalition came to power, the DfE said iGCSEs would be included in performance tables to produce a level playing field between state and independent schools which heavily used iGCSEs.  Then it changed its mind in 2014 (probably because it feared a wholesale abandonment of 'reformed' GCSEs in favour of iGCSEs as I pointed out here). 

 


Add new comment

Already a member? Click here to log in before you comment. Or register with us.