Monitoring School Performance: A Statistical Critique of England's ‘Expected Progress’ Approach with Comparison to Multilevel ‘Value-Added’ Models
Since 1992, the UK Government has published ‘school league tables’ summarizing the average educational ‘attainment’ and ‘progress’ made by pupils in each state-funded secondary school in England. In 2011 the Government made ‘expected progress’, a 'value-table' approach, their new headline measure of school progress. In this talk we analyze the data underlying their latest 2013 tables in order to statistically critique expected progress and compare it to the multilevel ‘value-added’ modeling approach. We find expected progress to be severely wanting. First, it perversely incentivizes schools to concentrate their efforts on pupils who are borderline in terms of making expected progress. Second, it exhibits an upwards and illogical saw-tooth relationship with prior attainment, severely biasing it in favor of schools with high prior attaining intakes. Third, it makes no attempt to quantify and communicate the statistical uncertainty in measuring school progress. In contrast, we describe and illustrate how the multilevel value-added modeling approach can go a long way to addressing these statistical problems while providing various other advantages over expected progress. However, many concerns remain when making quantitative comparisons between schools. We therefore urge that the results of all such analyses be interpreted more cautiously than is currently the case. While we focus on England's experience of school league tables, the statistical issues we discuss are relevant to the US and other education systems where school and teacher data are routinely published for school improvement, accountability and choice purposes.
George Leckie is a Senior Lecturer in Social Statistics at the Centre for Multilevel Modelling (CMM) and Graduate School of Education, University of Bristol, UK. He also holds a status-only Associate Professor position at the Graduate Department of Applied Psychology and Human Development, University of Toronto, Canada. His methodological interests are in the application and dissemination of multilevel and other latent variable models to analyze educational and social science data. His substantive interests include: school performance indicators and their associated publication in league tables; value-added models (VAMs) for measuring school and teacher effects on student achievement; and modeling rater effects on test scoring. He currently holds an ESRC Future Research Leaders grant and has previously been a co-investigator on several other ESRC grants. His research is published in various international journals, including: Journal of the Royal Statistical Society, Series A; Journal of Educational and Behavioral Statistics; Journal of Educational Measurement; and Journal of Statistical Software. He has taught multilevel modeling short courses across Europe, Australia, and the U.S.