Monthly Archives: December 2015

G2 grammar schools & educational apartheid

G2 – Grammar schools and Educational Apartheid

The fiftieth anniversary of Anthony Crosland’s Circular 10/65, in which the Labour minister decreed Local Authorities should plan for comprehensive education, passed by un-noticed. Yet it was not, as often thought, the end of the Grammar Schools. Indeed, by allowing a local option the way was left open for continuation of state funded grammar schools in Tory Local Authorities, and some grammar went independent as Grant Maintained status became an insecure option. This article looks at what has happened in one Local Authority area – Lincolnshire – and should be read alongside Margaret Morris’s assessment of the history of post 1944 education in the Theory section.

Educational Apartheid in Lincolnshire: selective education as a catalyst for driving inequalities.

It is a popular misconception that secondary modern schools went away. In Lincolnshire we retained this type of secondary school designed for the majority of students – those not in the so-called top 25% ability-range of the 11-plus. To confuse matters still further, most secondary modern schools are now academies, some offer A-levels while others don’t; grammar schools offer A-levels but are selective. Lincolnshire does not have a comprehensive education system due to the sporadic nature of its school structures. Secondary modern schools and grammar schools maintain the 11-plus status quo, while academies complicate matters further.


Sadly, comprehensive schools of the 1950s and 1960s never reached South Lincolnshire. I went to a “red brick” secondary modern school in Louth while my better-off counterparts attended the local grammar school, the history and traditions of which go back to at least 1548, supported by the Church and local guilds. On leaving school in 1976 I was conscious that university wasn’t an option. None of my peers left school to go to university because we didn’t have a sixth form, which meant there were limited opportunities to combine O-Levels with CSEs and no opportunity to do A-levels. There were, and, still are, inequalities within Louth that are symptomatic of selective education dividing social class. There are still demarcations across housing and income as to which schools serve particular parts of town.

Inequality has become so embedded into our culture that no one speaks out. Each year children are divided into sheep and goats at 11-plus for transition into secondary schools and we turn the other cheek. Grammar school supporters try to justify their system so we are faced with unfounded comments, such as “there’s no difference between schools selecting students and setting within schools”. In my opinion, having one’s own children rejected by this system seems like child abuse – it is totalising and brutal. Children’s friendships are torn apart. Rejection at 11-plus hurts everyone around the child. It damages community cohesion.

My observations are based on my own experiences, those of my children, their friends, parents and grandparents. I am also speaking out for those teachers whom I know are oppressed by the system.

Local context

In 2001 I moved to the coastal part of Lincolnshire within the district of East Lindsey, to a seaside town in-between Skegness and Mablethorpe. Our area suffers from 40% child povertyi iiand multiple inequalities that are exacerbated by selective education.

In 2013, in a TES article called, Waiting for a Sea Changeiii Emma Hadley, executive principal of an academy group in Skegness, estimated that about 30% of students at a primary academy lived in caravans. She explained that the seasonality of employment means that at Skegness Academy (an all-ability school with a sixth form) about half of students in year 11 joined the school after year 7 and 45% were eligible for the pupil premium.

School wars or a “Coastal Challenge?”

In his rebuttal of the evidence from the Sutton Trust, that showed grammar schools take far fewer children in receipt of free school meals than other state schoolsiv, Robert McCartney QC, chairperson of The National Grammar Schools Association (NGSA) said:

Many, many parents from deprived areas, including what is generally called the dependency classes, are essentially not particularly interested in any form of academic education.

It would be ridiculous to say that parents are not interested in education or that schools cannot make up for some surrounding poverty and inequality, but it would be equally crass to give schools the target of overcoming the link between social background and educational achievement and then punish them for failing. In my experience of my own children’s education, schools can and do make a difference, but they can only do so within the limits that political parties are prepared to invest in deprived areas. The language of “low-ability”, “chaotic”, “lacking resilience to accept disappointment” from those who should know better has offset scrutiny and responsibility for every child to be educated free from coercion and stress so that the powers that be can protect the remnants of an outdated education system that supports one child to the detriment of several others under the rhetoric of “parent choice”. The point is that the structure of selective education limits achievement and social integration. Even if parents are aspirational, unless children pass the 11-plus grammar schools don’t want them, which is ironic considering the pressure on school places is likely to move to secondary schools.

In light of Nicky Morgan, the Secretary of State for Education, approving a Kent grammar school’s expansion, Ian Widdows, founder of The National Association of Secondary Modern Schools (NASM) in Schools Week defends the successes of secondary modern schoolsv. However, the point is secondary modern schools are for “failures” and are seen as such. It is common parlance – fail the 11-plus, go to a secondary modern. Meanwhile, the elephant in the room for “coasting schools” (those rated as inadequate) is growing up poor affects child development, being best prepared to learnvi, and from deduction having the knowledge to pass 11-plus tests, getting good SATs and achieving benchmark GCSEsvii. For me, the obvious solution is to end the 11-plus and establish local school partnerships to work at the heart of local culture.

Back to reality; the effects of child poverty, the 11-plus and lack of investment in our area have not been addressed. The secondary modern school in Mablethorpe has suffered and 60% of parents have chosen not to send their children to that school, which is now earmarked for closureviii. I think school closure might become more commonplace if grammar schools are permitted to become academy sponsors in Multi-academy Trusts and then seek to break away from weaker schools such as in Mablethorpe, which is in a federation with Louth. I strongly feel that if grammar schools are to become sponsors in Multi-academy Trusts that they should be willing to work much more locally to save weaker schools from closure and to prevent children being bussed for miles – we could call this “The Coastal Challenge”.

Post-16 education also presents a problem on the Lincolnshire coast. Grammar schools provide some of the nearest sixth forms for A-levels but if a pupil fails to get good GCSEs, given our isolated location, they are likely to face a considerable journey and costs to get to a college that might not provide suitable courses.

Cuts to the local authority’s budget are likely to be exacerbated by selection. Transport is provided to schools within two Designated Transport Areas, one with free, non-means-tested, transport to grammar schools, the other with concessionary transport to non-selective schools, which is means-tested. To qualify for transport schools must be further than approximately 3 miles from home. But if your child fails the 11-plus and your catchment school happens to be coasting and you have to send them elsewhere you will have to pay, even if the better alternative school is located next to the nearest grammar school.

At post-16 better-off students leave grammar school if they don’t get the grades but can afford to drive to college. Meanwhile in light of the abolition of the Education Maintenance Allowance, poorer students have to make do with concessions colleges are still able to offer.

In summary, the notion of educational apartheid should not be understated. I think that middle class professionals whose children fail at 11-plus should make common cause with working class and unemployed parents who also have their children fail.

Alan Gurbutt, parent, former school governor (SEN) and member of Comprehensive Future’s steering group, 2015

i ‘Stark Child Poverty Figures in Mablethorpe and Sutton on Sea Are Revealed’, 2013, http://www.louthleader.co.uk/news/local/stark-child-poverty-figures-in-mablethorpe-and-sutton-on-sea-are-revealed-1-4833216, accessed 17 December 2015.

ii ‘Lincoln, Boston and Skegness Named as Most Deprived Areas in the Country’, Lincolnshire Echo, http://www.lincolnshireecho.co.uk/Lincoln-Boston-Skegness-named-deprived-areas/story-28011239-detail/story.html, 2015, accessed 17 December 2015.

iii I. Barker, ‘Waiting for a Sea Change’, TES, 29 March 2013, https://www.tes.com/article.aspx?storycode=6326724, accessed 18 December 2015.

iv S. Malik, ‘Free School Meal Pupils Outnumbered 4:1 by Privately Educated at Grammars’, The Guardian, 8 November 2013, sec. Education, http://www.theguardian.com/education/2013/nov/08/grammar-schools-admit-more-privately-educated-children, accessed 17 December 2015.

v ‘Secondary Moderns Must Have a Voice, Too | Schools Week’, http://schoolsweek.co.uk/secondary-moderns-must-have-a-voice-too/, accessed 17 December 2015.

vi B. Whitener, ‘Income Levels Affect the Structure of a Child’s Brain, NIH-Funded Study Shows’, (23 April 2015).

vii ‘Narrowing the Gap in Deprived Areas of Lincolnshire’, (2010), http://archive.c4eo.org.uk/narrowingthegap/files/ntg_lincolnshire.pdf, accessed 18 December 2015.

viii ‘Consultation | Monks’ Dyke Tennyson College’, https://www.mdtc.co/consultation/, accessed 17 December 2015.

T4 The Pisa Studies in perspective

T4 Unresolved issues of the PISA OECD tables.

The OECD PISA tables of international performance now dominate world education and are accepted as infallible by media and politicians inside the Westminster Bubble and elsewhere – all politicians across the globe now seem to regard the data as having biblical status. The next tranche of the three year studies is due in late 2016. While there is some use for PISA, other surveys particularly the TIMMS and PIRLS studies may be more valuable. Whatever the role of international surveys, they should be taken with great scepticism and Caveat Emptor should apply. It does not do so thus there are considerable risks in using this data, and any data driven approach which risks becoming a major damage to children and education by turning schools more effectively into exam factories. This article, written at the time of the last PISA published tables, sounds a note of caution will need to be developed as 2016 unfolds. TF.

———————————————————————————————————————————————–

Educational Policy; what can PISA tell us?

Harvey Goldstein

For over a decade OECD has been promoting its Programme for International Student Assessment (PISA) as a robust means of comparing the performance of educational systems in a range of countries. The latest results of tests on 15 year olds will be published early in December and the British government, along with many others in Europe and elsewhere, will be bracing themselves for news about their relative position in the international league tables. What has often been termed ‘PISA Shock’, or more accurately ‘PISA Panic’, has accompanied past releases and politicians of all persuasions, in many countries, have used the ‘evidence’ about movements up and down the tables to justify changes to their own educational curriculums or assessment systems. So Finland, which consistently comes towards the top, has been held up as a model to follow: if you come from the right you are likely to emphasise the ‘formality’ of the curriculum to justify ‘traditional’ curriculum approaches, and if you hail from the left you can find yourself pointing to the comprehensive nature of the Finnish system to justify reinstating comprehensivisation in England. The reality, of course, is that we simply do not know what characteristics of the Finnish system may be responsible for its performance, nor indeed, whether we should take much notice of these comparisons, given the weaknesses that I shall point out.

I don’t want to go into detail about the technical controversies that surround the PISA data. Just to say that there is an increasing literature pointing out that it is a vastly oversimplified view of what counts as performance in the areas of reading, maths and science. There is research that shows that countries cannot be ranked unequivocally along a single scale and that they differ along many dimensions. Thus, in a comparison of France and England myself and colleagues were able to show that different factors were at work in each system. This is further complicated by the way the systems are differently structured, with up to a third of pupils in French schools repeating a year at some stage, compared to very few in England.

There is good evidence that the process of translating the PISA tests from one language to another is problematic so that there is no assurance that the ‘same things’ are being assessed in different educational systems. Detailed analysis of the International Adult Literacy Survey has shown how much translation can depend upon context and in many cases that it is virtually impossible to achieve comparability of difficulty for translated items. PISA does in fact attempt to eliminate items that appear to be very discrepant in terms of how pupils respond to them in different countries. The problem with this, however, is that this will tend to leave you with a kind of ‘lowest common denominator’ set of items that fails to reflect the unique characters associated with different educational systems.

Most importantly, PISA is a one off ‘cross-sectional’ snapshot where each 15 year old pupil in the sample is tested at one point of time. No attempt is made (except in a few isolated countries) to relate pupil test scores to earlier test scores so that progress through the educational system can be studied. This is a severe handicap when it comes to making any ‘causal’ inferences about reasons for country differences, and in particular comparing educational systems in terms of how much they progress over time given their attainments when they start school. Often known as ‘value added’ analysis, this provides a much more secure basis for making any kind of causal attribution. OECD has in the past refused to implement any kind of ‘longitudinal’ linking of data across time for pupils, although this may be changing.

PISA still talks about using the data to inform policymakers about which educational policies may be best. Yet, OECD itself points out that PISA is designed to measure not merely the results of different curricular but is a more general statement about the performance of fifteen year olds, and that such performance will be influenced by many factors outside the educational system as such, including economic and cultural ones.

It is also worth pointing out that researchers who are interested in evaluating PISA claims by reanalysing the data, are severely handicapped by the fact that, apart from a small handful, it is impossible to obtain details of the tasks that are given to the pupils. These are kept ‘secure’ because, OECD argues, they may be reused for purposes of attempting to make comparisons across time. This is, in fact, a rather feeble excuse and not a procedure that is adopted in other large scale repeated surveys of performance. It offends against openness and freedom of information, and obstructs users of the data from properly understanding the nature of the results and what they actually refer to. Again, OECD has been resistant to moving on this issue.

So, given all these caveats, is there anything that PISA can tell us that will justify the expense of the studies and the effort that goes into their use? The answer is perhaps a qualified yes. The efforts that have gone into studying translational issues have given insights into the difficulties of this and provided pointers to the reservations which need to be borne in mind when interpreting the results. This is not something highlighted by OECD since it would somewhat detract from the need to provide simple country rankings, but nevertheless could be valuable. The extensiveness of the data collected, including background socio-economic characteristic of the pupils and information about curriculum and schools, is impressive, and with the addition of longitudinal follow-up data could be quite valuable. What is needed, however, is a change of focus by both OECD and the governments that sign up to PISA. As a suitably enhanced research exercise devoted to understanding how different educational systems function, what are the unique characteristics of each one and how far it may be legitimate to assign any differences to particular system features, PISA has some justification. If its major function is to produce country league tables, however, it is uninformative, misleading, very expensive and difficult to justify.

The best thing to do when the results are published would be for policymakers to shrug their shoulders, ignore the simplistic comparisons that the media will undoubtedly make, and try to work towards making PISA, and other similar studies, such as TIMSS, more useful and better value for money.

Further Reading

Affman, I. (2013). “Problems and issues in translating international educational achievement tests”. In Educational Measurement, issues and practice, vol 32, Pp2-14.

Goldstein H. (2004). International comparisons of student attainment: some issues arising from the PISA study. In Assessment in Education, Vol.11, No.3, November 2004 pp 319-330

Harvey Goldstein University of Bristol November 2013

T4 Unresolved issues of the PISA OECD tables.

The OECD PISA tables of international performance now dominate world education and are accepted as infallible by media and politicians inside the Westminster Bubble and elsewhere – all politicians across the globe now seem to regard the data as having biblical status. The next tranche of the three year studies is due in late 2016. While there is some use for PISA, other surveys particularly the TIMMS and PIRLS studies may be more valuable. Whatever the role of international surveys, they should be taken with great scepticism and Caveat Emptor should apply. It does not do so thus there are considerable risks in using this data, and any data driven approach which risks becoming a major damage to children and education by turning schools more effectively into exam factories. This article, written at the time of the last PISA published tables, sounds a note of caution will need to be developed as 2016 unfolds. TF.

———————————————————————————————————————————————–

Educational Policy; what can PISA tell us?

Harvey Goldstein

For over a decade OECD has been promoting its Programme for International Student Assessment (PISA) as a robust means of comparing the performance of educational systems in a range of countries. The latest results of tests on 15 year olds will be published early in December and the British government, along with many others in Europe and elsewhere, will be bracing themselves for news about their relative position in the international league tables. What has often been termed ‘PISA Shock’, or more accurately ‘PISA Panic’, has accompanied past releases and politicians of all persuasions, in many countries, have used the ‘evidence’ about movements up and down the tables to justify changes to their own educational curriculums or assessment systems. So Finland, which consistently comes towards the top, has been held up as a model to follow: if you come from the right you are likely to emphasise the ‘formality’ of the curriculum to justify ‘traditional’ curriculum approaches, and if you hail from the left you can find yourself pointing to the comprehensive nature of the Finnish system to justify reinstating comprehensivisation in England. The reality, of course, is that we simply do not know what characteristics of the Finnish system may be responsible for its performance, nor indeed, whether we should take much notice of these comparisons, given the weaknesses that I shall point out.

I don’t want to go into detail about the technical controversies that surround the PISA data. Just to say that there is an increasing literature pointing out that it is a vastly oversimplified view of what counts as performance in the areas of reading, maths and science. There is research that shows that countries cannot be ranked unequivocally along a single scale and that they differ along many dimensions. Thus, in a comparison of France and England myself and colleagues were able to show that different factors were at work in each system. This is further complicated by the way the systems are differently structured, with up to a third of pupils in French schools repeating a year at some stage, compared to very few in England.

There is good evidence that the process of translating the PISA tests from one language to another is problematic so that there is no assurance that the ‘same things’ are being assessed in different educational systems. Detailed analysis of the International Adult Literacy Survey has shown how much translation can depend upon context and in many cases that it is virtually impossible to achieve comparability of difficulty for translated items. PISA does in fact attempt to eliminate items that appear to be very discrepant in terms of how pupils respond to them in different countries. The problem with this, however, is that this will tend to leave you with a kind of ‘lowest common denominator’ set of items that fails to reflect the unique characters associated with different educational systems.

Most importantly, PISA is a one off ‘cross-sectional’ snapshot where each 15 year old pupil in the sample is tested at one point of time. No attempt is made (except in a few isolated countries) to relate pupil test scores to earlier test scores so that progress through the educational system can be studied. This is a severe handicap when it comes to making any ‘causal’ inferences about reasons for country differences, and in particular comparing educational systems in terms of how much they progress over time given their attainments when they start school. Often known as ‘value added’ analysis, this provides a much more secure basis for making any kind of causal attribution. OECD has in the past refused to implement any kind of ‘longitudinal’ linking of data across time for pupils, although this may be changing.

PISA still talks about using the data to inform policymakers about which educational policies may be best. Yet, OECD itself points out that PISA is designed to measure not merely the results of different curricular but is a more general statement about the performance of fifteen year olds, and that such performance will be influenced by many factors outside the educational system as such, including economic and cultural ones.

It is also worth pointing out that researchers who are interested in evaluating PISA claims by reanalysing the data, are severely handicapped by the fact that, apart from a small handful, it is impossible to obtain details of the tasks that are given to the pupils. These are kept ‘secure’ because, OECD argues, they may be reused for purposes of attempting to make comparisons across time. This is, in fact, a rather feeble excuse and not a procedure that is adopted in other large scale repeated surveys of performance. It offends against openness and freedom of information, and obstructs users of the data from properly understanding the nature of the results and what they actually refer to. Again, OECD has been resistant to moving on this issue.

So, given all these caveats, is there anything that PISA can tell us that will justify the expense of the studies and the effort that goes into their use? The answer is perhaps a qualified yes. The efforts that have gone into studying translational issues have given insights into the difficulties of this and provided pointers to the reservations which need to be borne in mind when interpreting the results. This is not something highlighted by OECD since it would somewhat detract from the need to provide simple country rankings, but nevertheless could be valuable. The extensiveness of the data collected, including background socio-economic characteristic of the pupils and information about curriculum and schools, is impressive, and with the addition of longitudinal follow-up data could be quite valuable. What is needed, however, is a change of focus by both OECD and the governments that sign up to PISA. As a suitably enhanced research exercise devoted to understanding how different educational systems function, what are the unique characteristics of each one and how far it may be legitimate to assign any differences to particular system features, PISA has some justification. If its major function is to produce country league tables, however, it is uninformative, misleading, very expensive and difficult to justify.

The best thing to do when the results are published would be for policymakers to shrug their shoulders, ignore the simplistic comparisons that the media will undoubtedly make, and try to work towards making PISA, and other similar studies, such as TIMSS, more useful and better value for money.

Further Reading

Affman, I. (2013). “Problems and issues in translating international educational achievement tests”. In Educational Measurement, issues and practice, vol 32, Pp2-14.

Goldstein H. (2004). International comparisons of student attainment: some issues arising from the PISA study. In Assessment in Education, Vol.11, No.3, November 2004 pp 319-330

Harvey Goldstein University of Bristol November 2013