This morning the news is filled with coverage of a Daily Telegraph story which uncovered exam boards tipping off teachers as to which questions they should prepare their students for, and key phrases they should use.
Michael Gove, the Education Secretary, has ordered a probe into the claims, and Ofqual has released a statement which – rather tellingly – appears to state only that they’re interested in it.
The thing is, I and every secondary and A level school teacher, and every student who has passed through the system in the last decade or so, and by the sounds of it Ofqual as well, have known this has been going on for years. The schools have known it’s going on. The examiners know it’s going on. No-one can reasonably expect that the exam boards didn’t also know it was going on.
The exam board EdExcel has stated:
“Examiners’ contracts specifically state that no discussion of the content of future exam questions should ever take place. Any breach of this clear contractual obligation is something we would take extremely seriously, and act on.”
The problem with this statement is that these seminars are laid on by the exam boards. Teachers are encouraged to go along. In fact, in the past (possibly even present – I’m not still a teacher so can’t confirm) it was compulsory for schools to send people to these sessions. The initial idea was simply to get an idea of how coursework should be graded by classroom teachers, in order that a level of consistency could be achieved across the country.
These sessions grew to cover all aspects of the curriculum and examination system. Every teacher for decades has advised students to look at past papers, and exam boards are notorious for being quite lazy and cycling the same questions. Therefore it becomes possible to predict the kind of areas of the syllabus which are likely to come up, without ever talking to an examiner.
What examiners have been caught doing is adding extra information to this. They have been not only specifying which questions would come up, but also – and more importantly – what keywords examiners are looking for. Schools which pay for their teachers to go on these seminars would be utterly disgusted if examiners didn’t “discuss the content of future exam questions” – not in specific detail, but at least talking about the styles of answers they are expecting. That is how the examination system has been set up, it is the way exam boards examine, and it is the information schools need to jump through the hoops to achieve the performance improvements that the government and schools inspectorate demand.
This keyword issue is absolutely central to the entire problem. Examiners mark according to a strict scheme, and basically scour an answer looking for these key phrases. The problem is that this system biases strongly in favour of those students who have both excellent English skills, and teachers who train these answers. The problem is that these keyphrases have little to do with understanding a subject. It makes marking easier, but it becomes an exercise in testing exam technique, rather than subject knowledge. Since when was understanding of a subject well measured by the use of key phrases which even teachers of the subject have to be coached in the use of?
This isn’t news. Teachers and students have both been complaining about this for years. The exam boards like to keep their lives easy though, so nothing changed. Actually, that’s not true. Things did change; they went further and further down the line of keyphrase answering. It is at the point now where exams which supposedly require long answer questions can be aced by someone who writes down three or four key phrases.
Things went so far down this line, that the only way schools could guarantee student success is to train their teachers to train the students how to answer these types of question only in a way which examiners would reward them with marks.
I’ve talked about grade inflation before, and this is a very easy mechanism (alongside the grade boundary manipulation I discussed before) by which to achieve the ridiculous system we have now, whereby more students are achieving top grades, while being less and less likely to actually understand a subject.
Universities are now struggling with a population of students who arrive largely unable to study a subject without asking the questions “will this be on the exam?”, or “how would I phrase that in an answer?”. Many of them genuinely struggle (and even formally complain) after their first year exams, when they discover that questions can be open ended, or that they’re not provided with specimen answers.
We have an education system which measures entirely the wrong things, in a way which strongly favours those who are good at learning by rote, or who have teachers that focus on the likely exam questions, rather than teaching the subject.
The most disgusting thing about this, is that it’s likely to be the individual examiners who are going to get the blame for this, while Ofqual, the exam boards and the National Curriculum will remain largely unchanged.
- So my first prediction is that a report will be produced, damning the system. Some people will lose their jobs, and changes will be made to some of the exam structures.
However, nothing will really change very much. I assume that teaching to specific exam questions will be discouraged. That will result in students answering questions differently, and on the whole less closely to the exam board mark scheme. That should lead to a marginal reduction in scores. However:
- My second prediction is that this won’t be reflected in A Level or GCSE results, which will continue seeing better and better results regardless of what happens
If marks continue to go up, irrespective of a sudden and unplanned change in how subjects are taught or examined, and that require students to be less prepared on a broader range of topics, then the protestations about grade inflation not being artificial lose any credibility they might have still had. Because the UK GCSE and A Level curricula and examination systems are completely broken.