ALL London branch
Last update: 27/05/2012
Useful points for discussing / analysing exam data
To download these statements, click here
This page has sample statements which could be used in the context of analysing examination results e.g.:
To address requirements to explain pupil progress.
Version 1: if the school takes a ‘prior attainment’ approach
In the context of the national comparability of grades in MFL compared with other subjects (see ALL statement appendix) my pupils make progress commensurate with their prior attainment, i.e. the average grade obtained by my pupils is on average half a grade below the average of their other subjects.
Version 2: if the school takes a ‘relative performance’ approach
Compared with the national average difference in performance between MFL and other subjects (0.5 grade), my results are consistent with this
i.e. my pupils gain on average half a grade lower than other subjects.
Arguments and counter-arguments
Below is a quotation from the CEM report which raises some of the issues which are often raised in the continuing debate about standards, grades and difficulty. This debate often erupts in August when the exam results are published. A rise in the percentage of students getting grade A or grade A*-C etc is interpreted as evidence either of a rise in standards or of "dumbing down". Confusion between norm-referencing and criterion-referencing will always lead to unresolvable argument. Language itself is full of interpretation and connotation, and we need to be sensitive to these. Equally one person's "standards" are another’s "pedantry".
Robert Coe, the report’s author said in a TES interview that it was unlikely that these factors [better teaching, motivation, time allocation] explained all of the differences.
It is worth considering some of the issues through example and counter-example, or analysis over time. Also, in each case where it being argued that a particular reason is why language grades are more "severely graded", it is worth enquiring whether there is any quantitative analysis which leads to a particular subject being more systematically "severely graded" as a result. Inevitably, there is none! The interpretations in the following paragraphs are more subjective and personal, but there is an underlying analysis.
Time:It is argued that languages get less time than XXX (any of i) English, Maths / other subjects + issue of time at KS3 ii) what the QCA assumed iii)what they get in other countries)
Counter-arguments: subjects such as BS get no time at KS3; virtually all subjects get the same time at KS4, German / Spanish is more often than not a second language / starts later / gets less time and yet the gradings for those subjects are virtually identical to French. The disparity re grading goes back to before QCA / what evidence was used to match time to criteria. In other countries, time for non-English languages is often similar across the ability range to ours, and non-English teachers in other countries despair about the lack of interest in their subject.
Teaching and Learning (QCA example - Jim Knight's response). It is argued that the problem lies on the quality of teaching and learning in ML. Response:
There is a false logic in the QCA's position of implying that you should
tackle teaching and learning INSTEAD of tackling severe grading (with its
implicit assumption of criticising ML teachers relative to teachers of other
subjects).The key point is that BOTH teaching and learning AND severe grading
need to be tackled. ALL subjects should be focusing continuously on improving
teaching and learning. Languages is uniquely disadvantaged because of severe
grading at GCSE.
Urdu (QCA example) To bring in Urdu as an apparent comparator is surprising given the number of native speakers taking the exam and the minimal number of students who will have been taught Urdu from scratch in school.
Links to press articles:
Back to top to severe grading home page