ALL London branch |
Last update: 27/05/2012 | Useful points for discussing / analysing exam dataTo download these statements, click here This page has sample statements which could be used in the context of analysing examination results e.g.:
Analysing resultsTo address requirements to explain pupil progress. Version 1: if the school takes a ‘prior attainment’ approachIn the context of the national comparability of grades in MFL compared with other subjects (see ALL statement appendix) my pupils make progress commensurate with their prior attainment, i.e. the average grade obtained by my pupils is on average half a grade below the average of their other subjects. Version 2: if the school takes a ‘relative performance’ approachCompared with the national average difference in performance between MFL and other subjects (0.5 grade), my results are consistent with this i.e. my pupils gain on average half a grade lower than other subjects. Arguments and counter-argumentsBelow is a quotation from the CEM report which raises some of the issues which are often raised in the continuing debate about standards, grades and difficulty. This debate often erupts in August when the exam results are published. A rise in the percentage of students getting grade A or grade A*-C etc is interpreted as evidence either of a rise in standards or of "dumbing down". Confusion between norm-referencing and criterion-referencing will always lead to unresolvable argument. Language itself is full of interpretation and connotation, and we need to be sensitive to these. Equally one person's "standards" are another’s "pedantry".
Robert Coe, the report’s author said in a TES interview that it was unlikely that these factors [better teaching, motivation, time allocation] explained all of the differences. It is worth considering some of the issues through example and counter-example, or analysis over time. Also, in each case where it being argued that a particular reason is why language grades are more "severely graded", it is worth enquiring whether there is any quantitative analysis which leads to a particular subject being more systematically "severely graded" as a result. Inevitably, there is none! The interpretations in the following paragraphs are more subjective and personal, but there is an underlying analysis. Time: It is argued that languages get less time than XXX (any of i) English, Maths / other subjects + issue of time at KS3 ii) what the QCA assumed iii)what they get in other countries)Counter-arguments: subjects such as BS get no time at KS3; virtually all subjects get the same time at KS4, German / Spanish is more often than not a second language / starts later / gets less time and yet the gradings for those subjects are virtually identical to French. The disparity re grading goes back to before QCA / what evidence was used to match time to criteria. In other countries, time for non-English languages is often similar across the ability range to ours, and non-English teachers in other countries despair about the lack of interest in their subject. : It is argued that you need a particular aptitude to succeed at languages. This would make it similar to music, art and drama, but significantly, those subjects actually appear on the other side of the "difficulty" spectrum to languages. Also, studies such as CEM showed the lower correlation between subjects such as music & art and the "mainstream" (which endorses the "aptitude" argument) but MFL shows a good fit (same as History). This effectively undermines the aptitude argument as a reason for "severe grading" for MFL.Motivation / being an "option" It is argued that if pupils are choosing a subject then they will do so because they wish to study that subject / will be better at it. This is where the situation with MFL in the last 4 years offers rich pickings for analysis, as the subject has moved from being "nearly compulsory" to "often optional". Often it is a local decision as to how optional MFL has become and the timescale involved. Appendix 1 offers a simple analysis to show that the exam boards have not acted consistently in what is a very difficult time to make judgements on standards, It is not surprising that complaints about fluctuations and inconsistencies in gradings have exploded as a result. The very fact that some exam boards are making some adjustments in some languages undermines the argument whichever way round it is being put!Teaching and Learning (QCA example - Jim Knight's response). It is argued that the problem lies on the quality of teaching and learning in ML. Response: There is a false logic in the QCA's position of implying that you should
tackle teaching and learning INSTEAD of tackling severe grading (with its
implicit assumption of criticising ML teachers relative to teachers of other
subjects).The key point is that BOTH teaching and learning AND severe grading
need to be tackled. ALL subjects should be focusing continuously on improving
teaching and learning. Languages is uniquely disadvantaged because of severe
grading at GCSE. Urdu (QCA example) To bring in Urdu as an apparent comparator is surprising given the number of native speakers taking the exam and the minimal number of students who will have been taught Urdu from scratch in school. Links to press articles:
Back to top to severe grading home page
|