Written evidence submitted by Amanda Pike


I am an experienced secondary mathematics teacher with a 1st Class Mathematics degree and have spent many years teaching in the London Borough of Bromley and more recently tutoring GCSE and A level mathematics students.

The planned assessment scheme is seriously flawed and the abuses it permits and even requires will result in able students losing places on A Level courses or at universities or worse still being accepted onto courses where their prior attainment would have fallen below the required levels but an inflated prediction mask that fact. This will cause resentment amongst students, a loss in the credibility of the teaching system and a waste of talent for the nation.

Based on my experience and insights of how teaching in many schools in my borough is actually executed I will explain these concerns. I also have suggestions to make this process fairer for the students and in the long term provide an opportunity to ensure higher standards in teaching and greater reliability of outcomes between and within schools.

GCSE concerns

As a tutor I now observe great inconsistencies in standards of GCSE marking and assessments between schools. In my Bromley borough I would suggest that there are only 3 schools out of at least 20 I have a view on which set regular written homework for which any useful grade can be ascertained. In line with more recent policies, the students may receive ticks, occasionally a useful comment and perhaps a colour (Red/Amber/Green) (Independent school generally provide the most detail.) The majority of local state schools, including grammar schools, rely on online services such as mymaths which only assesses final answer accuracy and has no means of inspecting methods. Many schools rely on self or peer assessment where the students are the expert!

There is a huge variety in the standard of summative mock assessments. Most schools demand that all students take the full 3 GCSE Papers or 2 IGCSE papers and usually the most recent. Some don’t and create their own bespoke papers, leaving out topics not yet covered, or worst of all set and extrapolate from only 2 out of 3 papers either because of timetabling constraints or worse still to lessen marking workload. Some produce topic level analysis for each student but they are rare.

Marking is frequently inaccurate and teachers rarely seem interested when students suggest they have been deprived of method marks. Often because the mathematical knowledge of many teachers is weak they fail to recognise an alternative but correct approach to a question which would have been acknowledged in the full mark scheme.

Many schools have begun to teach higher ability students content for the AQA further maths GCSE. Very few schools had made significant progress through the additional content. Many had not even given their students a mock exam. How will teacher assessment be reasonably applied to this qualification?

Mock marking, particularly some recent mocks completed just before the announcement, were marked by the students. There were many errors in this rushed process (usually under-marking). The students took the scripts home. Any recorded scores taken by staff would be unreliable in this situation.

Teacher retention issues means that it is often rare for GSCE students to have been taught by the same teacher for even 2 years. There may be 10 maths classes in a large comprehensive. Many teachers are inexperienced. How will the exam board trust their evidence?

I would suggest that these are not isolated events.

Much of my time as a tutor is spent providing the additional learning, both content and techniques, missing in maths teaching today. I always feel confident that the majority of my students will go on to exceed teacher expectations. How will this progress be factored in for the next few months?

I would suggest that any process to accept teacher predicted grades should demand some evidence from the school of their assessment and marking policy and award the most generous interpretations to those schools who can clearly demonstrate physical evidence of good practice, comprehensive and challenging mock exams and a past history of providing accurate predictions for their students at various ability levels. How has marking and assessment been monitored? How did previous years’ predictions relate to actual outcomes? What practices (e.g. Saturday/Easter/Revision support offers) could have influenced improvements from mock outcomes? Demonstrate attendance levels at these stages and content delivery for ability levels.


A Level concerns


The same inconsistencies apply to A level. It will be a very difficult task indeed to predict maths and further maths grades for students, not least because the vast majority begin the course with at least a level 7 at GCSE and as we can see from last year’s grade boundaries many found the final papers very demanding indeed.

My particular concern is that schools will inevitably simply predict a grade for a student which is as close as possible to their required UCAS offer, if relevant. If they don’t there will a significant number of complaints and appeals. Resitting the exam will be the only recourse but many students will have given up on further study now and may have limited access to support later in the year. This seems wrong.

How will students resitting the year get a predicted grade if they are not in a school or college setting and simply tutored?

Many have ceased studying seriously now and may arrive at university unprepared with a further impact on their studies. Surely a delayed university start date, allowing the opportunity to prove themselves in some form of simplified late public exam would be fairer?

There again is a huge variety of standards applied to summative assessments and mocks. It is particularly evident that students struggle to retain applied content from year 12 and have performed very poorly indeed in their year 13 mocks. Many schools lack experienced teachers to deliver mechanics in particular and this seems especially evident for further maths. Yet schools are predicting A* for these students when they have failed to exceed a B in their maths mock exams. Surely this should not happen?

Many schools have yet to finish teaching the full content. Few state comprehensives have covered the most challenging integration topics or completed application teaching. Perhaps an exam including some pure year 12 and 13 topics in the autumn term would be a sensible option to consider to ensure university bound students are motivated over the coming months and ready for the next stage of their education? Given recent government indications it is probable that university term may be delayed giving you this opportunity for meaningful assessment. Together with a teacher generated prediction this would provide some differentiation for university administrators.

Again I would suggest that schools have to demonstrate details of content of assessments and marking. I think you may find it difficult to find many schools that could supply students marked homework that demonstrates constructive feedback and grading. The fantastic detailed solutions provided by, for example Edexcel, means that many teachers rely on students to work independently and correct their work (often copied) which does not give an accurate picture of student abilities. Use of mini whiteboards for AFL seems rarely embedded, particularly with inexperienced teachers.

Hopefully the pandemic we are experiencing now will be an isolated event but I would suggest that in future schools should ensure that there is a student evidence portfolio kept (as many more subjective subjects require) and perhaps some unused invigilators could spend some time assessing these evidence requirements.

Many students will have worked hard and deserve the most aspirational and supportive interpretation of their final grade. Some however, either because of weak teaching standards or a deterioration in their effort will not have made good progress in these demanding courses. My worry in this instance is that they may be awarded a grade that does not reflect their ability or indeed knowledge and arrive at a university completely unprepared for what is to come.

I obviously care very much that we improve the standard of maths education in this country as the recent changes to examinations has aspired to. It would be a great shame, as further collateral damage from this pandemic, if the exam boards relax their standards.

I would be happy to contribute further in any way that you consider helpful.


September 2020