Science and Technology Committee

Oral evidence: Practical science in schools, HC 1270
Monday 12 May 2014

Ordered by the House of Commons to be published on 12 May 2014.

Written evidence from witnesses:

       Ofqual

Watch the meeting

Members present: Mr Andrew Miller (Chair); Jim Dowd; Stephen Metcalfe; Stephen Mosley; Graham Stringer

Questions 1-113

Witnesses: Professor Julia Buckingham, Chair, SCORE, Dr Sarah Main, Chief Executive Officer, Campaign for Science and Engineering, Professor Ian Haines, Executive Secretary, UK Deans of Science, and Malcolm Trobe, Deputy General Secretary, Association of School and College Leaders, gave evidence.

Q1   Chair: Can I welcome our witnesses this afternoon? There is rather a lot going on in the House at the moment, so some of my colleagues will be popping in and out. No discourtesy is intended; it is just the nature of this place.

As you know, we previously did a piece of work on science practicals. Given recent statements from Ofqual, we thought this was an appropriate time to follow up and invite a number of people to answer a few questions. Just for the record, I would be grateful if the four of you would introduce yourselves.

Professor Buckingham: I am Julia Buckingham. I am here as the chair of SCORE, and my day job is Vice-Chancellor of Brunel University.

Professor Haines: I am Ian Haines. I am the Executive Secretary of UK Deans of Science.

Dr Main: I am Sarah Main. I am Director of the Campaign for Science and Engineering.

Malcolm Trobe: I am Malcolm Trobe. I am Deputy General Secretary of the Association of School and College Leaders, and a former secondary school head teacher.

 

Q2   Chair: I just want to get on record the reasons why we are here today from your point of view. Can you explain to us what it is about practical experiments that is so vital for science education?

Professor Buckingham: I can try. It is very important to understand that the sciences are practical subjects, in the way that art and music are practical subjects. We take science forward by testing ideas, developing hypotheses, doing practical experiments and evaluating that information, so it is an integral part of the whole science process. In teaching science, it is critically important not just to teach students how to manipulate equipment, how to take measurements and so on, but to get them to think scientifically and understand how we go about analysing things, and how we evaluate evidence and take information forward.

Practical science is something that really engages and enthuses young people. They get incredibly excited about it; it appeals to their curiosity and their ability for inquiry, and it is very often that which drives them to take science further. At a time when we are desperate to get more people to do science and go into science, it is a critical part of ensuring that we have a pipeline of enthusiastic young people to go into science.

Professor Haines: It has almost all been said. If you talk to well-known scientists and listen to Nobel prize-winners, they always say there were two things that caused them to become scientists. One is the excellent teacher they had at some point, and the other is playing with science. If it were not for the fact that part of science is almost magical, there would not be the sale of so many chemistry sets, electronic sets and all the other things.

 

Q3   Jim Dowd: You hear this all the time: people always remember a good teacher who inspired them to go into things. You said that a good science teacher can encourage people to take up science and scientific subjects. What about evidence of the reverse?

Professor Haines: This is one of the problems, isn’t it? I have always tried to encourage people to ask the other question, which is, “Why didn’t you go into science?” Those people are very difficult to get hold of at a later stage. One meets people who always say, “I was never very good at science,” but when you ask why, it seems to connect with the fact that at some stage they had difficulty with the mathematics of science in chemistry and physics. Quite often, they seemed more comfortable with the biological side—I am now talking about older people to a significant extent—which they found somewhat more descriptive. Good biology is not descriptive in any sense at all. I think that is the big issue. Another issue has been about good science laboratories and good technical support in schools, as well as the question of good teaching.

Dr Main: The Ofsted report “Maintaining Curiosity” in 2013 asked what motivated people to do science and what turned them off, and it was precisely practicals, which is the key element in that. It showed very clearly, from talking to students, that experience of experimenting, discovering and asking your own questions really enthused students, and the lack of it turned them off.

Malcolm Trobe: I support what Ian said about mathematics. Mathematics is extremely important. There is a lot of negativity there. It seems to be acceptable for parents often to say, “I’m not very good at maths.” Schools are working very hard to change that culture. Maths sits behind a lot of the science, particularly in physics and chemistry, and there is an awful lot of statistics in biology. Most of the key points have been made. Practical work is at the heart of science—of science teaching. There is the old phrase “I do, and I understand.” Practically doing things really enhances understanding. There are a lot of competitive subjects out there for young people nowadays; there is the excitement of media and other areas, and science teachers have to take on the challenge of making science very interesting for young people.

 

Q4   Chair: Professor Buckingham, you made comparisons with music as an example. It seems to me that it would be a pretty odd music curriculum that did not have any practical testing in it. Is that your view of science practicals?

Professor Buckingham: Yes, absolutely. Practical is an integral part of science, and in assessing science we need to be assessing both practical and theoretical knowledge and understanding.

 

Q5   Chair: Are there any logical reasons you can see that would justify the practicals not being part of the final assessment?

Professor Buckingham: No, because we need an assessment of the person’s ability to do science, and doing practical science is an integral part of doing science.

Dr Main: To draw on the music analogy, I clearly remember having to do a written practical paper, as are in these proposals, to draw the light path through a microscope with reflecting mirrors and so on. That kind of information has its place, but only in the same way that having an understanding of how pressing the keys on a piano makes the hammers hit the strings has its place in helping you to be a pianist. It is helpful additional information, but in no way does it help you experience what it is like to sit at a microscope, twiddle the wheels, mount a slide without breaking the glass and try to focus on the wretched thing you are looking at. It is very different.

Chair: Does everybody else agree with that? Okay.

 

Q6   Graham Stringer: How would you describe the state of practical science in schools at the present time? Is it good, indifferent, poor?

Professor Buckingham: It is probably very varied. SCORE did a project last year which looked at the provision of practical equipment and the investment in consumables for science. The data were quite alarming. A very high proportion of schools did not have sufficient equipment of the type that the SCORE panel felt they should. Very worryingly, the investment in consumables in secondary schools ranged from 9p per student per year to about £31 per student per year, so the experience must be extraordinarily varied across the piece. The data for primary schools were even more depressing.

Malcolm Trobe: I agree with that. I think you will find varied practice. You only have to look at the state and range of school buildings as well to know that there are a significant number of new schools with well-fitted laboratories, but also schools which have poor laboratory provision. There is also the technician side and the ability to provide technical support for science, so funding does come into it.

It is interesting to note that there is no financial advantage to schools in getting students to do A-levels in scientific subjects, because effectively all subjects are funded at the same level, although obviously it costs significantly more to resource science laboratories and equipment than to run traditional classroom subjects, if I am allowed to use that term: mathematics, history, English literature and so on.

 

Q7   Graham Stringer: That is very interesting, because universities have more funding per student for science subjects. I had not realised that; it is a very interesting point. It is a long time since I did A-level physics and chemistry. You got a paper; you were asked to do an experiment and write it up, and it was marked. It looked like the rest of the A-level paper. Can you describe the difference between the paper I did more than 40 years ago and the situation now, and what Ofqual is proposing, and the advantages and disadvantages of those three systems?

Malcolm Trobe: I taught A-level chemistry until about five years ago. I taught Salters chemistry where the A-level assessment was by project, in which case students had to research and develop a project.

 

Q8   Graham Stringer: Over a period of weeks, or months.

Malcolm Trobe: Over a period of weeks. They were set a title; they had to do the research, and come up with a practical investigation and a work span for doing it. They would then go on to do the practical work involved. They would have to amend what they were doing; obviously, things do not always work to plan. They would amend it and look at their results and the data, and analyse the data. They would then look to draw conclusions from that, evaluate the effectiveness of their methodology and write it out in a full report, so it would be an effective full-scale practical investigation. That is for the A-level. In AS, it was a much more routine set of practicals, looking at basic manipulative, measurement and observational skills.

 

Q9   Graham Stringer: What are the advantages of that compared with a two and a half to three-hour invigilated exam, as it was when I did it, and compared with what Ofqual are proposing at the moment? What you are proposing may be exciting and very enjoyable to do, but it is fiddleable, isn’t it?

Malcolm Trobe: We have to look at what is best in terms of motivating and developing student skills within the context of a reliable and robust assessment scheme. All of the work is moderated externally, although the results are being looked at and the practical handling techniques are being assessed by the teachers. One would expect teachers to examine those skills and relate directly to the marking criteria, which are there. On the motivational aspect of doing that work and the sense of achievement youngsters can get, I have had students at university come back to say how good their feedback from their university tutors has been about their skill base, and about their ability to work practically in context and go on to develop their own projects.

Dr Main: Many examinations in schools are internally assessed, with moderation from examining boards, so that is not unusual. There are plenty of ways to make sure that it is robust. Julia mentioned the range of practical teaching and examination on offer. Some of the best case scenarios are the types of extended project qualifications you are talking about, which are used in the technical baccalaureate and in some examining boards at A-level. They discriminate very well between all the students, and they are assessed by teachers and moderated by the examining board. I think that is an excellent model to look at; it works well. There are other models. Some of the concern raised has been about the bunching of marks when practicals are done in schools. I believe, and the organisations I represent, which have written to me, also believe that the solution to that is improving the assessment of those practicals, and not by the new proposal, which in the view of many people does not address that problem.

Professor Haines: You were talking about a three-hour practical exam. You were fortunate, because when I did my A-level, even longer ago than that, the Welsh Joint Education Committee made you do it all day. That is a way of doing it. The great argument against a one-off practical exam is that, if the student makes a mistake on that day, it is written off and that is the end of it. Although there are advantages to a one-off practical exam, it is better if it can be done in a different way. I am convinced that there is a way of setting criteria and assessments such that you can grade the student’s result rather than give them a pass/fail, for example, but you will probably want to move on to that particular issue later.

Professor Buckingham: I would support all that has been said. I am very supportive of the concept of an extended project, because it really does encourage students to think and learn scientifically, and that is what I hope we are trying to achieve. We also need to bear in mind that, whether we like it or not, assessment drives learning and teaching, so however we design our assessment is going to determine the way our teachers teach and our students learn. For me, assessing them in a truly scientific way is absolutely critically important, and when they come to university that is what we are looking for.

 

Q10   Stephen Metcalfe: Is it possible to get a good grade in science Alevel at the moment without being proficient at practical science?

Professor Haines: In one word, yes.

Professor Buckingham: Yes.

Professor Haines: It is very interesting. The UK Deans of Science had our spring residential meeting last week and we discussed this issue, because people knew I was coming to this meeting. Somebody, as academics usually do, threw out the question: what if you had somebody applying for a place and they had straight As at A-level in the sciences but they’d failed the practical? I confess that initially there was a very long silence; nobody wanted to say what they would do. The vast majority of them then started to hint that they would not worry about the practical. As we discussed it further, there was a simple reason. They said exactly what has already been said today, which is that very frequently students come without any real skill in any practical technique at all. At the moment, people are accepting them at a point where they probably are a fail on the basis of any sensible criteria, so nothing would have changed really. That is not to say that they don’t also emphasise the importance of practical work and the fact that students should be able to achieve certain levels before they come to university.

 

Q11   Stephen Metcalfe: That is the motivation that lies behind trying to improve the current situation. There is perhaps not enough assessment or registering of the practical teaching of science. Is your problem with what is being proposed by Ofqual the decoupling of the two grades, as opposed to them being combined in some way?

Professor Buckingham: Decoupling is a real concern, because, if universities or employers do not understand it and do not use it, there is a very real danger that it will not be seen as valuable by either teachers or students and we will see a progressive downgrading of the amount of practical taught in schools. Particularly worrying, as I mentioned earlier, is that we have evidence that the provision of practical facilities and consumables is very variable across the piece. Inevitably, it is the less good schools that have the lesser facilities. We are all trying to widen participation. This could of course have a very adverse effect on widening participation.

There is also concern that, while we are potentially devaluing practical, our international competitors are looking to increase it. We need to think very hard about how our young people are going to compare on the international side. As we said earlier, practical is something that enthuses students; it gets them to want to do science, so if the practical is devalued and decreased, we will have fewer people wanting to do science, and that is a real risk.

 

Q12   Stephen Metcalfe: Do you think this would devalue the practical aspect more than it does now when, as you said, you can get a good qualification without having any practical skills?

Professor Buckingham: I have concerns that the separated practical mark will not be taken into account by universities or employers; they may not understand it. If it is not graded, it is also very difficult to know whether someone is, frankly, totally useless or absolutely brilliant.

Malcolm Trobe: What we are identifying is a significant risk. It has already been said that assessment and, to a certain extent, accountability measures drive what goes on in the classroom and the laboratory. If you are not putting into the assessment a good, clear assessment of practical abilities, potentially it is going to be downgraded in the eyes of teachers and students as well. Students are often very clear in their minds what they need to do to achieve the grades they need to enter higher education, and they focus on that. Unless you are putting in a practical emphasis, there is an inherent risk effectively of reducing the amount of practical work.

We are not talking just about universities; we are also talking about the needs of British industry. When a lot of work was done under the previous Administration on the science diploma, I was very interested to hear what a number of companies—Wellcome, AstraZeneca—were saying about the dearth of higher-level technicians in the science industry and the need to ensure that we recruit, perhaps not at degree level but, moving on from level 3 qualifications such as Alevel, that higher-level technical side of science for which people need a significant level of practical skills. There is an inherent risk related to that as well as for universities.

 

Q13   Stephen Metcalfe: Maybe I am not listening properly, but I am struggling to see how the proposals are worse than what we have at the moment, and why they would be ignored. You at least have a separate mark for practical, so it has to be taught. Why is that worse than the situation we have at the moment where you might not have done any practical?

Dr Main: My understanding is that for most examining boards about 20% to 30% of the overall A-level grade is in practical assessment. I think the reflections of how well students are fit for university by that have been given, but there is practical assessment in Alevels. My concern, which I think is broadly shared, is that because schools are motivated by assessment criteria and league tables, for example, if they are responding to pressure to perform on league tables, whether that is progression to A-level or whatever else, they will allocate their limited resources to the areas of schooling that will give them the best return for those resources. My grave concern is that, by uncoupling practicals from Alevels, resources will be diverted away from those practicals and will be put primarily into the things that result in Alevel grades.

 

Q14   Stephen Metcalfe: Presumably, you raised all these concerns as part of the consultation process. Do you feel you have not been listened to, or that those conducting the consultation did not understand your concerns?

Professor Haines: I think that the response of Ofqual to the consultation was made very clear. They did understand exactly what was being said. I think they have presented a solution, but it is not the solution that most people would prefer.

If I may come back to the question you asked earlier, at least at the moment students are trying to get a very good grade in practical. We could get into the issue of the mechanism by which they get it, which is often inappropriate because they are trained, if you like, to do just one thing and get it even more right until they get a high mark. This system will be in danger of generating something where there is no challenge to the student, except to get a pass—nothing better than a pass. One has a feeling that regardless of how little or extra practical there might be as a result of the other parts of the proposals, there will just be a tick in a box to say they have done that—no need to try to do it better, no need for them to compete with one another or with themselves to get a better and better grade.

Professor Buckingham: As you say, we have a situation at the moment which, in the view of many of us, is far from ideal. We are moving into a situation which, certainly in the view of SCORE and my colleagues along the table, is equally, if not more, undesirable. We in SCORE would have liked considerably more time to see a new system properly piloted and evaluated and compared with other systems, both nationally and internationally, before we leap forward into a completely new way of assessing, which is going to impact the lives of a large number of young people and could have much wider consequences for society.

Malcolm Trobe: I support that.

Dr Main: We represent quite a broad spectrum of the science and engineering community. It is hard for me to convey to you the vehemence with which many people have written to me with the answer “no” to your question; they felt they weren’t heard. The response from our poll gives some indication of the range of positive and negative comments about the proposal. It is fairly self-evident that a large cross-section of the community did not support the proposal as it was presented. One fundamental reason for that, as Julia said, is that there is no evidence of what effect this proposal will have. It has not been trialled or tested; there is no evidence to support it as a new system, or of what impact it will have.

Professor Haines: We agree with Mr Gove’s idea of making A-levels more ambitious and stretching the most able. It is impossible to see how a pass/fail system, wherever it is, is going to stretch the most able.

 

Q15   Jim Dowd: Professor Haines, you were moving towards this in the replies you just gave Stephen. If we are to promote practical skills in science and encourage them, and Ofqual’s approach does not meet with your approval—you will not be surprised to know that public administration is bedevilled by people telling you what not to do, but there is a dearth of people telling you what you should do in a practical sense—what would be an optimum approach to the issue?

Professor Haines: I can conceive of a menu of practical work that all students would be expected to do, with some of them taken out to be assessed, in a fixed period of time—time-controlled—with appropriate assessment criteria, so the students have several chances to get things right over several different practicals. I do not believe it is impossible, with the right training of whoever does the assessing, and the right work on the criteria for assessment, to produce something that can be applied across the piece without the problem we have at the moment of questioning whether every teacher in every school has carried out the assessment in the right way. I can think of one or two trivial examples—I am a chemist, so I think of them in terms of chemistry—that could be done in that way, where it would be quite clear what the criteria were for a pass, a grade E, D, C, B, A, or however it was decided it would be done, or a number, which would probably be the way one would do it if the result was integrated with a single A-level practical grade.

Dr Main: The key word would be “pilot.” You could pilot a few ideas on how to improve the system. It would probably be sensible to try to uncouple a few factors: one is the quality and breadth of the practical experience students have, and the support and training required for teachers to deliver that. The other one is the validity and reliability of assessment. The third one is the issue of grading, and whether you grade it pass/fail or give a grade as part of an Alevel. It would probably be broadly welcomed by the education system and the science and engineering community if we try to improve the practical experience, but to do it on a small scale, perhaps with individual schools, or working with different examining boards, which already have different methods, to work out which method will lead to the greatest experience of practicals going forward. I would suggest that delaying implementation by a year to 2016 instead of 2015 might give time for that to happen, and would also realign the changes with other GCSE and A-level changes for 2016.

Earlier, we mentioned extended project qualifications, which are used quite broadly in the technical baccalaureate and other vocational qualifications. They are also used by some examining bodies within A-levels, so even just looking at the data available at the moment about what works for different examining boards and the different methods they use would be a starting place.

Malcolm Trobe: The speed of change is an issue. Had we had another year we would have been able to do that work. There is international experience. New Zealand has a similar approach to project-based work in terms of their assessment of science. Even though this decision has been made, we hope that Ofqual would now move into a phase of looking at and piloting how assessment could be done better, and how it could be reintegrated, we hope, into a full assessment grade.

Professor Buckingham: We would support that very strongly. It would be well worth looking at what happens in the IB, where practical contributes 20% of the assessment and much of it is done by an extended project. Many universities are certainly very fond of the IB. Wellcome and Gatsby are at the moment doing a lot of research into practical, and SCORE is planning a project over the next year to 18 months in which we will look at teacher-led assessment.

 

Q16   Jim Dowd: If it is just a question of the pace at which this is pursued, I assume you must agree that the direction of travel is acceptable, because if you are going in the wrong direction you really do not want to do it this fast, or indeed at any speed at all, do you?

Professor Buckingham: We agree that there is a need for change, and SCORE has certainly very much welcomed the review of A-levels. What we are concerned about is the nature of the practical assessment and how we can have the best possible assessment for our young people, and for UK plc. We are not convinced that the separation of the practical grade is the right way forward.

 

Q17   Jim Dowd: But isn’t that rather like every party, every year—every sensible individual—saying they want the population to be as healthy as possible?

Professor Buckingham: No, it is not. If we are having a change, in my opinion, it should be properly evaluated and piloted before it is introduced.

 

Q18   Jim Dowd: The private sector only caters for a small minority of students in this country, but it is my observation that they tend to major on issues like science and use as a selling point the fact that they have well-equipped laboratories in particular. Are state schools adequately resourced and positioned to undertake decent practical experience for Alevel students?

Malcolm Trobe: There is significant variation in the funding of state schools around the country. Obviously, the level of funding depends on the area your constituency is in. The average level of funding for a 16 to 18 year-old is about £4,600. I think in the independent sector it is between £13,500 and £14,500 per student. That makes a significant difference in terms of their ability to fund practical work in science, and resource it through laboratories. As I said earlier, there is significant variation across the country. A lot of new schools with excellent facilities have been built, but there are also a number of schools with very poor facilities.

 

Q19   Jim Dowd: Mr Trobe, you mentioned New Zealand. Who across the international spectrum, or the world, is getting this particular approach right? Who is doing best?

Dr Main: It is interesting that in this whole investigation the community, as represented here by a handful of us, is being challenged to come up with new proposals. I would have thought that the onus should rest on Ofqual and the examining boards to come up with well-evidenced proposals and for them to look at the international community. Some people have written to me and said there are good examples in the United States and South Korea, particularly regarding geology and fieldwork, but I do not feel that some people having written to me in the last week is sufficient to provide evidence to your Committee. This is part of what Julia has been referring to. To design something better than what is there at the moment will require at least some assessment of what is out there in the UK and internationally. It seems to me that that sits with Ofqual.

Professor Buckingham: There was a very good paper written towards the end of last year by Abrahams, Reiss and Sharpe of the Universities of York and Leeds and the Institute of Education, which addresses that question very well and reviews the international spectrum. It is quite long, but it is certainly worth looking at.

 

Q20   Chair: Were there any particular countries that led the field?

Professor Buckingham: It is clear that a lot of countries, particularly ones that do well in the PISA league, are placing great emphasis on practical. It is difficult to tease out particular ones.

 

Q21   Chair: How do they measure their practicals?

Professor Buckingham: They measure it in many different ways, but it is integral to the exams.

 

Q22   Chair: I see; it is integral to the grading system.

Malcolm Trobe: For example, in Australia, all the states have different systems of assessing, so it requires a detailed study of exactly what all the various countries are doing. New Zealand was one that I picked out from Reiss’s study; it has a broadly project-based assessment model, but from what I can understand, it is highly successful in that country.

 

Q23   Chair: If there is no international comparison being made by Ofqual to justify their position, who is actually backing the Ofqual proposition? Can anyone answer that?

Dr Main: The data put out with the response indicated that some of the positive support for the proposal had come from the examining bodies, but I understand that two of the five examining bodies were not supportive. My understanding of what Ofqual have reported is that just over half of the current examining bodies are supportive. Throughout this, the question in my mind is who gains from the new system? Who is the winner? I do not think that at the moment it is the students or the science and engineering community in the workplace or in academia, so I wonder who it is.

Malcolm Trobe: We all accept that the qualifications system must be reliable and robust, but it is interesting to note that the Ofqual report recognises that the majority of respondents were in favour of keeping everything under one grade. Ofqual have moved against the majority of the respondents.

 

Q24   Stephen Metcalfe: I think you agree that there is a need for some change, and the motivation behind these proposals is good and proper. I hear what you say about delays, grading and pilots. I am always concerned about pilots, because students get only one opportunity to do the exam, and if you find that the pilot has not worked, you have potentially disadvantaged those students, so we have to get this right. My first question was going to be whether you are sure that what is being proposed will not achieve the desired result. You have probably answered that pretty comprehensively.

Dr Main: I would just interject that no one can be sure because there is no evidence; no one knows what will happen. It feels to me like a lurch.

 

Q25   Stephen Metcalfe: Your feeling is that, rather than it being an evidence-based decision, it will not deliver improved practical teaching of science. That is your gut feeling, because they cannot evidence that it will.

Dr Main: I am very concerned about it.

Professor Haines: If we come back to the business of stretching the most able, which I think on this side of the table we all agree with, it is very difficult to see how this will do it. Ofqual or others may come back and say, “Ah, yes, but the written examination is going to be such that part of it examines students’ ability to look at the results of practical and explain practical.” That does not in itself show whether a student is good or bad at practical work; it shows only that they are good or bad at a certain aspect of applying that practical work to the understanding of science. It patently is not set to stretch the most able candidates, and the UK Deans of Science feel that is the biggest issue here.

Malcolm Trobe: I think we are basing it on a judgment of risk. What you are getting is our judgment—also my judgment—in terms of the risk of the impact on teaching and young people and the risk to the provision of high-quality scientists at all levels.

 

Q26   Stephen Metcalfe: If these proposals go ahead as they are, unchanged, who will be responsible for making them work? Which group of people is the most important?

Malcolm Trobe: Teachers.

Professor Haines: They are the ones we rely on to make the best of whatever job they are handed. And I should add the technical staff who support them.

Professor Buckingham: And the governors who are responsible for ensuring that schools are properly resourced—or that the resources are allocated properly.

Malcolm Trobe: We have to look at the accountability system, because that puts a significant amount of pressure on teachers.

 

Q27   Stephen Metcalfe: Who should be examining the examiners—examining that accountability system? Who should be in charge of that? Who should be conducting the assessment of the impact this would have?

Dr Main: I have already said that Ofqual and DFE are responsible for the proposal they are putting forward. I would say that makes them responsible for monitoring how effective it is.

 

Q28   Stephen Metcalfe: If you could say one final thing about this issue for the record, what would it be? What message would you like to send clearly to Ofqual and the Minister, both of whom are going to appear before the Committee this afternoon?

Dr Main: I would say that the proposal is a huge gamble with the lives of individual students in this cohort, and with the future science and technology work force. We are making the assumption that uncoupling from A-level grades will lead to diversion of resources away from practicals because of pressure on schools. I cannot put my hand on my heart and say I know what the outcome will be, but I think that assumption is fair and reasonable in most people’s eyes; it is likely to lead to the diversion of resources away from practicals. If so, that will have the greatest impact on the schools and students who are most stretched for resources. I think that is a terrible shame. It also goes against the Government’s current ambition to increase our science and engineering work force, the Your Life campaign, launched last week to increase the diversity of students taking science and engineering subjects, and Ofsted’s own report about maintaining curiosity, which says very clearly how important practicals are at GCSE and A-level for keeping students interested in science.

Professor Buckingham: I would reiterate every word of that, and also say that it is a policy without any evidence to support it.

Professor Haines: Remember that interesting, exciting, practical work in the right sort of laboratory stimulates interest in science, and the opposite kind of practical work does the exact reverse.

Malcolm Trobe: Think again. It is not an irreversible decision. Go out, do some more research and come back with a better answer.

Chair: That sounds like some of my school reports: “Could do better.” Thank you very much for your attendance this afternoon.

 

Examination of Witnesses

 

Witnesses: Dennis Opposs, Director of Standards, Ofqual, Glenys Stacey, Chief Regulator, Ofqual, and Janet Holloway, Head of Reform, Ofqual, gave evidence.

Q29   Chair: Good afternoon. Could I welcome you here this afternoon, and thank you for sitting in on the earlier session, which I think helps to inform some of the exchanges. For the record, I would be grateful if the three of you would introduce yourselves.

Glenys Stacey: My name is Glenys Stacey and I am the Chief Regulator at Ofqual.

Dennis Opposs: I am Dennis Opposs. I am Director of Standards at Ofqual, and in the past I taught science in schools.

Janet Holloway: I am Janet Holloway, Head of Reform at Ofqual, and I also used to teach science in schools.

 

Q30   Chair: Out of interest, can I ask whether any of you read our previous report?

Glenys Stacey: For my part—I am sure my colleagues will speak for themselves—I read your 2011 report very carefully. Indeed, Dennis Opposs gave evidence to you at that time.

 

Q31   Chair: Did any of our recommendations influence your thinking?

Glenys Stacey: Yes, they did. If you recollect, in recommendations 10 and 15 in particular, you expressed concern about the need for experimentation and lab skills to be properly tested. You proposed that students’ understanding of the experimental process should be examined properly, and we had very real regard to that.

 

Q32   Chair: Because my information is largely based on media reporting of what you actually said, and my instant reaction when I read one report was, “This is daft,” I want to give you a clear opportunity to convince me that it was not daft. I am firmly of the view that practical knowledge of how science works is equally as valuable as theoretical, yet it seems that the proposals to decouple, as you heard earlier, will, in the eyes of many, not achieve the goal you are seeking. Can you convince me that it is not daft?

Glenys Stacey: Thank you for giving me the opportunity. I will give it my best shot. We are making changes to A-levels, the sciences in particular, first to embed the new content agreed by Government. If you have a chance to look at that, you will see that the new curriculum requires greater emphasis on testing and experimentation by exam, but there is also a list for each of the science subjects of a dozen or so practical skills that must be taught and assessed. If you like, we are putting into effect regulating for a more specific—

 

Q33   Chair: Can I just get it clear in my mind? Assessed beyond pass or fail? Grade?

Glenys Stacey: Not necessarily.

 

Q34   Chair: Why not? This is the bit I fail to understand.

Glenys Stacey: If I can explain what we are doing, we are taking that content and embedding it in a qualification—A-level—and in assessment arrangements that we judge are best, or most likely, to produce valid and reliable outcomes and also, as far as possible, the right educational outcomes for those students, so students are more likely to go to university with the right range of practical bench skills that universities themselves through research have said they most value. What we have been wrestling with is not that these skills are not important or not central to science; it is how best to regulate for the design of an assessment that is most likely to achieve the valid outcomes everyone wants, and also most likely to ensure that most students study a greater range of practical skills than—frankly—they do at the moment.

 

Q35   Chair: Do you think this approach will result in schools and parents putting greater or lesser weight on science education?

Glenys Stacey: I do not think it will result in lesser weight being put on science education. I say that for several reasons. One is that 15% of the marks in the examinations will be allocated in each of the exam boards to the student’s understanding of experimentation in its wider sense, including data analysis and experimentation design. Frankly, it will no longer be possible for students to obtain the best grades in any of the science subjects unless they have sufficient understanding and experience of experimentation. Parents will, hopefully, regard that well.

It also means that the grade from the examinations will be more valid and reliable than the current ones. It will not be obfuscated, confused or made less valid or reliable because of any particular practices that happen at the moment around experimentation and the assessment of it by non-exam techniques.

 

Q36   Chair: If you have a student who is heading for good grades—high grades—and their parents and teachers see that they will get a high grade come what may, surely they are going to put less emphasis on the practical skills, because all they are going to do is pass or fail.

Glenys Stacey: Students will, as I have said, only be able to achieve the highest grades in these qualifications if they have sufficient experience of experimentation in every sense. They are also required in each of the science subjects to do a minimum of a dozen determined practical skills. They will have to do those; there is no avoiding it. We will require a record of those things. We will require live moderation over the two years of the qualification from exam boards in a targeted way. We will require evidence per student that each of those practical skills has been undertaken and assessed. It is not that the practical skills in our proposal are optional for students; they are required. Therefore, if I were a young scientist with ambitions to study at university, I would be looking forward to a good deal more experimentation than I experience at the moment.

 

Q37   Chair: You heard the professor from the UK Deans of Science say that this will not stretch students. How can you convince him—or me—that that is not true?

Glenys Stacey: First, what has beleaguered this debate for many months—we started the consultation in October last year—is that, although there is a common view and agreement that things do not work well at the moment, none of the solutions in play at the moment, including the extended project, can be said to work sufficiently well across the vast majority of our schools. One of the issues we have all grappled with is that there is no immediate happy solution.

Secondly, we are looking to promote the right educational outcomes. We are looking to promote more science work and more bench work in schools, basically making it more difficult to avoid, should a teacher, school or student wish to do so. There is now a basic requirement in relation to a dozen skills. If you are a well-motivated teacher, an enthusiastic scientist, surely the way of doing that is to integrate your science teaching, so you are demonstrating and teaching students to learn these practical skills and be at ease with them through experimentation. We will be making sure that that happens through moderation and in discussions with Ofsted through the checks and balances they can bring to bear.

 

Q38   Stephen Metcalfe: As I said to the earlier panel, we all agree there is a need for change and for motivating the changes for the right reasons. We have the right reasons, but when you went out and consulted about decoupling the majority of respondents disagreed with your view. Can you tell us who supported the proposals?

Glenys Stacey: Yes, I can. First, some teachers at the chalk face supported them. One or two eminent scientists recognised the intractable issues with assessment; Jon Osborne, for example, was one of those who did that. Indeed, the Abrahams and Reiss research that Julia quoted earlier recognises the inherent problems with indirect, as opposed to direct, assessment of skills. I think there was a mixed picture. Many leading scientists prefer solutions such as, for example, the extended project, but we have the difficult task of squaring the promotion of science with manageability in the vast majority of schools, not simply a few, and also of being sufficiently sure of valid and reliable outcomes. We are trying to get three things—three boxes ticked, if you like: to promote good science and good learning, and for it to be manageable in the vast majority of schools. Remember that some of them—for example, OCR, have centres with 300 students doing chemistry, so the extended project gets very problematic. We are also looking for sufficiently valid and reliable outcomes.

Often the engagements we have had recognise the problem and the need to promote good science thinking, but the solutions get much thinner when you are looking at whether it is manageable across the range of centres and schools we are talking about and whether it will produce sufficiently valid assessment outcomes. That is really where the problems lie.

 

Q39   Stephen Metcalfe: What I draw from that is that all the major scientific organisations who are opposing these changes are looking at this from a different angle from you. Is that right? They want a different outcome from the one you are trying to achieve.

Glenys Stacey: That is right. We all have a common interest in ensuring that practical work is at the heart of science teaching, and that we develop people’s interest, enthusiasm and ability in science and in bench skills. There is no distance between us there at all.

I suspect that we are in a slightly different position from others because we are looking across the evidence base and what we have seen by way of different solutions. We obviously have to focus on issues such as lack of discrimination in some of these assessments and manageability. These are our statutory requirements; we need to do that. I think it is right to say that a good number of the leading science groups—for example, SCORE, Wellcome and Gatsby—all presented to us different proposed possibilities for solutions and combinations. We set those out for you in our memorandum to the Committee. The solution that we have come up with takes a good number of those things—for example, log books and increased testing, experimentation and examination—but one or two of the things proposed are, I suspect, the real rubbing points. One is the extended project. We cannot see that it is manageable or valid across all schools, so we did not go for it in the end. The big issue, which I quite understand, is that we have determined to take practical skills out of the main grade, but to get them assessed as pass or fail. I quite understand that that is a significant challenge and change. I believe we have set out the reasons why we have done that: the accountability pressures and the sufficient experience we now have across a wide range of schools of mark-chasing and grade-chasing that happens when it is within the assessment. We are trying to liberate it—to take the shackles off it—to enable proper science teaching, freed of the notion that you chase grades. Frankly, the evidence is that you chase grades even in the extended project, because you are assessing an extended project by a record of it—an indirect assessment which Reiss and Abrahams argue against, as do we. Good teachers not only know how to teach science but which words students should use in their assessment to make sure they tick the boxes for the marks scheme.

I think there is a common understanding of the issue, and a common requirement and desire to liberate science teachers to enable them to teach our budding science students well, but when it comes to how best to do that, we are very keen to ensure that we have good valid outcomes. We did not think that some of the solutions presented to us would do that sufficiently well.

 

Q40   Stephen Metcalfe: You heard what the previous panel said when I asked them what message they would like to send to you. There is obviously huge concern about the changes. They talk of danger, risk and potential damage to the current cohort—no evidence; think again. Regardless of whether or not you think you are right and they are wrong, or vice versa, would it not be better for science overall if there was more unanimity on this issue and you could find some common ground where you did agree? At the moment, it strikes me that we are damaging the perception of science and the way it is taught, whatever the outcome.

Glenys Stacey: One of the heartening things about the debate in the last month or two has been the willingness of even those science groups that react most adversely to our decisions to work with us to make sure we can implement these proposals well and, indeed, to evaluate them as they run. We welcome that wholeheartedly.

It is important for us now to regulate well, and promote as far as possible the outcomes we all want to see. For example, it was unknown to those commenting on our consultation, until we agreed it recently, that we will require all exam boards to dedicate 15% of their marks to experimentation. Unlike in the past, when there was a little bit of freedom of interpretation, we are really buttoning this down across all exam boards. We have a strengthened accreditation process that will check that in action. We require exam boards to set out their assessment strategies as well, which will show us how they intend to do that. For our part, we can regulate well and keep right on top of this.

For the practical skills requirements, we are for the first time ever going to require exam boards to adopt a common approach. One of the issues in the past has been the freedoms that exam boards have enjoyed to interpret these requirements. It will no longer be so. There will be a common approach to checking, moderation and setting out those requirements. We have an opportunity. They are not coming in until first teaching in September 2015, so we are very keen to talk to science teachers—and, yes, the interested science representative groups—to make sure we design and agree the checks and balances so that this really does happen.

Just as a small thing, we have taken the logbook proposal. How do we make that logbook really integrated into science teaching? In my day, you had to write out your science experiment, draw it and explain it in a way that another scientist could replicate it. That has been a bit lost, and we have an opportunity to get the logbook right. The checks and balances can really help. The success will be in the detail.

We also have an opportunity to pilot, as was discussed earlier. We will work with exam boards to pilot these arrangements to make sure we can get them as right as possible, but there is more work to be done, not just by us but by others, to prepare schools for teaching practical science to the extent that they are now required to do. The equipment issue, which you were exploring earlier, is one that we need to pick up, but not all of these things are solely within our gift. We need to be talking with others to make sure that both equipment and timetabling work as well as possible. It is not simply our controls over exam boards and schools.

 

Q41   Stephen Metcalfe: What would success look like? What would be your definition of a successful change? How long will you give it before you decide whether it has been successful in achieving the results you wanted, or that you would think again?

Glenys Stacey: I would like to be clear that we are committed to achieving better educational outcomes in science A-levels. That is what we are looking for. We will be doing all that we can to make sure we achieve those. Like all assessment, this is a relatively long game. For us, better outcomes would, for example, in time be feedback from higher education that those getting to university have a wider range of practical skills than they have at the moment. I was very struck by John Holman saying that one of his chemistry students said that she had done only two practicals in the two years she had studied, and some of the other students agreed. That feedback loop is really important, but we need to let these run. In the meantime, we will be making sure that they run to best effect while they are running, so that is about the assessment strategies being right; it is about accrediting qualifications only to the right standards, with good moderation checks and the data analysis that we can do. For example, what correlation do you expect to see between pass/fail? We will be right on top of that.

We will be working with Ofsted to see what checks and balances and live information they can give us, and, yes, we are very interested in what research we can run live to see how these things work in practice. We are quite a young regulatory body, but we have learned some hard lessons. One of them is getting out to schools, which I personally do a lot. Incidentally, I can tell you I have had enough experience of sitting in A-level science labs with teachers and students together telling me that the current experience is simply stultifying. I hope that over time when we get out to schools we hear a better story.

 

Q42   Jim Dowd: I think I now have a clear impression of the power structure of Ofqual, but I would be grateful if you could explain, as briefly as you can please, why you need both a chief regulator and a chief executive.

Glenys Stacey: I am the chief regulator and the chief executive; I am one and the same. I was employed as the chief executive, but already by statute it was determined that the chief executive would become the chief regulator accounting to Parliament.

 

Q43   Jim Dowd: Right. The biog I had in front of me did not make that clear. It said that you had moved from being chief executive to chief regulator.

Glenys Stacey: Only because the statute required it.

 

Q44   Jim Dowd: That’s fine. Your proposals were very prescriptive. There was no menu of options; just one approach was adduced and sent out into the world. Was this the only route—the only solution—you looked at?

Glenys Stacey: We looked at a range of options and the evidence we had of how the range of approaches tried in the past had stood up. None of them stood up sufficiently well for us to promote them as a solution. The guidance we follow for structuring our consultations is that where we have a preferred solution we should state it, so we did.

 

Q45   Jim Dowd: On the decoupling, as it has been called this afternoon, where was the evidence that separating the practical from the theoretical—although there is a heavy overlay of theoretical on the practical—would lead to more and better teaching of science in our schools?

Glenys Stacey: In coming to that decision, two particular things influenced us. One is the evidence that under the current arrangements, where those assessments contribute to the grade, distortion occurs. Forgive me if I get this wrong, but, for example, in AQA chemistry, which I think is the biggest provider, there are 40 marks for practical assessment. The most common marks are 38, 39 and 40. Of those, the most common mark is 40. That suggests to us, particularly when we correlate it to the range of marks on examination, that the correlations do not sufficiently equate. The evidence was, first, that a good number of so-called tried-and-tested approaches were not working in practice. Secondly, we had over 800 responses to our consultation, and there were a good number of independent responses from teachers.

 

Q46   Jim Dowd: How do they break down, roughly, for and against your approach?

Glenys Stacey: I think 52% were against decoupling, as you call it, but some of those were proposing written examination, so it was not a straightforward “against” every aspect of what we were proposing. Coming back to the teacher evidence, there were some telling submissions from very experienced and well-placed teachers about what is happening in schools. It is very persuasive, and being out there seeing it is as well.

 

Q47   Jim Dowd: The Minister for Science and I were both on a Committee together. He said that because somebody said this went too far and somebody else said it did not go far enough, he felt it was right. I asked him—Hansard records it—whether, if he felt that if nobody agreed with him he must be right, this was his approach to matters. Is it your approach?

Glenys Stacey: I do not think it is Ofqual’s approach at all. These decisions, as perhaps you know, are made by the Ofqual board, which in the main is seeking to make evidence-based decisions. The board has on it experienced teachers and heads, as well as others. We are always trying to make the best possible decisions. On occasions, that has meant that the board made decisions which are deeply unpopular. Why is that? It is because we are where we are with the way qualifications and examinations are designed at the moment, and to get them to a place where they best suit their purpose and produce valid outcomes we often challenge the way things are done in schools.

 

Q48   Jim Dowd: How much ministerial engagement and involvement was there? How closely did you work with Ministers before deciding to pursue this path?

Glenys Stacey: We did not seek Ministers’ opinions or advice.

Jim Dowd: Very wise.

Glenys Stacey: This is an independent regulator.

 

Q49   Jim Dowd: It is all your own work.

Glenys Stacey: It is.

 

Q50   Jim Dowd: There was no other prompting, other than the general proposition.

Glenys Stacey: The content for these A-levels is set by Government. We are looking at how best to embed that content in these A-levels. You will have seen the list of practical skills, so we have to take that content in the round and say, “Is that good enough in demand?” and all the rest of it. When we have it, we have to design the best qualification and assessment rules that go with it. That is the relationship between us. Dennis, do you want to say anything on that?

 

Q51   Jim Dowd: As you have taken all the trouble to come here, you are welcome to speak.

Dennis Opposs: One thing it is worth being very clear about is what we are separating out. It is not really separating science theory from science practical. We have said that a lot of science practical is now going to be embedded in the syllabuses and has to be taught. Some of that practical—certainly skills like planning investigations and data analysis—is going to be assessed in the exam. There are a small number of skills—the manipulative skills—that you can assess only if you are a teacher observing a student doing them. Those are the ones that can be reported separately. The way our proposal has been characterised in places is rather inaccurate in that sense. It is not that all the practical skills have been separated out and are reported separately.

 

Q52   Chair: You said you were hoping that this approach was going to be evidence-based. You heard us ask the previous panel about international comparisons. Have you any international comparisons to bring into the equation?

Dennis Opposs: We have drawn on some evidence. There was a piece of work done by Ofqual, published two years ago, which was a study of four A-level subjects, including chemistry, where we compared what went on in A-levels with what went on in 20 or so jurisdictions around the world.

 

Q53   Chair: Were these jurisdictions our competitor countries or just random countries?

Dennis Opposs: They were the sorts of countries that are often referred to as higher performing. Because it included chemistry, some of that evidence was about practical skills. Some of that was then drawn into the paper by Abrahams, Reiss and Sharpe, which was referred to by the earlier witnesses, who looked more widely at the state of practical work and practical assessment across the world. They found that, certainly in some of the countries that do very well in the PISA international tests—they quoted China, Singapore, New Zealand and Finland—they all made much more use of direct assessment of practical skills, as they referred to it, which we are saying is the separately reported part. They make more use of that than England and some other countries—for example, Australia or Scotland. There was some evidence to suggest that we ought to be doing more of that.

 

Q54   Chair: Is there a country that fits into that category that is using the methodology you are describing?

Dennis Opposs: According to that paper, in China, there is separate assessment and reporting of that direct assessment of practical skills, so there is one.

 

Q55   Chair: But, in China, there isn’t a universal system applying to the education system. You are selectively choosing parts of China. Which parts?

Glenys Stacey: It is fair to say that we are not choosing China.

Dennis Opposs: It is the case that the only part of China that has had its results reported in the PISA analysis is Shanghai.

 

Q56   Chair: That is a successful and important part of China, but it is not the whole of China.

Dennis Opposs: I cannot be certain because it is not my paper, but I think they are referring to something that is used more broadly inside China, so it was not just part of PISA.

 

Q57   Chair: To the best of your knowledge, there are some reports that parts of China adopt this approach. United States? Australia?

Dennis Opposs: No, not as far as I know.

 

Q58   Chair: Anywhere else—Germany?

Dennis Opposs: One of the messages from the work is that, when you look around the world at the way different countries do it, there is no very popular approach to this. There is a huge variety of different approaches.

Glenys Stacey: You mentioned Australia. As Malcolm Trobe said earlier, each state in Australia does it differently. There are very significant differences in the amount of examined and non-examined assessment, for example. The equivalent body to us in Australia spends a lot of its time trying to reconcile those things within the states of Australia, let alone comparing itself internationally. One important feature here, which is not so dominant in the vast majority of other jurisdictions, and which we all have regard to, is the accountability framework that our schools and colleges are judged by. It is quite unusual.

 

Q59   Graham Stringer: You heard the previous witnesses say that this is a big gamble. What would it take for you to change your minds?

Glenys Stacey: First, we want to see these qualifications run, and we want to put every effort into making sure that the checks, balances and controls are right. Indeed, part of that involves piloting with exam boards. If that piloting found these arrangements to be unworkable, we would be putting up our hands and saying so, but that is not our expectation.

Secondly, if in practice these qualifications do not produce the educational outcomes we aim for, and if we have evidence over time that students are not going to university with the right range of practical skills, we would be looking to advise Government of that, and look to our own arrangements for the design of qualifications. It is the same for any key state qualification. If the evidence is that it is not working well enough, our job is to address that and to make proposals to improve it, and that is what we would do.

 

Q60   Graham Stringer: Can you tell us a little more about piloting? Are you going to introduce it across all exam boards and the whole country, or will there be a small number of students involved to start with?

Glenys Stacey: Could I ask my colleague Janet to speak to this?

Janet Holloway: The proposal is that the arrangements when implemented nationally will be a national means of operating across the exam boards, so there will not be an area of competition or of different practice; there will be consistent practice. The piloting would take place with a small but representative group of centres on behalf of all the examination boards.

 

Q61   Graham Stringer: I am not understanding something. Everybody is doing it the same.

Janet Holloway: Yes.

 

Q62   Graham Stringer: But there is a small number of pilots. Does that mean you are doing a more detailed assessment on them? Usually, a pilot is taking so many people and measuring that against what went on previously, but you are not saying that, are you?

Janet Holloway: I am saying there would be a pilot in advance of the teaching starting, and it would be refining the assessment criteria that will be used by all the exam boards, reviewing the operational arrangements to ensure that they are manageable and identifying any unintended consequences that might arise.

 

Q63   Chair: It is not a pilot; it is a piece of testing.

Janet Holloway: I think “trial” might be a better way of putting it.

 

Q64   Chair: You are going to trial it with the objective of taking out any glitches before rolling it out in its entirety.

Janet Holloway: Yes.

 

Q65   Graham Stringer: Can you tell the Committee what the schedule is for that trial pilot and the 100% roll-out?

Janet Holloway: The schedule is being worked upon at the moment and discussed between the exam boards. We do not have it at the moment, but we will be able to share it with you. The intention is that the trial takes place during the autumn term.

 

Q66   Chair: The coming autumn term.

Janet Holloway: This coming autumn term, and any refinements will be implemented, so that final assessment criteria and arrangements are with the centres no later than spring next year.

 

Q67   Graham Stringer: Would you be doing 100% roll-out, if you were satisfied with the experiment, in the autumn term of 2015?

Janet Holloway: Yes.

 

Q68   Graham Stringer: That is clear. You mentioned previously that you were very bothered about the clustering of results right at the top: 90% or 100%. Is that evidence of excellent students, or is it evidence of malpractice and fiddling?

Glenys Stacey: I hesitate to answer. The plain fact is that, going out to schools, what we are told by students and teachers alike does not suggest that students are enjoying a wonderful experience and gaining the wide range of practical and experimentation skills that they aspire to. It suggests to us that the current arrangements narrow teaching, because schools can to a large extent predict the nature of the assessment that is coming. There are only a limited number of tasks that can be delivered in national standardised tests at more or less the same time. It suggests there is a narrowing of teaching, which no one is comfortable with, and a focus on a small number of tasks and refining performance in those tasks, and also making sure that performance by students is recorded in ways that mean optimum marks are achieved. That is what it suggests.

 

Q69   Graham Stringer: It does not quite answer the question whether teachers are fiddling with it.

Glenys Stacey: I think teachers are in an invidious position.

 

Q70   Chair: You use the word “malpractice” in your document. Where is this malpractice? Set it out; show us the evidence.

Glenys Stacey: There is evidence of malpractice in the administration of A-level science practical. For example, from memory, last summer about 750 AQA students were given estimated grades in relation to their chemistry results, because of known and reported malpractice. That is a worrying number of students who are going to university short-changed in the grade they are given. The thing to bear in mind is that that is only the malpractice that has been reported and is known. We will not always be aware of every incident. Malpractice is at the most hard-edged end of the spectrum, where you are doing something that is plainly wrong. The difficult territory many teachers are in is where they are trying to balance the tension between doing the best for their student and getting the best results. It is not always one and the same thing.

 

Q71   Graham Stringer: It seems to me from what I have heard this afternoon, putting malpractice on one side for a second—there is malpractice—that the incentivisation of teachers is to teach to the exam and, therefore, pupils are not being taught experimental techniques properly, whereas the previous witnesses were saying that the new system would downgrade experimental science because it was not part of the A-level. Can you help me and the Committee square that circle, because there is a direct and absolute conflict of view about this?

Glenys Stacey: It is difficult. When we press harder to understand the concerns, there is general acceptance that the current arrangements simply do not sufficiently promote experimentation and practical skills. The evidence for that is plain. We know from research Gatsby have done that higher education is able to identify skills that are not sufficiently well developed. The way this thing works in practice at the moment does not do what it was designed to do sufficiently well. That is the first commonly agreed position. Secondly, where there is a greater range of views, it is very difficult to find a way of promoting, encouraging, describing and then validly assessing the practical experimentation skills that we all look for in these students. We have not been able to square that circle, and nothing has been presented to us that will enable the proper teaching of those practical skills, and a valid, robust and manageable assessment that can contribute to the grade. That is the nub of it.

 

Q72   Graham Stringer: We have also heard from both panels that in some schools sixth forms are under-resourced when it comes to experiments. How can you differentiate between under-resourcing that can lead to students going to university without the necessary skills and your criticism that this is teaching to exams? How do you separate those?

Glenys Stacey: I am not a science teacher; my colleagues have been and perhaps they can comment as well. From what I have seen in schools, if you know that your students are likely to be assessed on a range of six to eight tasks, you only need to resource up for that, so presumably that is what you would be doing.

Under the new proposals, we have a minimum of a dozen different practical skills. Just sticking with chemistry, which has been the flavour today, they range from titration using a burette or a pipette through to the use of paper or gas chromatography. In the list of skills now you know you have at least those dozen skills and can see pretty well the equipment and kit you will need, and presumably you can thump the table in the senior management team as head of science and be very plain that, if you want to aspire to the best grades and good practical skills, this is the kit that is going to be required. In a way, it puts teachers of science in a stronger position both to identify a greater range of equipment that they will require and to require it. Dennis may have a contribution to make.

Dennis Opposs: The only point I would add is that for the first time we have criteria which will govern the new syllabuses. We have had criteria for many years, but these are the first science criteria in which the skills are actually spelt out. In the past, it was not absolutely clear what practical skills people were looking for. That will now be clear in the syllabuses.

In addition—I am not sure this was mentioned at all in the previous session—there will be a requirement in each syllabus for 12 particular experiments to be conducted, which relates to what Glenys was saying. Far more will be required up front—schools will know that if they are teaching a particular subject that is what they have to teach—in a way that was not there before. In that sense, they should be more aware of what they will need to teach in terms of practical science, and similarly they will know how they have to equip themselves to do that.

 

Q73   Graham Stringer: Your evidence base for wanting to change is really analysis of the results of the practical tests at the present time. If virtually all students who got grade C chemistry A-level passed the practical exam, would that mean you thought that the new system had failed? Will the new system require more people to be employed in the examining bodies?

Glenys Stacey: First, the success or fail measure is really the feedback from higher education. This is to be a valid qualification. We will want the feedback from higher education to tell us whether or not students are coming to university better equipped than they were. If there is a strange correlation between grades obtained in the examination and the results of the practical skill assessment, we will wish to understand what lies behind that, in the way we now interrogate data and information on outcomes and identify some irreconcilable correlations. We will be looking at the same, so if there is not a good correlation we will be looking to understand why, and there will be a range of reasons.

 

Q74   Graham Stringer: What about the number of people employed by examination boards? More or fewer?

Glenys Stacey: We are still working with exam boards to agree the common arrangements for checking. We will not compromise on those arrangements, so if they require greater expenditure by exam boards, so be it.

Chair: Thank you very much for attending this afternoon.

Glenys Stacey: Thank you for giving us the opportunity.

 

 

Examination of Witness

 

Witness: Elizabeth Truss MP, Parliamentary Under-Secretary of State for Education and Childcare, Department for Education, gave evidence.

Q75   Chair: Minister, thank you very much for coming this afternoon. Apologies that we have slightly overrun; I hope you found it useful listening to some of the exchanges. I understand that procedurally we are likely to have a vote at about 20 to or quarter to, so perhaps we can rattle straight on, if you don’t mind. First, can you tell us what your aspirations are for A-level science and convince us that it is not just targeted at high achievers?

Elizabeth Truss: My aspirations are for many more students to study Alevel science. In particular, we have just launched a campaign called Your Life, with leading entrepreneurs and businesses, to get more students studying physics in particular, where there is a very low number of girls at the moment, and also studying chemistry and biology. We all know that science is an increasing part of many jobs and opportunities, and if young people do not have the opportunities to study those subjects, it is likely that they will be closing doors that are potentially open to them.

From the evidence I have just heard from Ofqual, as well as the evidence from other people who have appeared in front of the Committee, everybody believes that practical science is one of the main ways of preparing students for university and work after university and going straight into the workplace, whether through an apprenticeship or directly into employment, but also motivating students to carry on with science. The Ofsted report last autumn on curiosity in science was very convincing that the best way of motivating young people is through practical experiments.

 

Q76   Chair: This we totally agree on—no dissent whatsoever from any of our witnesses today. Under the system that we appear to be presented with, we could end up in a position where you could get an A-grade science result without having passed your practical. Is that right?

Elizabeth Truss: The majority of students studying Alevels go on to university. Universities, who are very keen to see better practical experience, and at the moment are complaining that students are arriving at university without relevant practical experience, will surely be looking at whether the students have passed in the 12 practical examinations that take place. There is a very strong incentive for schools and students to make sure that those students have the relevant practical experience. Indeed, it is considerably more practical experience than students will be getting under the current system.

 

Q77   Chair: Have you had any direct discussions with the deans of science education on this proposition?

Elizabeth Truss: I have not spoken to the deans of science education. I spoke to various scientists at universities, while we were drawing up the GCSE plans, and I have also been talking to the organisations responsible for drawing up the content through ALCAB.

 

Q78   Chair: We are particularly engaged in the issue of Ofqual’s current proposition about the measuring and marking of science. Do you think it would be a good idea if you had some formal discussions with the deans before giving the green light to Ofqual to go ahead?

Elizabeth Truss: Ultimately, it is Ofqual’s decision. Ofqual have to make a decision about the best way to assess the reliability and validity of the examinations for which they are the independent regulator. As a Government Minister, it would be right for me to put pressure on Ofqual if they felt that was not the best way of assessing practical examinations. I would concur with what I have heard from Ofqual, though, which is that the current system is clearly not delivering what we all want, so to me there is a clear case for change.

 

Q79   Chair: Again, we are on common ground. [Interruption.] Minister, are you in a position to come back for 20 minutes?

Elizabeth Truss: Yes.

Chair: Perhaps we can suspend the sitting for a maximum of 15 minutes.

Sitting suspended for a Division in the House.

On resuming—

Q80   Chair: Minister, you were responding to my question, which was testing whether it was appropriate to have a system that could, if our understanding is correct, end up with someone getting a grade A who had nevertheless not passed their practical examination. Is that theoretically possible? Is it desirable?

Elizabeth Truss: It is theoretically possible at the moment to pass an Alevel without having the practical experience.

 

Q81   Chair: That does not sound like a good idea, does it?

Elizabeth Truss: I would say that it is now much clearer. A university looking at a candidate who has, let’s say, done Alevel chemistry and physics will be able to see how the student has done in the examination, including the part that refers to understanding how practicals work. They will also have a grade for whether or not that student has achieved what they need in the 12 practical experiments, so it will be very transparent to a university or an employer whether or not the student has those practical skills. At the moment, that transparency is not available.

 

Q82   Chair: But they could not get an A* today without passing them.

Elizabeth Truss: I would need to look at the grading structure. That is potentially possible, but the point is that universities and employers are the people who are saying that students do not have developed practical skills. They would presumably insist that students had passed those practical examinations to get on their courses; otherwise, what they are saying would not carry much weight.

 

Q83   Stephen Mosley: Ofqual announced the results of the consultation last month. How much involvement did you have in that consultation?

Elizabeth Truss: I have had a number of discussions with Ofqual to understand their rationale. Ultimately, it is Ofqual’s decision about how examinations are regulated, so it is their decision about the best way to ensure validity and reliability.

 

Q84   Stephen Mosley: Was there any push at all from Ministers—yourself or other Ministers—either to do the consultation or to go along with the outcome of that consultation?

Elizabeth Truss: I agree with the decision that Ofqual has made.

 

Q85   Stephen Mosley: One of the issues we heard about is that the Department for Education currently does not have a chief scientific adviser. Without a chief scientific adviser, who within the Department was responsible for looking at the proposals and deciding what your position was?

Elizabeth Truss: Ultimately, what we have done with Alevels is put them under the authority of ALCAB, which determines the content of A-levels. In the short term, it was carried out by Professor Mark Smith in the review of science Alevels. The longer-term proposal is that it sits under ALCAB committees, which will determine whether the content is keeping in line with the latest research in universities, but there is a different discussion about the best content for Alevels—it is very clear that there is a lot of practical experience required in the Alevel—and the way it is assessed. The way it is assessed ultimately is down to Ofqual. It is quite a technical decision about validity and reliability and how you assess—is there something a bit funny about this microphone? It echoes.

Stephen Mosley: There is a bit of an echo, isn’t there?

Chair: There is an echo coming from somewhere.

Elizabeth Truss: I feel like there is a ghost in the room.

Chair: It’s your bottle, Stephen.

Stephen Mosley: Is it me? There’s an echo when I speak as well, isn’t there? Keep going.

Elizabeth Truss: I will carry on.

The decision on content is ultimately for the university-appointed bodies to make in the long term, to make sure that standards are high. That is a promise we made when we came into office. We wanted universities to be back in the driving seat as far as Alevel content was concerned, but in terms of the way it is assessed, that is a professional decision for Ofqual to make.

We have to recognise that over the years the examination system has become a high-stakes system. Things that might have been possible 30 or 40 years ago in terms of practical examinations are now under great pressure through the school accountability system. That means that Ofqual has to make very rigorous decisions and ensure that those exams are ultimately fair at the end of the course. That is why I think they are best placed to do that.

 

Q86   Stephen Mosley: Are you planning to get a new chief scientific adviser, and when?

Elizabeth Truss: That is a very interesting point, and I will take it back to the Department.

 

Q87   Chair: In the absence of a CSA, do you think it would be sensible for the Department to have a discussion on this issue with the overall Government chief scientist, Mark Walport?

Elizabeth Truss: Absolutely. I will provide you with a list of all the people I have met in discussions about the future of GCSEs and Alevel. I have met a lot of scientists.

 

Q88   Chair: Specifically on practicals, I know that Mark Walport has strong views.

Elizabeth Truss: I certainly will meet him on that subject.

I know this specific inquiry is about the Alevel, but we cannot as a Government do everything through the accountability system. The accountability system is good at ensuring standards, but it does not dictate everything that is done in the classroom, and it can’t. We cannot micro-manage classrooms through the accountability system. What I am looking at as Education Minister is how we create the right environment, so that teachers and schools do as much practical work in science as possible. There are other avenues we should be looking at, because through high-stakes accountability, it is very difficult to incentivise that type of behaviour.

I am very pleased that Ofqual decided there should be 12 practical assessments, but to my mind that is just the beginning and not the end. We need a change in culture in the classroom. We already see the very best teachers doing that. We see the very best teachers and the very best schools, who are passionate about science, doing that. My question is how we spread it much more widely. I do not think the accountability system, particularly the performance tables, are the best route to do that.

 

Q89   Graham Stringer: I take it from your answers to Stephen that you think the proposals are good.

Elizabeth Truss: Yes.

 

Q90   Graham Stringer: Given that you think that, were you concerned about the high level and quality of the opposition to the proposals to decouple practical examinations from the final grade?

Elizabeth Truss: Of course we listen to all representations. The people who have raised concerns are scientific experts absolutely. Ofqual are assessment experts, so there is always a balance to be struck. What we are talking about is the interaction between schools and teaching and the accountability system and examinations, and we need experts on all sides of that argument working together. I am very keen to work with the deans—the Chair just mentioned them—and organisations like SCORE as to how we can improve practical science in our schools, but I do not think that the primary route for that should be the high-stakes examination system. I think there are other routes. For example, the Government fund science learning centres that are doing a lot to put on good examples of practical work that can be used in schools. They will be working around the new specifications from the awarding organisations to encourage best practice.

A lot of this is about school leadership as well. We are now moving to a school-led system where head teachers have much more flexibility to decide what happens in their school with the curriculum. For example, they have the flexibility to expand the amount of timetable time for science. One of the issues raised in the Ofsted report is that, often where schools do not have good-quality practicals, it is because science has been squeezed in the timetable, and maybe there are not the quality and expert science teachers needed to lead those practical examinations. I want to work with those organisations to focus on how we improve the quality of practice. The accountability and examinations system is quite a blunt instrument for trying to do this.

Perhaps I may quote from the Ofsted report about maintaining curiosity in science. This is about key stages 3 and 4, but I think it is very interesting. In terms of teaching how science works under the current system, the finding of the report was: “The majority of the schools visited planned for this, with teachers wanting to use investigative science to teach the content; in general, most did this well at Key Stage 3.” Key stage 3 is ages 11 to 14 before students start GCSE. “However, as students moved to Key Stage 4, preparing for GCSE assessment tasks became all-consuming in many science departments, leading to an ‘atomistic’ approach to teaching the various skills required. These were without a particular purpose other than learning the skill as an end in itself.”

What Ofsted are saying essentially is that at key stage 3, which is quite a long way from the examination, good practical skills were being taught, but by the time of key stage 4, where there are specific practical tasks to be prepared for, it became an end in itself. I would caution against trying to use the exam system to drive all practical teaching of science in schools, because it tends to become more of a teach to the test exercise. As Ofqual said, no one has put forward any way of preventing that from happening.

I think a lot of the voices that have been raised are concerned about the future of practical science, as am I. It is absolutely true that we do not have enough practical science in our schools. I also think that too often lessons in science and maths are not brought to life, and students do not understand why they are relevant in the workplace or in the world at large, so we need to do more to bring science to life and to teach it practically. I am not convinced that further changes to the exam system will deliver that without perverse consequences.

 

Q91   Graham Stringer: This was not a view I came to the Committee meeting with, but to me there appears to be a complete difference of view. What some of the academics we heard from say is that the new system devalues practical science by removing it from the exams, because students value exams more than anything else—that is what they focus on—as against the view of Ofqual that the current system devalues practical science because it is just about teaching to the exam. How do you as the Minister responsible, who wants to get the best science teaching, on which we are all agreed, judge between those two points of view? Is there any level of opposition that would have made you change your mind?

Elizabeth Truss: As you say, the outcome everyone wants is the same. We want high-quality practical science. We have a school system that has been very focused on high-stakes accountability and getting students to a certain level in examinations. Sir Michael Wilshaw and the Secretary of State are very clear that we want schools and teachers to be educating the whole child, to develop a wide range of skills, whether those are practical science skills, the ability to present well, the ability to debate, or the resilience and social skills students will need when they leave school and are in the world of work or university.

I put it to you that not all of those skills can be tested in exams. If you try to do that, you get to a stage where you are micro-managing a school to a huge level. The best education systems in the world have a combination of strong accountability on a few key matters, but a lot of autonomy for schools to develop in the way they see fit. Part of the issue here is about culture and school leadership. It is about a culture of valuing practical science. As a local MP, when I go into schools, I want to see what the school is doing to encourage students to take science subjects at Alevel.

We know from the Ofsted evidence that schools that do practical experiments right through primary and secondary school get better outcomes. It is in a school’s interests to do practical experiments and have good practical science teaching, just as it is in a school’s interests to hire high-quality teachers. If students are exposed to high-quality practical work, they are more likely to take science Alevels, which is very beneficial to those students. I think there is a case to be made to school leaders about how to change the culture of science, which is what I am keen to work on with the organisations you refer to. We have just done some focus grouping for the Department for Education on why students are not taking physics Alevel. At the moment, only 2% of girls and 7% of boys do physics Alevel, which is a pretty appalling figure. The words that came back were: “dull,” “boring,” “male,” “irrelevant,” “glasses” and things of that ilk. It seems to me that the issue with science practicals is part of a broader problem about the perception of science in schools, and the way it is being taught in some schools. The Ofsted report lays out very well what the best schools and teachers are doing. The question is: how do we achieve a culture shift from just teaching the content in a dry way to teaching it in a very lively way? It is about practical application and also linking it to developments in the world.

 

Q92   Stephen Metcalfe: Following on from Graham’s questioning, there is a lot of very serious opposition to this, whether it is SCORE or CaSE or Lord Winston, who publicly castigated the proposals. Do you think that is a communications problem on the part of Ofqual in getting the message across to the wider scientific community about their thinking and their proposals? It cannot be helpful or good for science in general to have learned societies and scientific experts disagreeing on the way we are going to start students off on a potential scientific career.

Elizabeth Truss: Maybe there is some confusion about what we have at the moment and what is proposed. I understand Lord Winston’s concerns. Ofqual have been looking at that and have written to Lord Winston, if I am not mistaken. As the Minister responsible, I am very happy to engage. We are engaging the organisations, many of which are raising this issue, in our general process and campaign, and also linking together the Government’s initiatives on science and maths in a very coherent way, so that from September schools will be able to access high-quality science ambassadors for schools, CPD and excellent teaching resourcing. This is about schools having access to the best research on what sort of science teaching works and the best resources in terms of CPD materials. That is the way we can help raise the game. Maybe there is an issue about the communication process.

 

Q93   Chair: Minister, on all of those issues we are totally with you. We specifically want you to drill down and deal with the measurement of science practicals.

Elizabeth Truss: Perhaps there is not enough appreciation that the existing system has not delivered in terms of—

 

Q94   Chair: With respect, I appreciate that you were not here at the beginning of the session, but all the organisations we heard in our first session, including SCORE and a representative of the deans of science faculties, disagreed with you.

Elizabeth Truss: But they do not say the current system is delivering high quality.

 

Q95   Chair: Everyone agreed with that starting point, but this is about the recommendations from Ofqual.

Elizabeth Truss: I have not seen an alternative workable proposal that will deliver reliability and validity and what those organisations want. In an ideal world, if there was a proposal out there, I am sure Ofqual would have considered it, but the problem is that I do not think there is a proposal out there. There is no perfect answer for creating a perfect accountability system. I do not think there is an alternative out there that is better than what Ofqual have proposed. I am sure that if there was Ofqual would have considered it.

 

Q96   Stephen Metcalfe: On what have you based that decision? Who is supporting the proposals as they are currently? Have you taken advice from the chief scientific adviser? You said that you will meet the chief scientific adviser. Does that mean you have not met him yet?

Elizabeth Truss: I have not met him on this issue anyway.

 

Q97   Stephen Metcalfe: Who is supporting it, and why did you side with that side of the argument, as opposed to those who are saying there might be another way?

Elizabeth Truss: Ofqual are the organisation responsible for the decision on this.

 

Q98   Stephen Metcalfe: I realise that, but you must have a view.

Elizabeth Truss: I talk to Ofqual about who they have spoken to and what evidence has been presented. To me, the case that Ofqual makes is overwhelming: there is no alternative proposal that delivers a better result than this. I also want to shift the energies of the many august organisations to how we improve practice in the classroom, rather than focusing on something to which I do not think there is a perfect solution. In the past, when there were practical examinations monitored by awarding organisations and exam boards, it was before we had a high-stakes accountability system. It is very difficult to deliver a consistent practical exam across the country, as I think the chief executive of Ofqual has outlined to you already.

Everybody has the same aspiration to improve practical science. In the exam and accountability system as it is now, we have gone as far as we can, and we should be looking at other avenues to pursue for improving practical science in schools. I would be very interested to see what the Committee says about this. If somebody has a fantastic proposal to make this work in a way that does not just mean teaching to the test—let’s face it; that is what is happening under the current system—I and I am sure Ofqual will be very interested to hear it.

 

Q99   Stephen Metcalfe: You think those learned organisations and other experts are overstating the case when they say this is a huge gamble; there is danger; it is a downgrade, not an upgrade; there is no evidence; and you should think again. They are just overstating the case, and you are convinced that this is the right way forward to benefit students. Presumably, that’s what we’re all after.

Elizabeth Truss: I think it is not correct to say that the current system is working very well.

 

Q100   Stephen Metcalfe: No one is saying that.

Elizabeth Truss: But no one is proposing an alternative apart from the alternative proposed by Ofqual, so there is no particularly positive alternative being proposed. It is possible to object to things, but it is hard to see if there is no alternative. It is like what people say about democracy.

 

Q101   Stephen Mosley: Assuming all this goes ahead, how will you know whether or not it has worked?

Elizabeth Truss: I think we will know if it has worked by the quality of students’ practical abilities once they leave school. That is how we will know it has worked.

 

Q102   Stephen Mosley: Will there be any measure of that? Who will be doing the measuring, and how will we know?

Elizabeth Truss: At the moment, we have a lot of feedback from universities saying that the quality is not good enough, and students do not have those practical capabilities. I hope that we will hear more positive feedback from universities.

 

Q103   Stephen Mosley: Would Ofsted have any role in it? Would they be looking at how practicals are taught at Alevel?

Elizabeth Truss: Ofsted clearly have a duty to make sure there is a broad and balanced curriculum taught in schools and also that there are high-quality outcomes. One of the things I highlighted earlier is that Ofsted have already said that high-quality outcomes are linked to good practical science. You cannot get good outcomes, and you do not get a lot of students going on to study the subject at Alevel and enthused about science, without having that high-quality practical education. It is hard to see how a school would be judged to be doing extremely well if it was not getting those outcomes, which generally come from having high-quality practical work.

 

Q104   Stephen Mosley: You said earlier that sometimes league tables are not the best way of measuring the quality of practical work. Will the separate grades that are given be reflected in those league tables at all?

Elizabeth Truss: They will not.

 

Q105   Stephen Mosley: They won’t at all. What incentive will there be for schools?

Elizabeth Truss: The point I was making earlier was that you cannot measure everything through league tables; you can only measure quite narrow things. We want students leaving school with a high-quality broad education, and you cannot measure all those things. If you try to, schools and students get overwhelmed with measurements.

 

Q106   Stephen Mosley: To a large extent, I think league tables indicate where something is not working, don’t they?

Elizabeth Truss: They do.

 

Q107   Stephen Mosley: At the moment, there has been a big push on the league tables, and they are seen as an incentive. Schools see their position on the league table as something to be valued and be proud about, and something that encourages them to push pupils to achieve their best. How will you incentivise them to do practical science if it is not reflected in the league tables?

Elizabeth Truss: At 18, destinations are a key measure for schools. Are students getting into good courses at university as a result of their schooling or their experience at college? Universities want to see high-quality practical science skills. If students are not doing the 12 practicals to a decent standard, they will not get that grade and those places at university, so it is indirectly reflected through the league tables. I think that at 18 schools are judged on where students go. That is a major consideration for students and parents when they look at a school. At 16, it is based much more on the pure performance data. The new performance 8 accountability measure puts a lot of weight on science. There are special slots reserved in the performance tables for science, and that will encourage schools to invest more time in science and maths education. Certainly, that is the feedback we are getting from schools that are preparing for the new curriculum and accountability system.

One of the issues about A-level science is whether students are being prepared for it through the school system. A lot of the feedback from the Ofsted report is that they are doing experiments in big groups. Students are not necessarily getting the individual practical experience they need before they go on to Alevel. With so few students going on to do Alevel science, particularly physics, there is an issue considerably before the age of 16. Even if there was a perfect way of assessing science practicals in exams at 16 and 18, my question mark would be: what about the rest of the school system? What about primary school science? What about science for 11, 12 and 13 year-olds? These are the ages at which students often fall in love with science and decide to be inspired by science. I just warn against focusing too much on the end exams at the expense of what is going on through a child’s school career.

 

Q108   Stephen Metcalfe: On falling in love with science, SCORE has done a report that says there is a link between having a good experience in science and the amount of investment made in facilities within a school. Presumably, you agree—do you agree with that? I must not put words into your mouth.

Elizabeth Truss: It is difficult. It is definitely the experience students have—a combination of subject specialist expert teachers, which is very important, and high-quality practical experience. It is obviously easier to achieve a high-quality practical experience if the school has good facilities. That does not mean it is impossible not to do practical experiments with more meagre facilities—schools can think creatively about that—but all other things being equal, good-quality facilities are important.

 

Q109   Stephen Metcalfe: Combining those, a good inspirational or quality teacher and good practical teaching leads to a good experience, and a good experience of learning science is more likely to take you on to studying it at a higher level, or even to pursuing it as a career.

Elizabeth Truss: Yes.

 

Q110   Stephen Metcalfe: If you want to improve satisfaction with science overall and improve that experience, might it be better to spend money on facilities within schools to improve the ability to get a better experience, rather than reorganising exams?

Elizabeth Truss: What we are doing is giving school leaders much more flexibility over how they spend their resources. Science facilities are one of those things. It depends on what facilities exist at the moment, but obviously they are useful resources for schools to have.

One of the things school leaders do is look holistically; they need to be investing in a high-quality teacher work force—probably the quality of the teacher is the most important thing. Without somebody with the right subject expertise, understanding and inspiration, it is very difficult to deliver a high-quality science programme. They need to give curriculum time to science. One of the bits of evidence from Ofqual is that sometimes triple science gets squeezed into a double science slot. That is a big problem if you want to give time to do practicals, so it is about people time and physical resources. Leaders of schools are best placed to decide the balance between those things. It is about how they prioritise science within the school. We know that some schools prioritise science, and other schools are what I call science deserts—the 49% of state schools where no girl goes on to do Alevel physics, for example. How do we shine a light on that and encourage schools to use facilities like the science learning centres that can provide them with CPD support and advice? How do we encourage schools to take part in some of the programmes that the Department funds, like the Stimulating Physics Network or the STEM ambassadors? There is some brilliant science practice out there. The question is how we bring everybody up to that level.

 

Q111   Stephen Metcalfe: I am looking at a chart which shows that a difference of under £10 per capita spend within a school leads to somewhere between “very dissatisfied” with the way science subjects are taught and “very satisfied”—£10 per head is a relatively small amount. Has the Department—forgive me if you have; I am asking from a point of ignorance—made any specific funding commitments to individual schools to get them to focus on improving the way science is taught? You have lots of good programmes, which I understand, but do you have a specific, targeted fund to help schools?

Elizabeth Truss: No. They are not directed at specific schools; generally, they are funding national programmes. The whole point about our school-led system is that head teachers need to make the best decisions about what is effective in terms of their schools and resources. Those may vary hugely according to what is available locally. For example, schools in my constituency work closely with the John Innes Centre, which is a life sciences centre nearby. They can leverage that. Others will have other local opportunities. We want to give flexibility, but we provide a lot of support to schools.

One of my main objectives over the next year or so is to make sure all schools are aware that they can participate in those programmes and benefit from them, but it is about school leaders prioritising science. When I go into schools the question I ask, and all MPs should ask, is, “What are you doing to get more students to carry on studying science? What are your practical facilities like?” Every leader has decisions about where they spend their budget.

 

Q112   Stephen Metcalfe: Agreed. One of the things that comes out from our reports—we keep looking at how we can improve science education generally—is that there are great extremes in the way schools engage. We have not quite got to the root of whether it is down to leadership, lack of opportunity locally or knowledge, but it is quite a big issue. I am sure we all want better teaching of science in our schools and more people who understand its benefits. We need a bit more stick and carrot that comes down centrally to try to get more people to take up the opportunities. When I write to all my schools to talk about Primary Engineer or STEM ambassadors, those that I think are already good are the ones who take up the opportunities, and those that I think are struggling are the ones who never take up the opportunities. It is a leadership issue.

Elizabeth Truss: Quite often, they will also be the schools that are taking up opportunities for music hubs or the Mandarin network. It is about general leadership as well as specifically science. We are looking at some interesting cases of schools that are generally good but have poor performance in terms of the number of students who are going on to study science A-levels. How do we fill those positions? One of the ideas behind the maths and physics chairs programmes, which are co-funded by corporates, is to create a supply of physicists with PhDs who can go into those schools and help fill some of the gaps. Underlying all of this is the fact that scientists who have done university degrees are in such great demand that, in getting them to go into teaching, we are competing with the City, engineering firms and biotech firms. There is a whole range of employment opportunities for those individuals, which is why we need to grow the pipeline, so that we can get inspirational teachers in our schools.

 

Q113   Chair: There are lots of areas of agreement. Indeed, much of what you said about inspiring teachers is covered in an article I wrote a month ago for Tribune in which, by the way, you get a mention—I wonder whether your press office has picked it up. In your remarks, you said that in your view there was no alternative but the Ofqual proposition. You recognise that there are groups of people you need to engage with. Of course, these proceedings are broadcast, and I take it that, if the experts listening to this have alternatives, you would welcome them into your office to discuss them. Is that correct?

Elizabeth Truss: I certainly would. I would caveat it by saying that Ofqual are the ultimate decision makers; that is why they were set up as an independent regulator to make sure that examinations are valid and reliable. I highlight that I would also like to have a broader discussion with those stakeholders about what are the other ways in which we can encourage the better take-up of practical science in schools. It is an incredibly important issue. We can work with all of those interested parties and leading experts within the scientific community. As a country we can do better on this, and I welcome the Committee putting such focus on it.

Chair: Thank you very much for your attendance this afternoon.

 

              Oral evidence: Practical science in schools, HC 1270                            34