As the whole planet finds itself engulfed in the midst of the Covid-19 pandemic, it has become increasingly clear that the public needs to be well equipped with the skills to deal with scientific knowledge. There are now daily press conferences in which scientific information is presented to the public. The definition of a virus might have been buried in a textbook or a GCSE question previously, but now there is a health emergency in which all citizens need to be scientifically literate to make sense of what is going on around them and to take appropriate measures to protect themselves. The Covid-19 pandemic is demanding that the public, possibly more than ever in our lifetimes, understands how scientists do science.
‘The Covid-19 pandemic demands that the public understands how scientists do science.’
How do scientists do science? Does school science represent what scientists do? A fairly typical depiction in school of how science is done involves the so-called ‘scientific method’, which is described as a process through which scientists produce robust evidence by applying procedures such as experimentation and observation. The story goes that scientists begin with a question they want to answer. They then design an experiment and, by carefully tracing independent and dependent variables, they produce findings that help them answer the question.
However, such a step-wise and linear description of the scientific method is simplistic and hardly a realistic representation of how scientists actually do science. Rather, scientists engage in a wide array of methods some of which include hypothesis testing, and some other approaches including those where there is no manipulation of variables (Erduran & Dagher, 2014).
Consider the current investigations around Covid-19 infections. Some data are collected around how the virus might be influencing a patient’s breathing over a period of time. Such observation is simply based on the recording of parameters where there is no manipulation of variables in the sense of an experimental design. Likewise, sometimes these data might be subjected to hypothesis testing about correlation between incubation period and extent of lung disease, but without having been part of an experiment. This would result in some non-manipulative hypothesis testing. Eventually scientists will have carried out some randomised control trials in which a drug could be treated as a variable in interventions that also include control groups to test the placebo effect. The important point is that all these different approaches are essential to the conduct of science, and there is no one single method but rather a diversity of scientific methods.
When we look at how we assess students about practical science at GCSE level, we see that there is disproportionate representation of different scientific methods in the examination questions (Cullinane, Erduran, & Wooding, 2019). Despite decades of reform in the assessment of practical science in England at GCSE, because of their high stakes these assessments tend to promote a narrow view of the scientific method whereby students carry out practical work that is formulaic and more of a hoop-jumping exercise – very far from the ways in which scientists are currently working to address key challenges such as the Covid-19 emergency. This may have the effect that we are unintentionally failing to inspire our young people, some of whom will be the next generation of scientists, about the creative and exciting ways in which practical science is actually carried out. Curriculum and assessment intentions have been hard to square, and the endorsement approach now taken at A-level might be a better way forward (Childs & Baird, 2020). Building upon this recent review by Childs and Baird, in a three-year project jointly funded by the Wellcome Trust, Gatsby Foundation and Royal Society we have designed and evaluated assessments that incorporate a diversity of scientific methods. We hope that by enriching the experiences of secondary students in understanding how scientific methods work, we will ensure that future citizens as well as scientists will be better equipped to make sense of issues such as the Covid-19 pandemic that emerge in their everyday lives.
This blog is based on the article ‘General Certificate of Secondary Education (GCSE) and the assessment of science practical work: An historical review of assessment policy‘ by Ann Childs and Jo‐Anne Baird, published in the Curriculum Journal on an open-access basis.
Sibel Erduran is the principal investigator of Project Calibrate, which provided the context for collaboration with Ann Childs and Jo-Anne Baird and led to this blog post. Project Calibrate is jointly funded by the Wellcome Trust, Gatsby Foundation and Royal Society (grant number: 209659/Z/17/Z).
Childs, A., & Baird, J. (2020). General certificate of secondary education (GCSE) and the assessment of science practical work: An historical review of assessment policy. Advance online publication. Curriculum Journal. https://onlinelibrary.wiley.com/doi/full/10.1002/curj.20
Cullinane, A., Erduran, S., & Wooding, S. J. (2019). Investigating the diversity of scientific methods in high-stakes chemistry examinations in England. International Journal of Science Education, 41(16). https://doi.org/10.1080/09500693.2019.1666216
Erduran, S., & Dagher, Z. (2014). Reconceptualizing the nature of science for science education: Scientific knowledge, practices and other family categories. Dordrecht: Springer.