Skip to content

Blog post

Using evidence-informed resources to accelerate learning

Flávia Schechtman Belham

Research in the field of cognitive sciences is constantly unveiling ways that enhance memory and learning. Nevertheless, as previously mentioned in the BERA blog (Younie 2017), findings from scientific investigation do not always reach schools.

I have recently visited a number of secondary schools and found that some teachers are incredibly knowledgeable about science findings and actively adapt them to their practice. Some teachers also blog about effective learning strategies and propose creative ideas to incorporate them in class. Similarly, initiatives such as the Education Endowment Foundation’s Research Schools Network and the Chartered College of Teaching’s Impact publication aim to build bridges between cognitive science research and practice in schools.

However, on an institutional level, schools in general are not aware of, and do not teach according to, these effective learning strategies. For example, academic studies have shown that frequent low-stake tests spread out over time are more effective than highlighting and reading notes. Similarly, combining verbal and non-verbal information facilitates understanding when compared to a words-only approach (for a recent review on these and other techniques see Weinstein et al 2018). Despite this evidence, surveys show that few students use these strategies when revising for exams (Karpicke et al 2009; Wilks 2017; Sumeracki 2018). The techniques that are commonly used in schools can lead to a long feedback loop for the student and to an extended marking time for teachers – sometimes over 11 hours per week (Ward 2016).

To help disseminate and facilitate the use of active learning techniques in secondary schools, we at Seneca Learning have developed new online learning software. This software is composed of short learning sessions consisting of a few concept modules followed by a large number and variety of question modules. An algorithm tracks each student’s progress and personalises how and when different pieces of information are presented and tested. The platform is engaging, and full of images and videos related to the to-be-revised content.

To test the efficacy of these new learning resources, a randomised controlled trial with 1,120 school pupils was conducted (for the full report, see Feddern et al 2018). The students were divided into three groups.

  • The first group used a textbook to study part of the GCSE-biology content in one 40-minute session.
  • The second group used a tablet to study the same content across two 20-minute sessions.
  • The third group studied the same content using our learning software, also across two 20-minute sessions.

An exam before the learning sessions indicated that no prior differences existed between the groups. In contrast, an exam after the learning sessions clearly showed that the marks of the group using the new learning software were two times better than the marks of the group using the textbook, and 60 per cent better than those of the group using the tablet.

Our results also revealed that, in general, pupils in selective schools performed better than pupils in non-selective schools. This pattern is consistent with a recent report (Social Mobility Commission, 2017) that revealed striking differences in school performance between students entitled to free school meals and their peers. Nevertheless, the improvement in marks with the use of the learning software in comparison with the other two groups was proportionally the same in selective and in non-selective schools. The potential for this software to accelerate the learning of students from different socioeconomic backgrounds is maximised by the fact that all GCSE-related resources are exam-board specific and provided for free.

Of course, the learning software is in its infancy. More GCSE subjects will be added, and new question formats will be developed. A future avenue for investigation is the question of how the software benefits students with specific characteristics, such as those with special education needs, those eligible for the pupil premium, and non-native English speakers. To investigate these questions, we have partnered with a few schools in England. Together, new features of the platform will be tested and their effects compared between different groups of pupils. The results of these studies will not only help us to improve the learning software, but will also provide valuable information to schools about their student bodies.

In conclusion, although the importance of complementing teaching with findings from cognitive sciences is widely acknowledged, many factors prevent students and teachers from doing so. Adopting a free, user-friendly online platform tested under scientific standards seems like a good step towards more evidence-informed assessment practice in schools. Alongside the initiatives mentioned above, evidence-informed education can help tackle the issue of social mobility in the UK.


Feddern L, Belham F S, Wilks S (2018) ‘Retrieval, interleaving, spacing and visual cues as ways to improve independent learning outcomes at scale’, Impact 2: 33–36.

Karpicke J D, Butler A C and Roediger III, H. L. (2009) ‘Metacognitive strategies in student learning: do students practise retrieval when they study on their own?’, Memory 17(4): 471–479.

Sumeracki M (2018) Do students utilize effective learning strategies?, blog, the Learning Scientists, February 2018.

Social Mobility Commission (UK) (2017) Time for change: An assessment of government policies on social mobility 1997–2017, London.

Ward H (2016) Workload: Tens of thousands of teachers spend more than 11 hours marking every week, Times Educational Supplement, 18 April 2016.

Weinstein Y, Madan C R and Sumeracki M A (2018). ‘Teaching the science of learning’, Cognitive Research: Principles and Implications 3(2).

Wilks S (2017) ‘Are learning techniques regressively distributed?’, blog,, 1 December 2017.

Younie S (2017) ‘Evidence-informed practice – much talk and little action’, BERA Blog, 30 October 2017.