Skip to content
 

Blog post Part of special issue: The playful academic

Playful interventions: Measuring impact on learning

Steph Comley, University of Exeter

Figure 1: PMM tracking individuals’ understanding of ‘playfulness and me’ over the course of several workshops and a field trip. Red text was added before any workshops were undertaken.


One of the main barriers to integrating playful or games-based learning into education can be the perceived difficulty of measuring and demonstrating impact on learning. During my doctoral research on the use of games-based activities to support collaborative learning, I was frequently asked questions such as ‘How will you know the students have learnt anything?’, or ‘Can you prove a playful intervention has supported achievement of learning outcomes?’ Seven years later, I am still being asked similar questions, except now I can confidently offer some answers!

Incorporating playful activities can bring benefits to students that may not be directly linked to or captured in learning outcomes but still support the student by contributing to a more authentic and joyful experience. When it comes to creating space in the curriculum for playful activities, or encouraging students to take part or convincing programme leads to get on board, it helps to be able to point to how a play-based activity will impact on learning helps.

Measuring impact on learning is often achieved through an analysis of grades achieved via coursework or exam. Where more qualitative data is used, there is often an expectation that specific terminology or taxonomy will be present, for example Bloom’s taxonomy and the associated verbs (Krathwohl, 2002).

Academics and researchers usually have a shared meaning of this educational language – indeed, referring to a particular taxonomy has many benefits in terms of planning learning experiences and articulating outcomes. Yet often we extrapolate comments from students, design feedback forms to fit this taxonomy or, worse, prompt students to use specific terminology when completing a survey or taking part in a focus group.

For a true evaluation of impact on learning we need to be able to enable students to feedback and express themselves in their own language, free of imposed ideologies and ‘educator speak’. Ideally, we would also allow students to contribute in a way that permits them to express themselves clearly, and this may not always be in sentences or via a tick on a Likert scale.

‘For a true evaluation of impact on learning we need to be able to enable students to feedback and express themselves in their own language, free of imposed ideologies and “educator speak”.’

One method that allows for this personalisation of data is Personal Meaning Mapping (PMM). A fairly recent data collection and analysis tool, PMM was first used by Falk et al. (1998) within the context of museum visitors. It was developed in order to evaluate the impact on understanding of a museum exhibition quickly, and without significant participant instruction. While similar to concept mapping, there is no requirement for the respondee to use specific terminology or have existing knowledge. Additionally, the rubrics used to score concept maps are based on predetermined language and ideas, transposing the researcher’s reality onto that of the participants’.

Personal Meaning Mapping can be used in much the same way as a traditional ‘brainstorming’ exercise (Comley, 2020). It requires a simple prompt that students can then use as an anchor from which to attach words, pictures or phrases. This exercise is then repeated, in a differentiated way, for example with a different-colour pen. The PMM can be revisited multiple times as needed over any time frame. The resulting map can highlight changes in depth of understanding, the building of concepts and how students link different concepts to one another. These changes in breadth, depth and extent of knowledge can be scored, resulting in data that can provide some insight into the impact on learning.

As well as providing a longitudinal method of collecting and evaluating progression, and impact on learning of play-based interventions, the process of making a Personal Meaning Map can also become part of the learning journey. It provides an opportunity for the student to reflect and evaluate on their own understanding and learning, without the pressure of revision or examination conditions.

There are various non-traditional learning activities that have been problematic when it comes to assessing impact on learning; collaborative group work, peer evaluation and play-based learning to name a few. Often academics (and students) can be apprehensive or possibly even reluctant to engage in alternative modes of assessment. Communicating the idea that assessment is being used for learning rather than simply ‘assessment for measurement’ (Cartney, 2012, p. 61) can help to reassure and highlight the benefits of using a mixed-methods approach to measuring impact on learning.

So, if, like me, you frequently end up in discussions as to how we can demonstrate the effectiveness of playful interventions, this methodology may be part of the answer or at least provide some inspiration!


References

Cartney, P. (2012). Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used. In S. Hatzipanagos & R. Rochon (Eds.), Approaches to assessment that enhance learning in higher education (pp. 61–74). Routledge.

Comley, S. R. (2020). Games-based techniques and collaborative learning between arts students in higher education [Doctoral thesis, University of the Arts London and Falmouth University]. https://ualresearchonline.arts.ac.uk/id/eprint/15547/

Falk, J. H., Moussouri, T., & Coulson, D. (1998). The effect of visitors’ agendas on museum learning. Curator: The Museum Journal, 41(2), 107–119. https://doi.org/10.1111/j.2151-6952.1998.tb00822.x

Krathwohl, A. (2002). A revision of Bloom’s Taxonomy: An overview. Theory into practice, 41(4). https://www.depauw.edu/files/resources/krathwohl.pdf