Skip to content
 

Blog post

Can we really trust socioeconomic measures used in PISA tests?

Pallavi A Banerjee, Senior Lecturer at University of Exeter Nurullah Eryilmaz, PhD Student at University of Bath

The Organisation for Economic Co-operation and Development (OECD) conducts the Programme for International Student Assessment (PISA) study of 15-year-old students’ academic achievement. In order to determine participants’ socioeconomic position, the study utilises a specially developed method based on self-reporting of household possessions. This information contributes to the formulation of policy recommendations aimed at reducing achievement gaps for socially disadvantaged students in schools and informs the conclusions of the PISA reports. However, controversy surrounds the validity and consistency of socioeconomic status measurements in each of the nations included in the PISA data.

The common socioeconomic status (SES) scale is called the index of economic, social and cultural status (ESCS) in the dataset. The ESCS is a measurement of a student’s access to family resources (financial, social, cultural and human capital), which determine the family’s or household’s social position. To put it more clearly, ESCS is a composite score derived from the indicators of parental education, highest parental occupation, and household possessions including books at home.

In addition to the common SES scale, PISA collects three country-specific questions on wealth. These questions vary across countries and usually have binary responses in the form of YES/NO questions (see Annex E of the 2018 technical report). In this blog, we discuss the reliability of the data obtained from these questions and whether these questions really reflect each country’s socioeconomic status.

If the country-specific questions are fit for purpose, the association between these questions and the common socioeconomic index should be high. The purpose of this study is to examine the reliability and cross-national comparability of socioeconomic disadvantage metrics included in the PISA data. For this purpose, the statistical approach we took in our study involved creating new variables – one of these was created from country-level questions (ESCS) and the second measure was created from local-level questions (Annex E) in the student survey (SES_LOCAL). Again, if these local items really reflect each country’s national (local) indicators, the correlation between SES_LOCAL and ESCS (socioeconomic status) index should be high. On the contrary, we found a weak correlation between these variables, and our findings demonstrate that these local and national items do not accurately reflect the local indicators of each country. For a more detailed discussion see our paper (Banerjee & Eryilmaz, 2022); we will also be presenting our findings at the forthcoming BERA conference 2022.

‘We found a weak correlation between these variables, and our findings demonstrate that these local and national items do not accurately reflect the local indicators of each country.’

There are certain known issues with the ESCS index, and these have been highlighted in other literature (see Eryilmaz et al., 2020). Calculating how equal the various elements are when compared across countries is difficult. Our aim was not to produce a comparable construct across countries or to compare countries with each other, using these items or a created construct. Instead, we make our inferences within each country, assessing whether the socioeconomic measures used by the country correlate to common socioeconomic measures in PISA.

For the country-specific questions, the information provided by the department or ministry of education of each country is utilised to compile the list of material possessions used in PISA. To represent each country’s socioeconomic status, PISA questioners ask three country-specific questions to department/ministry of education experts in each participating country. However, there are still a lot of problems with these questions internationally.

Moreover, not only are the country-specific and local measures misaligned but there are also problems in the way the data is collected, and with the options that are used in the questionnaires as SES indicators in PISA. We also think that when students are asked to provide information about parental education and parental income, it is possible that they may not always be aware of this kind of information, which adds further complexities to the data collected.

Despite the above-mentioned difficulties in socioeconomic status questions in PISA, socioeconomic status questions are arguably the most commonly utilised variable in reports and secondary analyses of data from PISA after student achievement. While preparing these SES questions, special attention should be paid to the uniqueness of the countries. What works in each educational environment and setting continues to be a significant problem that will continue to receive considerable attention. In order to deal with this concern, the usage of free school meals or the index of multiple deprivation might be more trustworthy metrics in the UK context and more precisely reflect socioeconomic status. More reliable measures are also available from official sources from other countries.

Pallavi Amitava Banerjee and Nurullah Eryilmaz will be presenting findings from this study at the BERA Annual Conference in the session, Globalisation and its impact on local education models, on 6 September 2022 (14:00–15:30). Further details are available in the conference programme.


References

Banerjee, P., & Eryilmaz, N. (2022). How reliable are the socioeconomic measures used in PISA data?. Paper presented at BERA Conference 2022, Liverpool, UK.

Eryilmaz, N., Rivera-Gutiérrez, M., & Sandoval-Hernández, A. (2020). ¿Los países que participan en PISA deberían interpretar por igual el ambiente socioeconómico? Un enfoque de medición de invariancia [Should different countries participating in PISA interpret socioeconomic background in the same way? A measurement invariance approach]. Revista Iberoamericana de Educación, 84(1), 109–133. https://doi.org/10.35362/rie8413981

More content by Pallavi A Banerjee and Nurullah Eryilmaz