As nations face the outcomes of their own national testing, and that of PISA and TIMMS, policy makers and politicians look to blame year after year. Is the blame fair? The global stage is fraught with comparisons between countries. The Organisation for Economic Co-operation and Development (OECD), instrumental in providing public data about schools and leadership, fuels school systems’ expectations which are passed on to school leaders. 79 school systems (2020) signed up to the OECD to test 15-year-olds’ skills and knowledge. The OECD has built on past successes and continues to be given authority by leaders of school systems, as a key expert and resource for evidence-based education policies in member countries (Morgan & Shahjahan, 2014; Pons, 2017). Member countries pay for their membership for its resources. The OECD uses reports of data from the Program for International Student Assessment (PISA) to make recommendations to countries and jurisdictions, impacting on their policy directions (Breakspear, 2012), and sometimes with less scrutiny than should be warranted (Sachse & Haag, 2017).
By 2020, more than 37 member countries were enlisted in the OECD, with the 79 school systems taking part in the PISA survey (OECD, 2020). This testing allows the OECD to make recommendations in three areas: public policy issues in preparing young people for life; literacy in the ways that students apply their knowledge and skills in key learning areas; and lifelong learning, with students measured not only in their reading, mathematics and science literacy but also asked about their self-beliefs (retrieved from https://www.oecd.org/pisa/pisafaq/). There is a global trend ‘in which national policies are increasingly often debated through appeals to models and policy advice promulgated by international organisations’ (Rautalin, Alasuutari, & Vento, 2019, p. 500), OECD being cited as one of them.
The OECD operates through a soft power with ‘cognitive’ and ‘normative’ governance (Sellar & Lingard, 2013). Joseph Nye of Harvard University developed this concept to describe a way to ‘attract and co-opt’, rather than use force (hard power) (Nye, 2012). Cognitive governance asserts its function through the agreed values of the member nations. While normative governance, described as peer pressure, is perceived as being vague (Woodward, 2009), this organic governance may hold the most influence because it ‘challenges and changes the mindsets’ of the member people (Sellar & Lingard, 2013, p. 715).
To be fair the OECD simply provides data sets. The power rests with those who interpret data. On their own, data sets do not speak for themselves. National governments analyse and make interpretations of data for their own purposes. Falling nation-wide results often provide the impetus for countries to lever OECD data as an implement for change. Public discourse about the initiatives from such data has been evident in these member schools.
Of import for school leaders is that Australian national and state policy makers are influenced by initiatives such as the OECD’s PISA data to compare and contrast Australia with other countries (Lingard & Sellar, 2016). As is occurring across nations, these comparisons and contrasts influence the directions of school systems and jurisdictions. What is problematic is that member countries in the OECD are diverse and their base platforms are not equal. For example, a country such as Finland and a jurisdiction such as Shanghai are largely homogenous school communities. To compare their data with Canadian schools or metropolitan Australian schools with high levels of cultural diversity diminishes the validity of data to be employed to make informed evidence based educational directions.
One such example of policy makers and school system leaders being influenced by OECD data has been the pedagogical leadership capabilities agenda for principals. An OECD 2013 publication on the evaluation of school leaders, advised the ways head teachers (principals) should be appraised in terms of ‘fostering pedagogical leadership in schools’ (OECD, 2013). Not surprisingly what has followed are now the priorities that school system leaders give to certain areas of leadership. Notably in Australia, pedagogical leadership and evidence-based leadership have been privileged, with increasingly rigid appraisal processes. Principals are appraised on their instructional and evidence-based leadership knowledge and skills along with the outcomes such their students’ performance results. Possibly because of the OECD’s soft power through data and recommendations, school systems are adopting new leadership roles to support principals, such as a reformed notion of the ‘Instructional Leader’ and ‘Leaders of Pedagogy’.
Instrument Confusion, Pressure Remains
Across the OECD member schools, national education reforms are often drawn from OECD data, where media sources are becoming highly skilled at tracking and presenting data with assertions from the OECD triennial cycle to the public. In Australia, for example, these media assertions often disregard the incompatibility between the National Assessment Program – Literacy and Numeracy (NAPLAN) test (skills-basic) and the PISA survey (applications of skills—higher order) (Baroutsis & Lingard, 2017; Lingard & Sellar, 2013) and other system impact factors on PISA data (Sellar & Lingard, 2013, p. 723). As global studies increase both in number and in sectors, they will be employed more heavily by policy makers as benchmarks for comparative rankings and leverage. The main point here is that employing OECD data to make comparisons between countries and the Australian student performance data from cycle to cycle is questionable due to the student diversity of member countries and Australia’s current preponderance and drilling basic skills (NAPLAN) and not higher order applications skills (PISA).
However, as a counterpoint the data retrieved by OECD in their report on PISA results 2018 provides sobering news to leaders of school systems:
…in over half of the PISA-participating countries and economies, principals of disadvantaged schools were significantly more likely than those of advantaged schools to report that their school’s capacity to provide instruction is hindered by a lack or inadequacy of educational material; and in 31 countries and economies, principals of disadvantaged schools were more likely than those of advantaged ones to report that a lack of teaching staff hinders instruction. In these systems, students face a double disadvantage: one that comes from their home background and another that is created by the school system. There can be numerous reasons why some students perform better than others, but those performance differences should never be related to the social background of students and schools (OECD, 2020).
This is a sobering finding. No umbrage can be taken about the source here (OECD) when the greater concern points to the inequitable actions by school leaders may diminish the life outcomes for children and young people.
Baroutsis, A., & Lingard, B. (2017). Counting and comparing school performance: an analysis of media coverage of PISA in Australia, 2000–2014. Journal of Education Policy, 32(4), 432-449.
Breakspear, S. (2012). The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance. OECD Education Working Papers, No. 71. OECD Publishing (NJ1).
Lingard, B., & Sellar, S. (2013). ‘Catalyst data’: Perverse systemic effects of audit and accountability in Australian schooling. Journal of Education Policy, 28(5), 634-656.
Lingard, B., & Sellar, S. (2016). The Changing Organizational and Global Significance of the OECD’s Education Work. Handbook of Global Education Policy, 357.
Morgan, C., & Shahjahan, R. A. (2014). The legitimation of OECD’s global educational governance: examining PISA and AHELO test production. Comparative Education, 50(2), 192-205.
Nye, J. (2012). China’s soft power deficit to catch up, its politics must be unleash the many talents of its civil society The Wall Street Journal. Retrieved from http://online.wsj.com/article/SB10001424052702304451104577389923098678842.html?mod=ITP_opinion_0
OECD. (2013). Synergies for better learning: an international perspective on evaluation and assessment, Paris: OECD. Retrieved from http://www.oecd.org/edu/school/Synergies%20for%20Better%20Learning_Summary.pdf
OECD. (2020). PISA 2018 Results Volume IV. Retrieved from https://www.oecd-ilibrary.org/docserver/48ebd1ba-en.pdf?expires=1598248952&id=id&accname=guest&checksum=123DB209E4384E4CD9DF2A2949D4FA06
Pons, X. (2017). Fifteen years of research on PISA effects on education governance: A critical review. European Journal of Education, 52(2), 131-144.
Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education policies: does PISA have an effect? Journal of Education Policy, 34(4), 500-522.
Sachse, K. A., & Haag, N. (2017). Standard errors for national trends in international large-scale assessments in the case of cross-national differential item functioning. Applied Measurement in Education, 30(2), 102-116.
Sellar, S., & Lingard, B. (2013). The OECD and global governance in education. Journal of Education Policy, 28(5), 710-725.
Woodward, R. (2009). The Organization for Economical Co-operation and Development (OECD). In: Routledge: London.