While it is unclear what influence international organisations, such as the OECD, have on governing education, there is a growing interest in such influences (Morgan & Shahjahan, 2014), especially empirical comparisons with particular countries and jurisdictions, such as Finland, Singapore and Shanghai (Morgan & Shahjahan, 2014). The OECD has built on past successes and continues to ‘gain authority as an expert and resource for evidence based education policy’ (Morgan & Shahjahan, 2014, p.194). The OECD, described by Woodward (2009), operates through soft power[1] and through ‘cognitive’ and ‘normative’ governance. Cognitive governance asserts its function through the agreed values of the member nations. While normative governance, described as peer pressure, is perceived as being vague (Woodward, 2009), yet it may hold the most influence because it ‘challenges and changes the mindsets’ of the member people (Sellar & Lingard, 2013, p. 715). This is important because the influence the OECD may hold over the mindsets of federal policy makers, who may in turn influence school system leaders, the key regulating authorities for teachers and students in Australian schools.
The OECD uses the reports of the data from the Program for International Student Assessment[2] (PISA) to make recommendations to countries and jurisdictions, with certain effects on their policy directions (Breakspear, 2012). By 2015, more than 70 countries had taken part in the PISA survey, which has allowed the OECD to track progress and examine three areas: public policy issues in preparing young people for life; literacy in the ways that students apply their knowledge and skills in key learning areas; and lifelong learning, with students measured not only in their reading, mathematics and science literacy but also asked about their self-beliefs (retrieved from https://www.oecd.org/pisa/pisafaq/). Importantly, the paper ‘Beyond PISA 2015: A longer-term strategy of PISA’ (OECD, 2016) explains that the PISA assessment is a tool to enable governments to review their education systems. Of importance for this study is that our national and state policy makers are fuelled by initiatives such as the OECD’s PISA data to compare and contrast Australia with other countries (Lingard & Sellar, 2016). These comparisons and contrasts are likely to influence the directions of school systems and jurisdiction in Australia, as is occurring elsewhere.
Empirical research studies have compared international curriculum systems, using OECD reports from various jurisdictions (Creese, Gonzalez, & Isaacs, 2016). There is a recognition that international organisations contribute to the construction and continuation of evidence-based cultures, which as Pereyra, Kotthoff, and Cowen (2011) assert, legitimatises comparative data being employed as a tool to govern education. In essence, national policy makers now adopt this comparative data heavily, to guide their educational directions (Breakspear, 2012; Morgan & Shahjahan, 2014). It is possible that using evidence-based cultures as a governing tool has become normalised and this possibility was considered in this inquiry.
Interestingly, an OECD 2013 publication on the evaluation of school leaders, advised the ways head teachers (principals) should be appraised in terms of ‘fostering pedagogical leadership in schools’ (OECD, 2013). The priorities that school system leaders give to certain areas of leadership, such as pedagogical leadership and evidence-based leadership, can result in principals being evaluated on their students’ performance results, which may affect their ongoing tenure.
Table 1.1 provides an overview of the largest OECD-based studies in education. The table illustrates that most age groups were assessed in some form by these studies.
Table 1.1
‘Beyond PISA 2015: A longer-term Strategy of PISA’ (Adapted from Schleicher, 2013, p. 8)
Study | Age | Subject areas | Sources of context information[3] | Frequency | Global coverage |
OECD PISA | 15 | – Reading – Mathematics – Science – Collaborative problem solving (2015) – Problem solving (2012) – Financial literacy |
– Students – Parents (optional) – Teachers (optional) – School principals |
Every 3 years since 2000 | OECD countries: 34 non-OECD participants: 40 (PISA 2009) |
OECD PIAAC[4] | 16–65 | – Literacy – Numeracy – Reading components – Problem solving in technology-rich environments |
– The individuals who are assessed | Frequency to be decided1 | OECD countries: 24 non-OECD participants: 2 (PIAAC 2011) |
OECD TALIS[5] | Teachers of lower secondary education2 | – Focuses on the learning environment and working conditions of teachers | – Teachers – School principals |
5 years between first 2 cycles | OECD countries: 16 non-OECD participants: 7 (TALIS 2008) |
OECD AHELO[6] | University students at the end of their B.A. program | – Generic skills common to all university students (e.g., critical thinking) – Skills specific to economics and engineering |
– Students – Faculties – Institutions |
Feasibility study carried out in 2012 | Institutions from 17 countries participated in the feasibility study |
Some people assert that the PISA program has helped to normalise the use of comparative data in education on the global stage (Sahlberg, 2011). Countries seek to understand why students in the top-performing education systems, such as Finland and Shanghai, perform so well in PISA testing. Others assert that bodies such as the OECD fuel national educational reforms, which are kept in the public eye by the media in the OECD triennial cycle (Bagshaw & Smith, 2016). These media assertions often disregard the incompatibility between the NAPLAN test (skills) and the PISA survey (applications of skills) (Lingard & Sellar, 2013) and other system impact factors on PISA data (Sellar & Lingard, 2013, p. 723), such as a student demographic of multi culturalism. As global studies increase both in number and in sectors, they are often adopted by policy makers as benchmarks for comparative rankings, although at times the validity of such comparisons may be questioned due to the impact factors at play.
The implication is that when national and state policy makers compare data with other countries, the trickle-down effect from national policy makers to school system leaders is likely to form part of the principals’ experiences of regulated assessment-focused accountability.
References
Bagshaw, E., & Smith, A. (2016, March 25, 2016). Education policy not adding up: OECD asks what’s wrong with Australia’s schools?, Sydney Morning Herald. Retrieved from http://www.smh.com.au/national/education/education-policy-not-adding-up-oecd-asks-whats-wrong-with-australias-schools-20160323-gnpno9
Breakspear, S. (2012). The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance. OECD Education Working Papers, No. 71. OECD Publishing (NJ1).
Creese, B., Gonzalez, A., & Isaacs, T. (2016). Comparing international curriculum systems: the international instructional systems study. The Curriculum Journal, 27(1), 5-23.
Lingard, B., & Sellar, S. (2013). ‘Catalyst data’: Perverse systemic effects of audit and accountability in Australian schooling. Journal of Education Policy, 28(5), 634-656.
Lingard, B., & Sellar, S. (2016). The Changing Organizational and Global Significance of the OECD’s Education Work. Handbook of Global Education Policy, 357.
Morgan, C., & Shahjahan, R. A. (2014). The legitimation of OECD’s global educational governance: examining PISA and AHELO test production. Comparative Education, 50(2), 192-205.
Nye, J. (2012). China’s soft power deficit to catch up, its politics must be unleash the many talents of its civil society The Wall Street Journal.
OECD. (2013). Synergies for better learning: an international perspective on evaluation and assessment, Paris: OECD Retrieved 29/10/2016, 2016, from http://www.oecd.org/edu/school/Synergies%20for%20Better%20Learning_Summary.pdf
OECD. (2016). OECD BETTER POLICIES FOR BETTER LIVES Retrieved 8 ApriL, 2016, from http://www.oecd.org/about/
Pereyra, M. A., Kotthoff, H., & Cowen, R. (2011). PISA under examination: Springer.
Sahlberg, P. (2011). Finnish lessons: Teachers College Press.
Schleicher, A. (2013). BEYOND PISA 2015: A LONGER-TERM STRATGEY OF PISA. Paris: Retrieved from http://www.oecd.org/callsfortenders/ANNEX.
Sellar, S., & Lingard, B. (2013). The OECD and global governance in education. Journal of Education Policy, 28(5), 710-725.
Woodward, R. (2009). The Organization for Economical Co-operation and Development (OECD): Routledge: London.
[1] Joseph Nye of Harvard University developed this concept to describe a way to ‘attract and co-opt’, rather than use force (hard power) (Nye, 2012).
[2] 70 member countries of the OECD test 15-year-olds’ skills and knowledge (OECD, 2016).
[3] Sources of context information: refers to who is assessed and/or where it is assessed
[4] Program for the International Assessment of Adult Competencies
[5] Teaching and Learning International Survey
[6] Assessment of Higher Education Learning Outcomes