Global Actors Impact on School Leaders’ Work

As nations face the outcomes of their own national testing, and that of PISA and TIMMS, policy makers and politicians look to blame year after year. Is the blame fair? The global stage is fraught with comparisons between countries. The Organisation for Economic Co-operation and Development (OECD), instrumental in providing public data about schools and leadership, fuels school systems’ expectations which are passed on to school leaders. 79 school systems (2020) signed up to the OECD to test 15-year-olds’ skills and knowledge. The OECD has built on past successes and continues to be given authority by leaders of school systems, as a key expert and resource for evidence-based education policies in member countries (Morgan & Shahjahan, 2014; Pons, 2017). Member countries pay for their membership for its resources. The OECD uses reports of data from the Program for International Student Assessment (PISA) to make recommendations to countries and jurisdictions, impacting on their policy directions (Breakspear, 2012), and sometimes with less scrutiny than should be warranted (Sachse & Haag, 2017).

Global Actors Impact on School Leaders’ Work

OECD – Soft Power?

By 2020, more than 37 member countries were enlisted in the OECD, with the 79 school systems taking part in the PISA survey (OECD, 2020). This testing allows the OECD to make recommendations in three areas: public policy issues in preparing young people for life; literacy in the ways that students apply their knowledge and skills in key learning areas; and lifelong learning, with students measured not only in their reading, mathematics and science literacy but also asked about their self-beliefs (retrieved from https://www.oecd.org/pisa/pisafaq/). There is a global trend ‘in which national policies are increasingly often debated through appeals to models and policy advice promulgated by international organisations’ (Rautalin, Alasuutari, & Vento, 2019, p. 500), OECD being cited as one of them.

The OECD operates through a soft power with ‘cognitive’ and ‘normative’ governance (Sellar & Lingard, 2013). Joseph Nye of Harvard University developed this concept to describe a way to ‘attract and co-opt’, rather than use force (hard power) (Nye, 2012).  Cognitive governance asserts its function through the agreed values of the member nations. While normative governance, described as peer pressure, is perceived as being vague (Woodward, 2009), this organic governance may hold the most influence because it ‘challenges and changes the mindsets’ of the member people (Sellar & Lingard, 2013, p. 715).

Data Don’t Speak

To be fair the OECD simply provides data sets. The power rests with those who interpret data. On their own, data sets do not speak for themselves. National governments analyse and make interpretations of data for their own purposes. Falling nation-wide results often provide the impetus for countries to lever OECD data as an implement for change. Public discourse about the initiatives from such data has been evident in these member schools.

Unequal Playing Fields

Of import for school leaders is that Australian national and state policy makers are influenced by initiatives such as the OECD’s PISA data to compare and contrast Australia with other countries (Lingard & Sellar, 2016). As is occurring across nations, these comparisons and contrasts influence the directions of school systems and jurisdictions. What is problematic is that member countries in the OECD are diverse and their base platforms are not equal. For example, a country such as Finland and a jurisdiction such as Shanghai are largely homogenous school communities. To compare their data with Canadian schools or metropolitan Australian schools with high levels of cultural diversity diminishes the validity of data to be employed to make informed evidence based educational directions.

Soft to Hard Power

One such example of policy makers and school system leaders being influenced by OECD data has been the pedagogical leadership capabilities agenda for principals. An OECD 2013 publication on the evaluation of school leaders, advised the ways head teachers (principals) should be appraised in terms of ‘fostering pedagogical leadership in schools’ (OECD, 2013). Not surprisingly what has followed are now the priorities that school system leaders give to certain areas of leadership. Notably in Australia, pedagogical leadership and evidence-based leadership have been privileged, with increasingly rigid appraisal processes. Principals are appraised on their instructional and evidence-based leadership knowledge and skills along with the outcomes such their students’ performance results. Possibly because of the OECD’s soft power through data and recommendations, school systems are adopting new leadership roles to support principals, such as a reformed notion of the ‘Instructional Leader’ and ‘Leaders of Pedagogy’.

Instrument Confusion, Pressure Remains

Across the OECD member schools, national education reforms are often drawn from OECD data, where media sources are becoming highly skilled at tracking and presenting data with assertions from the OECD triennial cycle to the public. In Australia, for example, these media assertions often disregard the incompatibility between the National Assessment Program – Literacy and Numeracy (NAPLAN) test (skills-basic) and the PISA survey (applications of skills—higher order) (Baroutsis & Lingard, 2017; Lingard & Sellar, 2013) and other system impact factors on PISA data (Sellar & Lingard, 2013, p. 723). As global studies increase both in number and in sectors, they will be employed more heavily by policy makers as benchmarks for comparative rankings and leverage. The main point here is that employing OECD data to make comparisons between countries and the Australian student performance data from cycle to cycle is questionable due to the student diversity of member countries and Australia’s current preponderance and drilling basic skills (NAPLAN) and not higher order applications skills (PISA).

OECD Data Show Inequities

However, as a counterpoint the data retrieved by OECD in their report on PISA results 2018 provides sobering news to leaders of school systems:

…in over half of the PISA-participating countries and economies, principals of disadvantaged schools were significantly more likely than those of advantaged schools to report that their school’s capacity to provide instruction is hindered by a lack or inadequacy of educational material; and in 31 countries and economies, principals of disadvantaged schools were more likely than those of advantaged ones to report that a lack of teaching staff hinders instruction. In these systems, students face a double disadvantage: one that comes from their home background and another that is created by the school system. There can be numerous reasons why some students perform better than others, but those performance differences should never be related to the social background of students and schools (OECD, 2020).

This is a sobering finding. No umbrage can be taken about the source here (OECD) when the greater concern points to the inequitable actions by school leaders may diminish the life outcomes for children and young people.

References

Baroutsis, A., & Lingard, B. (2017). Counting and comparing school performance: an analysis of media coverage of PISA in Australia, 2000–2014. Journal of Education Policy, 32(4), 432-449.

Breakspear, S. (2012). The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance. OECD Education Working Papers, No. 71. OECD Publishing (NJ1).

Lingard, B., & Sellar, S. (2013). ‘Catalyst data’: Perverse systemic effects of audit and accountability in Australian schooling. Journal of Education Policy, 28(5), 634-656.

Lingard, B., & Sellar, S. (2016). The Changing Organizational and Global Significance of the OECD’s Education Work. Handbook of Global Education Policy, 357.

Morgan, C., & Shahjahan, R. A. (2014). The legitimation of OECD’s global educational governance: examining PISA and AHELO test production. Comparative Education, 50(2), 192-205.

Nye, J. (2012). China’s soft power deficit to catch up, its politics must be unleash the many talents of its civil society The Wall Street Journal. Retrieved from http://online.wsj.com/article/SB10001424052702304451104577389923098678842.html?mod=ITP_opinion_0

OECD. (2013). Synergies for better learning: an international perspective on evaluation and assessment, Paris: OECD. Retrieved from http://www.oecd.org/edu/school/Synergies%20for%20Better%20Learning_Summary.pdf

OECD. (2020). PISA 2018 Results Volume IV. Retrieved from https://www.oecd-ilibrary.org/docserver/48ebd1ba-en.pdf?expires=1598248952&id=id&accname=guest&checksum=123DB209E4384E4CD9DF2A2949D4FA06

Pons, X. (2017). Fifteen years of research on PISA effects on education governance: A critical review. European Journal of Education, 52(2), 131-144.

Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education policies: does PISA have an effect? Journal of Education Policy, 34(4), 500-522.

Sachse, K. A., & Haag, N. (2017). Standard errors for national trends in international large-scale assessments in the case of cross-national differential item functioning. Applied Measurement in Education, 30(2), 102-116.

Sellar, S., & Lingard, B. (2013). The OECD and global governance in education. Journal of Education Policy, 28(5), 710-725.

Woodward, R. (2009). The Organization for Economical Co-operation and Development (OECD). In: Routledge: London.

A reflexive tool:  School leaders managing high velocity events

A leader creates human/social alternities by telling a compelling story about what is about will be, about what should be or about what should (or could) be done, about one or the other’(p.259)… It is the leader’s stories that mediate for all those who would follow, an alternative way of being, doing, knowing, having or saying in the world (Thayer, 1988, p. 260)

This reflexive tool draws on the possible stories you have told or your observations of other leaders in helping communities come to terms with complex situations. The purpose of this reflexive tool is to enable school leaders and aspirant leaders to reflect upon an event, experience or episode that holds, or has held, contradictions or complexities. Salicru (2018) describes events such as these as high velocity. This is an event or episode that holds ambiguity or contradictions, where your normal, routine approaches no longer work. Below are a set of questions with several theoretical platforms: sensemaking (Weick, 1995), sensegiving (Gioia & Chittipeddi, 1991) and the Theory of Planned Behaviour (Ajzen, 1991). Reflecting on these questions could help the leader make sense of and make decisions about how to move forward in response to the event. Click on the conceptual link at the end of each question. The link leads you to the theoretical premise of the concept.

Q1: Describe the event/episode or experience you wish to explore (keep it brief, I suggest no more than 50 words).

Questions 2 – 8 refer to the strategies leaders use to make sense of events for themselves. They are cognitive acts (sensemaking).

Q2: What support from the groups in your school community would you have/or did have in your anticipated response(s) to the event(s)/experience(s)? How relevant are the responses/ will the responses be for the community? (Social context)

Q3: What enhancements or threats could there be to your own sense of self as a leader in the event(s)/experience(s)? (Personal identity)

Q4: How has/have your past experience(s) influenced you in making sense of the current event(s)/experience(s)? (Retrospect)

Q5. What cues could help/helped you shore up your initial hunch(es)? Were there any contradictions or confusions from the cues as events unfolded on in the event(s)/experience(s)? (Salient cues)

Q6: How would it be/was it possible to place some boundaries around the event(s)/experience(s) to keep pace with the flow? If not, what may have prevented you from doing so? (Ongoing projects).

Q7: What stories or metaphors would help/ have helped you explain the event(s)/experience(s)? (Plausibility)

Q8: What kind of statements or declarations could you make or have made to ‘test the waters’ to see if your explanations of the event(s)/experience(s) were suitable? (Enactment)

Questions 9 – 11 ask about the strategies leaders adopt when enacting their sensemaking, called sensegiving.

Q9: After you have made sense of (envisioning) what is/was happening for yourself, how do/did you convey this sense with the community? Story telling? Using metaphors? Allegories? Drawing on the past? Persuasive acts, i.e. mantras “We’ve got this!”‘ (Signalling)

Q10: In response to the your signalling what signs were the community members giving to demonstrate they were engaging in their own sensemaking?? (Re-Visioning)

Q11: Did you incorporate or adjust the community member’s sensgving inot your own sensemaking? If so what did you do? did you articualte this th the community? If so what was you understanding in how the community responses to this last phase? (Energising)

Questions 12 -14 are based on the Theory of Planned Behaviour (Ajzen, 1991), with Question 15 being an invitation to action.

Q12: In your judgments, how favourable (or unfavourable) are your possible actions? You may wish to write down the possible actions and notate which ones would be favourable to you and which ones would not (Attitude)

Q13: Whom will you consider or need to consider when deciding to act? (Subjective norm)

Q14: With what ease or difficulty do you perceive your abilities to carry out your desired action? What past experiences influence your intentions? What are your anticipated challenges will influence your intentions? (Perceived behavioural control)

Q15: Review your responses. Enter into a reflexive mode of cognition.

Step 1: What feelings or thoughts come to mind about your responses (observing your own observing).

Step 2: You are invited to construct a model/framework that maps how you would act or how you acted as a leader in this situation. If this was a past event, ask yourself, what would you do differently now knowing the impact of these three processes? What declarations could you make as a leader for future complex or contradictory situations?

References

Ajzen, I. (1991). The Theory of Planned Behavior. Organizational Behaviour and Human Decision Processes, 50, 179-211.

Gioia, D. A., & Chittipeddi, K. (1991). Sensemaking and sensegiving in strategic change initiation. Strategic Management Journal, 12(6), 433-448.

Thayer, L. (1988). Leadership/communication: A critical review and a modest proposal. In G. M. Goldhaber & G. A. Barnett (Eds.), Handbook of organizational communication, (pp. 231–263). Norwood, NJ: Ablex.

Salicru, S. (2018). Storytelling as a leadership practice for sensemaking to drive change in times of turbulence and high velocity. Journal of Leadership, Accountability and Ethics, 15(2).

Weick, K. (1995). Sense-making in Organizations. Thousand Oaks, CA: SAGE Publications, Inc.