Wednesday, November 26, 2008

It is not personal and it is not simple


Promoting change from outside a system is fraught with difficulties for at least two main reasons.


Firstly, it is NOT NECESSARILY PERSONAL, but those who are responsible for the current situation naturally tend to experience proposals for change as (implied and) personal criticism. Our identity is closely link to what we do especially if we adopt an erroneous assumption that we are in control. Those responsible may be in charge but they are almost certainly not in control. Any situation is very largely the result of its history and prevailing culture. History and culture both enable and constrain what is possible. History includes factors well outside the immediate situation. Culture is reflected in the patterns of 'how we do things around here' and these are not easily changed. They have to be continually constructed and reconstructed. It is much easier to reconstruct the familiar than it is to construct something new.


Secondly, it is NOT SIMPLE. Changing a complex situation is never a simple endeavour. At best, those in charge may be able to moderate the direction in which things are moving. Attempting to just change to a different steady state is unrealistic. Being able to articulate such a state (as policy attempts to do) is not the same as causing the state to exist. When complex ideas are summarised they can sound simple and easy. The truth can be very different. Bureaucrats, proponents and the media have a real dilemma in this regard. They need to get the message across quickly and easily but the key understandings may be complex. and very difficult (perhaps impossible) to articulate briefly in simple terms.


I am involved in a classic example. An article in today's 'The Examiner' (local newspaper) has the headline "Ex-principal slams huge bureaucracy". In a conversation with the journalist I certainly criticised the thinking behind how the bureaucracy currently operates. But this thinking is the result of historical and cultural factors. The thinking is not isolated to the Tasmanian education bureaucracy - indeed it is almost universal. I was not aware that I 'slammed' the size of the bureaucracy. Proposing that a bureaucracy should be larger or smaller is usually a simplistic approach and therefore needs to be considered carefully. For what reasons might the bureaucracy be larger of smaller? What value would such a change add to the effectiveness of schools? At what cost (money, opportunity...)? On the other hand, it is true that the larger the bureaucracy the more officers there are in intervene (for better or worse) in what schools do. "Ex-principal questions bureaucracy" may have been a more valid headline.


And I did not 'slam' the people who work in the bureaucracy. I have worked with a large number of them over many years and I know the majority to be competent dedicated professionals, albeit working in difficult (perhaps impossible) circumstances. They are expected to 'be in control' and they are expected to implement 'simple' responses to the complex situations at hand. Like everyone else, they are caught in the middle. The impossibility of these terms of reference frequently results in simply requiring compliance, regardless of the best interests of those involved. In NSW, professional development for Principals is called 'compliance training'... at least they are explicit!!


The impact of the bureaucracy on the day to day operation of Tasmanian schools is certainly one of my major concerns. The last decade has seen continual intervention in the areas of system structure, curriculum, assessment and reporting. The cost has been huge in terms of time, energy, money, disruption, distraction, dislocation, disaffection, loss of knowledge and loss of social capital... The benefits are far less certain (see change and improvement). And the less certain the outcomes, the more likely the interventions will continue and increase.


Very few people apply for principal positions these days. Could this be a significant indicator of the poor health of the system? If so, then it ‘slams’ the current situation much more powerfully than I could, or would want to. It is important to start with a sound understanding of the current reality (good, bad or indifferent). As one of my mentors used to say, "There is a simple answer to every question and it is usually wrong".

Monday, November 17, 2008

Changing complex systems

It is best to think in terms of aspects of systems rather than systems per se. Indeed systems may not really exist but under certain conditions it is reasonable to think of the situation as being or containing 'systems'. That is, while systems thinking may be valid, systems may really be products of our thinking. That is, we can treat certain aspects of the situation 'as if' they were systems.
In the Cynefin framework, Dave Snowden suggests five types of of 'systems' according to the relationship between cause and effect phenomena occuring in the context.
  • Ordered (predictable)
    • Simple - cause and effect are widely known and understood
    • Complicated - cause and effect are knowable (with expert assistance)
  • Unordered (unpredictable)
    • Complex - retrospective coherence may be discernable
    • Chaotic - no coherence
  • Disordered (or perhaps undifferentiated)
In addition the framework provides guidance as to the appropriate change strategy according to the cause and effect relationships involved.
While most human activity is largely complex or chaotic there are times and places where it may be valid to respond 'as if' the situation is a complicated system or even a simple system. This is largely dependent on the consistency of human activity and interactivity.
Such consistency may emerge in complex systems from two main factors in the situation:
  • attractors such as shared values, purposes.... around which activity and interactivity continue and emerge (change)
  • boundaries such as rules and policies together with knowledge, artefacts...that constrain the activity and interactivity
Attractors and boundaries are sometimes confused or conflicting. Frequently, boundaries are used to control and to extract compliance, while attractors are used to inspire and motivate. Unfortunately, having established a (personal) image of what is to be achieved, it is common for management to see its initiatives as 'leadership' and the situation as an 'ordered system':
  • either simple: "If only people would do as expected..."
  • or perhaps complicated: "The experts know what needs to be done...."
The flaw in such thinking is that people cannot be simply instructed. Rather each one needs to (re)construct their knowledge, activities and arrangements in order to act 'as expected' or to do "what needs to be done". And this endeavour is enabled or constrained by attractors and boundaries both within the situation and elsewhere. They contribute to (enhance/constrain) the capacity of the system.

Thus it is wise to consider the matter of attractors and boundaries carefully in any change process. Simple attractors are potentially very powerful in fostering the emergence of new approaches and greater effectiveness and efficiency. At the same time boundaries may reveal the true purpsoes of the organisation and in so doing completely over-ride the attractors eg, the Tasmanian education's "Student at the Center" (intended attractor) has been annihilated by the Department's focus on structure, curriculum, assessment and reporting policies (actual boundaries).

Thursday, November 13, 2008

Fallout from School Report Cards

It looks like the whole school report card thing will be a fizzer . I certainly hope so.

Nothing in the local paper except for my article. It is so easy to discredit the process as at least one Principal has done. Each school can cite many instances of nonsense from the reports hence the meed for sense making to be applied to the data (see my previous posts). For example, in one school the Staff Attendance was deemed to be "Trend Down" largely as the result of a staff member with cancer. No reasonable person would accept this data accurately reflected a decline in school performance.

My recommendation would be to promote as little interest as possible (at least one other principal has adopted this strategy) . And this is not simply to avoid the difficulties of the School report Cards. Rather it enables the school to devote its energies to the real task... dealing with the everyday things that are impeding the achievement of success and well-being for all. That is, genuinely placing the 'student at the centre'.

The main outcome of the school report cards process is likely to be significant underlying damage to the working relationship between schools and the Department (and Government) . This seems to be part of a very confused notion of 'Learning Services' that has emerged from the current Department structure and arrangements... it combines resourcing, supervision, compliance enforcement and well as professional learning... This complex mix of centrally controlled interactions with schools based on various 'carrots and sticks' is of concern in terms of its impact on
  • the effectiveness of the schools
  • the long term interactions between schools and the Department (and Government) and
  • (psychological) OH&S for Principals (and staff), as reflected in the very small number of applicants for Principal positions
The OH&S issue arises from the fact that Principals frequently try to absorb the tension between
  • the demands of the system / government, and
  • the needs of the school and its people.
This phenomenon has been verified in research across the world. See also my previous posting on 'Schooling is not a service'

Saturday, November 8, 2008

Schooling is NOT a service

In recent years the Tasmanian Department of Education has adopted a 'service' orientation. It has sections such as Learning Services - the section that supports, directs and supervises schools and colleges; and School Performance Services that monitors and reports on various aspects of schools. And then there is Adult and Community Learning Services, and so on.

Providing facilities, staff, other resources associated with education, and even programs, may well be deemed to be services. However this does not mean that education, especially in the form of schooling, is a service for at least two reasons:
  • Firstly, schooling is compulsory whereas in service industries the clients of the service chooses whether or not to receive the service
  • Secondly, education (the aim of schooling) is not simply the result of the services being provided - the 'recipient' is also a major contributor (perhaps the major contributor)
While clean offices and tattoos are frequently produced by service providers with minimal contributions from their clients the same cannot be said for schooling. At its educational best, schooling is a highly complex and collaborative endeavour involving much more that the programs (services) provided by the teacher on the behalf of the teacher's employer.

If this is so, then it is time to revisit the service oriented organisational culture that has been adopted by the Tasmanian Education Deportment. There are huge implications for authority and responsibility, leadership and supervision. change management, policy making, innovation... This means that better working relationships, shared knowledge and understanding schools as purposeful communities are the keys to school improvement.

Thursday, November 6, 2008

Improve the improvement process

The recent Tasmanian School Report Cards attempted to accurately communicate school improvement but failed on at least three points

  • The community read the reports as being measures of performance (see the previous posting), and thus
  • Excellent performance was hidden by results of "Trend Downward' when a performance indicator had declined slightly (a fraction of a percent)
  • And the reports focused on 'measurable' items without fully explaining
    • That the improvements were based on measures of different cohorts, eg, this year's Kinder group is different from last year's, however
    • That the different cohorts were presumably assumed to be equivalent (which is highly unlikely in small cohorts such a staff (Staff Attendance) and Kinder (School Readiness)
    • The statistical limitations arising from small samples sizes in many schools, particularly in small schools
    • Why the measures were individually and collectively valid for inclusion in the school report card (staff attendance?)
    • Why the things measured were sufficiently significant to be included (staff attendance?)
    • The quality (how current, comprehensive, and complete) of the data and the limitations on the data available
      • Staff attendance was measured as a percentage of total staff attendance in only two successive years.
      • Presumably a single staff member with a emerging chronic health problem could 'cause' a significant decrease in staff attendance.
    • The highly specific (narrow) nature of the data used in some measures.
      • Readiness for school was only measured for late Kinder students whereas readiness for school is an ongoing and daily for issue in relation to some students
    • The interaction of most of the indicators
      • attendance, retention, literacy, numeracy, student satisfaction, parent satisfaction and readiness for school all interact with each reinforcing the positive or negative effects that emerge for individual students.
    • What other measures were not included (and perhaps why)
      • The Report Cards did not include any information on the schools' (improving?) provision for students with special needs, disabilities, disorders... in the cohorts being reported.
      • Similarly they did not include the school's provision for students with behaviours of concern and for families in distress (thus requiring support) were not reported yet these are some of the major constraints on schools and on student and staff success and well-being.
      • The need to deal with problematic student behaviour is such that it determines aspects of the actual organisation of many schools. It may also consume a large proportion of the resources available ... resources that could be used to provide higher quality education.
      • Certainly problematic student behaviour is far more significant than staff attendance in every school with which I am familiar. And some limited data is available in this area. Why as it not included?
      • The report cards did not contain any contextual information related to, say, the percentages of students with additional needs (behavioural or special needs)
Since the School Report Cards will become the direct focus of initiatives to achieve real improvements, it is important that the data reported is comprehensive, valid and useful in relation to the overall success of the school, its students, staff and community. With questions over the current form (language, content...) of the Report Cards and the possibility that key data has been omitted, the next step must be to learn from this experience and act quickly on what that learning reveals.

There is clearly a need to improve the improvement process. By doing so, the system will model the very actions and strategies that it is hoping to promote in its schools. If it fails to do so, it runs the risk of alienating the very people upon whom it is dependent for achieving the improvements it desires.

Useful starting points for improving the improvement process might include
  • Collating, summarising and reporting the same data at various departmental levels: cluster; learning service, whole of system
  • Inviting schools to respond by reporting how, and to what extent, they make sense of their own report cards. As one recent corespondent wrote:

    "We spent some time y'day on our school report, personally I don't think they're going to be a big deal. It's too hard to draw worthwhile conclusion about your own school from them, let alone any real comparisons with other schools."

Wednesday, November 5, 2008

The performance- improvement trap

The Tasmanian School Report Cards strategy may have fallen into the performance - improvement trap in several ways.
The message communicated is always the message received. The Department/Government has attempted to report on school improvement but the School Report cards have been widely received as reporting on school performance. This makes sense, since student 'report cards have always reported student performance (albeit with some comments on improvement or otherwise).
The relationship between performance and improvement is an interesting one. While performance and improvement are directly related there are some subtleties requiring attention:
  • measures of performance are used to to calculate improvement (or otherwise)
  • measures of improvement do not indicate actual performance
  • isolated measures of performance do not indicate improvement
  • similar levels of performance may be part of very different degrees of improvement
  • similar degrees of improvement may be part of very different levels of improvement
  • the ease of improvement is (generally) inversely proportional to the level of performance
  • too few measures of performance may not provide valid indications of improvement
  • improvement is usually less likely (and more difficult) with better performances
  • a perfect performance will result in either no improvement or deterioration
  • poor performances are often very easy to improve
  • to understand performance and improvement one first needs to understand variation
    • there is always some variation in any system
    • some variation is a result of the system
    • some variation comes from outside the system
    • variation in performance may not be an indication of improvement or deterioration at all... just variation
  • one also needs to understand both change and improvement
Reporting performance is personal for those involved. In the book "The Greening of America" the author suggested that to 'assess another person is an act of violence'. Misreporting or misrepresenting performance is even more an 'act of violence'. The damage may be done to individuals, their confidence in themselves and each other and to their relationships.

The current School Report Cards strategy has already done considerable 'violence' in some schools. In its present form the costs involved are likely to greatly exceed the value added.

Repairing the damage done will not be simple. First impressions tend to last and some of the information contained in the Report Cards is clearly invalid or simply not relevant for particular students and their families. For example, I understand that a Kinder student who cannot stand on one leg for 10 seconds is deemed not ready for school. Does this mean that a student with cerebral palsy will never be ready for school? A school with significant improvement in an area may be rated Excellent while performing much less well than another school whose performance in the same area has declined slightly hence being rated as Trend.Down. The impression given by the Report Card contradict its intent. The gold medal goes to the athlete who wins the race not the one who achieves their personal best.

But simply correcting or discrediting the School Reports is a low level response strategy. It will be important to raise the level of the conversation around this matter. Quality data that is comprehensive, current, accurate and valid will be required but it will not be sufficient. And it is not yet clear that the data in the recent School Improvement Reports meets these criteria.


Fundamentally, it is about constructive, collaborative change management in order to achieve ongoing and sustainable improvement. Now there's a challenge!!

Tuesday, November 4, 2008

School performance - everyone's responsibility (no bystanders)

The recent Tasmanian School Improvement Report Cards are definitely about school performance. They were published online by the School Performance Services unit of the Department of Education.

And they were effectively league tables. The report cards were published in local newspapers as tables that facilitated comparisons between schools. This immediately confirmed the fears of many people involved with schools and contradicted the Minister's claims that the publishing the report cards would not result in league tables.

School improvement means improved school performance. But 'school performance' is actually the performance of all those involved: staff, students, their families, the community, the related professions, academia, the Department and the Government. It takes everyone working (and learning) together to make a school great.


Great schools help those involved to meet many of their needs, especially needs related to learning and being members of a community. In this sense, schools are best understood as purposeful communities in their own right. They are not 'numeracy and literacy factories' even though literacy and numeracy are very important.

In purposeful communities,

  • members interact on the basis of shared purposes (derived from shared values)
  • the community includes all those involved
  • roles provide some useful structure
  • everyday working relationships are the foundation for achievement
  • members contribute according to their respective capacities
  • responsibility and authority are dynamic
  • the community is connected beyond its immediate locality

I find the ideas implied by the School Improvement Reports somewhat confusing. In fact, I have been trying to analyse how 'schools' are being understood in this context. Who or what is a school? Is it a 'factory'? An institution? Is it the school staff? Is it the Department in a particular locality?


Measures such as staff, student, and parent satisfaction and staff attendance included in the School Improvement Report may imply that ‘the school’ is the Principal. But this is inconsistent with other measures such as student attendance, early school readiness and reporting to parents. School readiness and attendance are primarily outcomes of the family and reporting is highly prescribed by governments. I wonder if this inconsistency could be contributing to the current low levels of interest in principal positions?


The major educational initiatives in recent years have been focused on the structure of the Department, the curriculum, and assessment and reporting. It has been a long time (last century, in fact) since in-depth consideration was given to the nature of schools. This is not surprising given the current dominance of psychological thinking in both education and management. The nature and performance of schools also need to be considered in sociological terms - something sadly lacking in the current context.


What next? Clarifying the above issues will be a major challenge and a genuine opportunity for real gains. Failing to accept the challenge is likely to result in even greater polarisation of positions.

It would be easy to overlook the implications of using data extensively. Using data can be a double edged sword. Data does not have any meaning in its own right. The task is to construct useful knowledge from valid data and this means:

  • checking concepts and assumptions (see above)
  • understanding the current cultural and historical context in relation to the data
  • using these to making sense of the data in order to construct the knowledge required for
  • developing responses (actions and arrangements) that are likely to achieve sustainable improvements

And there are no guarantees. The task of improving school performance is not an engineering task – cause and effect are not often consistent over time and place. Similarly solutions may not be directly connected to the causes of problems. Contrary to everyday thinking, what works well in one school may not be all that useful in another. School Report Cards may be dramatically different but it is not always clear which school deserves the greater recognition for its actual achievements ('performance').


School improvement can only be achieved one school at a time and it takes everyone involved, working together to see that it happens. No-one can be a bystander.