Article Information Authors: Richard J. Stirzaker1 Dirk J. Roux2 Harry C. Biggs3 Affiliations: 1CSIRO Land and Water, Canberra, Australia 2South African National Parks, George, South Africa 3South African National Parks, Skukuza, South Africa Correspondence to: Richard Stirzaker Email: richard.stirzaker@csiro.au Postal address: PO Box 1666, ACT 2601, Australia Dates: Received: 04 June 2010 Accepted: 28 Nov. 2010 Published: 13 May 2011 How to cite this article: Stirzaker, R.J., Roux, D.J. & Biggs, H.C., 2011, ‘Learning to bridge the gap between adaptive management and organisational culture’, Koedoe 53(2), Art. #1007, 6 pages. doi:10.4102/koedoe.v53i2.1007 Copyright Notice: © 2011. The Authors. Licensee: OpenJournals Publishing. This work is licensed under the Creative Commons Attribution License. ISSN: 0075-6458 (print) ISSN: 2071-0791 (online) Learning to bridge the gap between adaptive management and organisational culture In This Essay... Open Access • Abstract • Introduction • Science and certainty • Complex and ‘knowable’ systems • Adaptive management • Adaptive management and organisational culture • Learning • Conclusion • References Abstract (Back to top) Adaptive management is the problem-solving approach of choice proposed for complex and multistakeholder environments, which are, at best, only partly predictable. We discuss the implications of this approach as applicable to scientists, who have to overcome certain entrained behaviour patterns in order to participate effectively in an adaptive management process. The challenge does not end there. Scientists and managers soon discover that an adaptive management approach does not only challenge conventional scientific and management behaviour but also clashes with contemporary organisational culture. We explore the shortcomings and requirements of organisations with regard to enabling adaptive management. Our overall conclusion relates to whether organisations are learning-centred or not. Do we continue to filter out unfamiliar information which does not fit our world view and avoid situations where we might fail, or do we use new and challenging situations to reframe the question and prepare ourselves for continued learning? Conservation implications: For an organisation to effectively embrace adaptive management, its mangers and scientists may first have to adapt their own beliefs regarding their respective roles. Instead of seeking certainty for guiding decisions, managers and scientists should acknowledge a degree of uncertainty inherent to complex social and ecological systems and seek to learn from the patterns emerging from every decision and action. The required organisational culture is one of ongoing and purposeful learning with all relevant stakeholders. Such a learning culture is often talked about but rarely practised in the organisational environment. Introduction (Back to top) Pursuing the life sciences is as much a calling as it is a career. We are drawn to a science course at university through some combination of a fascination with the living world and a desire to use natural resources wisely. The renowned biologist Edward O. Wilson says that at age eighteen he saw science as the study of ants, frogs and snakes and a wonderful excuse to stay outdoors (Wilson 1998). For many of us, the objective view of the world that science provides appealed more than the literature and history classes we took at school. During university-level practicals, students have to solve problems to which objective answers exist. For example, how much chitin does a crab shell contain? Or, what is the identity of the bacteria in the broth? What is the change in momentum when two bodies collide? The tutor already knows the correct answer and if the experiment is performed properly, the students should get it right too. However, there may be several different interpretations to the meaning of a poem, which can all be apparently acceptable. The scientific method gives one correct result, which is independent of the observer. Modern science has built this predictive capability on four essential components. Firstly, one states a clear hypothesis about cause and effect that is testable by experiment. Secondly, one designs an experiment that tests a prediction emanating from the hypothesis in a controlled environment. Thirdly, one replicates the experiment to show that the observations are not the result of a chance event. Fourthly, the work is documented and subjected to peer review before the new knowledge can serve as building blocks for further investigations. The ability to tease apart a system into its constituent components and study it systematically allows scientists to infer cause and effect. Such reductionism is said to be the primary and essential activity of scientific research (Wilson 1998). At the outset of this essay, we offer a very brief description of organisational culture as it relates to science, our interest in adaptive management, and why we foresee an uneasy relationship. Culture consists of bundles of shared norms, which are behaviours common to a group. Norms give a group a sense of cohesion and protection against undesirable change, but can simultaneously cause a group to resist new, potentially beneficial ideas (Ehrlich & Levin 2005). Many scientists share a culture centred around the robust methodology that underpins their craft: the production of reliable information and the ability to make predictions, quantify uncertainty and expose error. Scientific thinking has produced knowledge and products that have contributed enormously to economic development and social upliftment over the last 300 years. Scientists not exposed to further study of the humanities, or even an introductory course on the philosophy of science, are often surprised by growing criticism of the very norms most of them regard as self-evident. Even if it were valid to pass off some of these critics as postmodern deconstructionists, scientists still have to heed a call from within their own community about the need to think differently about contemporary issues facing society (Lubchenko 2008; Steffen, Crutzen & McNeill 2007; Walker & Salt 2006). For example, Ulanowicz (2009:93) writes that ‘it is indeed feasible to march directly into the jaws of oblivion on the tacit assumptions that support conventional science’. Holling (1995:29) warns of the ‘pathology’ that emanates from scientists’ belief that they fully understand cause and effect – that success in managing one target variable in isolation leads to ‘less resilient and more vulnerable ecosystems, more rigid and unresponsive management agencies, and more dependent societies’. It is not our purpose here to provide a critique of reductionist science; suffice it to say that we sympathise with the view of Ulanowicz (2009) that biology and ecology are not entirely reducible to physics. Life scientists cannot fully explain the world from the bottom up and therefore are interested in finding approaches to managing ecological systems that accept and act on this understanding. Moreover, since human systems interact with biological systems in diverse ways we propose that adaptive management is the problem-solving approach of choice for environments that are highly changeable, heterogeneous and often unpredictable, and usually involve multistakeholder interest. But we also anticipate that its implementation will frequently conflict with aspects of our own science culture and that of the organisations where we work. Science and certainty (Back to top) Modern society’s knowledge of the physical sciences is so well developed in many areas that we put our total trust in its products. Most of us expect an aeroplane to get us safely to our destination even if we do not understand all the intricacies involved in the process. One reason for this certainty is that we can be reasonably sure about the boundaries of the system. We believe that the engineers, pilots and air traffic controllers know which factors they need to understand well, and which they can ignore. The biological sciences present more of a challenge because the boundaries of a system are not always clear. An experiment is performed in ‘controlled’ conditions, yet it is impossible to exclude all factors extraneous to the hypothesis owing to practical constraints often associated with larger-scale experiments. Not all of us can fumigate small mangrove islands to study the re-colonisation process (Simberloff & Wilson 1969). Statistical techniques provide an objective means for identifying cause and effect in controlled experiments and authors of scientific publications know that reviewers start to object when one extrapolates findings too far from the experimental conditions under which the data were collected. The new knowledge holds only when other factors are excluded; that is, scientists’ claims are valid within certain defined boundary conditions. By speculating outside the narrow range of the measured data or the conditions in which they were collected, one can no longer claim the same certainty. In other words, our knowledge is obtained by framing or constraining the problem. What does this mean for understanding cause and effect in biological or ecological research? It is tempting to believe that the problem of boundary conditions can be overcome just by doing more experiments until all the combinations and permutations have been exhausted. This view is problematic. As already discussed, replicating experiments under controlled conditions is extremely difficult at ecosystem level, and has even less relevance for the action research of the social disciplines (Rogers 2006). But the problem may lie even deeper. Walters and Holling (1990) state that our ecological knowledge is not only incomplete but also elusive, particularly when we consider the range of values held by different groups in society and the political constraints to action. Gallopin et al. (2001) refer to a degree of ‘irreducible uncertainty’ associated with complex social–ecological systems. Complex and ‘knowable’ systems (Back to top) In order to test the claim of ‘irreducible uncertainty’, one needs to differentiate between different types of p roblem and be able to infer cause and effect accordingly. Snowden (2002) categorises problems into those that may be difficult or complicated but are ultimately ‘knowable’ through reductionist scientific method, and problems that are complex and always characterised by an inherent degree of uncertainty. For example, we can categorise an aeroplane as a complicated but knowable machine. Although it is made of thousands of different parts, the function of each in relation to another is understood. Apart from the most extreme conditions, aeroplanes behave in a predictable way, and we trust them with our lives. Human systems tend to be complex: the people that make up the system change and the way they relate to one another is highly context dependent and therefore not entirely predictable. Snowden gives the following example: Consider what happens in an organisation when a rumour of reorganisation surfaces: the complex human system starts to mutate and change in unknowable ways; new patterns form in anticipation of the event. On the other hand, if you walk up to an aircraft with a box of tools in your hand, nothing changes. (Snowden 2005:105) One can accept the notion of an aeroplane being complicated but knowable, while a human system is very unpredictable, but what about biology and ecology? Surely biological systems are subject to natural laws that give the system some form of deterministic behaviour? Ulanowicz (2009) cautions that the idea of tight cause and effect in open systems should be set aside in preference for Poppers’ view of ‘propensities’. A propensity downgrades cause and effect to a more general likelihood of one factor influencing another. More importantly, it is the combined effect of several propensities acting together on the whole system that facilitates unique and sometimes surprising behaviour of a specific system. In Table 1 we distinguish between problems that are complicated but knowable and those that are complex (informed partly by Snowden and Boone 2007). The complexity of the problems in the right-hand column of Table 1 is due to feedbacks, thresholds and, often, nonlinear interactions within the system, together with lags and cross-scale effects. Such factors combine to give the system a degree of uncertainty. An example of feedback is the reinforcement of terrestrial warming as ice sheets melt, because there is less ice to reflect the sun’s energy. An example of a threshold effect is the rapid switch from savannah vegetation to woodland thicket once perennial grasses that support fire reach a critically low density (Walker & Meyers 2004). Nonlinear interaction means that a small change in one factor can have a big effect somewhere else. When there are multiple such interactions it becomes practically impossible to keep track of all the causal relationships. TABLE 1: Some characteristics of knowable and complex problems. Complex systems also have so-called emergent properties. An analogy might be ‘team spirit’. The statistical attributes of each member of the team may be known, but the joint interaction of the team sets up a dynamic that strongly affects how the team performs as a unit. In other words, there are mutually beneficial interactions between players that give the team its unique character. If we disassemble the team to study each player’s attributes in detail, the team spirit, or emergent property, disappears. Adaptive management (Back to top) Feedbacks, thresholds, multiple nonlinear interactions, lags, chance events and emergent properties contribute to a general uncertainty about cause and effect, and, consequently, the impact of our management actions. This realisation led to the development of the field of adaptive management (Holling 1978; Lee 1993; Walters 1986). Its fundamental premise is that the puzzle of a social–ecological system can never be fully solved by studying the pieces. We have to use real-life management of the system as a whole and turn it into an experiment by asking the right questions, implementing decisions, collecting the right data and learning from the experience. The emphasis is on formulating an explicit mental model, however imperfect, and then acting accordingly by managing and monitoring to see how our understanding can be improved as we gain further insight of the system. Furthermore, many of the ecological problems we face are as much a controversy over values as disputes about cause and effect. This challenges the positivist view of science, which regards science as the principal producer of reliable knowledge that should be passed on to those with a management responsibility (Ziman 2000). Broader society now demands that their local and experiential knowledge, as well as their values, be considered in management plans. Therefore, when both the facts are uncertain and their interpretation is contested, we need an approach that can integrate knowledge from different sources and treat management activities as experiments from which we can learn. Adaptive management is a way of getting around the dilemma of delaying decisions until we fully understand all the potential consequences of our actions. The act of management is itself an experiment, but clearly not in the traditional sense of controls and replication. To distinguish adaptive management from simple trial and error, considerable effort should be put into integrating existing information from different disciplines and perspectives. Appropriate models should be used to frame the questions, eliminate the least likely solutions and identify the knowledge gaps (Stankey, Clark & Bormann 2005; Walters 1997). Monitoring is a central issue. Adaptive management needs an intellectual paper trail to show that reasoning underlies the actions – we cannot learn without this (Lee 1993; Venter et al. 2008). The conceptual framework, whether represented by a simple diagram or sophisticated model, should be matched by the amount of effort put into monitoring. Unfortunately, monitoring is much more expensive than modelling, which can create tension between researchers and funders or managers. Often the true value of monitoring will be reaped only by the next generation of scientists; a problem for those responsible for paying the bills now (Walters 1997). Conversely, it is the responsibility of scientists to identify variables that are most likely to be indicative of system behaviour. This may involve identification of integral measures that remove much of the ‘noise’ and variables that warn in advance of an approaching threshold (Stirzaker et al. 2010). We simply cannot measure everything. According to Stankey et al. (2005) the principles of adaptive management are widely acclaimed, but remain more an ideal than a demonstrated reality. One of the several reasons Rogers, Roux and Biggs (2000) cite for this is that the new way of operating does not comply with the old organisational culture with an authoritarian structure. Moreover, when investigating large multidisciplinary problems we get overloaded with information and often experience ‘turf protection’ among scientists and between scientists and managers. This raises a new question: if issues that we once saw as ‘knowable’ are in fact complex and demand a radically different problem-solving approach, do we also need to think through the ways our organisations operate? Adaptive management and organisational culture (Back to top) Scientists responding to the challenge of living with more uncertainty can find their organisations moving in the opposite direction. The conventional view of curiosity-driven research that leads to new findings, beneficial applications and tools for the improvement of human welfare (Ravetz 2004) has given way to formal methods of planning and accountability. Science no longer has special status in a government’s budget. The case for investmentin science must be carefully argued with explicit costs and benefits, with timelines to show what will be delivered when and by whom. Organisational culture and adaptive management are likely to clash, at least initially, on several fronts. Firstly, scientists are expected to produce the knowledge that managers need to make informed decisions. A focus on inherent unpredictability seems to be undermining the foundation of this social contract. Secondly, organisations spend considerable time streamlining their portfolio of work into ‘manageable’ units aligned to corporate goals, whereas adaptive management can be a messy web of relationships encompassing scientific, social and political perspectives. Thirdly, adaptive management requires us to be open to learning things that may be counter to the way we normally operate. It requires a level of flexibility that challenges the way that things have always been done. As scientific organisations strain under the pressure to adopt a more overt ‘business principles’ approach, there is a greater focus on specifying outputs of programmes well in advance and minimising risk at all stages of the project. Reporting on project deliverables and milestones is required on shorter timescales, which makes it easier to follow the contract than to follow up surprises. Whereas accountability and risk management are obviously important, rigid management systems run counter to the nature of adaptive management. Organisations certainly need to balance order with creativity, but the desire for certainty and control can overwhelm the desire to nurture the flexible learning approach required for adaptive management. According to Wheatley (2005), organisations should resist the notion that there is some optimum structure that will deliver results: If a system becomes too homogenous, it becomes vulnerable to environmental shifts. If one form is dominant, and that form no longer works when the environment shifts, the entire system can collapse. … If leaders fail to encourage diverse ways of doings things, they destroy the system’s capacity to adapt. (Wheatley 2005:78) If, for example, we lose the balance between flexibility and responsibility and opt for an overly planned and rigid system, we may find we spend more time interacting with the organisation and less with the real world. At the other end of the spectrum the organisation may become too chaotic when everyone follows their own plans. People do not learn from their mistakes and bad behaviour is not brought to account. Both extremes are counter-productive. Wheatley (2003:39) writes: ’If a system has too much order, it atrophies and dies. Yet if it lives in chaos, it has no memory.’ When science is seen as just another business, goals such as improving efficiency are accepted without question. This sounds sensible at first, but there is evidence that targeting efficiency as the prime goal can destroy the very thing we are trying to manage (Walker & Salt 2006). Rogers et al. (2000) cite the example of a ‘command-and-control’ approach to managing water resources. What starts out as efficient delivery of services easily spills over into exploitation when the focus is too narrow and agencies take too long to respond to feedbacks. Cilliers (2006:109) argues that the value of organisations lies in their ability to be ’stable enough not to be buffeted around by every fluctuation, [and] … flexible enough to be able to adapt when necessary’. He advises against a culture where ’speed is linked with efficiency, and has become a virtue in itself.’ Our real identity is forged when we are able to reflect adequately on our experience, and based on that reflection, to resist certain change (Cilliers 2006). Yet, when the pressure is on to deliver results too quickly, we are more likely to react than to reflect. Learning (Back to top) If an organisation is going to embrace adaptive management, it will have to learn to do things differently. Stankey et al. (2005) propose that we normally learn by accumulating new facts, but our understanding moves ahead in leaps when we ask new questions and see the old facts in a new light. This is critical for adaptive management. Instead of filtering out information that is unfamiliar or does not fit our world view, or avoiding situations where we might fail, we use challenging situations to reframe the question and prepare ourselves for learning new things. Garvin (1993) defines a learning organisation as one ‘skilled at creating, acquiring, and transferring knowledge, and at modifying its behaviour to reflect new knowledge and insights.’ Modifying behaviour is difficult and should involve three overlapping phases. • Cognitive: Members of the organisation are exposed to new ideas, expand their knowledge, and begin to think differently. • Behavioural: Employees begin to internalise new knowledge and alter their behaviour (as manifested, for example, in the use of new vocabulary). • Performance improvement: Changes in behaviour lead to measurable improvements in results or outcomes. Thus far we have dealt with the cognitive aspects of adaptive management: the idea that we should distinguish complex systems from those that may be difficult but ultimately able to be fully understood (Table 1). We have also addressed the early behavioural changes: adopting a new language that goes beyond the use of new vocabulary to determine what the new concepts really mean in a specific context and how they can inform our approach to science. The third step – applying the new concepts in the real world – is, however, the major stumbling block. Of course, if there is no performance improvement the ‘new’ behaviour will be challenged and will most likely be replaced by a next wave of ideas, or the organisation will default to its old ways. Learning is the mechanism through which we change our individual and collective understanding of our world. New knowledge enables us to respond differently to new circumstances and challenges. The rate and relevance of our learning will, in effect, determine our ability to respond to external changes effectively. In this sense, learning proficiency relates to what we should learn about (and what we should forget), who we should learn with, and how we should learn. To accommodate new knowledge, previous learning and beliefs sometimes have to be left behind. Selectively ‘forgetting’ outdated knowledge is referred to as unlearning (Becker 2005). However, unlearning may not be a straightforward or easily manageable activity. Individuals (often unknowingly) protect existing knowledge by actively disregarding conflicting information (Lyndon 1989). It appears that more recently acquired knowledge is easier to relinquish than knowledge that was acquired and reinforced over an extended period of time. Experts may be especially susceptible to ‘trained incapacity’ (Miller & Morris 1999): the more someone’s knowledge is shaped by learning within a defined field, the harder it becomes to associate with knowledge that emerges from other fields. Environmental issues inevitably imply the involvement of multiple stakeholders. Therefore, life scientists need to be prepared to learn together from diverse sources (Keen, Brown & Dyball 2005; Pahl-Wostl & Hare 2004). We should not only settle for compromise but strive for a consensus that can distribute the benefits and costs of our interventions in a way that is (1) equitable and (2) within the ecological limits (Rogers 2006). Compromise involves trading off conflicting demands against those who hold a contrary position, creating winners and losers. The approach of seeking consensus is about moving beyond the problem and ‘developing a set of shared values that guide future decision making’ (Rogers 2006). Ongoing learning is uncomfortable. It is much easier to believe we already know most of what we need to know. If we feel overwhelmed by new information our dominant learning mode is reactive and we tend to reinforce pre-established knowledge and frames of reference. Scientists need to perceive their working environments as safe to envisage alternative futures and to learn along new and dynamic trajectories towards such futures (Senge et al. 2005). Furthermore, to learn with other parties who may hold very different world views, requires us to learn with empathy and humility. To be empathetic means to consider different perspectives and assumptions, temporarily suspending our own in the process, so that we can inquire into the reasons for people’s views (Senge et al. 1999). In this sense, humility means acknowledging that the knowledge base in any given field is too vast for a single person to master. Even the expert’s knowledge is only a partial reflection of what is known. However, by combining one’s partial knowledge with that of others, one can, in practice, use more knowledge than one’s own (Wenger 2005). Conclusion (Back to top) Successful adaptive management in a multistakeholder context rests on three pillars, namely the ability to form a robust, shared conceptualisation, the ability to monitor key variables that will shed light upon this conceptualisation, and the ability to learn from the experience. If any of these are compromised, the structure will collapse. It is easy to be so enamoured by the conceptualisation of the problem that we fail to invest in thorough monitoring, and equally easy to keep collecting data without knowing how the knowledge will be used. The test is: are we still learning and can we document our learning journey (Venter et al. 2008)? Learning is never quick or easy, and involves travelling along detours and going down blind alleys. When disillusioned, scientists should avoid the trap of falling back into the old pattern of over-promising and under-delivering as they proffer ‘silver bullet’ solutions for complex problems to those who control the purse strings. Similarly, organisations should resist defaulting to command-and-control systems that appear to have delivered some certainty in the past. Organisations will have to find and foster the champions of adaptive learning, including the visionary activist, the respected integrator and the rebel bureaucrat (Gunderson, Holling & Light 1995). There will need to be a shift from the view that benefits come from power or withholding information and ideas, to one where benefits come from sharing, and there are clear incentives to reflect this. Learning together rather than competing against one another is absolutely central. Science culture has been forged in a competitive environment – competition for best ideas to secure limited funding, competition for space in top journals, and even a league table of citation metrics that purport to show how useful our work has been. Learning should involve exploring, discovering, reflecting, listening and sharing frustrations and surprises. Managers, scientists and stakeholders need to see themselves as part of the same community, where benefits and risks are shared within the context of a shared vision. Rogers and Breen leave us with the core challenge: Perhaps the most important lesson ecologists should learn is not to enter the new social theatre as ’experts’ (Ludwig 2001), but as co-learners, interactive players seeking consensus on stage. For some ecologists, and for ecology as a science, this transition will certainly be difficult. We will judge success by a shift from research outputs that impress peers to outcomes that allow society to better respond to environmental challenges. (Rogers & Breen 2003:50) References (Back to top) Becker, K., 2005, ‘Individual and organisational unlearning: directions for future research’, International Journal of Organisational Behaviour 9, 659−670. Cilliers, P., 2006, ‘On the importance of a certain slowness’, Emergence: Complexity and Organizations 8(3), 106–113. Ehrlich, P.R. & Levin, S.A., 2005, ‘The evolution of norms’, PLoS Biology 3, 943−948. doi:10.1371/journal.pbio.0030194, PMid:15941355, PMid:1149491 Gallopin, G.C., Funtowicz, S., O’Connor, M. & Ravetz, J., 2001, ‘Science for the twenty first century: from social contract to the scientific core’, International Journal of Social Science 168, 219−229. doi:10.1111/1468-2451.00311 Garvin, D.A., 1993, ‘Building a learning organisation’, Harvard Business Review July-August, 78−91. Gunderson, L.H., Holling, C.S. & Light, S.S., 1995, ‘Barriers broken and bridges built: a synthesis’, in L.H. Gunderson, C.S. Holling & S.S. Light (eds.), Barriers and bridges to the renewal of ecosystems and institutions, pp. 489–532, Columbia University Press, New York. Holling, C.S., 1978, Adaptive environmental assessment and management, John Wiley, London. Holling, C.S., 1995, ‘What barriers? What bridges?’, in L.H. Gunderson, C.S. Holling & S.S. Light (eds.), Barriers and bridges to the renewal of ecosystems and institutions, pp. 3– 34, Columbia University Press, New York. Keen, M., Brown, V.A. & Dyball, R., 2005, ‘Social learning: a new approach to environmental management’, in M. Keen, V.A. Brown & R. Dyball (eds.), Social learning in environmental management: Towards a sustainable future, pp. 1−21, Earthscan, London. Lee, K.N., 1993, Compass and gyroscope. Integrating science and politics for the environment, Island Press, London. Lubchenco, J., 2008, ‘Entering the century of the environment: A new social contract for science’, Science 279, 491−496. doi:10.1126/science.279.5350.491 Ludwig, D., 2001, ‘The era of management is over’, Ecosystems 4, 758–764. doi:10.1007/s10021-001-0044-x Lyndon, H., 1989, ‘I did it my way! An introduction to “old way/new way” methodology’, Australasian Journal of Special Education 13, 32−37. doi:10.1080/1030011890130107 Miller, W.L. & Morris, L., 1999, Fourth Generation R&D: managing knowledge, technology, and innovation, John Wiley & Sons, New York. Pahl-Wostl, C. & Hare, M., 2004, ‘Processes of social learning in integrated resources management’, Journal of Community and Applied Social Psychology 14, 193−206. doi:10.1002/casp.774 Ravetz, J., 2004, ‘The post-normal science of precaution’, Futures 36, 347–357. doi:10.1016/S0016-3287(03)00160-5 Rogers, K.H., 2006, ‘The real river management challenge: Integrating scientists, stakeholders and service agencies’, River Research and Applications 22, 269–280. doi:10.1002/rra.910 Rogers, K.H. & Breen, C.M., 2003, ‘The ecology-policy interface’, Frontiers in Ecology and the Environment 1, 49–50. doi:10.1890/1540-9295(2003)001[0050:TEPI]2.0.CO;2, doi:10.1890/1540-9295(2003)001[0049:TEPI]2.0.CO;2 Rogers, K., Roux, D. & Biggs, H., 2000, ‘Challenges for catchment management agencies: Lessons from bureaucracies, business and resource management’, Water SA 26, 505–511. Senge, P., Kleiner, A., Roberts, C., Ross, R. & Smith, B., 1999, A fifth disciple resource: the dance of change, Nicolas Brealey, London. Senge, P., Scharmer, C.O., Jaworski, J. & Flowers, B.S., 2005, Presence: exploring profound change in people, organizations and society, Nicholas Brealey, London. Simberloff, D. & Wilson, E.O., 1969, ‘Experimental zoogeography of islands: The colonization of empty islands’, Ecology 50, 278−296. doi:10.2307/1934856 Snowden, D., 2002, ‘Complex acts of knowing: paradox and descriptive self-awareness’, Journal of Knowledge Management 6, 100−111. doi:10.1108/13673270210424639 Snowden, D.J. & Boone, M.E., 2007, ‘A leader’s framework for decision-making’, Harvard Business Review November, 69–76. Stankey, G.H., Clark, R.N. & Bormann, B.T., 2005, Adaptive management of natural resources: theory, concepts, and management institutions, Department of Agriculture, Forest Service, Pacific Northwest Research Station, Portland, OR. Steffen, W., Crutzen, P.J. & McNeill, J.R., 2007, ‘The Anthropocene: are humans now overwhelming the great forces of nature?’, Ambio 36, 614−621. doi:10.1579/0044-7447(2007)36[614:TAAHNO]2.0.CO;2 Stirzaker, R.J., Biggs, H.C., Roux, D.J. & Cilliers, P., 2010, ‘Requisite simplicities to help negotiate complex problems’, Ambio 39, 600−607. doi:10.1007/s13280-010-0075-7, PMid:21141779 Ulanowicz, R.E., 2009, A third window: Natural life beyond Newton and Darwin, Templeton Foundation Press, Indiana University. Venter, F.J., Naiman, R.J., Biggs, H.C. & Pienaar, D.J., 2008, ‘The evolution of conservation management philosophy: Science, environmental change and social adjustments in Kruger National Park’, Ecosystems 11, 173–192. doi:10.1007/s10021-007-9116-x Walker, B.H. & Meyers, J.A., 2004, ‘Thresholds in ecological and social–ecological systems: a developing database’, Ecology and Society 9(2), viewed 18 February 2011, from http://www.ecologyandsociety.org/vol9/iss2/art3 Walker, B.H. & Salt, D.A., 2006, Resilience thinking: sustaining ecosystems and people in a changing world, Island Press, Washington DC. Walters, C., 1986, Adaptive management of renewable resources, Macmillan, New York. Walters, C., 1997, ‘Challenges in adaptive management of riparian and coastal ecosystems’, Conservation Ecology 1(2), 1, viewed 01 November 2010, from http://www.consecol.org/vol1/iss2/art1/ Walters, C.J. & Holling, C.S., 1990, ‘Large-scale management experiments and learning by doing’, Ecology 71, 2060–2068. doi:10.2307/1938620 Wheatley, M.J., 2005, Finding our way: Leadership for an uncertain time, Berrett-Koehler, San Francisco. Wenger, E., 2005, Learning for a small planet: a research agenda, viewed 15 October 2010, from http://www.ewenger.com/ Wilson, E.O., 1998, Consilience: the unity of knowledge, Vintage Books, New York. Ziman, J., 2000, Real Science: What it is, and what it means, Cambridge University Press, Cambridge. doi:10.1017/CBO9780511541391