Decision Support and Algorithmic Area Prioritization

Two years later, I arrived at the University of Texas with the intent of developing and testing protocols for systematic conservation planning. Our first decision support software tool was ResNet which incorporated methods that had been used to identify the ways in which the conservation area network of Quebec could be improved (Sarakinos et al. 2001). The recommendations were partly implemented insofar as some of the new areas we prioritized for conservation were designated for that purpose. However, the extent to which our proposals were explicitly used has never been clear to me. The likely scenario is that they were made part of the recommendations developed by The Nature Conservancy, which had provided us with much of our data in return for our results. However, there was no explicit acknowledgment of systematic conservation planning by the Quebec authorities.

ResNet was originally developed jointly with Anshu Aggarwal who had worked with me at Boston University in the early 1990s and had continued to help in software development for various research projects. In Texas, Justin Garson, a philosophy graduate student, was responsible for many extensions and revisions. Garson also worked on developing a suite of other software decision tools for biodiversity conservation (Sarkar et al. 2005). Another philosophy graduate student, Chris Pappas, made further improvements to these decision tools. Trevon Fuller, who began as a philosophy graduate student but later switched to biology, developed software to optimize spatial connectivity between conservation areas (Fuller and Sarkar 2006).

Meanwhile, Margules and I continued our collaboration to write the first textbook of systematic conservation planning (Margules and Sarkar 2007). Our methods were adopted and used by the laboratory of Victor Sánchez-Cordero at the Instituto de Biología of the Universidad Nacional Autónoma de México (UNAM). Around this time Margules left CSIRO to head the Asia-Pacific Division of Conservation International (CI), which led to the possibility that our methodologies would find use in optimizing conservation decisions in the field, something that had, at best, only been partly achieved in Quebec. The planning exercises in México as well as the CI-sponsored one from Indonesia (the case study of this chapter) are important because they provide feedback from explicit attempts at using philosophically-based decision theory in practical conservation contexts.

Values and Multiple Criteria

The software decision support tools that we and others were developing implemented algorithms to solve complex computational problems. As I have pointed out elsewhere (Sarkar 2012a), much of the theoretical work in this part of conservation biology' at the time consisted of algorithm design. In the 1990s, the algorithms that had been developed by conservation biologists were largely restricted to attempts to identify the smallest possible area for conservation that would ensure adequate protection for biodiversity. There were two versions of this problem. Both required that quantitative targets be set for each biodiversity feature such as species or ecosystem to be conserved. The first “minimum area” problem asks that all such features be included, up to their targets, in a set of conservation areas in as small a prioritized area as possible. The second “maximum representation” problem asks that as much of the features (up to their targets) be included as possible within a fixed budget constraint. It turned out that the first problem was much easier to solve than the second. ResNet resolved typical data sets in a matter of seconds, as did many other decision support tools for systematic conservation planning (including С-Plan, Marxan, Target, and WorldMap).

Between 1999 and 2010, I routinely taught systematic conservation planning both in Integrative Biology and in environmental philosophy courses at the University of Texas. In these classes, the conceptual framework as well as the methodologies of systematic conservation planning were subjected to relentless philosophical scrutiny. This attention led to three interesting innovations in how we conceived of the methodology' of systematic conservation planning. First, students were fast to point out that quantitative targets for the inclusion of biodiversity features such as species were arbitrary' insofar as that they had no credible basis in science. Should 10 percent of the habitat of a species be conserved? Or 15 percent? There was no ecological criterion that decided such choices. (This problem was also used by' conservation biologists to criticize systematic conservation planning [e.g., Soule and Sanjayan 1998].)

Class discussions led to the realization that these targets reflected normative societal judgments about acceptable risk and were very' similar to judgments about how to categorize risk for species, namely, when they should be labeled as “endangered,” “threatened,” and so on. These are normative value judgments—what risk we, as a society, find acceptable and to what extent. Our response in developing software was two-fold. We enabled—and encouraged— the exploration of a variety of possible target sets. We were explicit in noting that the choice of targets should be made through deliberation by' stakeholders making conservation decisions. As time went on, I began emphasizing that we were devising decision support tools, not decision making tools. In an introductory' text on environmental philosophy' that I published at the time (Sarkar 2012b) I tried to bring these conceptual problems to the attention of philosophers. The point that I tried to emphasize is that philosophy of science has much to contribute to the construction of a satisfactory' framework for systematic conservation planning.

Second, though we had realized that the maximum representation problem was computationally more complicated than the minimum area problem, we had implicitly thought of them as “conversely” related to each other—in the mathematical terminology of computer science, as “dual” problems. We now came to realize that this was not the case, because of the ambiguity in the phrase “as much of the features.” For instance, should we maximize the number of features that met their target? Or the extent to which all of them met their targets? There was no good reason to treat any one of the formulations as necessarily being the correct one for all contexts. Our software began to offer multiple options. The ResNet family of programs was not sophisticated enough for addressing all these options. A new approach to software was needed, and Pappas was the first to propose that we turn to a new family of metaheuristic algorithms, most notably, tabu search. (Tabu search is a metaheuristic algorithm for optimization.)

The third problem was much more serious; for me, it brought into question much of the work we had been doing in systematic conservation planning, which had originally appealed to me because of my dissatisfaction with Soule’s framework for conservation biology'—especially his treatment of normative questions by rejecting human values as irrelevant or illegitimate. Yet, we had not broached these values at all in all the protocols for conservation decisions we had developed. This sense of failure was aggravated by a realization that, if any specialty should be particularly adept at the kind of normative analysis that was being called for, it should be philosophy.

Once again, I turned to the literature on decision theory. By 2002 my laboratory' had begun a systematic review of the relevant methodologies that were scattered across the economics, operations research, and philosophical literature, often classified under acronyms such as MCDM (multiple criteria decision making). The problem now was the wide variety of methods that were available. Working with me, a philosophy graduate student published a critical review with recommendations for use by conservation biologists (Moffett and Sarkar 2006). Garson and I developed a protocol in which ResNet would be used to generate a portfolio of scenarios which were all adequate for biodiversity representation. We then subjected these scenarios to multi-criteria analysis using Dominance, that is, retaining only non-dominated (or Pareto-optimal) scenarios. When this strategy still left many scenarios as acceptable, we recommended deliberation (Sarkar and Garson 2004; Sarkar 2012b). The philosophical problems raised by complex decisions led me to recommend rational deliberation among stakeholders to the fullest possible extent before the use of formal methods.

With regard to techniques of multi-criteria analysis, Jim Dyer at the University of Texas convinced me that the only reasonable one to use beyond Dominance is multi-attribute value theory (MAVT). This was the only method that was fully consistent with standard utility theory. But its use also required the satisfaction of some subtle conditions on how problems must be formulated. It became clear that trained decision analysts would be needed to advise decision makers in the field. We began large training exercises. Some of the largest were held at UNAM in Mexico City in 2007 and 2008, and several groups in Mexico became the first to use these methods. Meanwhile, Michael Ciarleglio, an applied mathematics graduate student at the University of Texas, developed ConsNet, a software package based on tabu search that supported multi-criteria decisions (Ciarleglio et al. 2009a). This was the software package that we subsequently used in both Mexico and in the Indonesian case that will be discussed next. Developing the package and elucidating a protocol for it required collaboration between economists, mathematicians, and computer scientists besides philosophers and ecologists—this is what it took to get our work into the field.

< Prev   CONTENTS   Source   Next >