Keywords
monitoring, optimal learning, value of information, adaptive management
Start Date
1-7-2010 12:00 AM
Abstract
Environmental monitoring programs can improve management over time, butgenerally require that correspondingly less time and money be put into direct restorationefforts such as revegetation or dam removal. Thus, budget constraints compelenvironmental managers to make difficult decisions regarding the allocation of scarcefunds and personnel between environmental monitoring and environmental restoration.Among other factors, the best allocation of resources between monitoring and restoration—or, more generally, learning and doing—will depend on the quality of information availablefrom a monitoring program. This paper demonstrates the application of the partiallyobservable Markov decision process (POMDP) as a framework for investigating theoptimal intensity of monitoring given stochastic state dynamics and imperfect observationson state variables. Specifically, the paper addresses the problem of choosing among a setof available monitoring protocols that differ in their costs and the type of information theyprovide. An empirical application of the model to erosion control in California watershedsdemonstrates the utility of the resulting decision policy as well as limitations to theapproach.
Decision Support for Environmental Monitoring and Restoration: Application of the Partially Observable Markov Decision Process
Environmental monitoring programs can improve management over time, butgenerally require that correspondingly less time and money be put into direct restorationefforts such as revegetation or dam removal. Thus, budget constraints compelenvironmental managers to make difficult decisions regarding the allocation of scarcefunds and personnel between environmental monitoring and environmental restoration.Among other factors, the best allocation of resources between monitoring and restoration—or, more generally, learning and doing—will depend on the quality of information availablefrom a monitoring program. This paper demonstrates the application of the partiallyobservable Markov decision process (POMDP) as a framework for investigating theoptimal intensity of monitoring given stochastic state dynamics and imperfect observationson state variables. Specifically, the paper addresses the problem of choosing among a setof available monitoring protocols that differ in their costs and the type of information theyprovide. An empirical application of the model to erosion control in California watershedsdemonstrates the utility of the resulting decision policy as well as limitations to theapproach.