Abstract
Many problems in the management of renewable natural resources are extremum problems: we wish to maximize fish yield from a lake, tree yield from a forest, or minimize insect pest survival, for example. Such problems can be handled better by dynamic programming than classical analysis, because of the large number and the complex nature of constraints imposed on such systems. However, a priori arguments and analysis of biological time series show that such renewable natural resources do not constitute Markov processes, since state changes from t to t + 1 are dependent on the state: at t − 1 as well as the state at t. Therefore, before making a decision about the optimal strategy at t, we must explore the future consequences of the strategy at t + 1 and t + 2. This paper reports computer experiments on strategy evaluation procedures, using dynamic programming and modified dynamic programming with two-stage “look-ahead”. The data used to develop the model came from 60 years of observations on weather conditions and insect pest populations at Magdeburg, Germany. The general conclusion from this work is that the selection of most appropriate strategy for a biological management problem should he determined by the structure of the problem. Sometimes, “one-stage look-ahead” gave the lowest value for the criterion function, and sometimes “two-stage look-ahead” was optimal. The two types of programming always yielded lower cumulative defoliation than the method now used: killing as many pests as feasible whenever “pest” densities are reached.

This publication has 4 references indexed in Scilit: