Abstract
Water balances, modelled from 150 years of temperature and precipitation data, were used together with long-term fire-history data to assess the effect of climate variability on fire occurrence determined in a separate study. The water balance shifted from consistently positive effective precipitation(precipitation minus potential evapotranspiration) during the nineteenth century to one where precipitation roughly equalled potential evapotranspiration during the twentieth century. Droughts during the 1890s and 1930s were characterized by a negative water balance. Fire-season precipitation was particularly low in the 1890s. Analyses of soil storage and effective precipitation showed that fires tended to take place during decades of high moisture deficits and in dry years occurring in the course of moister decades. In some cases where fire occurrence was not well predicted by annual water balance, fire was predicted by fire-season (March-June and October-November) water balance. Annual water balances of fire years showed higher deficits than those of non-fire years. The empirical relationship between fire history and long-term water balance provides a crude basis for prediction of changing wildfire regimes to be expected with climate change. In the absence of fire suppression, fire frequency is predicted to increase by 10-25% during the Twentieth century as a consequence of a more negative water balance.