Abstract
Nonlinear inverse problems are usually solved with linearized techniques that depend strongly on the accuracy of initial estimates of the model parameters. With linearization, objective functions can be minimized efficiently, but the risk of local rather than global optimization can be severe. I address the problem confronted in nonlinear inversion when no good initial guess of the model parameters can be made. The fully nonlinear approach presented is rooted in statistical mechanics. Although a large nonlinear problem might appear computationally intractable without linearization, reformulation of the same problem into smaller, interdependent parts can lead to tractable computation while preserving nonlinearities. I formulate inversion as a problem of Bayesian estimation, in which the prior probability distribution is the Gibbs distribution of statistical mechanics. Solutions are then obtained by maximizing the posterior probability of the model parameters. Optimization is performed with a Monte Carlo technique that was originally introduced to simulate the statistical mechanics of systems in equilibrium. The technique is applied to residual statics estimation when statics are unusually large and data are contaminated by noise. Poorly picked correlations (“cycle skips” or “leg jumps”) appear as local minima of the objective function, but global optimization is successfully performed. Further applications to deconvolution and velocity estimation are proposed.