Abstract
The lattice distortion which occurs at the surface of a monatomic metal is investigated by considering a simple cubic model based on a Morse potential interaction between pairs of atoms. Effective nearest-neighbor harmonic coupling constants are estimated for the following cases: (a) an atom in an infinite monatomic cubic crystal; (b) an atom in the surface layer of a semi-infinite monatomic cubic crystal, taking account of lattice distortion, and considering vibrations both parallel and perpendicular to the surface; and (c) an atom cn the surface, vibrating perpendicular to the surface. From these results the relative magnitudes of the Debye–Waller factor are estimated for the various cases. It is found that this model, in contrast to purely harmonic-force models, has the Debye–Waller factor for vibrations parallel to the surface greater than the bulk value. Also, comparison with the results of other models suggests that surface lattice distortion reduces the Debye–Waller factor from the bulk value by about as much as does the creation of a surface itself, ignoring the attendant lattice distortion.