Effects of Vegetation Cover on the Microwave Radiometric Sensitivity to Soil Moisture

Abstract
The reduction in sensitivity of the microwave brightness temperature to soil moisture content due to vegetation cover is analyzed using airborne observations made at 1.4 and 5 GHz. The data were acquired during six flights in 1978 over a test site near Colby, Kansas. The test site consisted of bare soil, wheat stubble, and fully mature corn fields. The results for corn indicate that the radiometric sensitivity to soil moisture S decreases in magnitude with increasing frequency and with increasing angle of incidence (relative to nadir).The sensitivity reduction factor, defined in terms of the radiometric sensitivities for bare soil and canopy-covered conditions Y=1 - Scan/ Ss was found to be equal to 0.65 for normal incidence at 1.4 GHz, and increases to 0.89 at 5 GHz. These results confirm previous conclusions that the presence of vegetation cover may pose a serious problem for soil moisture detection with passive microwave sensors.

This publication has 13 references indexed in Scilit: