The image normalization process aims to remove radiometric differences between multitemporal images that are due to non-surface factors. Accurate normalization is essential for image processing procedures that use multi-date imagery, such as change detection. Linear regression using temporally invariant targets is a widely accepted method for normalization. How- ever, except for the criteria for selecting target points, there is no standard method for conducting this important procedure. This paper proposes a standardized radiometric normalization method for detecting and deleting outliers and obtaining the optimal linear equation for a given set of target points. The method consists of a linear regression model and a studentized residual method for outlier determination. Standardized decision criteria such as R2 and confidence range for t-test are discussed and investigated, as are the issues of band selection and normalization target size. Four variants of the method are tested here, using a pair of Landsat TM images 10 years apart and corresponding training sets and accuracy assessment data. As a result, a standardized computation procedure is proposed, which uses band-by-band linear regression, single pixel targets, and a very conservative 99 percent confidence interval for determining outliers.
|Number of pages||9|
|Journal||Photogrammetric Engineering and Remote Sensing|
|Publication status||Published - 2000 Feb|
All Science Journal Classification (ASJC) codes
- Computers in Earth Sciences