Evaluation of land cover classification based on multispectral versus pansharpened landsat ETM+ imagery
Access this Article
Search this Article
Land cover generated from satellite images is widely used in many real-world applications such as natural resource management, forest type mapping, hydrological modeling, crop monitoring, regional planning, transportation planning, public information services, and so on. Moreover, land cover data are one of the primary inputs to many geospatial models. In South-East Asia's cities where the houses are interspersed with small trees, bare land and grassland are difficult to detect in multispectral Landsat ETM+ images because its 30 × 30 m spatial resolution is likely to capture a variety of land cover within each pixel, particularly in urban areas. Although other medium resolution multispectral satellites such as ALOS, SPOT, IRS, and so on have higher spatial resolution than Landsat ETM+, it is sometimes difficult to extract the built-up or human settlement areas because of the lack of shortwave infrared bands, which are very useful for distinguishing between soil and vegetation. In this article, we generated land cover data from both Landsat ETM+ multispectral and pansharpened images by applying the same training areas but using different spectral properties. We differentiated between two classified images visually, spectrally, and spatially. Our results showed that 65% of the total area had similar land cover and 35% had dissimilar land cover. Although dense urban areas, forest, agricultural land, and water were almost the same in the classified images, sparse urban areas and grassland were quite different. Much of the sparse urban areas were detected using the pansharpened classified imagery. This is important in South-East Asian cities where many houses are mixed with trees or grassland. Accurate delineation of human settlement area plays a critical role in population estimation, socio-economic studies, disaster management, and regional development planning.
- GIScience and remote sensing
GIScience and remote sensing 50(4), 458-472, 2013-08
Taylor & Francis