A Watershed-Canny Based Approach for Building Footprint Extraction from Very High Resolution Optical Image
			
				 Download as PDF
Download as PDF
			
			
				DOI: 10.23977/acsat.2017.1023			
			
				Author(s)
				Saidi Faycal, Chen Jie
			 
			
				
Corresponding Author
				Faycal Saidi			
			
				
ABSTRACT
				The advanced very high resolution (VHR) sensors are capable of achieving sub-meter resolution, which offers the opportunity for a fine level of analysis of man-made structures. In this paper, we present a method for the extraction of 2-D building footprints from VHR optical scenes. The data sets include Worldview-2 (0.5m) image and cover urban area of San Francisco. Our main idea was to combine edge based detector, regions based segmentation and non-building masks. In the first step, Canny operator was used to extract edge from the optical image and morphological operations were used to remove small edge. In the second step, the Watershed transform was used for segmentation of optical image and morphological operations were used to remove small regions. In the third step, we computed two non-building masks which are vegetation mask and shadow mask. These two masks were applied first to filter Watershed segmentation result.  In the fourth step, we have translated the Canny contour image in both directions and  computed the Correlation Coefficient between Canny contour and filtered Watershed contour for each value of translation, and we got the best translation parameters when Correlation Coefficient was maximal. In the last step, the matched Canny edge image was combined separately with: 1) vegetation mask, 2) shadow mask and 3) filtered Watershed segmentation. The three results were then combined (logical "or") to obtain the final building footprint. The obtained results demonstrated that our approach performs well and improves the boundaries extraction based on Watershed segmentation. 
 
			
			
				
KEYWORDS
				Building Footprint Extraction, VHR Optical Image, Canny Edge Detector, Watershed Segmentation.