20100722

Image processing: Detecting and monitoring change

Brad Skelton
Brad Skelton
CTO, ERDAS
Brad.Skelton@erdas.com


Technological innovations in image processing can be looked at from both the algorithmic and computational sides. On the algorithm side, automated feature extraction, while still the Holy Grail, is continuing to improve significantly. On the computational side, the availability of inexpensive massive multicore systems is changing the approach to algorithm implementation.

Multicore systems availability
Multisource data fusion, coupled with improved segmentation and objectbased metrics and Bayesian classification can now produce high quality building footprint extraction (for example) with the need for very little update or correction. Traditional high resolution optical sources can be combined with SAR imagery and LIDAR data to provide a much richer set of cues. For example, the addition of LIDAR improves the ability to distinguish between a flat roof and driveway. SAR data provides the ability to detect additional material differences that would not be visible in optical images. Working in object space after segmentation provides cues based on shape, such as roundness or squareness, and enables you to use proximity of different objects as a further cue, for example, as the presence of shadows next to buildings.


As the 18-month cycle of doubling CPU speed has essentially stopped with most processors running at just around 3 GHZ today, increasing the number of cores in the CPU to do work is now the way to increase speed. While most machines now come with dual or quad core CPUs and eight core and higher are coming, the graphics processing units (GPUs) have been going well beyond that. Today one can buy a 448 core GPU for just about $350. Couple this, with the adoption of a standard language for multicore computing on GPUs called OpenCL and you have a significant change for image processing. Image processing lends itself well to parallel processing, and performance can improve by 10 to 100 times with these systems.

Automated feature extraction
Automated feature extraction advancement is driven by the need to reduce cost and improve accuracy and repeatability. These improvements are being recognised by the wider availability of multisource data, along with improved computational capability. The improvements in GPUs for general purpose applications have been driven by the gaming market. This extensive market is driven by a demand for ever increasing realism, which requires the high numbers of cores.

Detecting and measuring change
Detecting and measuring change is (and has been) the real driving application of these technologies. Monitoring change for administrative, emergency and defense applications is at the top of the list. Take, for example, monitoring oil spills and not just major disasters like that in Gulf. Shipping throughout the world results in oil spills on a routine basis. There is a need to detect, measure and track these spills. SAR data is particularly useful in this detection and automated feature extraction can be used to measure and track change.

Many of the tools in ERDAS IMAGINE, LPS and ERDAS APOLLO leverage multicore CPUs. In the future, we will also include support for the massively cored GPUs.

No comments:

Post a Comment