• Home
05/2006

UAB satellite image improvement technology

Imatge fusionada obtinguda pel mètode WiSpeR

Satellites silently observe natural phenomena that occur on our planet.  Nevertheless, the images produced by mechanical eyes are not as reliable as we would wish. To alleviate the myopia that satellites suffer from, a team from the UAB has proposed a method of image fusion called WiSpeR.

Looking at the earth from the sky offers the possibility of following a great number of phenomena, both natural phenomena and those caused by man, as well as to carry out a study of our environment. Various different scientific disciplines take advantage of this tool, such as for example geology, 'oceanography, ecology, meteorology, etc., and even other economic and social applications such as environmental control, registration and even fishing.

In recent decades a large amount of data has been collected basically images, obtained by various different sensors, which has meant a great deal of heterogeneity in the characteristics of these data. One of the first things the scientific community tried to do was to extract the maximum level of information from these data, taking into account the specific properties of each type of data. One of the most common has been the use of sensors that offer images with a high spatial resolution, that is to say, which offer images in great detail and visual clarity, but taking into account the technical restrictions in the construction of these sensors, the images were only obtainable in black and white. Additionally, when we have tried to get colour images, the same technical restrictions have meant that the spatial resolution of the sensors was lower than their predecessors. Therefore, in many applications they have sought to marry the properties which offer specific properties with those that offer others, in this case the high spatial resolution of certain sensors with the colour offered by others. This problem is known as data fusion.

Let us suppose that sensor A gives us images in black and white, called panchromatic images, with a high spatial resolution, and that sensor B gives us a colour image, called a multispectral image, but with a lower spatial resolution. What is desired is an ideal image obtained by a fictional sensor which ahs the spatial resolution of sensor A and the colours obtained by sensor B. The techniques to solve this problem have been many over the years. Methods were defined based on IHS colour decomposition and others based on mathematical functions called wavelets. But all these methods only used the information that was actually present in the data, and ignored the physical characteristics of the sensors which had collected the data. That is, situations arose where details were used that only appeared in infrared and these were added to images   captured in visible light. This led to results which presented a certain degree of degradation in the information contained in the image.

In the current work, the authors define a method of fusing images, we have called WiSpeR, and which for the first time takes into account the physical characteristics (basically the spectral response) of the sensors which captured the images. In image 1 we show a colour image captured by a multispectral sensor. We can see that this has a low spatial resolution where we can hardly make out the field in the centre of the image. In image 2 we show an image in black and white captured by a different sensor, where we can clearly see this field, as well as many other details, such as for example the central lines on the road. In image 3 we show the fusion obtained by a classical method based on IHS transformation, and where we can see the details of image 2, but this time in colour. The problem with this image is that many of the colours are altered, as we can clearly see in some of the forest trees which appear blue. On the other hand in image 4 we show the result obtained by the method defined in this work, where we have respected practically all the colours of each of the details.

Figure 1. Low-resolution colour image captured by a first sensor.

Figure 2. High-resolution black and white image captured by a second sensor.

Figure 3. Fused image obtained by a classical method based on IHS decomposition.

Figure 4. Fused image obtained by the WiSpeR method defined by the authors in this work.

Xavier Otazu

Universitat Autònoma de Barcelona

References

Article: Otazu, X; Gonzalez-Audicana, M; Fors, O; Nunez, J, "Introduction of sensor spectral response into image fusion methods. application to wavelet-based methods", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 43 (10): 2376-2385 OCT 2005.

 
View low-bandwidth version