Experiment results of variable resolution algorithm based on NCC
1. Introduction
In many countries around the world, about 70% of water resource will be used to for agriculture irrigation each year. Precisely control irrigation can significantly reduce the waste of irrigation water while increasing plant productivity. Automated sensing of plant water status via non-destructive, automatic techniques plays a central role in such irrigation control system development. Plant canopy temperature acts as a good indicator of the plant water status. When plants experience water stress, their temperature increases. A novel approach to irrigation scheduling and thus, potential water savings, is to monitor plant temperature and relate to the plants water status. To say further, if we want to know the plant water status, we need to know the canopy temperature at first. Recent research in agriculture indicates that plant water status may be monitored if the canopy temperature distribution of the plant is known (Jones, 1999a,b; Jones and Leinonen, 2003; Guilioni, et al., 2008; Grant, et al., 2007; Wheaton et al., 2007). Plant water status information can be obtained via the computation of the crop water stress index (CWSI) (Jones, 1999a). This index offers great potential to generate an automated irrigation control system where plant canopy temperature distribution is acquired via thermal imaging. Such a system is expected to be able to optimize irrigation water usage and the potential to maintain plant health in real time, thus increasing the productivity of limited water resources.
Typically, measurement data of the infrared (IR) thermography sensing system consists of a reference optical image and an IR image. The optical image is obtained by using a normal digital camera and is taken at the same location as the IR image to provide a true view of the IR image scene, one may identify the area of interest (e.g., plant leaves other than ground or sky) from the optical image. The optical image allows the underlying plant canopy of interest to be flawlessly identified. To quantify plant water stress, the value of CWSI is calculated based on the canopy temperature, and temperatures of a dry and wet reference surface. These temperatures can be estimated once the temperature distribution of the canopy leaf area is obtained. Clearly, for automatic determination of plant canopy temperature, the detection of the overlap area between the pair of IR and optical images plays a central role in the plant canopy temperature acquisition in a program for the automatic controlled irrigation program. Preliminary results from recent work (Wang et al., 2010a,b) indicates that the accuracy of canopy temperature distribution estimation, for the evaluation of the plant water stress, strongly depends on the accuracy of optical and IR image fusion registration. But using nondestructive optical image sensing technique to obtain the canopy temperature is challenging. The first step is to automatically register the infrared image with the optical image, then extract the temperature of the interested regions, and finally can get the temperature estimation of the canopy. However, the exact canopy temperature can not be acquired only by using optical image or by using IR image. Thus image fusion registration between optical and IR image becomes vitally necessary.
Algorithms for image registration are numerous and widely available, for example, cross correlation (Tsai & Lin, 2003), mutual information(Roche, et al., 1998), correlation ratio(Klein,2007), and SIFT based methods(Chen & Tian, 2009). There are also some fast algorithms such as automatic registration approach based on point features(Yu, et al., 2008), automatic image registration algorithm with sub-pixel accuracy(Althof, et. al., 1997) and using importance sampling(Bhagalia, et al., 2009), etc. Image registration has been applied across a range of domains. For example, medical image registration to assisted clinical diagnosis, and remote sensing image fusion registration for multi-band images.
Although many effective image registration algorithms exist, in order to align images from different sources and formats, further algorithm refinement is required. The major difficulties which arise when intensive images are used for image registration are listed below:
Images are taken by different sensors and possibly at different view angles and different times.
At the overlap area the intensities of both images can be quite different. Therefore, approaches involving image intensity are unlikely to obtain satisfactory registration.
Apart from some similarity in overall structure, it is difficult to identify consistent feature points from both images in some popular feature spaces via an automatic registration method, such as the scale invariant feature transformation (SIFT) method.
Image registration algorithms fundamentally fall into two main categories, i.e., area-based methods and feature-based methods. The SIFT method perhaps is the most representative approach in the feature-based automatic methods. The SIFT implementation is able to find distinctive points that are invariant to location, scale and rotation, and robust to affine transformations (changes in scale, rotation, shear, and position) and changes in illumination for images of the same source or of the same type of sensors. When this is the case, the algorithm is particularly effective. However, for our application the success rate of SIFT is less than 10%. Fig 1 demonstrates the SIFT based methods cannot get enough matching key points so the image registration process is not effective. The main reason is that the objects of interest are not rigid in this kind of image registration application.
Solutions based on area correlation techniques seem to be more applicable to the problem presented in Fig 1. Exception however, are those which use intensity (or color) dependent functions as similarity measures, such as the Fourier transformation type and mutual information type approaches.
Instead of using cross correlation coefficient, Huttenlocher et al,(1993) used the Hausdorff distance as a similarity measure to register binary images that are the output of an edge detector(Huttenlocher, et al., 1993).. To deal with the problem of multimodality medical image registration, Maes et al, (1997) proposed a method which applies mutual information (MI) to measure the statistical dependence or information redundancy between the image intensities of corresponding pixels in both images. Using correlation ratio (CR) as the similarity measure, Roche et al, (1998) proposed a CR method which assumes the pixel-pair intensities between two registered images are functional dependent. These area-based methods were summarized in Lau et al., (2001).
A variety of image registration techniques and algorithms were tested in reference to their fast realization (Yang et al., 2009). Though the SIFT based methods have good effects in many applications, they are usually used to process rigid objects and same source image pairs registration. Here the main objects in the images are leaves which move over time, which can result in the same source images (IR & optical) containing different imaging characteristics.
Yang et al, 2009) suggested that because of the existing differences between optical and infrared images, an algorithm should edge the images at first, then use an image registration method to process the edge images. Because the standard cross correlation (NCC--normalized cross correlation) method is slow, the variable resolution method based on normalized cross correlation may be the best choice. Experiments show that the variable resolution normalized cross correlation method has not only the same good registration results, but also fast running speed, taking one fortieth the time compared to the standard cross correlation method (Yang et al, 2009). Among several image registration methods such as NMI(Normalized Mutual Information), CR(Correlation Ratio) and NCC, the variable resolution normalized cross correlation has the overall best performance.
Chapters are organized as follows:
Section 2 the algorithm is described and the flow diagram is given; Section 3 is the implementation of the above algorithm; The experimental results are presented in Section 4, followed by Section 5 Discussion and Section 6 Conclusion.
2. Variable resolution algorithm based on normalized cross correlation
By monitoring plant canopy temperature and the temperatures of wet and dry leaves, it is possible to estimate the underlying plant water stress status and therefore, intelligently control the related irrigation process. Fig. 2 illustrates a typical plant irrigation strategy, where the plant water status information acquired (inside the dotted box) plays a critical role in the optimization of plant productivity and water usage. As a key link of nondestructive plant water status monitoring processing, image registration method plays an important role. Via image registration from the optical and IR image pair, the plant canopy temperature can be acquired. Then by using expected maximum value method, the wet and dry leaf temperatures can be determined via image fusion between the optical and IR image (Wang et al, 2010a). Colour of canopy, and wet and dry leaves, plays an important role as green sunlit leave transpire most the plants water. Thus, the green leaf areas are correlated with their corresponding IR image data.
To register an IR image to a reference optical image, its location and rotation angle should be obtained. When the IR image is wholly overlapped by the reference optical image, it can be assumed that the space resolution of the two images is the same. A flow diagram of the variable resolution algorithm based on normalized cross correlation is shown in Fig 3.
Because the algorithm needs to calculate the correlation coefficient pixel by pixel, the normalized cross correlation (NCC) method is time expensive. In order to resolve the slow running speed, several methods can be tested, for example, the Sequential Similarity Detection Algorithm(SSDA), the Bit Plane Correlation Algorithm(BPCA), the Variable Resolution Correlation Algorithm(VRCA). Here, the Variable resolution correlation algorithm is used to accelerate it. The key steps in the variable resolution algorithm are: a) reduce the image resolution, b) roughly register the lower resolution images, and c) register images more accurately at the possible positions in the full resolution images.
Suppose ImO and ImIR (image size is M×N) stands for the optical and IR image respectively, f is the image zoom factor, ImOl and ImIRl (image size is Mh×Nh) stands for the lower resolution images, accordingly, ImOe, ImIRe, ImOle, ImIRle are the edged images of the corresponding images, then the generalize cross correlation can be expressed as:
where
In our case, f is set to 2, and the search strategy in the lower resolution layer is perform once rough image registration process for every 2×2pixels. The search strategy in the original resolution layer is to perform a pixel by pixel based image registration in the neighboring area of the rough registration position spanning 4 pixels for each direction.
3. Algorithm implementation of the variable resolution algorithm
The first stage of algorithm implementation is is edge extraction, which plays an important role in the image registration between different image sources. There are many methods on edge detection and extraction; such as Prewitt, Sobel, Robert operators and Canny algorithm, and so on. As the above mentioned, the infrared image is different with the reference optical image (see Fig 4, the resolution of the optical image is 2848*2136pixels, the resolution of the IR image is 320*240pixels), having a lower resolution, less details, and therefore it will need to reserve its edge information which are the common features in both Infrared and optical images (if they are in the same scene) while conducting the edge processing. Here the edge operation is based on the modified Sobel operators (There are four operators unlike the normal two operators with horizontal and vertical one) and they can be described as follows,
Using these four operators to convolute with the raw images, we can resolove
k=1,2,3,4(4)
Where
After edge detection, the edged image ImOe, ImIRe, ImOle, ImIRle, is processed using NCC algorithm to do the rough registration. On account of possible disturbing, we choose the some points with the former higher correlation coefficients marked as
The candidate points
Where
The final step is the estimation of rotation angle. In our case, the permitted rotation angle θ range is from -10 to 10 . The range is divided into 200 small angles. To each angle, its correlation coefficient is calculated, followed by an angle θ* with the maximum correlation value.
4. Experiment results
The above algorithm was tested using many image pairs. As shown in Fig 5, all the results are successful with tolerable errors. In order to compare the algorithm run time, the experiment has been done using both the variable resolution algorithm and the standard NCC algorithm in an identical condition, and the results are shown in Table 1. Results illustrate that the variable resolution algorithm can achieve the same acceptable registration criteria as the standard NCC but with a fast speed, i.e., it requires less than one fortieth running time compared with the standard NCC algorithm. Because we do not know the true registration positions, need to compare them against the results of the manual registration. In order to verify the variable resolution method, the NMI and CR algorithms are tested concurrently. Results indicate that the NMI method almost has the same consistent results as that of the proposed algorithm, but the results of CR method are poor. Under the variable resolution technique, both NMI and CR methods require a higher computational load, for example, the run time of NMI algorithm is several times longer than the fast NCC one. If the calculated amount is disregarded, the NMI method may be the alternative method for this application.
Number of Image pairs | Success number | Running time per pair (average) | Registration error (max, min, average) | |
Standard NCC | 20 | 20 | 610s | (6,0,2.6) |
Variable resolution algorithm | 20 | 20 | 15s | (6,0,2.6) |
In the NMI method the correlation matrix RNMI(u,v) is calculated by
where POIRuv(i,j) is the joint probability that the intensities of Imuv and ImIR are at levels i and j respectively. Puv and PIR are the marginal probability of the images Imuv and ImIR. Imuv is the (u,v)th sub-image of optical image ImO with the same size as the infrared image ImIR. These probabilities can be computed from the normalized joint and marginal intensity histograms.
In the CR method the correlation matrix RCR(u,v) is calculated by
where
and
All summations in (8) and (9) are taken over image intensity space.
Number of Image pairs | Success* number | Registration error (max, min, average) | |
Variable resolution algorithm | 10 | 10 | (6,0,2.6) |
NMI algorithm | 10 | 8 | (9,0,2.7) |
CR algorithm | 10 | 0 | Null |
All of the tests are done in the condition of f = 2 and using Sobel operator to get the image edges. The contrasting experiments with the NMI (normalized mutual information) and CR(correlation ratio) algorithms have also been done(Fig 6, Table 2). From these figures and tables, we can see the variable resolution algorithm based on normalized cross correlation is the best choice. Fig 7 and Table 3 show that the variable resolution algorithm based on Sobel edge operators is a little inferior to Canny edge based algorithm, but it is a compromising solution with easier realization and higher running efficiency.
Type Edge extraction | Number of successful registration | Registration location error (maximum, minimum, average) |
Sobel based edge | 10 | (6,0,2.8) |
Canny edge extraction | 10 | (6,0,2.6) |
Roberts based edge | 10 | (9,0,3.2) |
5. Discussion
Our experiment results suggest that NMI and CR algorithms are unsuitable for image registration of different sources because these two methods are dependent on the intensity distributions of the images which is of different values in the input images, while the proposed algorithm considers the edge feature which is common to both images involved and is independent of the image intensities. The image registration methods base on SIFT are also unsuitable to our application because the common key points can rear. The important reason is that the key points are largely dependent on the image intensities.
With regard to computational efficiency, although the images are only zoomed out at 2 times, computational load is significantly reduced. The variable resolution algorithm can run about 40 times faster than the standard pixel by pixel cross correlation method without registration performance degradation.
As demonstrated in the experiment results, the rapid NCC algorithm implementation is able to achieve the same registration accuracy as the standard NCC implementation. Once an image pair is registered, the plant water status information can be estimated via the method in (Wang et. al. 2010a), where it is evidenced that the accuracy of the IR and optical image registration may have significant influence to the canopy temperature estimation. Fig 8 illustrates the experiment results presented in (Wang, et al., 2010a) which verifies that the trend of the computed CWSI is consistent with an alternative plant water stress indicator, the SWP data.
6. Conclusion
The chapter presented a fast realization algorithm of automatic optical and infrared image registration. It reduces the computation considerably by utilizing variable resolution algorithm without sacrificing registration performance. It can also be used in other applications. As we mentioned above, the image registration is only the first stage in the process to estimate the canopy temperature. We are developing methods to obtain more accurate registration and how to construct a algorithm suit to allow for more variation in the background images.
It is our aim to continue research in this field towards a decision support system to deliver real time automated irrigation control based on CWSI where the ‘plant is the sensor’.
References
- 1.
Jones H. G. 1999a Use of thermography for quantitative studies of spatial and temporal variation of stomatal conductance over leaf surfaces . Journal of , Cell and Environment,22 9 (September 1999),1043 1055 ,1365-3040 - 2.
Jones H. G. 1999b Use of infrared thermometry for estimation of stomatal conductance as a possible aid to irrigation scheduling. Journal of Agricultural and Forest Meteorology,95 3 (June 1999),139 149 ,1461-9563 - 3.
Jones H. G. Leinonen I. 2003 Thermal imaging for the study of plant water relations . ,59 3 (September 2003),205 217 ,0021-8588 - 4.
Guilioni L. Jones H. G. Leinonen I. Lhomme J. P. 2008 On the relationships between stomatal resistance and leaf temperatures in thermography. Journal of Agricultural and Forest Meteorology,148 11 (October 2008),1908 1912 ,1461-9563 - 5.
Grant O. Tronina L. Jones H. G. Chaves M. M. 2007 Exploring thermal imaging variables for the detection of stress responses in grapevine under different irrigation regimes . ,58 4 (March 2007),815 825 ,0022-0957 - 6.
Wheaton A. D. Cooley N. Dunn G. Goodwin I. Needs S. 2007 Evaluation of infrared thermography to determine the crop water status of Cabernet Sauvignon grapevines. Proceedings of The 13th Australian Wine Industry Technical Conference, Adelaide, 28 July- 2 August, 2007. - 7.
Wang X. Z. Yang W. P. Wheaton A. D. Cooley N. Moran B. 2010a Automated canopy temperature estimation via infrared thermography: A first step towards automated plant water stress monitoring. Journal of Computers and Electronics in Agriculture,73 1 (July 2010 ),74 83 ,0168-1699 - 8.
Wang X. Z. Yang W. P. Wheaton A. D. Cooley N. Moran B. 2010b Efficient Registration of Optical and IR images for Automatic Plant Water Stress Assessment. Journal of Computers and Electronics in Agriculture,74 2 (November 2010),230 237 ,0168-1699 - 9.
Tsai D. M. Lin C. T. 2003 Fast normalized cross correlation for defect detection . Journal of ,24 15 (November 2003),2625 2631 ,0167-8655 - 10.
Roche A. Malandain G. etc 1998 The correlation ratio as a new similarity measure for multimodal image registration . Medical Image Computing and Computer Assisted Intervention- MICCAI’98,1496 1115 1124 , Boston, USA, October, 1998. - 11.
Klein S. Staring M. Pluim J. P. W. 2007 Evaluation of Optimization Methods for Nonrigid Medical Image Registration Using Mutual Information and B-Splines . ,16 12 (December 2007),2879 2890 ,1057-7149 - 12.
Chen J. Tian J. 2009 Real-time multi-modal rigid registration based on a novel symmetric-SIFT descriptor . ,19 5 (May 2009),643 651 ,1002-0071 - 13.
Yang W. P. Wang X. Z. etc 2009 Automatic optical and IR image fusion for plant water stress analysis, Proceedings of The 12th International Conference on Information Fusion,1053 1059 , Seattle, 6-9 July 2009. - 14.
Yu L. Zhang D. R. Holden E. J. 2008 A fast and fully automatic registration approach based on point features for multi-source remote-sensing images. Journal of Computers & Geosciences,34 7 (July 2008),838 848 ,0098-3004 - 15.
Althof R. J. Wind M. G. J. Dobbins J. T. 1997 A Rapid and Automatic Image Registration Algorithm with Subpixel Accuracy, IEEE TRANS ON MEDICAL IMAGING,16 3 (June 1997),308 316 ,0278-0062 - 16.
Bhagalia R. Fessler J. A. Kim B. 2009 Accelerated nonrigid intensity-based image registration using importance sampling . IEEE TRANS ON MEDICAL IMAGING,28 8 ( August 2009),1208 1216 ,0278-0062 - 17.
Huttenlocher D. P. Klanderman G. A. Rucklidge W. J. 1993 1993).Comparing images using the Hausdorff distance , IEEE Trans. on Pattern Analysis and Machine Intelligence,15 9 (September 1993),850 863 ,0162-8828 - 18.
Maes F. Collignon A. Vandermeulen D. etc 1997 Multimodality image registration by maximisation of mutual information”, IEEE Trans. on Medical Imaging,16 2 ( April 1997),187 198 ,0278-0062 - 19.
Lau Y. H. Braun M. Hutton B. F. 2001 Non-rigid image registration using a median-filtered coarse-to-fine displacement field and a symmetric correlation ratio, Journal of ,46 4 (April 2001),1297 1319 ,0031-9155