Open access

Automatic Optical and Infrared Image Registration for Plant Water Stress Sensing

Written By

Weiping Yang, Zhilong Zhang, Xuezhi Wang, Bill Moran, Ashley Wheaton and Nicola Cooley

Submitted: 15 October 2010 Published: 24 June 2011

DOI: 10.5772/16813

From the Edited Volume

Image Fusion and Its Applications

Edited by Yufeng Zheng

Chapter metrics overview

3,863 Chapter Downloads

View Full Metrics

1. Introduction

In many countries around the world, about 70% of water resource will be used to for agriculture irrigation each year. Precisely control irrigation can significantly reduce the waste of irrigation water while increasing plant productivity. Automated sensing of plant water status via non-destructive, automatic techniques plays a central role in such irrigation control system development. Plant canopy temperature acts as a good indicator of the plant water status. When plants experience water stress, their temperature increases. A novel approach to irrigation scheduling and thus, potential water savings, is to monitor plant temperature and relate to the plants water status. To say further, if we want to know the plant water status, we need to know the canopy temperature at first. Recent research in agriculture indicates that plant water status may be monitored if the canopy temperature distribution of the plant is known (Jones, 1999a,b; Jones and Leinonen, 2003; Guilioni, et al., 2008; Grant, et al., 2007; Wheaton et al., 2007). Plant water status information can be obtained via the computation of the crop water stress index (CWSI) (Jones, 1999a). This index offers great potential to generate an automated irrigation control system where plant canopy temperature distribution is acquired via thermal imaging. Such a system is expected to be able to optimize irrigation water usage and the potential to maintain plant health in real time, thus increasing the productivity of limited water resources.

Typically, measurement data of the infrared (IR) thermography sensing system consists of a reference optical image and an IR image. The optical image is obtained by using a normal digital camera and is taken at the same location as the IR image to provide a true view of the IR image scene, one may identify the area of interest (e.g., plant leaves other than ground or sky) from the optical image. The optical image allows the underlying plant canopy of interest to be flawlessly identified. To quantify plant water stress, the value of CWSI is calculated based on the canopy temperature, and temperatures of a dry and wet reference surface. These temperatures can be estimated once the temperature distribution of the canopy leaf area is obtained. Clearly, for automatic determination of plant canopy temperature, the detection of the overlap area between the pair of IR and optical images plays a central role in the plant canopy temperature acquisition in a program for the automatic controlled irrigation program. Preliminary results from recent work (Wang et al., 2010a,b) indicates that the accuracy of canopy temperature distribution estimation, for the evaluation of the plant water stress, strongly depends on the accuracy of optical and IR image fusion registration. But using nondestructive optical image sensing technique to obtain the canopy temperature is challenging. The first step is to automatically register the infrared image with the optical image, then extract the temperature of the interested regions, and finally can get the temperature estimation of the canopy. However, the exact canopy temperature can not be acquired only by using optical image or by using IR image. Thus image fusion registration between optical and IR image becomes vitally necessary.

Algorithms for image registration are numerous and widely available, for example, cross correlation (Tsai & Lin, 2003), mutual information(Roche, et al., 1998), correlation ratio(Klein,2007), and SIFT based methods(Chen & Tian, 2009). There are also some fast algorithms such as automatic registration approach based on point features(Yu, et al., 2008), automatic image registration algorithm with sub-pixel accuracy(Althof, et. al., 1997) and using importance sampling(Bhagalia, et al., 2009), etc. Image registration has been applied across a range of domains. For example, medical image registration to assisted clinical diagnosis, and remote sensing image fusion registration for multi-band images.

Although many effective image registration algorithms exist, in order to align images from different sources and formats, further algorithm refinement is required. The major difficulties which arise when intensive images are used for image registration are listed below:

  • Images are taken by different sensors and possibly at different view angles and different times.

  • At the overlap area the intensities of both images can be quite different. Therefore, approaches involving image intensity are unlikely to obtain satisfactory registration.

  • Apart from some similarity in overall structure, it is difficult to identify consistent feature points from both images in some popular feature spaces via an automatic registration method, such as the scale invariant feature transformation (SIFT) method.

Image registration algorithms fundamentally fall into two main categories, i.e., area-based methods and feature-based methods. The SIFT method perhaps is the most representative approach in the feature-based automatic methods. The SIFT implementation is able to find distinctive points that are invariant to location, scale and rotation, and robust to affine transformations (changes in scale, rotation, shear, and position) and changes in illumination for images of the same source or of the same type of sensors. When this is the case, the algorithm is particularly effective. However, for our application the success rate of SIFT is less than 10%. Fig 1 demonstrates the SIFT based methods cannot get enough matching key points so the image registration process is not effective. The main reason is that the objects of interest are not rigid in this kind of image registration application.

Solutions based on area correlation techniques seem to be more applicable to the problem presented in Fig 1. Exception however, are those which use intensity (or color) dependent functions as similarity measures, such as the Fourier transformation type and mutual information type approaches.

Figure 1.

Results of SIFT based registration algorithm (no matching key-points) (a) 5906 keypoints found in optical image, (b) 447 keypoints found in IR image, (c) 5666 keypoints found in optical image, (d) 605 keypoints found in IR image

Instead of using cross correlation coefficient, Huttenlocher et al,(1993) used the Hausdorff distance as a similarity measure to register binary images that are the output of an edge detector(Huttenlocher, et al., 1993).. To deal with the problem of multimodality medical image registration, Maes et al, (1997) proposed a method which applies mutual information (MI) to measure the statistical dependence or information redundancy between the image intensities of corresponding pixels in both images. Using correlation ratio (CR) as the similarity measure, Roche et al, (1998) proposed a CR method which assumes the pixel-pair intensities between two registered images are functional dependent. These area-based methods were summarized in Lau et al., (2001).

A variety of image registration techniques and algorithms were tested in reference to their fast realization (Yang et al., 2009). Though the SIFT based methods have good effects in many applications, they are usually used to process rigid objects and same source image pairs registration. Here the main objects in the images are leaves which move over time, which can result in the same source images (IR & optical) containing different imaging characteristics.

Yang et al, 2009) suggested that because of the existing differences between optical and infrared images, an algorithm should edge the images at first, then use an image registration method to process the edge images. Because the standard cross correlation (NCC--normalized cross correlation) method is slow, the variable resolution method based on normalized cross correlation may be the best choice. Experiments show that the variable resolution normalized cross correlation method has not only the same good registration results, but also fast running speed, taking one fortieth the time compared to the standard cross correlation method (Yang et al, 2009). Among several image registration methods such as NMI(Normalized Mutual Information), CR(Correlation Ratio) and NCC, the variable resolution normalized cross correlation has the overall best performance.

Chapters are organized as follows:

Section 2 the algorithm is described and the flow diagram is given; Section 3 is the implementation of the above algorithm; The experimental results are presented in Section 4, followed by Section 5 Discussion and Section 6 Conclusion.

Advertisement

2. Variable resolution algorithm based on normalized cross correlation

By monitoring plant canopy temperature and the temperatures of wet and dry leaves, it is possible to estimate the underlying plant water stress status and therefore, intelligently control the related irrigation process. Fig. 2 illustrates a typical plant irrigation strategy, where the plant water status information acquired (inside the dotted box) plays a critical role in the optimization of plant productivity and water usage. As a key link of nondestructive plant water status monitoring processing, image registration method plays an important role. Via image registration from the optical and IR image pair, the plant canopy temperature can be acquired. Then by using expected maximum value method, the wet and dry leaf temperatures can be determined via image fusion between the optical and IR image (Wang et al, 2010a). Colour of canopy, and wet and dry leaves, plays an important role as green sunlit leave transpire most the plants water. Thus, the green leaf areas are correlated with their corresponding IR image data.

Figure 2.

Illustration of irrigation actuation via a plant based sensor

To register an IR image to a reference optical image, its location and rotation angle should be obtained. When the IR image is wholly overlapped by the reference optical image, it can be assumed that the space resolution of the two images is the same. A flow diagram of the variable resolution algorithm based on normalized cross correlation is shown in Fig 3.

Because the algorithm needs to calculate the correlation coefficient pixel by pixel, the normalized cross correlation (NCC) method is time expensive. In order to resolve the slow running speed, several methods can be tested, for example, the Sequential Similarity Detection Algorithm(SSDA), the Bit Plane Correlation Algorithm(BPCA), the Variable Resolution Correlation Algorithm(VRCA). Here, the Variable resolution correlation algorithm is used to accelerate it. The key steps in the variable resolution algorithm are: a) reduce the image resolution, b) roughly register the lower resolution images, and c) register images more accurately at the possible positions in the full resolution images.

Suppose ImO and ImIR (image size is M×N) stands for the optical and IR image respectively, f is the image zoom factor, ImOl and ImIRl (image size is Mh×Nh) stands for the lower resolution images, accordingly, ImOe, ImIRe, ImOle, ImIRle are the edged images of the corresponding images, then the generalize cross correlation can be expressed as:

Figure 3.

Flow diagram of variable resolution algorithm based on NCC

ρl(u,v)=i=0Mhj=0Nh(ImOleu,v(i,j)ImIRle(i,j))σOlu,vσIRlE1
ρ(k,l)=i=0Mj=0N(ImOek,l(i,j)ImIRe(i,j))σOk,lσIRE2

where ρl(u,v) is the cross correlation coefficient calculated from the lower resolution image pair ImOle and ImIRle, (u,v) is the coordinate index of the optical image ImOle, ImOleu,vis an image located in (u,v)th of the image ImOle and its size is the same as image ImIRle, σOlu,v and σIRl are the standard deviation of the corresponding images respectively; ρ(k,l)is the cross correlation coefficient calculated from the raw resolution image pair ImOe and ImIRe, (k,l) is the coordinate index of the optical image ImOe, ImOek,lis an image located in (k,l)th of the image ImOe and its size is the same as image ImIRe, σOk,l and σIR are the standard deviation of the corresponding images respectively.

In our case, f is set to 2, and the search strategy in the lower resolution layer is perform once rough image registration process for every 2×2pixels. The search strategy in the original resolution layer is to perform a pixel by pixel based image registration in the neighboring area of the rough registration position spanning 4 pixels for each direction.

Advertisement

3. Algorithm implementation of the variable resolution algorithm

Figure 4.

One image pair (a) An optical image, (b) An IR image

The first stage of algorithm implementation is is edge extraction, which plays an important role in the image registration between different image sources. There are many methods on edge detection and extraction; such as Prewitt, Sobel, Robert operators and Canny algorithm, and so on. As the above mentioned, the infrared image is different with the reference optical image (see Fig 4, the resolution of the optical image is 2848*2136pixels, the resolution of the IR image is 320*240pixels), having a lower resolution, less details, and therefore it will need to reserve its edge information which are the common features in both Infrared and optical images (if they are in the same scene) while conducting the edge processing. Here the edge operation is based on the modified Sobel operators (There are four operators unlike the normal two operators with horizontal and vertical one) and they can be described as follows,

S1=[101202101],
S2=[121000121]E3
S3=[210101012]
S4=[012101210]E4

Using these four operators to convolute with the raw images, we can resolove

Ek(i,j)=m=02n=02Im(i+m1,j+n1)*Sk(m,n)

k=1,2,3,4(4)

E(i,j)=maxkEk(i,j)E5
(5)

Where E(i,j) is the edge of the point (i,j) of image Im.

After edge detection, the edged image ImOe, ImIRe, ImOle, ImIRle, is processed using NCC algorithm to do the rough registration. On account of possible disturbing, we choose the some points with the former higher correlation coefficients marked as{(uk,vk)}.

The candidate points{(uk,vk)} are obtained. The next step is to acquire accurate image registration is then performed in a small neighboring area in the full resolution images. Here we adopt a weighted coefficient c to ensure that the final point with the maximum correlation coefficient is the optimal one.

ρk=cρl(uk,vk)+(1c)ρ(uk,vk)E6
ρk*=maxkρkE7

Where (uk,vk) is the point position in the full resolution image ImOe and ρ(uk,vk) is the correlation coefficient of accurate image registration, ρl(uk,vk)is the correlation coefficient of rough image registration. Through this step registration position (uk*,vk*) is obtained.

The final step is the estimation of rotation angle. In our case, the permitted rotation angle θ range is from -10 to 10 . The range is divided into 200 small angles. To each angle, its correlation coefficient is calculated, followed by an angle θ* with the maximum correlation value.

Advertisement

4. Experiment results

The above algorithm was tested using many image pairs. As shown in Fig 5, all the results are successful with tolerable errors. In order to compare the algorithm run time, the experiment has been done using both the variable resolution algorithm and the standard NCC algorithm in an identical condition, and the results are shown in Table 1. Results illustrate that the variable resolution algorithm can achieve the same acceptable registration criteria as the standard NCC but with a fast speed, i.e., it requires less than one fortieth running time compared with the standard NCC algorithm. Because we do not know the true registration positions, need to compare them against the results of the manual registration. In order to verify the variable resolution method, the NMI and CR algorithms are tested concurrently. Results indicate that the NMI method almost has the same consistent results as that of the proposed algorithm, but the results of CR method are poor. Under the variable resolution technique, both NMI and CR methods require a higher computational load, for example, the run time of NMI algorithm is several times longer than the fast NCC one. If the calculated amount is disregarded, the NMI method may be the alternative method for this application.

Number of Image pairsSuccess numberRunning time per pair
(average)
Registration error
(max, min, average)
Standard NCC2020610s(6,0,2.6)
Variable resolution algorithm202015s(6,0,2.6)

Table 1.

Experiment results of variable resolution algorithm based on NCC

Figure 5.

Image pairs and their registration results (a) Image pair 1(Optical(left),IR(right)), (b) Registration result of image pair 1(287,299,0.2), (c) Image pair 2(Optical(left),IR(right)), (d) Registration result of image pair 2(343,166,0)

In the NMI method the correlation matrix RNMI(u,v) is calculated by

RNMI(u,v)=i(Puv(i)logPuv(i)+PIR(i)logPIR(i))i,jPOIRuv(i,j)logPOIRuv(i,j)E8

where POIRuv(i,j) is the joint probability that the intensities of Imuv and ImIR are at levels i and j respectively. Puv and PIR are the marginal probability of the images Imuv and ImIR. Imuv is the (u,v)th sub-image of optical image ImO with the same size as the infrared image ImIR. These probabilities can be computed from the normalized joint and marginal intensity histograms.

In the CR method the correlation matrix RCR(u,v) is calculated by

RCR=11VarIRiVaruv(i)Puv(i)E9

where

VarIR=ii2PIR2(i)(iiPIR(i))2E10

and

Varuv(i)=1Puv(i)jj2POIRuv2(i,j)(1Puv(i)jjPOIRuv(i,j))2E11

All summations in (8) and (9) are taken over image intensity space.

Number of Image
pairs
Success* numberRegistration error
(max, min, average)
Variable resolution algorithm1010(6,0,2.6)
NMI algorithm108(9,0,2.7)
CR algorithm100Null

Table 2.

Experiment results of several different algorithmsRemark: * stands for that a successful registration is whose position error is less than 10 pixels.

Figure 6.

Image pairs and their registration results using different algorithms (a) Image pair 1(Optical(left),IR(right)), (b) Registration result 1 of Variable resolution algorithm (299,120,0.8), (c) Registration result 1 of NMI algorithm (304,125,-0.5), (d) Registration result 1 of CR algorithm (318,257,0.5), (e) Image Pair 2(Optical(left),IR(right)), (f) Registration result 2 of Variable resolution algorithm (364,153,1.5), (g) Registration result 1 of NMI algorithm (362,157,1.0), (h) Registration result 1 of CR algorithm (283,93,3.9)

All of the tests are done in the condition of f = 2 and using Sobel operator to get the image edges. The contrasting experiments with the NMI (normalized mutual information) and CR(correlation ratio) algorithms have also been done(Fig 6, Table 2). From these figures and tables, we can see the variable resolution algorithm based on normalized cross correlation is the best choice. Fig 7 and Table 3 show that the variable resolution algorithm based on Sobel edge operators is a little inferior to Canny edge based algorithm, but it is a compromising solution with easier realization and higher running efficiency.

Figure 7.

Tests of variable resolution algorithm based on different edge extraction methods(a) Optical raw image(left) and infrared raw image(right), (b) Results of edge extraction using Canny algorithm(left) and registration result (266,254,-0.4), (c) Results of edge extraction using Roberts operators(left) and registration result (265,253,1.8), (d) Results of edge extraction using Sobel Operators(left) and registration result (265,256,1.1)

Type

Edge extraction
Number of successful registrationRegistration location error
(maximum, minimum, average)
Sobel based edge10(6,0,2.8)
Canny edge extraction10(6,0,2.6)
Roberts based edge10(9,0,3.2)

Table 3.

Image registration results based different edge extraction methods(10 image pairs)

Advertisement

5. Discussion

Our experiment results suggest that NMI and CR algorithms are unsuitable for image registration of different sources because these two methods are dependent on the intensity distributions of the images which is of different values in the input images, while the proposed algorithm considers the edge feature which is common to both images involved and is independent of the image intensities. The image registration methods base on SIFT are also unsuitable to our application because the common key points can rear. The important reason is that the key points are largely dependent on the image intensities.

With regard to computational efficiency, although the images are only zoomed out at 2 times, computational load is significantly reduced. The variable resolution algorithm can run about 40 times faster than the standard pixel by pixel cross correlation method without registration performance degradation.

Figure 8.

Comparison of linear regression of CWSI calculated using different methods and conditions versus stem water potential (SWP)

As demonstrated in the experiment results, the rapid NCC algorithm implementation is able to achieve the same registration accuracy as the standard NCC implementation. Once an image pair is registered, the plant water status information can be estimated via the method in (Wang et. al. 2010a), where it is evidenced that the accuracy of the IR and optical image registration may have significant influence to the canopy temperature estimation. Fig 8 illustrates the experiment results presented in (Wang, et al., 2010a) which verifies that the trend of the computed CWSI is consistent with an alternative plant water stress indicator, the SWP data.

Advertisement

6. Conclusion

The chapter presented a fast realization algorithm of automatic optical and infrared image registration. It reduces the computation considerably by utilizing variable resolution algorithm without sacrificing registration performance. It can also be used in other applications. As we mentioned above, the image registration is only the first stage in the process to estimate the canopy temperature. We are developing methods to obtain more accurate registration and how to construct a algorithm suit to allow for more variation in the background images.

It is our aim to continue research in this field towards a decision support system to deliver real time automated irrigation control based on CWSI where the ‘plant is the sensor’.

References

  1. 1. JonesH. G. 1999a Use of thermography for quantitative studies of spatial and temporal variation of stomatal conductance over leaf surfaces. Journal of Plant, Cell and Environment, 22 9 (September 1999), 10431055 , 1365-3040
  2. 2. JonesH. G. 1999b Use of infrared thermometry for estimation of stomatal conductance as a possible aid to irrigation scheduling. Journal of Agricultural and Forest Meteorology, 95 3 (June 1999), 139149 , 1461-9563
  3. 3. JonesH. G.LeinonenI. 2003 Thermal imaging for the study of plant water relations. Journal of Agricultural Meteorology, 59 3 (September 2003), 205217 , 0021-8588
  4. 4. GuilioniL.JonesH. G.LeinonenI.LhommeJ. P. 2008 On the relationships between stomatal resistance and leaf temperatures in thermography. Journal of Agricultural and Forest Meteorology, 148 11 (October 2008), 19081912 , 1461-9563
  5. 5. GrantO.TroninaL.JonesH. G.ChavesM. M. 2007 Exploring thermal imaging variables for the detection of stress responses in grapevine under different irrigation regimes. Journal of Experimental Botany, 58 4 (March 2007), 815825 , 0022-0957
  6. 6. WheatonA. D.CooleyN.DunnG.GoodwinI.NeedsS. 2007 Evaluation of infrared thermography to determine the crop water status of Cabernet Sauvignon grapevines. Proceedings of The 13th Australian Wine Industry Technical Conference, Adelaide, 28 July- 2 August, 2007.
  7. 7. WangX. Z.YangW. P.WheatonA. D.CooleyN.MoranB. 2010a Automated canopy temperature estimation via infrared thermography: A first step towards automated plant water stress monitoring. Journal of Computers and Electronics in Agriculture, 73 1 (July 2010 ), 7483 , 0168-1699
  8. 8. WangX. Z.YangW. P.WheatonA. D.CooleyN.MoranB. 2010b Efficient Registration of Optical and IR images for Automatic Plant Water Stress Assessment. Journal of Computers and Electronics in Agriculture, 74 2 (November 2010), 230237 , 0168-1699
  9. 9. TsaiD. M.LinC. T. 2003 Fast normalized cross correlation for defect detection. Journal of Pattern Recognition Letters, 24 15 (November 2003), 26252631 , 0167-8655
  10. 10. RocheA.MalandainG.etc 1998 The correlation ratio as a new similarity measure for multimodal image registration. Medical Image Computing and Computer Assisted Intervention- MICCAI’98, 1496 11151124 , Boston, USA, October, 1998.
  11. 11. KleinS.StaringM.PluimJ. P. W. 2007 Evaluation of Optimization Methods for Nonrigid Medical Image Registration Using Mutual Information and B-Splines. IEEE Transactions on Image Processing, 16 12 (December 2007), 28792890 , 1057-7149
  12. 12. ChenJ.TianJ. 2009 Real-time multi-modal rigid registration based on a novel symmetric-SIFT descriptor. Progress in Natural Science, 19 5 (May 2009), 643651 , 1002-0071
  13. 13. YangW. P.WangX. Z.etc 2009 Automatic optical and IR image fusion for plant water stress analysis, Proceedings of The 12th International Conference on Information Fusion, 10531059 , Seattle, 6-9 July 2009.
  14. 14. YuL.ZhangD. R.HoldenE. J. 2008 A fast and fully automatic registration approach based on point features for multi-source remote-sensing images. Journal of Computers & Geosciences, 34 7 (July 2008), 838848 , 0098-3004
  15. 15. AlthofR. J.WindM. G. J.DobbinsJ. T. 1997 A Rapid and Automatic Image Registration Algorithm with Subpixel Accuracy, IEEE TRANS ON MEDICAL IMAGING, 16 3 (June 1997), 308316 , 0278-0062
  16. 16. BhagaliaR.FesslerJ. A.KimB. 2009 Accelerated nonrigid intensity-based image registration using importance sampling. IEEE TRANS ON MEDICAL IMAGING, 28 8 ( August 2009), 12081216 , 0278-0062
  17. 17. HuttenlocherD. P.KlandermanG. A.RucklidgeW. J. 1993 1993). Comparing images using the Hausdorff distance, IEEE Trans. on Pattern Analysis and Machine Intelligence, 15 9 (September 1993), 850863 , 0162-8828
  18. 18. MaesF.CollignonA.VandermeulenD.etc 1997 Multimodality image registration by maximisation of mutual information”, IEEE Trans. on Medical Imaging, 16 2 ( April 1997), 187198 , 0278-0062
  19. 19. LauY. H.BraunM.HuttonB. F. 2001 Non-rigid image registration using a median-filtered coarse-to-fine displacement field and a symmetric correlation ratio, Journal of Physics in Medicine and Biology, 46 4 (April 2001), 12971319 , 0031-9155

Written By

Weiping Yang, Zhilong Zhang, Xuezhi Wang, Bill Moran, Ashley Wheaton and Nicola Cooley

Submitted: 15 October 2010 Published: 24 June 2011