Original paper
Efficient registration of optical and IR images for automatic plant water stress assessment

https://doi.org/10.1016/j.compag.2010.08.004Get rights and content

Abstract

Automatic registration of optical and IR images is a crucial step towards constructing an automated irrigation control system where plant water information is sensed via thermal imaging. The scene of the IR image is assumed to be completely included in the optical image and the alignment between the common scene in the two images may involve translation and rotation by a small angle, though a small scale difference may also be present. This automatic registration of data from two quite different, non-rigid imaging regimes presents several challenges, which cannot be overcome using common image processing techniques. In this paper, a fully automatic image registration algorithm for the alignment of optical and IR image pairs is described, where Pearson's cross-correlation between a pair of images serves as the similarity measure. A computationally efficient algorithm is designed and packaged as a software application. This work provides an intervention free process for extracting plant water stress information which can be fed into an automated irrigation scheduling program. The proposed algorithm is justified by the comparison of its registration performance with that of other potential algorithm techniques using several experimental data collections. Our results demonstrate the effectiveness of the proposed algorithm and efficiency of its application to the registration of IR and optical images.

Research highlights

▶ Algorithm for the automatic optical and IR image alignment for plant canopy temperature data. ▶ Computationally efficient, robust software implementation to enable online data processing for the analysis of plant water stress in real time.

Introduction

Recent research in agriculture indicates that plant water status may be monitored if the canopy temperature distribution of the plant is known (Jones, 1999a, Jones, 1999b, Jones and Leinonen, 2003, Wheaton et al., 2007). Plant water status information can be obtained via the computation of the crop water stress index (CWSI) (Jones, 1999a). This index offers great potential to generate an automated irrigation control system where plant canopy temperature distribution is acquired via thermal imaging. Such a system is expected to be able to optimize irrigation water usage and maintain plant health in real time, and thus increase the productivity of limited water resources.

The Melbourne School of Land and Environment at the University of Melbourne is currently conducting a research program to estimate spatial and temporal variation in water status of grapevines using data collected from remotely sensed infrared (IR) and optical digital images (Wheaton et al., 2007). The ultimate research goal is to design an automatically controlled irrigation system using CWSI via non-destructive IR thermography and automated measurement processing (Jones, 1999a, Jones, 2004).

Typically, measurement data of the IR thermography sensing system consists of a reference optical image and an IR image. The optical image is obtained using a digital camera and is taken at the same location as the IR image. The optical image allows the underlying plant canopy of interest to be flawlessly identified. In the process of plant water stress analysis, the value of CWSI is calculated based on the canopy temperature, dry and wet reference temperatures. These temperatures can be estimated once the temperature distribution of the canopy leaf area is obtained. Clearly, the automatic detection of the area overlap between the pair of IR and optical images plays a central role in the plant canopy temperature acquisition, which has been identified as one of the main issues for the automatic controlled irrigation program. Preliminary results from recent work (Wang et al., 2010) indicates that the accuracy of canopy temperature distribution estimation, which plays a central role for the evaluation of the plant water stress, strongly depends on the accuracy of optical and IR image registration.

The major difficulties which arise due to this type of image registration are listed below.

  • Image data is often obtained utilizing various sensors, which results in view angles and image collection time being different. The pair of optical and IR images are generally not matched exactly.1

  • Where the image area does overlap, the intensities of both images can be quite different. Therefore, using approaches which involve image intensity is unlikely to obtain a satisfactory registration result.

  • Apart from some similarity in overall structure, there is almost no commonality in some popular feature spaces between the pair of images. This suggests that feature-based approaches, such as scale invariant feature transformation (SIFT) may not be suitable for this type of image registration.

Obviously, the quality of the optical and IR image registration has a direct impact on the accuracy of canopy temperature estimation and thus the reliability of plant water status prediction. Fig. 1 shows an example of optical and IR image pair taken from the scene of Cabernet Sauvignon grapevines. These pictures were taken at different times and from different view angles. From this pair of images, one would only identify a “most similar” rather than “exact” area that can match the IR image (right) from the optical image (left).

Various image registration techniques and algorithms are available in the literature (Zitová and Flusser, 2003) and references therein. The approaches are fundamentally divided into two main categories, i.e., area-based methods and feature-based methods.

The SIFT method is perhaps the most representative approach in the feature-based automatic methods. The SIFT implementation is able to find distinctive key point that are invariant to location, scale and rotation. They are also robust to affine transformations, that is changes in scale, rotation, shear and position; as well as changes in the illumination for images of the same source or same type of sensor (Lowe, 1999). The algorithm is particularly effective when images are collected from same source or similar type of sensors and contain rigid objects. However, we noted that in the underlying application the SIFT success rate is less than 10%. In most of cases, there is simply no SIFT point at all.

Solutions which are based on area correlation technique (Pratt, 1974) seems to be more suited to the problem at hand, except for those which use intensity (or color) dependent functions as similarity measures, such as the Fourier transformation type (Liu et al., 2006) and mutual information type (Viola and Wells, 1997) approaches. The maximum correlation coefficient detection method was initially proposed by Anuta (1969) and later extended by Pratt (1974) by considering the correlation in a feature space. Instead of using cross-correlation coefficient, Huttenlocher et al. (1993) used the Hausdorff distance as the similarity measure to register binary images that are the output of an edge detector. To deal with the problem of multimodality in medical image registration, Maes et al. (1997) proposed a method in which applies mutual information (MI) to measure the statistical dependence or information redundancy between the image intensities of corresponding voxels in both images. The normalised mutual information (NMI) approach was proposed as a voxel-based similarity measure for registration of MRI images (Rueckert et al., 1999), which is insensitive to intensity changes as a result of the contrast enhancement. Registration is achieved by minimizing a cost function, which represents a combination of the cost associated with the smoothness of the transformation and the cost associated with the image similarity. The effectiveness of this method was further demonstrated by Klein (2007). Using correlation ratio (CR) as the similarity measure, Roche et al. (1998) proposed a CR method which assumes that the voxel-pair intensities between two registered images are functionally dependent. These area-based methods were summarized by Lau et al. (2001).

In this paper, we consider the registration problem between an optical image and an IR image via area-based registration techniques. In each pair of collected images, the scene of the IR image is completely included in the reference optical image. Affine transformations between the pair of images may involve translation, rotation by a small angle, and/or a possible small scaling difference. An automatic cross-correlation (ACC) algorithm based on the fundamental Pearson's correlation is proposed. In addition, a computational efficient algorithm is designed. This work provides the intervention free measurement of plant water status assessment, such as the one described in Wang et al. (2010), which can be embedded into an automated irrigation scheduling system for smart irrigation control. The proposed algorithm is justified by comparing its registration performance in terms of registration accuracy with that of other potential candidates using real data collections. Our results demonstrated the effectiveness of the proposed ACC algorithm and the efficiency of its application to the registration of IR and optical images.

A short version of this work was presented in Yang et al. (2009). Compared to the earlier work, in this paper we have refined the ACC algorithm both in theory and implementation. Issues with computational complexity and robustness are addressed. In addition, more experimental results for the algorithm performance and comparison are provided.

The rest of the paper is organized as follows. The problem description is given in the next section. We present the automatic cross-correlation (ACC) image registration approach in detail in Section 3. Alternative methods under consideration are discussed in Section 4. In Section 5, computational issues of the ACC algorithm are addressed and a computationally enhanced implementation is described. Experimental results and discussions are given in Section 6, followed by the conclusion.

Section snippets

Problem description

Let Fo and FIR denote the related optical and IR images, respectively. In the application at hand, the pair of images are always taken from the same location, which reasonably validates the following assumptions:

  • 1.

    The scene of FIR is completely within that of Fo. In other words, there is an area in the optical image Fo where the scene of the IR image FIR is approximately matched.

  • 2.

    The disparity between the optical and IR cameras is short enough compared to the distance of them from the object.

  • 3.

    The

Automatic cross-correlation alignment algorithm

The ACC algorithm developed in this work is based on the principle of Pearson's cross-correlation method Anuta. As shown in Fig. 2, the input image pair Fo and FIR are firstly converted into gray scale images Fo and FIR, respectively. An edge detection filter is applied to obtain the filtered edge images fo and fIR, respectively. This operation takes away the color and intensity information, which are quite different in the pair of input images.

Therefore, the correlation of the pair of images

Alternative auto-registration methods

Apart from the proposed ACC algorithm, a couple of area-based approaches which are of weak dependency to the intensity changes between the input images, in particular the NMI and CR methods, may be considered as alternative candidates for the underlying image registration problem. Both NMI and CR have been successfully applied to medical image registration Lau. Therefore, in our studies we also implement these algorithms and compare their performance with that of the proposed ACC algorithm.

It

Efficient ACC implementation

While the ACC algorithm has a satisfactory alignment performance, it also involves a large computational overhead. Typically, Brute force search is required for computing the cross-correlation coefficient matrix R, which can generate huge computational load when large size images are involved, even when running the ACC with a modified approach (Tsai and Lin, 2003) which uses sum-tables to reduce data dimension and thus saves computation. It is important therefore, to consider a computational

Experiment results and discussions

The experiment has two parts. First, we compare the registration performances between the proposed ACC method and the alternative NMI and CR methods. The registration performance is measured in terms of root mean square (RMS) error and registration success rate. The ground truth in the experiment was “obtained” via visual examination conducted by an expert. Secondly, we present the evidence to demonstrate that the proposed software implementation of the ACC algorithm is able to achieve an

Conclusion

The requirement for plant water stress analysis via thermal imaging gives rise to the need for the registration of optical and IR images. This particular registration problem poses several challenges to conventional image registration techniques since no consistent common feature or exact matching can be found from the input images. In this paper, an automatic optical and IR image alignment algorithm ACC is developed for the application of plant water stress analysis. The algorithm uses the

References (20)

There are more references available in the full text version of this article.

Cited by (69)

  • Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach

    2020, Computers and Electronics in Agriculture
    Citation Excerpt :

    This is due to the difference in image modalities. Some other Area-based registration algorithms were also tested, such as: Normalized Cross-Correlation (NCC) (Wang et al., 2010) and Phase-Correlation (PC) (Erives and Fitzgerald, 2005). But due mainly to the presence of some deformations on the image dataset used, these algorithms did not manage to correctly register most of the images.

  • An Optimized YOLO v5 Model for Tomato Leaf Disease Classification with Field Dataset

    2023, Engineering, Technology and Applied Science Research
View all citing articles on Scopus
View full text