Open access

Normalization of Infrared Facial Images under Variant Ambient Temperatures

Written By

Yu Lu, Jucheng Yang, Shiqian Wu, Zhijun Fang and Zhihua Xie

Submitted: 14 November 2010 Published: 09 August 2011

DOI: 10.5772/21502

From the Edited Volume

Advanced Biometric Technologies

Edited by Girija Chetty and Jucheng Yang

Chapter metrics overview

2,817 Chapter Downloads

View Full Metrics

1. Introduction

Face recognition, being straightforward, passive and non-invasive comparing with other biometrics such as fingerprint recognition (Yang et al., 2006; Yang & Park, 2008a; Yang & Park, 2008b), has a nature place in biometric technology and computer vision. Currently, most researches on face recognition focus on visual images. The reason is obvious: such sensors are cheap and visual images are widely used. The key problem in visual face recognition is to cope with different appearances due to the large variations both in intrinsic (pose, expression, hairstyle etc) and extrinsic conditions (illumination, imaging system etc). It is difficult to find the unique characteristics for each face, and it is accordingly not easy to develop a reliable system for face recognition by using visual images.

Infrared face recognition, being light-independent and not vulnerable to facial skin, expressions and posture, can avoid or eliminate the drawbacks of face recognition in visible light. Some methods (Buddharaju, et al, 2004, Chen, et al, 2005 , Kong, et al, 2005, Wu, et al, 2005A) based on thermal images are proposed for infrared face recognition in last decade. It is highlighted that the infrared images which are the character of the human skins can be affected by ambient temperature, psychological, as well as physiological conditions. Therefore, the recognition systems based on thermal images have the problem that achieving high performance when the test and train images are captured in the same ambient temperature, while the performance is poor if the test and train samples are collected under different temperature (time-lapse data).

To improve the performance on time-lapse data, it is important to normalize the training images and test images. Linear gray transform and histogram equalization are two common methods for image normalization. However, these approaches change the grayscales which represents the skin temperature so that the thermal images have no physical significance. Therefore, both methods are not suitable for normalization of infrared facial images.

In this chapter, we dedicate to provide a novel study on normalization of infrared facial images, especially resulting from variant ambient temperatures. Three normalization methods are proposed to eliminate the effect of variant ambient temperatures. The experimental results show that the proposed methods can increase the robustness of infrared face recognition system and greatly improve its performance on time-lapse data.

The organization of the chapter is as below. In section 2, effect of ambient temperatures on thermal images is analyzed. Three normalization methods are presented in Section 3. An improved Normalized Cross Correlation (NCC) for similarity measurement is proposed in Section 4. A variety of experiments on normalized images are performed in Section 5, and the conclusions are drawn in Section 6.

Advertisement

2. Effect of ambient temperatures on facial thermal patterns

Prokoski et al. (1992) indicated that humans are homoiotherm and capable of maintaining constant temperature under different surroundings. However, it should be highlighted that the so-called ”homoiotherm” only refers to the approximately constant temperature in deep body (i.e., the core temperature), whereas the skin temperature distribution fluctuates with ambient temperature, changes from time to time, as shown in Houdas & Ring(1982), Guyton & Hall (1996), Jones & Plassmann (2000). The infrared camera can only capture the apparent temperature instead of deep temperature. It is pointed by Housdas & Ring (1982) that the variations in facial thermograms result from not only external conditions, such as environmental temperature, imaging conditions, but also various internal conditions, such as physiological and psychological conditions. Socolinsky & Selinger (2004A, 2004B) also studied such variations. It is necessary to learn how the thermal patterns vary in different conditions.

Figure 1.

An image is divided into blocks.(a)original image (b)block representation

Wu et al. (2005A, 2007) have illustrated that variations in ambient temperatures significantly change the thermal characteristics of faces, and accordingly affect the performance of recognition. It is indicated by Professor Wilder et al. (1996) that the effect of ambient temperatures on thermal images was essentially equivalent to that of external light on visible images. We only consider the effect of ambient temperatures on thermal images in the following analysis.

To study the characteristics of thermal images under different ambient temperatures Te, a sequence of images are captured and then divided into un-overlap blocks as shown in Figure 1. We can then obtain the mean values from each blocked image.

Table 1 shows the temperature mean in each block of different people in the identical ambient temperature. It is seen that different people have different thermal distributions. Table 2 indicates the means of skin temperatures in each blocked images of the same person. As is shown, the temperatures in each blocked images increase if the ambient temperatures increase. It reveals that the skin temperature variations and the ambient temperatures variations have the same tendency for the same person. In other hand, the skin temperature variations of different people have the same tendency when ambient temperature changes.

Block Image 1 Image 2 Image 3 Image 4
1 30.62ºC 31.46ºC 30.26ºC 31.32ºC
2 31.85ºC 32.21ºC 31.15ºC 32.59ºC
3 34.46ºC 34.73ºC 34.32ºC 34.05ºC
4 33.58ºC 33.24ºC 33.35ºC 33.17ºC
5 35.02ºC 34.53ºC 35.04ºC 34.96ºC
6 33.43ºC 33.74ºC 33.50ºC 33.49ºC
7 33.34ºC 32.83ºC 33.58ºC 32.97ºC
8 35.02ºC 34.56ºC 34.61ºC 33.63ºC
9 33.62ºC 33.35ºC 33.45ºC 32.83ºC
10 32.14ºC 32.83ºC 32.89ºC 32.75ºC
11 34.47ºC 34.11ºC 34.39ºC 34.40ºC
12 32.41ºC 32.67ºC 33.06ºC 33.01ºC

Table 1.

Skin temperature mean in each block of different people when ambient temperature is Te =25.9

Block/Te Te1=24
ºC
Te2=25.9
ºC
Te3=28
ºC
Te4=28.5
ºC
1 29.74ºC 31.71ºC 32.96ºC 33.12ºC
2 30.91ºC 31.88ºC 33.37ºC 33.58ºC
3 30.22ºC 30.66ºC 33.53ºC 33.65ºC
4 31.98ºC 32.97ºC 34.04ºC 34.18ºC
5 34.51ºC 34.52ºC 35.38ºC 35.52ºC
6 32.35ºC 33.03ºC 34.39ºC 35.69ºC
7 33.06ºC 33.44ºC 34.81ºC 34.96ºC
8 34.25ºC 34.25ºC 35.41ºC 35.63ºC
9 33.21ºC 33.29ºC 35.06ºC 35.29ºC
10 32.62ºC 32.67ºC 34.17ºC 34.45ºC
11 34.25ºC 34.42ºC 35.49ºC 35.85ºC
12 30.94ºC 31.93ºC 34.09ºC 34.48ºC

Table 2.

Skin temperature mean in each block of same person in different ambient temperatures, where Te is the ambient temperature

What’s more, with the increasing of ambient temperature, the skin temperature variations of each blocked images are different. In order to study the relation between skin temperature variations of each blocked images and ambient temperatures, the maximum and minimum values in thermal images are used to analyze.

Figure 2.

Maximum and minimum values in thermal facial images captured under different ambient temperatures

As shown in Figure 2, ten thermal images are collected in ambient temperatures 26ºC and 28.5ºC respectively. The maximal and minimal temperature in thermal images, are relatively stable. They all increase when ambient temperatures rise from 26ºC to 28.5ºC. It’s worth noting that the variations of maximums are lower than those of minimums, which implies the temperature variations are different in blocks.

Figure 3.

Thermal images in different environments and their subtraction

In Figure 3, the input image (a) and reference image (b) (the thermal images of same person) were captured when ambient temperatures are 26ºC and 28.5ºC, which are geometrically normalized to size 80X60. The background areas are set to 0.000001ºC. Image (c) is the subtraction of image (b) and image (a). According to the characteristics of thermal images, the point which is brighter (white) has higher gray value, while the gray value of dark point is lower. Through observation of the images captured under different ambient temperatures, reference image, and their subtracted images, we find:

  1. The temperatures of forehead and nose parts are higher than those of the facial edge, hair. Moreover, the higher the skin temperature, the smaller its temperature variations while ambient temperature changing;

  2. In a thermal image, the difference of skin temperature among different parts is about 10ºC;

  3. Due to the effect of breathing pattern, the temperature fluctuation in mouth part is larger than any other parts.

Advertisement

3. Normalization of infrared facial images

Wilder et al. (1996) illustrated that the effect of ambient temperatures on thermal images is essentially equivalent to that of external light on visible images. Therefore, analogous to the methods solving illumination in visible face recognition, the methods to eliminate ambient temperatures variations can be divided into the following three kinds: the methods based on transformation, invariant features and temperature model.

In order to eliminate the effect of ambient temperatures on infrared imaging and improve the robustness and recognition performance, we convert the image captured in unknown ambient temperature into the reference ambient temperature. This is the so-called normalization of infrared images, which is the key issue addressed in this section.

The traditional infrared image normalization methods are classified as linear gray transform methods ( Chen, et al, 2005 , Zhang, et al, 2005) and equilibrium methods (Song, et al, 2008). The former methods expand the range of gray levels to a dynamic range, with some linear methods. These methods aim to strengthen the detail characteristics of images. The second equilibrium methods are also dedicated to improve the information which people are interested in. These methods are not suitable for application to infrared image normalization, because these methods change the distribution of facial temperatures. Kakuta et al. (2002) have proposed a method for infrared image normalization which utilizes the skin temperature subtraction to convert infrared images, as shown in Figure 4.

This method can convert IR images obtained under various thermal environments into those under reference environments, in which the IR images contain foot, chest, hand and forearm. Every point in image gets the same offset. However, it should be noted that the offset of each point in face is different, as indicated in Section 2. Such processing will lead to wrong temperature variation. Therefore, we should use different normalized methods to deal with the infrared facial images. Three methods will be introduced below separately.

3.1. Normalization based on transformation (TN)

Normalization of infrared facial images based on transformation aims to reduce the effect of ambient temperatures using transformation. The proceedure using the block and least squares method is explained in details in Wu, et al, 2010. Firstly, the images collected under different ambient temperatures were divided into some block images, the values of the change of ambient temperatures and the change of corresponding temperature in face were obtained, which were used to be fitted into a function through the least squares method. Then, each

Figure 4.

Conversion of infrared images (Kakuta, et al., 2002)

block image was normalized by the relevant function and the whole image could be normalized into the reference condition. Specific processes are as follows.

  1. Obtaining q images captured in every i different ambient temperatures T e 1 , T e 2 , ... , T e i and blocking them;

  2. For each image containing m × n blocks, the means T 1 , T 2 , ... , T q of q reference images and their mean T m e a n are calculated. With this calculation, the means T i 1 , T i 2 , ... , T i q of q reference images in i different ambient temperatures and their mean T i m e a n are obtained;

  3. | T e T e i | T m e a n T i m e a n y i = f i ( t ) y i = f i ( t )
g j = g j f j ( t ) E1

In formula (1), t is the variation of ambient temperatures and f i ( t ) is variation in facial temperature. g i represents the j th blocked image in image g under different ambient temperatures, while g i is the image after temperature normalization.

  1. Each blocked image can be normalized by the same process. With these executions, the thermal image captured in unknown ambient temperature can converted into the reference environmental temperature.

3.2. Normalization based on invariable features (IFN)

Principal Component Analysis (PCA) is one of the most successful feature extraction methods, which frequently applied to various image processing tasks. The magnitudes of eigenvalues correspond to the variances accounted for the corresponding component. For example, discarding principal components with small eigenvalues is a common technique for eliminating small random noise. It has been theoretically justified from the point of spherical harmonics of view that eliminating three components with largest eigenvalues has been used to handle illumination variations (Ramamoorthi, 2002). In addition, the traditional PCA algorithm operates directly on the whole image and obtains the global features, which are vulnerable to external ambient temperatures, psychological and physiological factors. Such global procedure cannot obtain enough information because the local part of the face is quite different. It is necessary to extract more detailed local features. Xie et al. (2009) proposed a weighted block-PCA and FLD infrared face recognition method based on blood perfusion images, which highlighted the contribution of local features to the recognition and had a good performance when the test samples and training samples were captured in the same environmental temperatures.

Temperature normalization using block-PCA is proposed in this section (Lu, et al, 2010). In order to take full advantage of both the global information and the local characteristics of facial images, the images are partitioned into blocks. Then PCA is performed on each block and the component corresponding to the biggest eigenvalue of each block is discarded to eliminate the ambient effect.

3.2.1. PCA feature extraction

The main idea of PCA algorithm is using a small number of characteristics of features to describe the samples, reduce feature space dimension while the required identifying information are reserved as much as possible. These steps are briefly descripted below.

3.2.1.1. PCA algorithm

Assume that a set X = { x 1 , x 2 , ... , x N } u × N of N training samples is given, where x i is column vector which connects the rows of image matrix x i , u is the number of pixels.

The average face of the training set is as follow:

M ¯ = 1 N × i = 1 N x i E2

The distance from each face to the average face is as follow:

M i = x i M ¯ , i = 1 , 2 , ... N E3

We assume the matrix M = ( M 1 , M 2 , ... , M N ) and the corresponding covariance matrix is S = M × M T . The next task is to select the v largest eigenvalues with corresponding eigenvectors and form a projection space W o p t . Then the original image vector of dimension u is projected to the space of dimension v, where u>v. The eigenvector after projection is as follow:

y k = W o p t T × x k , k = 1 , 2 , ... , N E4

There are usually two methods to determine the number of principal components, which is the number v of eigenvalues of the covariance matrix.

The first method is determined by the pre-specified information compression ratio ( η 1 ):

η = λ 1 + λ 2 + λ 3 + ... + λ v λ 1 + λ 2 + λ 3 + ... + λ N E5

Where λ 1 λ 2 λ 3 ... λ v ... λ N is the eigenvalues of S .

The second method that directly reserves the N 1 largest eigenvalues is a common method, where v = N 1 .

3.2.1.2. Improved PCA algorithm

Although the eigenvalues and their corresponding eigenvectors produced by PCA take into account all the differences between the images, it cannot distinguish these differences caused by the human face or by the external factors. If the test samples and training samples are not collected at the same time, the ambient temperatures, psychological and physiological factors will affect the infrared image greatly when the images are collected. Some variations affected by these factors will arise in the principal components with the corresponding eigenvectors. In addition, each principal component represents a kind of image feature. Some of these features pertain to the subjects depicted in the thermal images, and others represent irrelevant sources of image variations. Thus, it is natural to handle these principal components to reduce the effects caused by the ambient temperatures, psychological and physiological factors on infrared images.

After determining the number of principal component, eliminating three components with largest eigenvalues has been used to handle illumination variations (Ramamoorthi, 2002). Although the effect of ambient temperatures on thermal images is essentially equivalent to that of external light on visible images, it cannot ensure the three components with the largest eigenvalues in thermal images must contain the useless information for identification. The principal components may contain feature information which distinguishing the images from different classes. Therefore, a method calculates the standard deviation of each principal component is proposed to estimate the effects of ambient temperatures, psychological and physiological factors on each principal component. The procedure describes as follows:

After obtaining the projection matrix, ten test samples, as an example, of the same person are extracted and projected into the projection matrix. The images with high dimension are mapped to the space with low dimension. Then the standard deviation of each principal component belongs to the test samples are calculated and shown as follows:

σ = [ 53.1 , 21.1 , 31 , 30 , 35.2 , 44.2 , 83.1 , 97.9 , 49.1 , 17.1 , ... ] E6

We can see from the standard deviations of principal components that the standard deviations of the first, 7th and 8th principal component are very large in comparison with other standard deviations of the ten principal components, which means the image information in these three principal component change much more strenuous with the variations of ambient temperatures, psychological and physiological factors. These factors account for the differences between test samples and training samples collected in different time. The standard deviations of the second and the third principal component are smaller than any other standard deviations of the ten principal components. The useful feature information for identification will lose if these two principal components are discarded. For a human face, the energies of first three or five components are 90% and 94% respectively. The remaining principal components contain less information comparing with the preceding principal components. Therefore, instead of discarding three components with largest eigenvalues, the component with the biggest eigenvalue is discarded in this section to reduce the effects of the ambient temperatures, psychological and physiological in infrared images.

The v largest eigenvalues and their corresponding eigenvectors are obtained by the covariance matrix S firstly. Then the eigenvector corresponding to the biggest eigenvalue is removed and the reminders of the v-1 eigenvalues with the eigenvectors constitute a new projection space W n e w . In this way, the original u-dimensional images vector project onto the (v-1)-dimensional space have better separability. The eigenvector after projection is as follows:

y k = W n e w T × x k , k = 1 , 2 , ... , N E7

3.2.2. Feature extraction by block-PCA

Traditional PCA algorithm is performed on the original images directly, which obtained are the global features. For IR images, only partial areas of the temperature information on the face change obviously when ambient temperature changes. The global features extracted tend to strengthen the part changes obviously, and neglect some other information representing the image classes.

Apart from the effect of uniform illumination on visible recognition system, the infrared images affecting by ambient temperatures are an overall process; all the human faces would change with the variation of the ambient temperatures. In fact, different parts of IR images have different variations with the changes of ambient temperatures. It does not primarily eliminate the effect of environment on the local thermal images by removing the largest principal component of the whole images.

Therefore, the idea of block-PCA is proposed to extract local features. It is assumed that a set X = { x 1 , x 2 , ... , x N } of N training samples is given. Each image is divided into m × n blocks and the size of each sub-block is p × q .

x = [ x 11 , x 12 , ... , x 1 n x 21 , x 22 , ... , x 1 n , , , x m 1 , x m 2 , ... , x m n ] E8

Using the method of information compression ratio, the eigenvectors of each block image is obtained and then the eigenvector with the largest eigenvalues of each block image is discarded. The remainders of eigenvectors of each block constitute a new eigenvector to be used for feature extraction.

The new combined eigenvector not only reflects the global features of images, but also reflects the global features. Moreover, it can reduce the effects of ambient temperatures, physiological and psychological factors, which enhance the robustness of the recognition system.

3.3. Normalization based on temperature model (TMN)

3.3.1. 0-1 normalization

As aforementioned in section 2, the higher the skin temperature, the smaller its temperature variation while ambient temperature changes. Therefore, every point in thermal image may have a weight coefficient to represent its own temperature variation weight. The high temperature corresponds to small weight coefficient while the low temperature point has a high weight coefficient. Therefore, a weighted linear normalization method of infrared image is proposed in this section (Wu, et al 2010). The weight coefficients of each point in the face are first obtained through the improved 0-1 standardization. Then each coefficient is used to temperature normalization.

Assume a sample x = ( x 1 , x 2 , ... , x m ) , where x m is one of the skin temperature in face.

x k f ( x k ) = ( x max x k ) / ( x max x min ) = ω ¯ k E9

In which wk is corresponding to weight coefficient while ambient temperature changes, x max = max ( x ) = max ( x 1 , x 2 , ... , x m ) ,, x min = min ( x ) = min ( x 1 , x 2 , ... , x m ) .

As shown in the formula (8), the point x k whose skin temperature is high has a high gray value, which results in small weight coefficient, and vise reverse.

3.3.2. Temperature normalization based on maximin model

Suppose two thermal images f ( x , y ) and g ( x , y ) to be captured in different ambient temperatures, in which f ( x , y ) is collected in reference temperature T e 1 while g ( x , y ) is gathered in unknown temperature T e 2 . State parameters are extracted from g ( x , y ) and the ambient temperature T e 2 is obtained. Then the difference of ambient temperature of the two thermal images is calculated. Δ T = T e 1 T e 2 , each point in test image has the following calculation:

{ x k = x k x max x k x max x min | Δ T | , i f T T e 1 x k = x k x max x k x max x min | Δ T | , e l s e E10

The result obtained by the maximin model is anastomotic with the analysis shown in Section 2.

Advertisement

4. Improved Normalized Cross Correlation for similarity measurement

The purpose of temperature normalization is to convert the images captured under different environmental temperature into the images collected in reference temperature. The images after temperature normalization have the similar ambient temperature and temperature distribution. Therefore, the similarity of test image and reference image can be used to test the performances of the proposed normalization methods.

Normalized Cross Correlation (NCC) is defined as follow:

N C C = i = 1 m j = 1 n S ( i , j ) × T ( i , j ) i = 1 m j = 1 n S ( i , j ) 2 × i = 1 m j = 1 n T ( i , j ) 2 E11

where S is the reference image, T is the test image. Both S and T have same size m×n. Obviously, higher value of NCC indicates more similar of the two images. NCC equals to 1 when the two images are completely same.

It is noted that the NCC method is pixelwised, which requires that the images have the exact location for measurement. As faces are 3D moving subjects, which poses 3D rotation and deformation, it is impossible to achieve exact one-to-one correspondence computation even the two faces images are well segmented. Namely, the point I ( x , y ) corresponding to nose in test image is not the same location of nose in reference image. Therefore, the traditional NCC method is not accurate for face measurement. An improved NCC method is proposed to test the normalized performance.

Assuming S, T are normalized reference image and normalized test image. Each image is blocked into p × q and M is the mean of block image. The improved NCC is defined as follow:

N C C = i = 1 p j = 1 q ( M S ( p , q ) T e 1 ) i = 1 p j = 1 q ( M S ( p , q ) T e 1 ) 2 × i = 1 p j = 1 q ( M T ( p , q ) T e 1 ) 2 E12
N C C n o γ = i = 1 p j = 1 q ( M S ( p , q ) T e 1 ) × ( M T n o γ ( p , q ) T e 1 ) i = 1 p j = 1 q ( M S ( p , q ) T e 1 ) 2 × i = 1 p j = 1 q ( M T n o γ ( p , q ) T e 1 ) 2 E13

In which M S ( p , q ) is the mean temperature value of one block image, Te1 is the ambient temperature when reference image S is captured. If N C C n o γ N C C n e w 0 , It reveals that the test image has more similar temperature distribution with reference image. On the other hand, it also shows the temperature normalized method can effectively reduce the effect of external temperature on thermal facial recognition system.

Advertisement

5. Experimental results and discussions

5.1. Infrared database

Currently, there is no international standard infrared facial database stored in temperature. We captured the infrared images using the ThermoVision A40 made by FLIR Systems Inc. The training database comprises 500 thermal images of 50 individuals which were carefully collected under the similar conditions: environment under air-conditioned control with temperature around 25.6~26.3ºC. Each person stood at a distance of about 1 meter in front of the camera. The ten templates are: 2 in frontal-view, 2 in up-view, 2 in down-view, 2 in left-view, and 2 in right-view. As glass is opaque to long-wavelength infrared, subjects are required to remove their eyeglasses in database collection. The original resolution of each image is 240×320. The size turns to be 80×60 after face detection and geometric normalization, as illustrate in Figure 5 and Figure 6.

Time-lapse data were collected from one month to six months. There are totally 85 people involved, some are included in training subjects and some are not. This time, the subjects are allowed to wear eyeglasses. The data were acquired either in air-conditioned room from morning (24.5~24.8ºC), afternoon (25.4~26.3ºC) to night (25.4~25.8ºC) or outdoor, where the ambient temperature was 29.3~29.5ºC. It is noted that these data were collected on the condition that the subjects had no sweat. The total test images is 1780.

Figure 5.

Original image and its geometrically normalized image. (a) Original image. (b) Geometrically normalized image

Figure 6.

Some samples after geometrical normalization

5.2. NCC experimental results

In Figure 7, image (c) represents the normalized image. It is shown that the test image and its normalized image look similar that it is difficult to judge the effect of the proposed normalization method.

We can see from Figure 8 that the NCC after TMN method is higher than the original NCC without normalization.

Figure 7.

a) training image (b) test image and (c) normalized image using TN method

Figure 8.

NCC of training image with the original test images and their normalized images using TMN method

Further, we verify the performance of normalization via recognition rate. In our expreiments,,linear least squares model and polynomial least squares modedl are applied to verify the influence of different matching curves (Wu, S. Q. et al, 2010a). After temperature normalization, the thermal images are converted into simplified blood perfusion domain to get stable biological features (Wu et al, 2007).

ω = ε σ ( T 4 T e 4 ) α c b ( T a T ) E14

Where T is the temperatures in images, T a , T e , are the core temperature and ambient temperature respectively. ω is the blood perfusion, ε , σ , α and c b are four constants.

To extract features, PCA and discrete wavelet transform (DWT) (Wu, S. Q. et al, 2009) are chosen to test the effect of normalization. 3NN and Euclidean distance are selected in classification and recognition. Experimental results are shown in the Table 3.

5.3. Experimental results using TN method

It is shown in Table 3 that the normalization using block size 80×60 did not improve the performance, because the skin temperature variations vary with different facial parts. However, the normalized methods using small blocks have better performance, whatever which method are chosen. Moreover, the normalized method using linear least squares achieves better performance than that using polynomial least squares. It is highlighted that smaller block size results in higher performance. But the computation increases along with the increasing of the number of block images.

Feature extraction method Block size Pre-normalization Linear normalization Polynomial normalization
PCA 80×60 56.97% 56.97% 56.97%
PCA 40×20 56.97% 59.39% 60.61%
PCA 20×20 56.97% 63.64% 70.30%
DWT 80×60 49.70% 49.70% 49.70%
DWT 40×20 49.70% 53.94% 56.46%
DWT 20×20 49.70% 55.76% 63.64%

Table 3.

Recognition rate in comparing pre- and post-normalization

We also using 2D linear discriminant analysis (2DLDA) (Wu, S. Q. et al, 2009), DWT+PCA (Wu, S. Q. et al, 2009), and PCA+LDA (Xie, Z. H et al, 2009) for feature extraction. The recognition rates with or without normalization are shown in Table 4.

Feature extraction method Block size Pre-normalization Polynomial normalization
2DLDA 20×20 61.82% 71.52%
DWT+PCA 20×20 50.30% 57.58%
PCA+LDA 20×20 55.76% 61.82%

Table 4.

Recognition rate with/without normalization

As shown is Table 4, no matter which feature extraction method is chosen, the thermal recognition system after temperature normalization has better performances.

5.4. Experimental results using IFN method

Fig.9 shows how the recognition rates vary with number of eigenvectors removed, and Table 5 demonstrates the recognition rates when discarding different components.

Figure 9.

Recognition rate when discarding some components

As shown in Table 5 that the recognition performance has remarkable improvement by getting rid of the principal components with the largest eigenvalues. Moreover, the performance by discarding three principal components with the largest eigenvalues is better than that discarding two principal components with the largest eigenvalues, which verifies that the second principal component is useful for identification and it cannot be discarded. These results agree with the previous work in Section 3.2.

s v Recognition rate
0 499 56.97%( 94/165)
1 498 81.21%(134/165)
2 497 74.55%(123/165)
3 496 80.61%(133/165)
4 495 80.61%(133/165)
5 494 77.58%(128/165)

Table 5.

Recognition varies with the number of discarded components with biggest components, where v is the total number of principal components and s is the component number to discard

As can be seen from Figure 10, the recognition rate with η min = 0.9 is only 3.64 percentages lower than that with η max = 0.99 . Hence, the block-PCA experiments are operated in η min = 0.9 and η max = 0.99 . Sub-block size is chosen to be 40×30,20×30,20×20 and 10×10. Since the positions in each block-image are different and their temperature changes are different with the variations of the ambient temperatures, the method that determined the number of eigenvectors in this chapter is information compression ratio. Accordingly, the total number of eigenvalues after the PCA operation on each block-image is different. After the block-PCA performed, and the component with the biggest eigenvalue of each block-image is discarded, the reminders of the eigenvectors combine into a new eigenvector by the tactic of the block-images and are used for recognition.

Figure 10.

Recognition rate by deleting the biggest component when the information compression rate η = 0.99

η p × q Recognition rate
0.99 80×60 81.82%(135/165)
0.99 40×30 83.03%(137/165)
0.99 20×30 86.06%(142/165)
0.99 20×20 85.45%(141/165)
0.99 10×10 84.24%(139/165)

Table 6.

Recognition rate with different block size in η max = 0.99

Table 6 and Table 7 show the recognition rate in different block size, where η is the information compression ratio, p × q is the size of the sub-block.

η p × q Recognition rate
0.9 80×60 78.18%(129/165)
0.9 40×30 81.82%(135/165)
0.9 20×30 87.27%(144/165)
0.9 20×20 86.06%(142/165)
0.9 10×10 84.24%(139/165)

Table 7.

Recognition rate with different block size in η min = 0.9

Figure 11.

Recognition rate of normalization based on block and normalization based on non- block.

It is shown in Table 6 and Table 7 that the recognition performance is best when the size of the sub-block is 20×30 in the case of η min = 0.9 and η max = 0.99 . This is because the own temperature change of each sub-block is different with the change of the ambient temperatures. The different parts in the face can locate well in each sub-block when the size of sub-block is 20×30.

In order to show the contribution of the block on recognition rate, the results using block and without using block are shown in Figure 11. No matter how much is the information compression rate, the performance when use the block is better than those without using block. It is seen from all the experimental results that the recognition rates using block-PCA and discarding the principal components are significantly higher than those using PCA directly. Therefore, the method using block-PCA proposed here can fully utilize the local characteristics of the images and thus improve the robustness of the infrared face recognition system.

5.5. Experimental results using TMN method

As shown in Table 8, the normalized method without weighting coefficients does not improve the performance. This is because the skin temperature variations of some points in thermal images are lower than the ambient temperature variation. Such processing leads to great change of skin temperatures, especially in the parts with high temperatures. By using the normalized method with weighted coefficients, the performances are all improved greatly, no matter which kind of features are extracted. It illustrates the normalization method using maximin model can improve the robustness and performance of the system.

Feature extraction method Pre-normalization Normalization without weigh coefficient Normalization with weigh coefficient
PCA 56.97%(94/165) 55.15%(91/165) 89.09%(147/165)
DWT 49.70%(82/165) 49.70%(82/165) 86.06%(142/165)
2DLDA 61.82%(102/165) 52.73%(87/165) 78.79%(130/165)
DWT+PCA 50.30%(83/165) 47.88%(79/165) 81.21%(134/165)
PCA+LDA 55.76%(92/165) 50.91%(84/165) 76.36%(126/165)

Table 8.

Recognition rate using TMN method with different feature extraction methods

Advertisement

6. Conclusion

This chapter dedicates to eliminate the effect of ambient temperatures on thermal images. The thermal images of a face are severely affected by a variety of factors, such as environmental temperature, eyeglasses, hairstyle and so on. To alleviate the ambient temperatures variations, three methods are proposed to normalize thermal images. The normalization based on transformation uses the function obtained by the block and least squares method. Features which do not change with ambient temperatures are extracted. The third method aims to normalize image through the maximin model. The extensive experiments demonstrated that the recognition performances with temperature normalization are substantially better than that with no temperature normalization process.

It should be highlighted that psychological (e.g., happy, angry, and sad etc) and physiological (e.g., fever) conditions also affect the thermal patterns of faces. How to analysis and eliminate these variations will be our future work.

Advertisement

Acknowledgments

This work was partially supported by the National Natural Science Foundation of China (No. 61063035), and it is also supported by the Merit-based Science and Technology Activities Foundation for Returned Scholars, Ministry of Human Resources of China.

References

  1. 1. Buddharaju P. Pavlidis I. Kakadiaris I. A. 2004 Face recognition in the thermal infrared spectrum, Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition Wokrshop, 133 Washington DC, USA, 2004
  2. 2. Chen X. Flynn P. J. Bowyer K. W. 2005 IR and visible light face recognition, Computer Vision and Image Understanding, 99 3 332 358 , 2005
  3. 3. Chen X. X. Lee V. Don D. 2005 A simple and effective radiometric correction method to improve landscape change detection across sensors and across time, Remote Sensing of Environment, 98 1 63 79 , 2005
  4. 4. Guyton A. C. Hall J. E. 1996 Textbook of Medical Physiology, 9th ed., Philadelphia: W.B.Saunders Company, 1996
  5. 5. Houdas Y. Ring E. F. J. 1982 Human Body Temperature: Its Measurement and Regulation. New York: Plenum Press, OSTI ID: 6601231, 1982
  6. 6. Jones B. F. Plassmann P. 2002 Digital infrared thermal imaging of human skin, IEEE Enginerring in Medicine & Biology Magazine, 21 6 41 48 , 2002
  7. 7. Kakuta N. Yokoyama S. Mabuchi K. 2002 Human thermal models for evaluating infrared images. IEEE Medicine & Biology Society, 65 72 , 0739-5175
  8. 8. Kong S. G. Heo J. Abidi B. R. Paik J. Abidi M. A. 2005 Recent advances in visual and infrared face recognition- a review, Computer Vision and Image Understanding, 97 1 103 105 , 2005
  9. 9. Lu Y. Li F. Xie Z. H. et al. 2010 Time-lapse data oriented infrared face recognition method using block-PCA, Proceedings of 2010 International Conference on Multimedia Technology, 410 414 , Ningbo, China, October, 2010
  10. 10. Prokoski F. J. Riedel B. Coffin J. S. 1992 Identification of individuals by means of facial thermography, Proceedings of IEE Int. Conf. Security Technology, Crime Coutermeasures, 120 125 , Atlanta, USA, Oct. 1992
  11. 11. Ramamoorthi R. 2002 Analytic PCA construction for theoretical analysis of lighting variability, including attached shadows, in a single image of a convex Lambertian object. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24 1322 1333 , 2002.
  12. 12. Socolinsky D. A. Selinger A. (20 2004A Thermal face recognition in an operational scenario, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 1012 1019 , Washington DC, USA, 2004
  13. 13. Socolinsky D. A. Selinger A. (20 2004B Thermal face recognition over time, Proceedings of Int. Conf. Pattern Recognition, 187 190 , Cambridge, UK, 2004
  14. 14. Song Y. F. Shao X. P. Xu J. 2008 New enhancement algorithm for infrared image based on double plateaus histogram, Infrared and Laser Engineering, 2008
  15. 15. Wilder J. Phillips P. J. Jiang C. Wiener S. 1996 Comparison of visible and infrared imagery for face recognition, Proceedings of the 2nd Int. Conf. Automatic Face and Gesture Recognition, 182 187 , Killington, Vermont, USA, 1996
  16. 16. Wu S. Q. Song W. Jiang L. J. et al. (20 2005A Infrared face recognition by using blood perfusion data, Proceedings of Audio- and Video-based Biometric Person Authentication, 320 328 , Rye Brook, NY, USA, 2005
  17. 17. Wu S. Q. Gu Z. H. China K. A. Ong S. H. 2007 Infrared facial recognition using modified blood perfusion, Proceedings 6th Int. Conf. Inform., Comm. & Sign. Proc, 1 5 , Singapore, Dec, 2007
  18. 18. Wu S. Q. Lu Y. Fang Z. J. et al. 2010a Infrared image normalization using block and least-squares method, Proceeding of Chinese Conference on Pattern Recognition, 873 876 , Chongqing, China, October, 2010
  19. 19. Wu S. Q. Lu Y. Fang Z. J. Xie Z. H. 2010b A weighted linear normalization method of infrared image, Journal of Wuhan University of Technology, 32 20 1 5 , 2010.
  20. 20. Wu S. Q. Liang W. Yang J. C. Yuan J. S. 2009 Infrared face recognition based on modified blood perfusion model and 2DLDA in DWT domain, The 6th International Symposium on Multispectral Image Processing and Pattern Recognition, Yichang, China, 2009.
  21. 21. Xie Z. H. Wu S. Q. etc F. A. N. G. Z. J. 2009 Weighted block-PCA and FLD infrared face recognition method based on blood perfusion images. Journal of Chinese Computer Systems, 30 10 2069 2072 , 2009
  22. 22. Yang J. C. Yoon S. Park D. S. 2006 Applying learning vector quantization neural network for fingerprint matching, Lecture Notes in Artificial Intelligence (LNAI 4304) (Springer, Berlin), 500 509 , 2006
  23. 23. Yang J. C. Park D. S. 2008a A fingerprint verification algorithm using tessellated invariant moment features, Neurocomputing, 71 1939 1946 , 2008
  24. 24. Yang J. C. Park D. S. 2008b Fingerprint verification based on invariant moment features and nonlinear BPNN, International Journal of Control, Automation, and Systems, 6 6 800 808 , 2008
  25. 25. Zhang X. J. Sun X. L. 2005 A research on the piecewise linear transformation in adaptive IR image enhancement, IT AGE,3 13 16 ,2005.

Written By

Yu Lu, Jucheng Yang, Shiqian Wu, Zhijun Fang and Zhihua Xie

Submitted: 14 November 2010 Published: 09 August 2011