Open access

Multi-Frequency Image Fusion Based on MIMO UWB OFDM Synthetic Aperture Radar

Written By

Md Anowar Hossain, Ibrahim Elshafiey and Majeed A. S. Alkanhal

Submitted: 21 February 2012 Published: 20 November 2013

DOI: 10.5772/56943

From the Edited Volume

New Advances in Image Fusion

Edited by Qiguang Miao

Chapter metrics overview

3,114 Chapter Downloads

View Full Metrics

1. Introduction

The principle idea behind synthetic aperture radar (SAR) stems from the desire for high-resolution images. SAR transmits signal at spaced intervals called pulse repetition interval (PRI). The responses at each PRI are collected and processed to reconstruct a radar image of the terrain [1]. In general, high-resolution SAR images in range domain are generated using ultra-wideband (UWB) waveforms as radar transmitted pulse [2]. UWB pulses (500 MHz bandwidth and above) can enhance the range resolution considerably. UWB technology has dual advantages: good capacity of penetration and high-resolution target detection in range domain for radar applications [3].

Orthogonal frequency division multiplexing (OFDM), a modulation scheme commonly utilized in commercial communications, shows a great potential for use in forming radar waveforms. An OFDM signal is comprised of several orthogonal subcarriers, which are simultaneously emitted over a single transmission path. Each subcarrier occupies a small slice of the entire signal bandwidth [4]. Technology advances helped in increasing the sampling speed capabilities, allowing accurate generation of UWB-OFDM waveforms. This results in a diverse signal that is capable of high-resolution imaging. While OFDM has been elaborately studied and commercialized in the digital communication field, it has not yet been widely studied by the radar scientific community apart from a few efforts [5-7]. The advantages of using OFDM in radar applications include: (a) Transceiver system is based on digital implementation using relatively inexpensive components (b) Ease of narrowband interference mitigation (c) High-resolution in UWB scale and good multi-path potential (d) Same architecture can be used to transmit large amounts of data in real time; and (e) Flexibility in pulse shaping using different subcarrier compositions.

Although SAR is a well-known remote sensing application which obtains high-resolution in range domain by transmitting wide-band waveform and high-resolution in azimuth domain by utilizing the relative motion between the target and the radar platform, current single antenna SAR is not able to provide some remote sensing performances, such as simultaneously high-resolution and wide-swath imaging. Multiple-Input Multiple-Output (MIMO) SAR provides a solution to resolve these problems and provides following advantages compared to traditional SAR: diversity in viewing angles on a particular target to improve identifiability, increased azimuth resolution or decreased pulse repetition frequency (PRF) which results in wider swath. Due to the larger number of degrees-of-freedom of a MIMO system, enhanced resolution can be achieved by coherently processing of multiple waveforms at multiple receivers simultaneously.

Several research works have been reported in recent years to overcome the trade-off between wide-swath and azimuth resolution in conventional SAR system [8]. However, MIMO SAR systems are only investigated by generalizing the theoretical modeling of MIMO communication systems and discussed recently in radar communities [9 and 10]. The configuration of the proposed MIMO-SAR system in this chapter is considered as two co-located transmitters and two receivers along with image fusion technique.

In remote sensing applications, the increasing availability of spaceborne sensor gives a motivation to utilize image fusion algorithms. Several situations in image processing require high spatial and high spectral resolution in a single image. Image fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images [11].

The structure of the chapter is as follows. UWB-OFDM pulse shaping for MIMO SAR is described in section 2, while the comparison of auto-correlation and cross-correlation of different pulses in radar perspective is presented in section 3. Detailed analysis of MIMO wide-swath SAR system and its functionality is discussed in section 4. MIMO wide-swath SAR imaging results based on UWB-OFDM waveforms are presented in section 5. Section 6 presents the optimized SAR image based on image fusion technique. Final conclusions are provided in section 7.

Advertisement

2. MIMO UWB OFDM signal generation

A widely studied approach in MIMO architecture involves the transmission of orthogonal signals from different antennas. This makes it possible to separate the reflected signals from the target arriving at the receiver. In particular, we develop a procedure to design the optimal waveform that ensures orthogonality by imposing the rules shown in Table 1. The key to our approach is to use a model for the received radar signals that explicitly includes the transmitted waveforms. To achieve lower cross-correlation between transmitted pulses with a common bandwidth for the same range resolution, OFDM frequency-domain sample vector for N subcarriers is generated using the sequences shown in Table 1. The sequence in Table 1 generates the orthogonal signals based on the placement of 1’s and 0’s. We observe that in each column, if we consider a 1, the other elements are 0’s and the next column is filled with all 0’s to prevent over-sampling. The spectrum of an OFDM signal is shown in Figure 1 where the width of the main-lobe depends on the duration of the pulse. In digital implementation of an OFDM signal, the pulse duration is related to the number of subcarriers. As the number of subcarrier increases, the duration of the pulse increases.

1 2 3 - - - - - - - - - - - - N
Ψω1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0
Ψω2 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0
Ψω3 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0
Ψω4 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0

Table 1.

OFDM frequency-domain sample vector generation

Figure 1.

OFDM signal spectrum

As an example, an OFDM signal is generated according to the scheme shown in Figure 2 by spreading the digital frequency domain vector shown in Table 1 and the modulation symbol from random integer generator. The order of modulation (M) is chosen as 4 for QPSK. Inverse Fast Fourier Transform (IFFT) is then applied to obtain the discrete time domain OFDM signal and finally Hanning window is imposed to minimize the side-lobes. The time-domain OFDM signal is given as

Ψtxit=Ϝ-1Ψωi wn      i=1,2,4E1

where, Hanning window wn=0.51-cos2πnN,   0nN-1 and N is the number of subcarriers. The term Ψωi  denotes the spreading sequences for ith sub-pulses. Each antenna transmits two sub-pulses simultaneously.

UWB-OFDM waveforms are generated using the following parameters: number of OFDM subcarriers, N = 256, sampling time, Δts = 1ns results in baseband bandwidth, B0 = 1/2Δts = 500 MHz, dividing by a factor of two to satisfy Nyquist criterion. UWB-OFDM waveform in frequency domain and time domain is shown in Figure 3 and Figure 4 respectively. We can observe that the Hanning window reasonably minimizes side-lobes which in turns improve the auto-correlation function (ACF) and cross-correlation function (CCF) of time-domain OFDM waveforms as shown in Figure 5 and Figure 6, respectively.

Figure 2.

OFDM signal generator

Figure 3.

UWB-OFDM waveform in frequency-domain (a) before windowing (b) after windowing

Figure 4.

UWB-OFDM waveform in time-domain (a) before windowing (b) after windowing

Figure 5.

Auto-correlation function (ACF) of OFDM pulses in time domain (a) before windowing (b) after windowing

Figure 6.

Cross-correlation function (CCF) of OFDM pulses in time domain (a) before windowing (b) after windowing

Advertisement

3. Comparison of auto-correlation and cross-correlation

Cross-correlation is the measure of similarity between two different sequences and can be given as

Rxym=n=0N-m-1xn+myn*   m0Ryx*-m            m<0E2

where, xn and yn are the elements of two different sequences and have period N. Auto-correlation shows the measure of similarity between the sequence and its cyclic shifted copy can be obtained from equation (2) as a special case (x=y) [12].

Figure 7.

Ideal Walsh-Hadamard sequences (a) ACF (b) CCF

The auto-correlation and cross-correlation properties of the sequences used in generating the transmitted waveform play an important role in high-resolution SAR imaging based on MIMO architecture. Practically, lower cross-correlation between waveforms avoids interference which results in independent information gains from target signature at various angles. Similarly, low auto-correlation peak side-lobe ratio ensures high-resolution in range domain. Thus, the waveforms with lower cross-correlation and auto-correlation peak side-lobe are desired for MIMO SAR systems. The sequences with good auto-correlation property provide high-resolution target detection and lower cross-correlation mitigates the interference from nearby sensors.

Orthogonality is the most important properties of Walsh-Hadamard sequences [12]. Because of this property, the cross-correlation between any two codes of the same set is zero as shown in Figure 7. Unfortunately, Walsh sequences are orthogonal only in the case of perfect synchronization, and have non-zero off-peak auto-correlations and cross–correlation in asynchronous case. To compare the performance of OFDM signal using Walsh-Hadamard sequences and proposed orthogonal sequences for radar application, we can analyze ACF and CCF by assuming a point target at the center of the target area.

Figure 8.

Point target profile using Walsh-Hadamard sequences and proposed orthogonal sequences (a) ACF of Walsh-Hadamard (b) ACF of proposed orthogonal sequences (c) CCF of Walsh-Hadamard (d) CCF of proposed orthogonal sequences

Figure 8(a) and Figure 8(b) show the auto-correlation, while Figure 8(c) and Figure 8(d) show the cross-correlation in terms of a point target using Walsh-Hadamard sequences and proposed orthogonal sequences respectively. The auto-correlation is measured as the correlation between the received signal and the transmitted signal at same antenna, while cross-correlation is measured as the correlation between transmitted signal in one antenna and the received signal in another antenna. We can observe that the significant improvement in ACF and CCF for a point target profile using proposed orthogonal pulses in comparison with Walsh-Hadamard sequences. In MIMO SAR, ACF between transmitted and received signal of the same antenna should provide narrow main-lobe width for high-resolution. Lower CCF properties between transmitted signal of one antenna and received signal of another antenna are needed to avoid interference from nearby sensor. The main-lobe width of the proposed sequence shown in Figure 8(b) is reasonably narrow in comparison with Figure 8(a) which in turns improves the range resolution. In case of CCF, since all cross-correlation values, not just peak values, affect the system performance, we should consider the measure as mean cross-correlation value. The mean CCF of the proposed sequence shown in Figure 8 (d) is much lower than the Walsh-Hadamard sequence shown in Figure 8(c).

Advertisement

4. MIMO wide-swath SAR imaging system

In MIMO SAR, independent signals are transmitted through different antennas, and these signals are received by multiple antennas after propagating through the environment. Each antenna transmits a unique waveform orthogonal to the waveforms transmitted by other antennas; the returns of each orthogonal signal will carry independent information about the targets. In the receiver, a matched filter-bank is used to extract the orthogonal waveform components. Consider the MIMO SAR system with a transmit array equipped with 2 co-located antennas and a receive array (possibly the same array) equipped with 2 co-located antennas. Suppose both transmit and receive arrays are close to each other in space but they see different target area at different directions. Figure 9 shows the MIMO wide-swath stripmap SAR imaging topology and Figure 10 shows the block diagram of the MIMO OFDM SAR imaging system.

Figure 9.

MIMO stripmap wide-swath SAR imaging topology

The antenna beam A and B are illuminating the swath A and B respectively. At an specific PRI, TxA transmits pulse ΨtxAt via the antenna beam A, while TxB transmits the pulse ΨtxBt via the antenna beam B at the same time. Echoes from swath A and B will exist at the both receivers. To separate echoes from swath A and B, a careful design of transmit antenna pattern as well as transmitted pulse is required. It can further reduce the disturbance echoes from the temporal undesired swath.

The OFDM signal generator generates signal according to the scheme shown in Figure 2. The detailed of the block diagram components such as D/A converter, mixer and power amplifier shown in Figure 10 can be found in [5]. We consider four typical orthogonal sub-pulses based on the sample vectors shown in Table 1. Two different signals ΨtxA(t) and ΨtxB(t) are transmitted simultaneously from antenna A and B respectively at each PRI where each signal is the combination of two sub-pulses and are given as

ΨtxAt=Ψtx1t+Ψtx2tE3
ΨtxBt=Ψtx3t+Ψtx4tE4

Figure 10.

MIMO OFDM SAR imaging system

The received signal for radar at antenna A is given by

ΨrxAt,u=αn=1NσnΨtxAt-tdnA+βn=1NσnΨtxBt-tdnB+ηA(t)E5

Similarly, the received signal at antenna B is given as

ΨrxBt,u=αn=1NσnΨtxBt-tdnB+βn=1NσnΨtxAt-tdnA+ηB(t)E6

where, α and β are the scale factor and is chosen as 12 and 110 respectively. The scale factor α is chosen to distribute the total power to two sub-pulses and β is chosen to model the out-of-beam signal. The term tdnA= 2c(XcA+xn)2+(yn-u)2 is the time-delay associated with the target position (xn, yn) in swath A and tdnB= 2c(XcB+xn)2+(yn-u)2 is the time-delay associated with swath B. XcA and XcB denote the range distance to the center of the swath A and B respectively, where, n = 1, 2, 3…N are the number of targets within the antenna beam at any given synthetic aperture position (u) in azimuth direction while σn denotes the reflectivity of the nth target. The terms ηA(t) and ηB(t) denote the additive white Gaussian noise.

Next, the received radar echoes should be separated apart by matched filtering. As the transmitted signal matrix is known to both transmitter and receiver and the transmitted waveforms are designed to be orthogonal, they should satisfy the conditions

0TpΨrxmtΨtxn*t=δt,       m=n0,             mnE7

where, Tp is the sub-pulse duration and (.)* denotes a conjugate operator. At receiving antenna A, two received orthogonal sub-pulses can be extracted by two matched filters and is given by

ΨMFnt=Ϝ-1[FΨrxAt.FΨtxn*t]E8

Similarly, at receiving antenna B, two sub-pulses can be separated as

ΨMFnt=Ϝ-1[FΨrxBt.FΨtxn*t]E9

where, n = 1, 2 for equation (8) and n = 3, 4 for equation (9) while Ϝ-1 and F denote the inverse Fourier transform and Fourier transform operations respectively. Therefore, echoes from different swaths could be considered as well separated after the matched filtering.

Finally, we will have a total of four extracted signals from two receiving antennas. Compared to the traditional phased array SAR where the same waveform is used at all the transmitting antennas and a total of 2 coefficients are obtained from the matched filtering, the MIMO OFDM SAR gives more coefficients and therefore provides more degrees of freedom [8]. Each matched filter output is then processed separately using SAR imaging algorithms such as Range-Doppler algorithm and image fusion technique is then applied to achieve final SAR reconstructed image as described in the following sections.

Advertisement

5. MIMO Wide-swath SAR Imaging

The scenario involves the wide-swath SAR imaging by considering four distinct orthogonal UWB-OFDM sub-pulses as SAR transmitted signals using two antennas. The objective is to investigate the performance of the proposed orthogonal waveforms in MIMO architecture. Let us consider 2 point targets reside in swath A at the positions [(x1, y1), (x2, y2)] = [(300, 100), (900, -50)] while 2 point targets reside at the positions [(x3, y3), (x4, y4)] = [(300, -50), (900, 100)] in swath B. Stripmap SAR imaging topology is considered for raw data generation based on the proposed UWB-OFDM waveforms [5] while Range-Doppler algorithm is used for SAR image reconstruction [13 and 14]. Processing of SAR raw data from multiple antennas can be done in parallel. Field Programmable Gate Array (FPGA) is a powerful tool for real-time implementation of SAR image reconstruction from raw data [15 and 16]. Figure 11 and Figure 12 show the resolved images of swath A based on the output of matched filter 1 and 2 respectively while Figure 13 and Figure 14 show the reconstructed images of swath B based on the output of matched filter 3 and 4.

Advertisement

6. Image fusion

Observing a given scene from two SAR antennas with distinct trajectories allows one to determine the position of the scattering points. Unfortunately, SAR interferometry fails when the scenes imaged by the two antennas are not really the same scene, due to a too large distance between the trajectories of the two SAR antennas. In these cases, the two images may not be sufficiently correlated. SAR image fusion is presented here exploiting the data recorded by same antennas about the same scene using two sub-pulses simultaneously. The usefulness of the fusion technique is evaluated by estimating the noise level for the non-fused and the fused images in terms of entropy. In addition, the behavior of the back-scatterer, as a function of frequency, changes on the basis of the surface types. Therefore, if the images acquired in many regions of the spectrum are fused, the output image will carry useful information about specific back-scatterers. Furthermore, the fusion of multi-frequency images can allow us to fuse the information acquired about the object observed in many spectral bands, in the same spatial context. Complementary information about the same observed scene can be available in the following cases:– data recorded by the same sensor scanning the same scene at different dates (multi-temporal image fusion); – data recorded by the same sensor operating in different spectral bands (multi-frequency image fusion); – data recorded by the same sensor at different polarizations (multi-polarization image fusion); – data recorded by the same sensor located on platforms flying at different heights (multi-resolution image fusion).

Figure 11.

Reconstructed image from matched filter 1 (swath A)

Figure 12.

Reconstructed image from matched filter 2 (swath A)

Many methods exist to perform image fusion. The very basic one is based on discrete wavelet transform (DWT) has become a very useful tool for fusion. DWT is a wavelet transform for which the wavelets are discretely sampled. As with other wavelet transforms, a key advantage it has over Fourier transforms is temporal resolution: it captures both frequency and location information (location in time). Figure 15 shows the block diagram of wavelet transform based image fusion technique. The principle of image fusion using wavelets is to merge the wavelet decompositions of the two original images using fusion methods applied to approximations coefficients and details coefficients [11]. The DWT is a spatial frequency decomposition that provides a flexible multi-resolution analysis of an image. The inverse discrete wavelet transform (IDWT) is applied to the combined coefficient map to produce the fused image from the two input images.

Figure 13.

Reconstructed image of matched filter 3 (swath B)

In all wavelet based image fusion schemes the DWT of the two registered input images I1(x, y) and I2(x, y) are computed and these transforms are combined using some kind of fusion rule. Then the inverse discrete wavelet transform (IDWT) is computed and the fused image I(x, y) is reconstructed as

Ιx,y=W-1ϕWI1x,y,WI2x,yE10

where, W and W-1 denotes the DWT and IDWT respectively. The term ϕ denotes the rules imposed in fusion such as wavelet function, level, approximation, and detail coefficients. Figure 16 shows the single level decomposition of the image shown in Figure 14 using Haar wavelet function.

Figure 14.

Reconstructed image of matched filter 4 (swath B)

Figure 15.

Wavelet transform based image fusion

Figure 17 shows the fused image obtained using the reconstructed images from matched filters 1 and 2 while Figure 18 shows the fused image using the reconstructed images from matched filters 3 and 4. The image fusion output shown in Figure 17 and Figure 18 is achieved by taking the ‘maximum’ for ‘approximations’ and the ‘minimum’ for the ‘details’ using level 5 based on Haar wavelet. Haar wavelet is chosen because of its simplicity and good reconstruction capability. Since wavelet coefficients with large absolute values contain the information about the salient features of the images such as edges and lines, a good fusion rule is to choose the ‘maximum’ for ‘approximation’ values, while ‘minimum’ is chosen for the ‘details’ to suppress the noise. Final reconstructed wide-swath SAR image shown in Figure 19 with all resolved point targets of swath A and B is the horizontal concatenation of fused images of Figure 17 and Figure 18.

Figure 16.

Single level decomposition (a) Approximation (b) Horizontal detail (c) Vertical detail (d) Diagonal detail

Figure 17.

Fused image of swath A

Figure 18.

Fused Image of Swath B

Figure 19.

Final SAR image

To assess the reduction in noise level due to image fusion technique, we can analyze both input images and fused image in terms of entropy. Entropy is a good measure for information content (uncertainty) present in the image space. Information content in SAR images after wavelet transform based image fusion is identified with entropy value that serves as a measure of the roughness present in the image space. Entropy is used as a metric of noise level in non-fused and fused images. Table 2 summarizes the entropy of the input images of swath A and B as well as the fused images for different wavelet families. We observe that the Haar wavelet gives the best reduction in noise level.

Wavelet Parameters Entropy of image 1 Entropy of image 2 Entropy offused image(Swath A) Entropy offused image(Swath B)
Haar Level : 5
Approx.: Max
Details: Min
(Swath A)
4.6353
(Swath B)
4.5606
(Swath A)
4.6072
(Swath B)
4.5362
3.9572 3.8627
Daubechies1 3.9574 3.8630
Symlets2 3.9594 3.8696
Coiflets2 3.9605 3.8723

Table 2.

Entropy of fused SAR images using different wavelet families.

Advertisement

7. Conclusions

An image fusion based MIMO UWB-OFDM SAR system has been presented which is able to provide wide-swath imaging. Pulse shaping is an important component in OFDM applications. As orthogonal transmission waveforms are required for the proposed MIMO OFDM SAR system, a new approach to generate OFDM waveforms is explored and investigated. It is shown that the proposed MIMO UWB-OFDM SAR indeed provides a potential solution to high-resolution remote sensing as well as wide-swath imaging. The usefulness of the developed approach has been demonstrated by fusing SAR images. Image fusion techniques provide a powerful tool to reduce clutter and certain types of noise such as AWGN, and thus can be used to enhance the quality of SAR images. The performance of the system is estimated by testing the proposed technique on SAR data acquired on multiple sensors. The results are evaluated by estimating the information flow from the input data to the output image, in terms of automatic recognition and detection of features present in the acquired images. Each SAR sensor acquires data about the inspected region using more than one frequency, and a processor that exploits the information carried by multiple frequencies is thus needed. Future work may include investigation of the proposed system by exploiting sensors that scan the from multiple heights using various platforms.

Advertisement

Acknowledgments

This work is funded by the National Plan for Science and Technology, Kingdom of Saudi Arabia, under project number: 08-ELE262-2.

References

  1. 1. Soumekh M. Synthetic Aperture Radar Signal Processing with MATLAB Algorithms. New York: Wiley, 1999.
  2. 2. Lim B-G, Woo J-C, Lee H-Y, and Kim Y-S. A Modified Subpulse SAR Processing Procedure Based on the Range-Doppler Algorithm for Synthetic Wideband Waveforms. Sensors, 8, pp.8224-8236, 2008.
  3. 3. Aiello R. and Wood S. Essentials of UWB. New York: Cambridge, 2008.
  4. 4. Hanzo L. and Keller T. OFDM and MC-CDMA. West Sussex: Wiley, 2006.
  5. 5. Hossain M. A., Elshafiey I., Alkanhal M., and Mabrouk A. Adaptive UWB-OFDM Synthetic Aperture Radar. Proc. of Saudi International Electronics, Communications and Photonics Conference (SIECPC), Riyadh, Saudi Arabia, pp. 1-6, 2011.
  6. 6. Hossain M. A., Elshafiey I., and Alkanhal M. High-resolution UWB SAR based on OFDM architecture. Proc. of 3rd Asia-Pacific International Conference on Synthetic Aperture Radar (APSAR). Seoul, South Korea, pp.1-4, 2011.
  7. 7. Hossain M. A., Elshafiey I., Alkanhal M., and Mabrouk A. Anti-jamming Capabilities of UWB-OFDM SAR. Proc. of European Radar Conference (EuRad), Manchester, United Kingdom, pp.313-316, 2011.
  8. 8. Wang W-Q. Space-Time Coding MIMO-OFDM SAR for High-Resolution Imaging. IEEE Trans. on Geosciences and Remote Sensing, 49(8), pp.3094-3104, 2011.
  9. 9. Lim S. H., Hwang C. G., Kim S. Y., and Myung N H. Shifting MIMO SAR System for High-resolution Wide-swath Imaging. Journal of Electromagnetic Waves and Applications (JEMWA), 25, pp.1168-1178, 2011.
  10. 10. Xu W., Huang P. P., and Deng Y. K. MIMO-Tops Mode for High-resolution Ultra-Wide-Swath Full Polarimetric Imaging," Progress in Electromagnetics Research (PIER), 121, pp.19-37, 2011.
  11. 11. Zeeuw P. M. Wavelet and Image Fusion. Amsterdam: CWI, 1998.
  12. 12. DU K-l. and Swamy M. N. Wireless Communication Systems. New York: Cambridge University Press, 2010.
  13. 13. Cumming I. G. and Wong F. H. Digital Processing of Synthetic Aperture Radar Data. London: Artech House, 2004.
  14. 14. Woo J. C., Lim B. G., and Kim Y. S. Modification of the Recursive Sidelobe Minimization Technique for the Range-Doppler Algorithm of SAR Imaging. Journal of Electromagnetic Waves and Applications (JEMWA), 25, pp.1783-1794, 2011.
  15. 15. Hossain M. A., Elshafiey I., Alkanhal M., and Mabrouk A. Real-time implementation of UWB-OFDM synthetic aperture radar imaging. IEEE Int. Conf. on Signal and Image Processing Applications (ICSIPA), Kuala lumpur, Malaysia, pp.450-455, 2011.
  16. 16. Atoche A. C. and Castillo J. V. Dual Super-Systolic Core for Real-Time Reconstructive Algorithms of High-Resolution Radar/SAR Imaging Systems. Sensors, 12, pp. 2539-2560, 2012.

Written By

Md Anowar Hossain, Ibrahim Elshafiey and Majeed A. S. Alkanhal

Submitted: 21 February 2012 Published: 20 November 2013