Open access peer-reviewed chapter

A Review: Remote Sensing Sensors

Written By

Lingli Zhu, Juha Suomalainen, Jingbin Liu, Juha Hyyppä, Harri Kaartinen and Henrik Haggren

Submitted: 07 June 2017 Reviewed: 19 September 2017 Published: 20 December 2017

DOI: 10.5772/intechopen.71049

From the Edited Volume

Multi-purposeful Application of Geospatial Data

Edited by Rustam B. Rustamov, Sabina Hasanova and Mahfuza H. Zeynalova

Chapter metrics overview

4,606 Chapter Downloads

View Full Metrics

Abstract

The cost of launching satellites is getting lower and lower due to the reusability of rockets (NASA, 2015) and using single missions to launch multiple satellites (up to 37, Russia, 2014). In addition, low-orbit satellite constellations have been employed in recent years. These trends indicate that satellite remote sensing has a promising future in acquiring high-resolution data with a low cost and in integrating high-resolution satellite imagery with ground-based sensor data for new applications. These facts have motivated us to develop a comprehensive survey of remote sensing sensor development, including the characteristics of sensors with respect to electromagnetic spectrums (EMSs), imaging and non-imaging sensors, potential research areas, current practices, and the future development of remote sensors.

Keywords

  • remote sensing
  • satellite
  • sensors
  • electromagnetic spectrum
  • spectrum of materials
  • imaging sensors
  • non-imaging sensors

1. Introduction

In 2015, one of the most remarkable events in the space industry was when SpaceX realized the reusability of its rocket for the first time. Additionally, in June 2014, Russia used 1 rocket to launch 37 satellites at the same time. At present, many countries have the capability to launch multiple satellites in one mission. For example, NASA and the US Air Force launched 29 satellites in a single mission in 2013. At that time, the mission represented the most satellites ever launched at one time [1]. In 2015 and 2016, China and India launched 20 satellites in single mission, respectively. At present, six organizations have the capability to launch multiple satellites in a single mission: Russia, USA, China, India, Japan, and ESA. This trend indicates that in the future, the cost of sending satellites to space will greatly decrease. More and more remote sensing resources are becoming available. It is of great importance to have a comprehensive survey of the available remote sensing technology and to utilize inter- or trans-disciplinary knowledge and technology to create new applications.

Remote sensing is considered a primary means of acquiring spatial data. Remote sensing measures electromagnetic radiation that interacts with the atmosphere and objects. Interactions of electromagnetic radiation with the surface of the Earth can provide information not only on the distance between the sensor and the object but also on the direction, intensity, wavelength, and polarization of the electromagnetic radiation [2]. These measurements can offer positional information about the objects and clues as to the characteristics of the surface materials.

Satellite remote sensing consists of one or multiple remote sensing instruments located on a satellite or satellite constellation collecting information about an object or phenomenon on the Earth surface without being in direct physical contact with the object or phenomenon. Compared to airborne and terrestrial platforms, spaceborne platforms are the most stable carrier. Satellites can be classified by their orbital geometry and timing. Three types of orbits are typically used in remote sensing satellites, such as geostationary, equatorial, and sun-synchronous orbits. A geostationary satellite has a period of rotation equal to that of Earth (24 hours) so the satellite always stays over the same location on Earth. Communications and weather satellites often use geostationary orbits with many of them located over the equator. In an equatorial orbit, a satellite circles the Earth at a low inclination (the angle between the orbital plane and the equatorial plane). The Space Shuttle uses an equatorial orbit with an inclination of 57°. Sun-synchronous satellites have orbits with high inclination angles, passing nearly over the poles. Orbits are timed so that the satellite always passes over the equator at the same local sun time. In this way, these satellites maintain the same relative position with the sun for all of its orbits. Many remote sensing satellites are sun synchronous, which ensures repeatable sun illumination conditions during specific seasons. Because a sun-synchronous orbit does not pass directly over the poles, it is not always possible to acquire data for the extreme polar regions. The frequency at which a satellite sensor can acquire data of the entire Earth depends on the sensor and orbital characteristics [3]. For most remote sensing satellites, the total coverage frequency ranges from twice a day to once every 16 days. Another orbital characteristic is altitude. The space shuttle has a low orbital altitude of 300 km, whereas other common remote sensing satellites typically maintain higher orbits ranging from 600 to 1000 km.

The interaction between a sensor and the surface of the Earth has two modes: active or passive. Passive sensors utilize solar radiation to illuminate the Earth’s surface and detect the reflection from the surface. They typically record electromagnetic waves in the range of visible (˜430–720 nm) and near-infrared (NIR) (˜750–950 nm) light. Some systems, such as SPOT 5, are also designed to acquire images in middle-infrared (MIR) wavelengths (1580–1750 nm). The power measured by passive sensors is a function of the surface composition, physical temperature, surface roughness, and other physical characteristics of the Earth [4]. Examples of passive satellite sensors are those aboard the Landsat, SPOT, Pléiades, EROS, GeoEye, and WorldView satellites. Active sensors provide their own source of energy to illuminate the objects and measure the observations. These sensors use electromagnetic waves in the range of visible light and near-infrared (e.g., a laser rangefinder or a laser altimeter) and radar waves (e.g., synthetic aperture radar (SAR)). A laser rangefinder uses a laser beam to determine the distance between the sensor and the object and is typically used in airborne and ground-based laser scanning. A laser altimeter uses a laser beam to determine the altitude of an object above a fixed level and is typically utilized in satellite and aerial platforms. SAR uses microwaves to illuminate a ground target with a side-looking geometry and measures the backscatter and travel time of the transmitted waves reflected by objects on the ground. The distance that the SAR device travels over a target in the time taken for the radar pulses to return to the antenna produces the SAR image. SAR can be mounted on a moving platform, such as spaceborne and airborne platforms. According to the combination of frequency bands and polarization modes used in data acquisition, sensors can be categorized as single frequency (L-band, C-band, or X-band), multiple frequency (a combination of two or more frequency bands), single polarization (VV, HH, or HV), and multiple polarization (a combination of two or more polarization modes). Currently, there are three commercial SAR missions in space: Germany’s TerraSAR-X and TanDEM-X (X-band with a ˜3.5 cm wavelength), Italy’s COSMO-SkyMed (X-band with ˜3.5 cm wavelength), and Canada’s RADARSAT-2 (C-band with ˜6 cm wavelength). In addition, ESA’s ERS-1, ERS-2, and Envisat also carried SAR, although these missions have ended. The latest SAR satellites from ESA include Sentinel-1A, Sentinel-1B, and Sentinel-3A. Typical SAR parameters are repeat frequency, pulse repetition frequency, bandwidth, polarization, incidence angle, imaging mode, and orbit direction [5].

As sensor technology has advanced, the integration of passive and active sensors into one system has emerged. This trend makes it unclear difficult to categorize sensors in the traditional way, into passive sensors and active sensors. In this paper, we introduce the sensors in terms of imaging or non-imaging functionality. Imaging sensors typically employ optical imaging systems, thermal imaging systems, or SAR. Optical imaging systems use the visible, near-infrared, and shortwave infrared spectrums and typically produce panchromatic, multispectral, and hyperspectral imagery. Thermal imaging systems employ mid to longwave infrared wavelengths. Non-imaging sensors include microwave radiometers, microwave altimeters, magnetic sensors, gravimeters, Fourier spectrometers, laser rangefinders, and laser altimeters [6].

It has been decades since Landsat-1, the first Earth resources technology satellite, was launched in 1972. Satellite platforms have evolved from a single satellite to multi-satellite constellations. Sensors have experienced unprecedented development over the years, from 1972 with the first multispectral satellite, Landsat-1, with four spectral bands to 1997 with the first hyperspectral satellite, Lewis, with 384 spectral bands. Spatial resolution has also significantly improved over the decades, from 80 m in Landsat-1 to 31 cm in Worldview-3. A number of studies on satellite imagery processing methods and applications have been conducted. A few papers providing sensor overviews have been published, including [7, 8, 9]. Blais [7] reviewed the range sensors developed over the past two decades. The studied range sensors include single point and laser scanners, slit scanners, pattern projections, and time-of-flight systems. In addition, commercial systems related to range sensors were reviewed. Melesse et al. [18] provided a survey of remote sensing sensors for typical environmental and natural resources mapping purposes, such as urban studies, hydrological modeling, land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation, remotely sensed-based rainfall, and potential evapotranspiration for estimating crop water requirement satisfaction indexes. Recently, a survey on remote sensing platforms and sensors was provided by Toth and Jóźków [9]. The authors gave a general review in current remote sensing platforms, including satellites, airborne platforms, UAVs, ground-based mobile and static platforms, sensor georeferencing and supporting navigation infrastructure, and provided a short summary of imaging sensors.

In the literature, we found that overviews of remote sensing sensors were quite rare. One reason for this finding was that this topic is fairly broad. Usually, one can find detailed knowledge from thick books or a very simple overview from some webpages. As most readers need to obtain relevant knowledge within a reasonable time period and with a modest depth, the contribution of our paper is valuable. In this paper, we review the history of remote sensing, the interaction of the electromagnetic spectrum (EMS) and objects, imaging sensors and non-imaging sensors (e.g., laser rangefinders/altimeters), and commonly used satellites and their characteristics. In addition, future trends and potential applications are addressed. Although this paper is mainly about satellite sensors, there is no apparent boundary between satellite sensors and airborne, UAV-based, or ground-based sensors except that satellite sensors have more interaction with the atmosphere. Therefore, we use the term “remote sensing sensors” generally.

Advertisement

2. Remarkable development in spaceborne remote sensing

Although the term ‘remote sensing’ was introduced in 1960. However, in practice, remote sensing has a long history. In the 1600s, Galileo used optical enhancements to survey celestial bodies [10]. An early exploration of prisms was conducted by Sir Isaac Newton in 1666. Newton discovered that a prism dispersed light into a spectrum of red, orange, yellow, green, blue, indigo, and violet and recombined the spectrum into white light. One hundred years later, in 1800, Sir William Herschel explored the thermal infrared electromagnetic radiation for the first time in the world. Herschel measured the temperature of light that had been split with a prism into the spectrum of visible colors. In the following decades, some attempts were made with aerial photographs using cameras attached to balloons. However, the results were not satisfactory until 1858, when Gasper Felix Tournachon took the first aerial photograph successfully from a captive balloon from an altitude of 1200 feet over Paris. Later, in 1889 in Labruguiere, France, Arthur Batut attached a camera and an altimeter to kites for the first time so that the image scale could be determined. Therefore, he is considered to be the father of kite aerial photography. Then, at the beginning of the twentieth century, the camera was able to be miniaturized (e.g., 70 g) so that it was easily carried by pigeons. The Bavarian Pigeon Corps took the first aerial photos using a camera attached to a pigeon in 1903. During the First World War, the use of aerial photography grew. Later, in 1936, Albert W. Stevens took the first photograph of the actual curvature of the earth from a free balloon at an altitude of 72,000 feet. The first space photograph from V-2 rockets was acquired in 1946. Table 1 addresses the evolution of the remote sensing, excluding the early development stage. The table starts with the use of aerial photographs for surveying and mapping as well for military use. The milestones in this evolution (see Table 1) were referenced to [7, 10]. Additionally, recent developments in microsatellites and satellite constellations are also listed in Table 1.

Phases Time series Remarks
Airborne remote sensing During the First and Second World Wars The use of photographs for surveying, mapping, reconnaissance and military surveillance
Rudimentary spaceborne satellite remote sensing In the late 1950s The launch of Sputnik 1 by Russia in 1957 and Explorer 1 by US in 1958
Spy satellite remote sensing During the Cold War (1947–1991) Remote sensing for military use spilled over into mapping and environment applications
Meteorological satellite sensor remote sensing 1960˜ The launch of the first meteorological satellite (TIROS-1) by the US in 1960. Since then, data in digital formats and the use of computer hardware and software
Landsat 1972˜ Landsat 1, 2, and 3 carrying a multispectral scanner; Landsat 4 and 5 carried a Thematic Mapper sensor; Landsat 7 carries an Enhanced Thematic Mapper; Landsat 8 carries the Operational Land Imager. Landsat satellites have high resolution and global coverage. Applications were initially local and have become global since then
European Space Agency’s first Earth observing satellite program 1991˜ The European Space Agency launched the first satellite ERS-1 in 1991, which carried a variety of earth observation instruments: a radar altimeter, ATSR-1, SAR, wind scatterometer, and microwave radiometer. A successor, ERS-2, was launched in 1995
Earth observing system (EOS) Since the launch of the Terra satellite in 1999 Terra/Aqua satellites carrying sensors, such as MODIS and taking measurements of pollution in the troposphere (MOPITT). Global coverage, frequent repeat coverage, a high level of processing, easy and mostly free access to data
New millennium Around the same time as EOS Next generation of satellites and sensors, such as Earth Observing-1, acquiring the first spaceborne hyperspectral data
Private industry/commercial satellite systems 2000˜ 1. Very high-resolution data, such as IKONOS and Quickbird satellites
2. A revolutionary means of data acquisition: daily coverage of any spot on earth at a high resolution, such as Rapideye
3. Google streaming technology allows rapid data access to very high-resolution images
4. The launch of GeoEye-1 in 2008 for very high-resolution imagery (0.41 m)
Microsatellite era and satellite constellations 2008˜ 1. Small satellites and satellite constellation (RapidEye and Terra Bella, formerly Skybox): RapidEye was launched in August, 2008, with five EOS. These are the first commercial satellites to include the Red-Edge band, which is sensitive to changes in chlorophyll content. On March 8, 2016, Skybox imaging was renamed to Terra Bella. Satellites provided the ability to capture the first-ever commercial high-resolution video of Earth from a satellite and the ability to capture high-resolution color and near-infrared imagery
2. For the first time, Russia carried out a single mission to launch 37 satellites in June of 2014
3. ESA launched the first satellite of the Sentinel constellation in April of 2014.
4. SpaceX reusable rocket capacity since December of 2015
5. Current satellites in high revisiting period, large coverage, and high spatial resolution, up to 31 cm

Table 1.

Evolution and advancement in remote sensing satellites and sensors.

Advertisement

3. Characteristics of materials in electromagnetic spectrum (EMS)

Remote sensors remotely interact with objects on the surface of the Earth. Objects on the surface of the Earth generally include terrain, buildings, road, vegetation, and water. The typical materials of these objects that interact with the EMS are categorized into groups: transparent and opaque (partly or fully absorbed).

3.1. Electromagnetic spectrum

Figure 1 contains the EMS range from gamma rays to radio waves. In remote sensing, typical applications include the visible light (380–780 nm), infrared (780 nm–0.1 mm), and microwave (0.1 mm–1 m) ranges. This paper treats the terahertz (0.1–1 mm) range as an independent spectral band separate from microwaves. Remote sensing sensors interact with objects remotely. Between sensors and the earth surface, there is atmosphere. It is estimated that only 67% of sunlight directly heats the Earth [11]. The remainder of the light is absorbed and reflected by the atmosphere. The Earth’s atmosphere strongly absorbs infrared and UV radiation. In visible light, typical remote sensing applications include the blue (450–495 nm), green (495–570 nm), and red (620–750 nm) spectral bands for panchromatic or multispectral or hyperspectral imaging. Current bathymetric and ice LIDAR generally uses green light (e.g., NASA’s HSRL-1 LIDAR, with a spectrum of 532 nm). However, new experiments have shown that in the blue spectrum, such as at 440 nm, the absorption coefficient for water is approximately an order of magnitude smaller than at 532 nm, and 420–460 nm light can penetrate relatively clear water and ice much deeper, offering substantial improvements in sensing through water for the same optical power output, thus reducing power requirements [11]. The red spectrum together with near-infrared (NIR) is typically used for vegetation applications. For example, the Normalized Difference Vegetation Index (NDVI) is used to evaluate targets that may or may not contain live green vegetation. Infrared is invisible radiant energy. Usually, infrared is divided into different regions: near IR (NIR, 0.75–1.4 μm), shortwave IR (SWIR, 1.4–3 μm), mid-IR (MIR, 3–8 μm), longwave IR (LWIR, 8–15 μm), and far IR (FIR, 15–1000 μm). Alternatively, according to the ISO 20473 scheme, another division is proposed as NIR (0.78–3 μm), MIR (3–50 μm), and FIR (50–1000 μm). Most of the infrared radiation in sunlight is in the NIR range. Most of the thermal radiation emitted by objects near room temperature is infrared [14]. In nature, on the surface of the Earth, almost all thermal radiation consists of infrared in the mid-infrared region, which is a much longer wavelength than that in sunlight. Of these natural thermal radiation processes, only lightning and natural fires are hot enough to produce much visible energy, and fires produce far more infrared than visible light energy. NIR is mainly used in medical imaging and physiological diagnostics. One typical application of MIR and FIR is thermal imaging, for example, night vision devices. In the MIR and FIR spectrum bands, water shows high absorption, and biological systems are highly transmissive.

Figure 1.

The electromagnetic spectrum. Image from UC Davis ChemWiki, CC-BY-NC-SA 3.0.

With regard to the terahertz spectrum band, terahertz frequencies are useful for investigating biological molecules. Unlike more commonly used forms of radiated energy, this range has rarely been studied, partly because no one knew how to make these frequencies bright enough [12] and because practical applications have been impeded by the fact that ambient moisture interferes with wave transmission [13]. Nevertheless, terahertz light (also called T-rays) has remarkable properties. T-rays are safe, non-ionizing electromagnetic radiation. This light poses little or no health threat and can pass through clothing, paper, cardboard, wood, masonry, plastic, and ceramics. This light can also penetrate fog and clouds. THz radiation transmits through almost anything except for not metal and liquid (e.g., water). T-rays can be used to reveal explosives or other dangerous substances in packaging, corrugated cardboard, clothing, shoes, backpacks, and book bags. However, the technique cannot detect materials that might be concealed in body cavities [14].

The terahertz region is technically the boundary between electronics and opt-photonics [15]. The wavelengths of T-rays—shorter than microwaves, longer than infrared—correspond with biomolecular vibrations. This light can provide imaging and sensing technologies not available through conventional technologies, such as microwaves [16]. For example, T-rays can penetrate fabrics. Many common materials and living tissues are semi-transparent and have ‘terahertz fingerprints’, permitting them to be imaged, identified, and analyzed [17]. In addition, terahertz radiation has the unique ability to non-destructively image physical structures and perform spectroscopic analysis without any contact with valuable and delicate paintings, manuscripts, and artifacts. In addition, terahertz radiation can be utilized to measure objects that are opaque in the visible and near-infrared regions. Terahertz pulsed imaging techniques operate in much the same way as ultrasound and radar to accurately locate embedded or distant objects [18]. Current commercial terahertz instruments include Terahertz 3D medical imaging, security scanning systems, and terahertz spectroscopy. The latest breakthrough research (9.2016) on terahertz applications was that MIT invented a terahertz camera that can read a closed book. This camera can distinguish ink from a blank region on paper. The article indicates that ‘In its current form the terahertz camera can accurately calculate distance to a depth of about 20 pages’ [19]. It is expected that in the future, this technology can be used to explore and catalog historical documents without actually having to touch or open them and risk damage.

Regarding microwaves, shorter microwaves are typically used in remote sensing. For example, this region is used for radar, and the wavelength is just a few inches long. Microwaves are typically used for obtaining information on the atmosphere, land, and ocean, such as Doppler radar, which is used in weather forecasts, and for gathering unique information on sea wind and wave direction, which are derived from frequency characteristics, including the Doppler effect, polarization, back scattering, that cannot be observed by visible and infrared sensors [20]. In addition, microwave energy can penetrate haze, light rain and snow, clouds, and smoke [21]. Microwave sensors work in any weather condition and at any time.

3.2. Objects and spectrum

When light encounters an object, they can interact in several different ways: transmission, reflection, and absorption. The interaction depends on the wavelength of the light and the nature of the material of the object.

Most materials exhibit all three properties when interacting with light: partly transmission, partly reflection, and partly absorption. According to the dominant optical property, we categorize objects into two typical types: transparent materials and opaque materials.

Transparent material allows light to pass through the material without being scattered or absorbed. Typical transparent objects include plate glass and clean water. Figure 2 shows the transmission spectrum of soda-lime glass with a 2-mm thickness. Soda-lime glass is typically used in windows (also called flat glass) and glass containers. From Figure 2, it can be seen that soda-lime glass nearly blocks UV radiation. Nevertheless, it has high transmittance in the visible light and NIR wavelengths. It is easy to understand that when a laser scanner with a wavelength of 905, 1064, or 1550 nm hits a flat glass window or a glassy balcony, over 80% of the laser energy passes through the glass and hits the objects behind the window. Another typical example of transmissive material is clear water. Water transmittance is very high in the blue-green part of the spectrum but diminishes rapidly in the near-infrared wavelengths (see Figure 3). Absorption, on the other hand, is notably low in the shorter visible wavelengths (less than 418 nm) but increases abruptly in the range of 418–742 nm. A laser beam with a wavelength of 532 nm (green laser) is typically applied in bathymetric measurements as this wavelength has a high water transmittance. According to the Beer-Lambert law, the relation between absorbance and transmittance is as follows: Absorbance = −log (Transmittance).

Figure 2.

Transmission spectrum of soda-lime glass with a 2-mm thickness. Obtained from Wikipedia [22].

Figure 3.

Liquid water absorption spectrum. Obtained from Wikipedia [23].

Opacity occurs because of the reflection and absorption of light waves off the surface of an object. The reflectance of light depends on the material of the surface that the light encounters. There are two types of reflection: one is specular reflection and another is diffuse reflection. Specular reflection is when light from a single incoming direction is reflected in a single outgoing direction. Diffuse reflection is the reflection of light from a surface such that an incident ray is reflected at many angles rather than at just one angle, as in the case of specular reflection. Most objects have mixed reflective properties [24]. Representative reflective materials include metals, such as aluminum, gold, and silver. From Figure 4, it can be seen that aluminum has a high reflectivity over various wavelengths. In the visible light and NIR wavelengths, the reflectance of aluminum reaches up to 92%, while this value increases to 98% in MIR and FIR. Silver has a higher reflectance than aluminum when the wavelength is longer than 450 nm. At a wavelength of 310 nm, the reflectance of aluminum is zero [25]. The reflectance of gold significantly increases at a wavelength of approximately 500 nm, reaching a very high reflectance starting in the infrared. This figure indicates that regardless of the wavelength at which the sensor operates, it is inevitable to encounter high reflection from aluminum surfaces.

Figure 4.

Reflective spectrum of metals: aluminum, gold, and silver.

The physical characteristics of the material determine what type of electromagnetic waves will and will not pass through it. Figure 5 shows examples of the reflection spectrums of dry bare soil, green vegetation, and clear water. The reflection of dry bare soil increase as the wavelength increases from 400 to 1800 nm. Green vegetation has a high reflectance in the red light and near-infrared regions. These characteristics have been applied for distinguishing green vegetation from other objects. In addition, the previous figure shows that water has a low absorbance in the visible light region. Figure 5 shows that water reflects visible light at a low rate (<5%). Indirectly, the figure indicates that water has a high transmittance in the visible light range.

Figure 5.

Examples of reflective materials. Image referenced from Wikimedia [26].

Advertisement

4. Spaceborne sensors

Spaceborne sensors have been developed for over 40 years. Currently, approximately 50 countries are operating remote sensing satellites [9]. There are more than 1000 remote sensing satellites available in space, and among these, approximately 593 are from the USA, over 135 are from Russia, and approximately 192 are from China [27].

Conventionally, remote sensors are divided into two groups: passive sensors and active sensors, as we described in the first section. However, as sensor technology has advanced, nothing has been absolute. For example, an imaging camera is usually regarded as a passive sensor. However, in 2013, a new approach that integrates active and passive infrared imaging capability into a single chip was developed. This sensor enables lighter, simpler dual-mode active/passive cameras with lower power dissipation [28]. Alternatively, remote sensing sensors can be classified into imaging sensors and non-imaging sensors. In terms of their spectral characteristics, the imaging sensors include optical imaging sensors, thermal imaging sensors, and radar imaging sensors. Figure 6 illustrates the category in terms of imaging sensors and non-imaging sensors.

Figure 6.

Spaceborne remote sensing sensors.

4.1. Optical imaging sensors

Optical imaging sensors operate in the visible and reflective IR ranges. Typical optical imaging systems on space platform include panchromatic systems, multispectral systems, and hyperspectral systems. In a panchromatic system, the sensor is a monospectral channel detector that is sensitive to radiation within a broad wavelength range. The image is black and white or gray scale. A multispectral sensor is a multichannel detector with a few spectral bands. Each channel is sensitive to radiation within a narrow wavelength band. The resulting image is a multilayer image that contains both the brightness and spectral (color) information of the targets being observed. A hyperspectral sensor collects and processes information from 10 to 100 of spectral bands. A hyperspectral image consists of a set of images. Each narrow spectral band forms an image. The resulting images can be utilized to recognize objects, identify materials, and detect elemental components. Table 2 gives a more detailed description of these optical imaging systems. It can be seen that when a light is split into multiple spectrums, the greater the number of spectrums is, the lower the imaging resolution will be. That is, a panchromatic image usually presents a higher resolution than a multispectral/hyperspectral image. Pan-sharpening technique was introduced by Padwick et al. in 2010 [29] for improving the quality of multispectral images. This method combines the visual information of the multispectral data with the spatial information of the panchromatic data, resulting in a higher resolution color product equal to the panchromatic resolution.

Panchromatic systems Multispectral systems Hyperspectral systems
Spectral range (nm) ˜430–720 ˜430–720
˜750–950
˜470–2000
Satellites QuickBird, SPOT, IKONOS SPOT, QuickBird, IKONOS TRW Lewis, EO-1
Spectral band Monospectral, black and white, gray-scale image Several spectral bands 10 to 100 of spectral bands
Spatial resolution Submeter Up to 1–2 m Up to 2 m
Applications Earth observation and reconnaissance applications Red-green-blue (true color): visual analysis; Green-red-infrared: vegetation and camouflage detection; Blue-NIR-MIR: visualizing water depth, vegetation coverage, soil moisture content, and the presence of fires, all in a single image (i) Agriculture; (ii) eye care; (iii) food processing; (iv) mineralogy; (v) surveillance; (vi) physics; (vii) astronomy; (viii) chemical imaging; (ix) environment
Advantages High applicability in (i) imaging multiple targets; (ii) mosaic strips to large area; (iii) stereo and tristereo acquisition; (iv) linear feature acquisition, such as coastlines, pipelines, roads, and borders
Disadvantages Affected by sun illumination and cloud coverage. Polar areas with seasonal changes in sun illumination and the equatorial belt with persistent cloud coverage

Table 2.

Satellite optical imaging systems.

4.2. Thermal IR imaging sensors

A thermal sensor typically operates in the electromagnetic spectrum between the mid-to-far-infrared and microwave ranges, roughly between 9 and 14 μm. Any object with a temperature above zero can emit infrared radiation and produce a thermal image. A warm object emits more thermal energy than a cooler object. Therefore, the object becomes more visible in an image. This is especially useful in tracking a living creature, including animals and the human body, and detecting volcanos and forest fires because a thermal image is independent from the lights in a scene and is available whether it is daytime or nighttime. Commonly used thermal imaging sensors include IR imaging radiometers, imaging spectroradiometers, and IR imaging cameras. Currently, the satellite IR sensors in use include ASTER, MODIS, ASAA, and IRIS. Table 3 lists the thermal IR sensors and their applications.

Sensor Operational wave band Definition Satellites sensors Applications
IR imaging radiometer UV, mid-to-far-infrared, or microwave Measures the intensity of electromagnetic radiation ASTER Volcanological, mineralogical, and hydrothermal studies, forest fires, glacier, limnological and climatological studies and DEM
Imaging spectroradiometer Infrared Measure the intensity of radiation in multiple spectrums MODIS, ASAS, IRIS Sea surface temperature, cloud characteristics, ocean color, vegetation, trace chemical species in the atmosphere
Infrared imaging camera Mid-far infrared Measure reflected energy from the surface Volcanology, determining thunderstorm intensity, identifying fog and low clouds

Table 3.

Thermal IR sensors.

4.3. Radar imaging sensors

A radar (microwave) imaging sensor is usually an active sensor, operating in an electromagnetic spectrum range of 1 mm–1 m. The sensor transmits light to the ground, and the energy is reflected from the target to the radar antenna to produce an image at microwave wavelengths. The radar moves along a flight path, and the area illuminated by the radar, or footprint, is moved along the surface in a swath. Each pixel in the radar image represents the radar backscatter for that area on the ground. A microwave instrument can operate in cloudy or foggy weather and can also penetrate sand, water, and walls. Unlike infrared data that help us identify different minerals and vegetation types from reflected sunlight, radar only shows the difference in the surface roughness and geometry and the moisture content of the ground (the complex dielectric constant). Radar and infrared sensors are complimentary instruments and are often used together to study the same types of Earth surfaces [30]. Frequently used microwave spectrum bands for remote sensing include the X-band, C-band, S-band, L-band, and P-band. Specific characteristics of each band can be found in Table 4.

Band Frequency (GHz) Wavelength (cm) Key characteristics
Ka 40–27 0.75–1.11 Usually for astronomical observations
K 27–18 1.11–1.67 Used for radar, satellite communications, astronomical observations, automotive radar
Ku 18–12 1.67–2.5 Typically used for satellite communications
X 12.5–8 2.4–3.75 Widely used for military reconnaissance, mapping and surveillance
C 4–8 3.75–7.5 Penetration capability of vegetation or solids is limited and restricted to the top layers. Useful for sea-ice surveillance
S 4–2 7.5–15 Used for medium-range meteorological applications, for example, rainfall measurement, airport surveillance
L 2–1 15–30 Penetrates vegetation to support observation applications over vegetated surfaces and for monitoring ice sheet and glacier dynamics
P 1–0.3 30–100 So far, only for research and experimental applications. Significant penetration capabilities regarding vegetation canopy, sea ice, soil, and glaciers

Table 4.

Commonly used frequency and spectrum bands of radar imaging sensors.

Referenced from Born and Wolf [31].

Conventional passive microwave imaging instruments (such as cameras or imaging radiometers) provide imagery with a relatively coarse spatial resolution when compared to an optical instrument. The diffraction-limited angular resolution of a camera aperture is directly proportional to the wavelength and inversely proportional to the aperture dimension [33]. To achieve a similar spatial resolution as optical instruments, a large antenna aperture (e.g., tens of kilometers) is needed. Clearly, it is not feasible to carry such a large antenna on a space platform. SAR is an active microwave instrument that resolves the above problem. SAR utilizes the motion of the spacecraft to emulate a large antenna from the small craft itself. The longer the antenna is, the narrower the beam is. A fine ground resolution usually results from a narrow beam width. At present, a synthesized aperture can be several orders of magnitude larger than the transmitter and receiver antenna. It has become possible to produce an SAR image with a half meter of accuracy [32].

Specifically, SAR uses microwaves to illuminate a ground target with a side-looking geometry and measures the backscatter and traveling time of the transmitted waves reflected by objects on the ground. The distance the SAR device travels over a target in the time taken for the radar pulses to return to the antenna produces the SAR image. Typically, SAR is mounted on a moving platform, such as a spaceborne or airborne platform. According to the combination of frequency bands and polarization modes used in data acquisition, SAR can be categorized into [33]:

  • Single frequency (L-band, C-band, or X-band);

  • Multiple frequency (Combination of two or more frequency bands);

  • Single polarization (VV, HH, or HV);

  • Multiple polarization (Combination of two or more polarization modes).

The main parameters of designing and operating SAR include the power of electromagnetic energy, frequency, phase, polarization, incident angle, spatial resolution, and swath width. There are different types of SAR techniques, including ultra-wideband SAR, terahertz SAR, differential interferometry (D-InSAR), and interferometric SAR (InSAR). Ultra-wideband SAR utilizes a very wide range of frequencies of radio waves. This method results in a better resolution and more spectral information on target reflectivity. Therefore, this approach can be applied for scanning a smaller object or a closer area. Terahertz radiation works in the spectral range from 0.3 to 10 THz, typically between infrared and microwave. Typical characteristics of this wavelength range include its transmission through plastics, ceramics, and even papers. Terahertz radiation is extraordinarily sensitive to water content. If the material has even a small amount of water content, it will be fairly absorptive to terahertz light. Therefore, this radiation can be applied in detecting lake shores or coastlines. InSAR, also called interferometric SAR, is a technique that produces measurements from two or more SAR images. This technique is widely applied in DEM production and monitoring glaciers, earthquakes, and volcanic eruptions [34]. D-InSAR requires taking at least two images with the addition of a DEM. The DEM can be acquired from GPS measurements. This method is mainly used for monitoring subsidence movements, slope stability analysis, landslides, glacier movement, and 3D ground movement [35]. Doppler radar is used to acquire a distant object’s velocity relative to the radar. The main applications of this technique include aviation, sounding satellites, and meteorology. In general, SAR can reach a spatial resolution on the order of a millimeter.

4.4. Non-imaging sensors

A non-imaging sensor measures a signal based on the intensity of the whole field of view, mainly as a profile recorder. In contrast with imaging sensors, this type of sensor does not record how the input varies across the field of view. In the remote sensing field, the commonly used non-imaging sensors include radiometers, altimeters, spectrometers, spectroradiometers, and LIDAR. Table 5 provides detailed information about conventional non-imaging sensors. In the remote sensing field, non-imaging sensors typically work in the visible, IR, and microwave spectral bands. The applications for non-imaging sensors mainly focus on height, temperature, wind speed, and other atmospheric parameter measurements.

Sensor Operational wave band Definition Application
Radiometer Ultraviolet, IR, microwave To measure the amount of electromagnetic energy present within a specific wavelength range Calculating various surface and atmospheric parameters
Altimeter IR, microwave/radiowave, sonic To measure the altitude of an object above a fixed level Mapping ocean-surface topography and the hills and valleys of the sea surface
Spectrometer Visible, IR, microwave To measure the spectral content of the incident electromagnetic radiation Multispectral and hyperspectral imaging
Spectro-radiometer Visible, IR, microwave To measure the intensity of radiation in multiple spectrums Monitoring sea surface temperature, cloud characteristics, ocean color, vegetation, trace chemical species in the atmosphere
LIDAR Ultraviolet, visible, NIR To measure distance and intensity Ocean, land, 3D topographic mapping
Doppler LIDAR: measure the wave number for speed; Polarization effects of LIDAR: shape Meteorology, cloud measurements, wind profiling and air quality monitoring
Sonar Acoustic Measure the distance to an object; determine the depth of water beneath ships and boats Navigation, communication and security (e.g., vessels) and underwater object detection. For example, handheld sonar for a diver
Sodar Acoustic As a wind profiler, sodar systems measure wind speeds at various heights above the ground and the thermodynamic structure of the lower layer of the atmosphere Meteorology: atmospheric research, wind monitoring (typically in a range from 50 to 200 m above ground level)
A radio acoustic sounding system (RASS) Radio wave and acoustic wave Measuring the atmospheric lapse rate using backscattering of radio waves from an acoustic wave front to measure the speed of sound at various heights above the ground Is added to a radar wind profiler or to a sodar system

Table 5.

Non-imaging sensors.

Lasers have been applied in measuring the distance and height of targets in the remote sensing field. We generally call a laser scanning system as LIDAR (light detection and ranging) system. Satellite LIDAR, airborne LIDAR, mobile mapping LIDAR, and terrestrial LIDAR are different carrier platforms. Laser sources include solid-state lasers, liquid lasers, gas lasers, semiconductor lasers, and chemical lasers (see Table 6). Typical laser sources for laser rangefinders and laser altimeters include semiconductor laser and solid-state lasers. Semiconductor lasers typically produce light sources at wavelengths of 400–500 nm and 850–1500 nm. Solid-state lasers generate light at wavelengths of 700–820 nm, 1064 nm, and 2000 nm. Satellite or airborne LIDAR systems are typically operated at wavelengths of 905 , 1064 and 1550 nm. One of the main considerations for wavelength selection is the atmospheric transmission between the sensor and the surface of the Earth. Lower transmittance at a given wavelength means less solar radiation at that wavelength. The transmittance at 905 nm is approximately 0.6, while the wavelengths of 1064 and 1550 nm have similar transmittances of approximately 0.85. In addition, wavelength selection can also be a cost issue. Diode lasers at 905 nm are inexpensive compared to Nd:YAG solid-state lasers at 1064 nm and diode lasers at 1550 nm. In 2007, the cost of diode lasers at 1550 nm was 2.5 times higher than lasers at 905 nm. However, the wavelength of 1550 nm is a good candidate for use in invisible wavelength eye-safe LIDAR. The higher absorption of 1550 nm light by water makes it eye safe, and this absorption is approximately 175 times greater than that of 905 nm light. In addition, the solar background level of light at 1550 nm is approximately 50% lower than that of light at 905 nm. Making measurements at 1550 nm also results in a higher signal to noise ratio compared to using a beam at 905 nm. All in all, when ignoring the cost issue, a wavelength of 1550 nm has a clear advantage over light at 905 nm [36].

Laser types Pump source Typical applications
Gas laser Electrical discharge Interferometry, holography, spectroscopy, material processing
Chemical laser Chemical reaction Military use
Dye laser Other laser, flashlamp Research, laser medicine
Metal-vapor laser Electrical discharge Printing and typesetting applications, fluorescence excitation examination, scientific research
Solid-state laser Flashlamp, laser diode, Fiber laser, Nd: YAG. Material processing, rangefinding, laser target designation
Semiconductor laser Electrical current Telecommunications, holography, printing, weapons, machining

Table 6.

Typical laser sources.

In general, at a wavelength of 1064 nm, vegetation has stronger reflectance than soil, while at a wavelength of 1550 nm, soil shows greater a reflectance than vegetation. Taking measurements with different wavelengths is beneficial for object classification. Green lasers with a wavelength of 532 nm are usually pumped by a solid-state laser (Nd:YAG). This type of laser is widely used for bathymetric measurement. Table 7 lists the typical applications of different laser light wavelengths.

Name wavelength (nm) Laser sources Typical applications
UV laser 355 Gas laser Cutting and drilling
Violet laser 405 Semiconductor laser or solid-state laser Laser printing, data recording, laser microscopy, laser projection displays, spectroscopic measurements
Blue laser 488 Solid-state laser Environmental monitoring, medical diagnostics, handheld projectors and displays, telecommunications
Green laser 532 Solid-state laser (Nd:YAG) Bathymetric measurement
Red laser 640 Semiconductor laser Vegetation measurement
NIR laser 1064 Semiconductor laser or solid-state laser (fiber laser) Airborne laser scanning
NIR laser 1550 Semiconductor laser or solid-state laser (fiber laser) Airborne laser scanning

Table 7.

Commonly used laser wavelength.

Referenced from Hey [36].

4.5. Commonly used remote sensing satellites

So far, more than 1000 remote sensing satellites have been launched. These satellites have been updated with new generation satellites. The few spectral sensors from the earliest missions have been upgraded to hyperspectral sensors with hundreds of spectral bands. The spatial and spectral resolutions have been improved on the order of 100-fold. Revisit times have been shortened from months to daily. In addition, more and more remote sensing data are available as open data sources. Table 8 gives an overview of the commonly used remote sensing satellites and their parameters.

Mission Country Launch year Sensors Height of orbit (km) Swath (km) Revisit (day) Channels Spatial resolution
Landsat USA 1972, 1975, 1978,1982, 1984,1993, 1999,2013, 2020 Panchromatic and multispectral sensor 705 185, 183 16 7–11 120 m, 100 m, 60 m, 30 m, 15 m
SPOT USA 1986, 1990, 1993, 1998, 2002, 2012 Imaging spectroradiometer 694 60 1–3 Panchromatic, B, G, R, NIR 2.5 m, 5 m, 10 m, 20 m
ERS ESA 1991, 1995 IR radiometer, microwave sounder, Radiometer, SAR 782–785 5–100 km (AMI) - 500 km (ATSR) 3, 35, 336 SAR 26 m across track and 6–30 m along track
RADARSAT Canada 1995, 2007, 2018 SAR 793–821, 798, 592.7 45–100, 18–500, 5–500 1 SAR 8–100 m, 3–100 m, 3–100 m
MODIS USA 1999, 2002 Imaging spectroradiometer 705 2330 1 36 1000 m, 500 m, 250 m
IKONOS USA 1999 Imaging spectroradiometer 681 11.3 3 Panchromatic, B, G, R, NIR Panchromatic:80 cm
B, G, R, NIR:3.2 m
QuickBird USA 2000, 2001 Imaging spectroradiometer 482, 450 16.8–18 2.4–5.9 Panchromatic, B, G, R, NIR Panchromatic:65 cm/61 cm
B, G, R, NIR:2.62 m/2.44 m
Envisat ESA 2002 ASAR, MERIS, AATSR, RA-2, MWR, GOMOS, MIPAS, SCIAMACHY, DORIS, LRR 790 1150 km,
100 km, 400 km
35 days 15 bands (VIS, NIR), C-band 300 m, 30–150 m
GeoEye USA 2008 Imaging spectroradiometer 681 15.2 8.3 Panchromatic, B, G, R, NIR Panchromatic:41 cm
B, G, R, NIR: 1.65 m
WorldView USA 2007
2009
2014
2016.9
Imaging spectroradiometer, Laser altimeter 496,
770,
617,
681
17.6 km
16.4 km
13.1 km
14.5 km
1.7
1.1
<1
3
Panchromatic;
Panchromatic and eight multispectrum;
Panchromatic and eight multispectrum;
Panchromatic, B, G, R, NIR
Panchromatic 0.5 m;
Panchromatic and stereo images:0.46 m
multispectral: 1.84 m;
Panchromatic 0.34 m and multispectral 1.36 m
Sentinel
1–6
ESA 2014, 2015,
2016,
2017, 2021
Radar and super-spectral imaging 693,
786, 814
250 km
290 km,
250 km,
12, 10, 27 C-SAR, 12 bands (VIS, NIR, SWIR), 21 bands (VIS, NIR), S-band & X-band 5–20 m, 5–40 m,
10 m & 20 m & 60 m

Table 8.

Remote sensing satellites.

Referenced from Refs. [37, 38, 39, 40].

Advertisement

5. Future and discussions

A common expectation from the remote sensing community is the ability to acquire data at high resolutions (spatial, spectral, radiometric, and temporal), at low cost, with open resource support and for the creation of new applications by the integration of spatial/aerial and ground-based sensors.

The development of smaller, cheaper satellite technologies in recent years has led many companies to explore new ways of using low Earth orbit satellites. Many companies have focused on remote imaging, for example, to gather optical or infrared imagery. In the future, a low-cost communications network between low Earth orbit satellites can be established to form a spatial remote sensing network. This network would integrate with a large number of distributed ground sensors to establish ground-space remote sensing. In addition, satellites can easily cover large swaths of territory, thereby supplementing ground-based platforms. Thus, data distribution and sharing would become very easy.

Openness and sharing resources can promote the utilization of remote sensing and maximize its output. In recent years, open remote sensing resources have made great progress. Beginning on April 1, 2016, all Earth imagery from a widely used Japanese remote sensing instrument operating aboard NASA’s Terra spacecraft since late 1999 has been available to users everywhere at no cost [41]. On April 8, 2016, ESA announced that an amazing 40-cm resolution WorldView-2 European cities dataset would be available for download through the Lite Dissemination Server. These data are made available free of charge. This dataset was collected by ESA, in collaboration with European Space Imaging, over the most populated areas in Europe at 40-cm resolution. These data products were acquired between February 2011 and October 2013. The dataset is available to ESA member states (including Canada) and European Union Member states [42]. In open remote sensing resources, NASA (USA) was a pioneer in sharing its imagery data. NASA has been cooperating with the open source community, and many NASA projects are also open source. NASA has also set up a special website to present these projects. In addition, some commercial companies like DigiGlobal (USA) have also partly opened their data to the public. In the future, more and more open resources will become available.

Future applications in remote sensing will combine the available resources from space/aerial/UAV platforms with ground-based data. The prerequisites of such resource integration are as follows: (i) the spatial resolution of satellite data is high enough to match ground-based data; for example, both spatial data and ground data are in the same order of accuracy. WorldView-3 has achieved a 30-cm spatial resolution, which is comparable with ground-based sub-centimeter data accuracy (e.g., 2 cm in mobile laser point cloud); (ii) cloud-based calculation supports big datasets from crowd-sourced remote sensing resources. The current situation shows promising support for the integration of multiple sources of remote sensing data. We expect to see new applications developing in the coming years.

Advertisement

6. Conclusions

This paper investigated remote sensing sensor technology both broadly and in depth. First, we reviewed some fundamental knowledge about the electromagnetic spectrum and the interaction of objects and the spectrum. It helps to understand that when a sensor is operated in a certain wavelength how environmental objects will react to it. In addition, we also highlighted the terahertz region of the spectrum. Since little research has been done on this range, in the future, research efforts on new applications of terahertz radiation may be worth exploring. On the interaction of sensors with the environment, typical examples in glass, metal, water, soil, and vegetation were provided. Remote sensors were presented in terms of imaging sensors and non-imaging sensors. Optical imaging sensors and thermal imaging sensors, radar imaging sensors, and laser scanning were highlighted. In addition, commonly used remote sensing satellites, especially those from NASA and ESA, were detailed in terms of launched time, sensors, swath width, spectrum bands, revisit time and spatial resolution.

Advertisement

Acknowledgments

We would like to thank TEKES for its funding support in the project of COMBAT and also the financial support from EU project 6Aika.

References

  1. 1. Space. Record-Setting Rocket Launch on Nov. 19: The 29 Satellites [Internet]. 2013. Available from: http://www.space.com/23646-ors3-rocket-launch-satellites-description.html [Accessed: Feb 27, 2017]
  2. 2. Microimages. Introduction to RSE [Internet]. 2012. Available from: http://www.microimages.com/documentation/Tutorials/introrse.pdf [Accessed: Feb 27, 2017]
  3. 3. NASA. Earth Satellite Orbits [Internet]. 2012. Available from: http://earthobservatory.nasa.gov/Features/OrbitsCatalog/page2.php [Accessed: Mar 2, 2017]
  4. 4. NASA. Passive Sensors [Internet]. 2012. Available from: https://www.nasa.gov/directorates/heo/scan/communications/outreach/funfacts/txt_passive_active.html [Accessed: Mar 3, 2017]
  5. 5. Earth Imaging Journal Exploring the Benefits of Active Vs. Passive Spaceborne Systems [Internet]. 2013. Available from: http://eijournal.com/print/articles/exploring-the-benefits-of-active-vs-passive-spaceborne-systems [Accessed: Mar 8, 2017]
  6. 6. Japan Association of Remote Sensing. Sensors [Internet]. 2010. Available from: http://www.jars1974.net/pdf/03_Chapter02.pdf [Accessed: Mar 20, 2017]
  7. 7. Blais F. Review of 20 years of range sensor development. Journal of Electronic Imaging. 2004;13(1):231-240
  8. 8. Melesse AM, Weng Q, Thenkabail PS, Senay GB. Remote sensing sensors and applications in environmental resources mapping and modelling. Sensors. 2007;7(12):3209-3241
  9. 9. Toth C, Jóźków G. Remote sensing platforms and sensors: A survey. ISPRS Journal of Photogrammetry and Remote Sensing. 2016;115(May):22-36
  10. 10. Barry Rice. A Brief History of Remote Sensing [Internet]. 2008. Available from: http://www.sarracenia.com/astronomy/remotesensing/primer0120.html [Accessed: Mar 28, 2017]
  11. 11. An Introduction to Solar System Astronomy. The Earth’s Atmosphere [Internet]. 2007. Available from: http://www.astronomy.ohio-state.edu/˜pogge/Ast161/Unit5/atmos.html [Accessed: Mar 30, 2017]
  12. 12. TechPort. New Technology Reports: Available from: techport.nasa.gov [Accessed: Mar 30, 2017]
  13. 13. Wikipedia. Infrared [Internet]. 2008. Available from: https://en.wikipedia.org/wiki/Infrared [Accessed: Mar 30, 2017]
  14. 14. Lightsources. Terahertz Radiation or T-Rays [Internet]. 2010. http://www.lightsources.org/terahertz-radiation-or-t-rays [Accessed: Apr 3, 2017]
  15. 15. PHYS. A Revolutionary Breakthrough in Terahertz Remote Sensing [Internet]. 2010. Available from: http://phys.org/news/2010-07-revolutionary-breakthrough-terahertz-remote.html#jCp [Accessed: Apr 4, 2017]
  16. 16. KASAI Y. Introduction to terahertz-wave remote sensing. Journal of the National Institute of Information and Communications Technology. 2008;55(1):65-67
  17. 17. Amir F. (2011). Advanced Physical Modelling of Step Graded Gunn Diode for High Power TeraHertz Sources. [thesis]
  18. 18. Wagh MP, Sonawane YH, Joshi OU. Terahertz technology: A boon to tablet analysis. Indian Journal of Pharmaceutical Sciences. 2009;71(3):235
  19. 19. GIZMODO. MIT Invented a Camera that can Read Closed Books [Internet]. 2016. Available from: http://gizmodo.com/mit-invented-a-camera-that-can-read-closed-books-1786522492 [Accessed: Apr 10, 2017]
  20. 20. Japan Association of Remote Sensing. Chapter 3 Microwave Remote Sensing [Internet]. 2010. Available from: http://wtlab.iis.u-tokyo.ac.jp/˜wataru/lecture/rsgis/rsnote/cp3/cp3-1.htm [Accessed: Apr 18, 2017]
  21. 21. NASA. Microwaves [Internet]. 2011. Available from: http://science.hq.nasa.gov/kids/imagers/ems/micro.html [Accessed: Apr 18, 2017]
  22. 22. Wikipedia. Soda-Lime Glass [Internet]. 2010. Available from: https://en.wikipedia.org/wiki/Soda-lime_glass [Accessed: Apr 20, 2017]
  23. 23. Wikipedia. Electromagnetic_Absorption_by_Water [Internet]. 2017. Available from: https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_water [Accessed: Apr 20, 2017]
  24. 24. Wikipedia. Specular_Reflection [Internet]. 2017. Available from: https://en.wikipedia.org/wiki/Specular_reflection [Accessed: Apr 20, 2017]
  25. 25. More RM, editor. Laser Interactions with Atoms, Solids and Plasmas. Vol. 327. Springer Science & Business Media. Berlin/Heidelberg, Germany; 2013
  26. 26. Wikimedia. File: Image-Metal-Reflectance.png [Internet]. 2010. Available from: https://commons.wikimedia.org/w/index.php?curid=1729695 [Accessed: Apr 21, 2017]
  27. 27. UCSUSA. UCS Satellite Database [Internet]. 2017. Available from: http://www.ucsusa.org/nuclear-weapons/space-weapons/satellite-database#.WagRxrIjGpo [Accessed: Aug 25, 2017]
  28. 28. Photonics. Active and Passive Modes in one IR Camera [Internet]. 2013. Available from: http://www.photonics.com/Article.aspx?AID=52832 [Accessed: Apr 21, 2017]
  29. 29. Padwick C, Deskevich M, Pacifici F, Smallwood, S, 2010. Worldview-2 Pan-Sharpening. ASPRS 2010 Annual Conference, San Diego, California, April 26-30, 2010
  30. 30. Earth Imaging Journal. Radar [Internet]. 2012. Available from: http://eijournal.com/wp-content/uploads/2012/10/radar_table2.jpg [Accessed: Apr 27, 2017]
  31. 31. Born M, Wolf E. (1980). Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light. Sixth Edition. Copyright © 1980 Elsevier Ltd., Pergamon, ISBN: 978-0-08-026482-0. pp 836
  32. 32. Jackson CR, Apel JR. Synthetic Aperture Radar: Marine user's Manual. US Department of Commerce, National Oceanic and Atmospheric Administration, National Environmental Satellite, Data, and Information Serve, Office of Research and Applications; Washington, D. C.; 2004. p. 1-7
  33. 33. Natural Resources Canada. Polarization in Radar Systems [Internet]. 2010. Available from: http://www.nrcan.gc.ca/earth-sciences/geomatics/satellite-imagery-air-photos/satellite-imagery-products/educational-resources/9567 [Accessed: Apr 28, 2017]
  34. 34. Wikipedia. Synthetic Aperture Radar [Internet]. 2010. Available from: https://en.wikipedia.org/wiki/Synthetic_aperture_radar [Accessed: May 5, 2017]
  35. 35. Permanet. Differential Interferometry Synthetic Aperture Radar (DInSAR) [Internet]. 2010. Available from: http://www.permanet-alpinespace.eu/archive/pdf/WP6_1_dinsar.pdf [Accessed: May 9, 2017]
  36. 36. Hey JDV. A novel LIDAR ceilometer: Design, implementation and characterisation. In: Weber SM, editor. Handbook of Laser Wavelengths. CRC Press, Florida, United States; 2014, 1998. p. 784 ISBN 9780849335082
  37. 37. Wikipedia. Remote Sensing Satellite and Data Overview [Internet]. 2017. Available from: https://en.wikipedia.org/wiki/Remote_sensing_satellite_and_data_overview [Accessed: May 11, 2017]
  38. 38. NASA. Missions [Internet]. 2010. Available from: http://www.nasa.gov/missions [Accessed: May 16, 2017]
  39. 39. ESA. Latest Mission Operations News [Internet]. 2017. Available from: https://earth.esa.int/web/guest/missions/esa-operational-eo-missions/ers [Accessed: May 16, 2017]
  40. 40. ESA. Sentinel [Internet]. 2016. Available from: https://sentinel.esa.int/web/sentinel/home [Accessed: May 16, 2017]
  41. 41. Jet Propulsion Laboratory. NASA, Japan Make ASTER Earth Data Available at no Cost [Internet]. 2016. Available from: http://www.jpl.nasa.gov/news/news.php?feature=6253 [Accessed: May 18, 2017]
  42. 42. ESA. WorldView-2 European Cities Dataset 40cm Resolution [Internet]. 2016. Available from: https://earth.esa.int/web/guest/content/-/article/worldview-2-european-cities-dataset-40cm-resolution [Accessed: May 18, 2017]

Written By

Lingli Zhu, Juha Suomalainen, Jingbin Liu, Juha Hyyppä, Harri Kaartinen and Henrik Haggren

Submitted: 07 June 2017 Reviewed: 19 September 2017 Published: 20 December 2017