Open access peer-reviewed chapter

Development of an UAS for Earthquake Emergency Response and Its Application in Two Disastrous Earthquakes

Written By

Chaoyong Peng, Zhiqiang Xu, Jiansi Yang, Yu Zheng, Weiping Wang, Sha Liu and Baofeng Tian

Submitted: 01 December 2017 Reviewed: 29 March 2018 Published: 05 November 2018

DOI: 10.5772/intechopen.76885

From the Edited Volume

Earthquakes - Forecast, Prognosis and Earthquake Resistant Construction

Edited by Valentina Svalova

Chapter metrics overview

933 Chapter Downloads

View Full Metrics

Abstract

To support humanitarian action after a disaster, we require reliable data like high-resolution satellite images for analyses aimed to define the damages of facilities and/or infrastructures. However, we cannot obtain satellite images in few days after an event. Thus, in situ surveys are preferred. Advances in unmanned aircraft system (UAS) have promoted them to become precious tools for capturing and assessing the extents and volume of damages. Safety, flexibility, low cost, and ease of operation make UAS suitable for disaster assessment. In this chapter, we developed an example of UAS for swiftly acquiring disaster information. With the selected fixed-wing UAS, we successfully performed data acquisition at specified scales. For the image analysis, we applied a photogrammetric workflow to deal with the very high resolution of the images obtained without ground control points. The results obtained from two destructive earthquakes demonstrated that the presented system plays a key role on the processes of investigating and gathering information about a disaster in the earthquake epicentral areas, like road detection, structural damage survey, secondary disaster investigation, and quick disaster assessment. It can effectively provide disaster information in hardly entered areas to salvation headquarters for rapidly developing the relief measures.

Keywords

  • unmanned aircraft system
  • emergency rescue
  • disaster assessment
  • regional panorama
  • earthquake

1. Introduction

For efficient mitigation of losses caused by a destructive earthquake [1, 2, 3], accurate development of relief measures, and practical improvement of the emergency rescue efficiency, we need to rapidly acquire the earthquake information and promptly attempt to evaluate the probable damages. In the case of ground transport disruption as well as communication interruption, remote sensing (RS) plays a crucial role on acquiring disaster situation for post-earthquake emergency response and evaluation. Nevertheless, once satellite remote sensing (SRS) is adopted to acquire disaster information, it is currently restricted by its spatial resolution and run cycle. Besides, the traditional manned aerial photography is also constrained by some restrictions [4], like airport and weather conditions, and some potential dangers to pilots during the flight. The unmanned aircraft system (UAS) can be used as a supplementary resort to manned aerial photography and SRS because of its unique advantages such as being real time, flexible, high image resolution, and low cost. It refers to a class of aircrafts which can easily and safely fly without the onboard presence of a pilot [5, 6, 7, 8, 9] and can offer several opportunities in disaster-related situations when equipped with RS instrumentations [10, 11, 12]. High-resolution images are analyzed and utilized to generate hazard maps, dense surface models, detailed building renderings, comprehensive elevation models, and other characteristics in a disaster area. These data can then be analyzed with RS methods as well as visual interpretation to coordinate rescue efforts, record the responses of a structural system to the vibrations, detect structural failures, investigate access issues, and verify experimental model of modeling disaster. In addition, the data can be gathered before a disaster for documenting immediate pre-event conditions of critical facilities and infrastructures, monitoring susceptible environmental concerns, and documenting historical conditions and sites [13]. Briefly speaking, RS based on UAS yields the best possible spatial and temporal resolutions for the respective research question or application [14].

In this chapter, we firstly introduce the utilization of a fixed-wing aircraft type called HW18 developed by HoverWings, China, which is equipped with Sigma DP2 (an optimal digital camera system). Then, we represent the field test results in the Yuxi County, Yunnan province, China, and real applications in two severe earthquakes: the 2013 Ms7.0 Lushan earthquake (occurred at Sichuan province, China) and the 2014 Ms6.5 Ludian Earthquake (occurred at Yunnan province, China). The main objectives of the application of UAS are very high-resolution monitoring of the investigated post-earthquake areas with optical RS data as well as creating detailed digital maps. Due to its very high image resolution, the UAS can be used in a broad range of applications with a superior degree of detail than SRS data. In addition, the photogrammetric potential of small-format aerial photography data makes it more suitable for applications requiring higher efficiency and spatial completeness than traditional field work. However, if we want to develop an UAS survey approach, the main prerequisite is to satisfy different requirements of resolution, scale, and accuracy [15].

Advertisement

2. System structure

The developed system is composed of three main parts. These parts are an UAS, a ground control station, and a system used for image processing (Figure 1).

Figure 1.

Different parts of the presented UAS.

2.1. UAS specifications

As the unmanned platforms continuously grow, we have many choices to select different unmanned platforms for low-altitude Earth observation, and it is difficult for us to generalize advantages or disadvantages of particular platform, as possible applications and operating conditions remarkably vary throughout the world. When we need to precisely cover small sites that require only a small number of images, tethered systems, which are manually navigated (e.g., gliders, blimps, and kites), are ideal tools. However, these systems are scarcely utilized to systematically survey larger regions, where regular overlaps along evenly spaced flight lines are preferable for an efficient processing workflow. Two well-known instances of recent development in UAS technology, the Global Positioning System (GPS) and Inertial Navigation System (INS) technologies, have been leading to the availability of a broad range of autopilot systems like drones, planes, and quadcopters, which can autonomously follow predefined flight plans.

However, as a RS platform, we need to make compromises according to hardware and software limitations of an UAS, because it is limited in its flight duration and payload capacity [16]. Thus, when developing a custom-built UAS, we must balance the platform accessibility with the data quality of low-cost sensors and the technological limitations inherent of small-scale platforms. Such weight and cost limitations necessitate a tangible reduction in quality of manufacturing process of the sensor. Through the use of cheaper fabrication methods and materials, the absence of on-board processing features, or limited data storage capacity, we can readily achieve the reductions.

In this work, we employed a fixed-wing aircraft-type HW18 developed by HoverWings, China (Figure 1), with weight of approximately 2.3 kg (without payload), a length of 110 cm, and a wingspan of 120 cm. It is equipped with Sigma DP2, a digital camera system. The aviation fuel-powered system is based on an airplane model. At a ground speed of 70–110 km/h and with full payload (the maximum takeoff weight is 5.2 kg), the duration of flight is more than 100 minutes. In addition, the plane can be hand-launched and does not require catapult (Figure 2). Figure 3 represents the operating principle of the UAS. During flight and takeoff, the presented UAS is autonomously controlled by an autopilot system and its GPS/IMU components. We use the Pix4UAV software [17] to design flight paths followed by the UAS, and it can automatically create an accuracy report in order to investigate the result quality. After finishing the flight plan, the UAS returns to the starting place and remains rotary in a preset altitude. At the starting point, we set a center for the so-called bounding box which is a circular unit with a diameter of 150 m. When a pilot adopts an assisted flying mode, the maximum distance between the starting point and the UAS is limited by the bounding box. The UAS will automatically and immediately turn to remain within its given extent whenever it hits the limit of the bounding box.

Figure 2.

Hand-launching the UAV.

Figure 3.

Working principle of the UAV.

The system also features an assisted flying mode, where an autopilot software is used to support the pilot for UAS control. Destabilization due to wind or height loss while flying turns is automatically amended. Within the software, turn radii are limited for avoiding stress concentrations which might lead to the UAS being damaged. Besides, steering is simplified. The pilot can directly turn the UAS left or right through software-controlled interaction of roll-aileron and yaw rudder. Limits to maximum values of roll angles and pitch are set in the software. The pilot can safely land and steer a plane through this flying mode while constricting the navigation region to a preset range, which is especially useful for difficult terrains where a fully autonomous landing may be impossible. In addition, it increases the flexibility during the survey when the pilot wants to cover extra targets not previously set in the flight plans. However, using the assisted flying mode, a pilot cannot be relieved from his/her responsibility because he/she must be able to safely land and steer a plane. Thus, it is vital to pass the training courses offered by HoverWings prior to independent surveys.

The UAS installed a 14 MP digital fixed lens camera, which is called Sigma DP2 [18], as the optical onboard sensor. It is a high-end compact digital camera, with a 14-megapixel Foveon X3 sensor (2652 × 1768 × 3 layers), a 2.5” LCD, a fixed 24.2 mm f/2.8 lens (equivalent to 41 mm), and a pop-up flash. In addition, it is one of the few “compact” cameras featuring a sensor with a size equivalent to APS-C, which can produce high-quality images with the help of a digital single-lens reflex (DSLR) camera. This camera with DSLR-comparable quality is much lighter (260 g excluding battery and card) and smaller (113.3 × 59.5 × 56.1 mm including lens) than that of a traditional single-lens reflex cameras. These characteristics make it to be highly appropriate for unmanned platforms. In addition, this camera has several important features for photogrammetric analysis of the images: large image sensor, multiple recording modes (e.g., JPEG, movie, and lossless compression RAW data), single focal length, as well of lack of an image stabilizer. For safety, we conceal the entire camera in the body of the plane and make it vertically point to the ground through an opening at the bottom of the plane. An on-board computer is used to trigger the camera exposure in a regular interval defined by a flight-planning software. Therefore, we can take images in a continuous mode during the flight.

Generally, for achieving regular stereoscopic overlap along the flight line, the flight plans are designed as parallel lines. The roll of the UAS may reach large angles up to 60° at the end of each line, where the image axis is off-nadir. Thus, we must consider a rather large overshoot resulting in a remarkably larger area covered by the predefined flight plan than the actual image acquisition area. In addition, due to restricted triggering control, we can avoid large amounts of oblique images at ends of the flight line. Maximum pitch angles (e.g., 10° off-nadir) are predefined through a flight-planning software.

2.2. Ground station

A ground station is used to remotely control the UAS, receive telemetry data, and display images transferred in real time. The processing flow of this ground station is shown in Figure 4. We can see from Figure 4 that the ground station includes three components: a ground monitoring station, a telemetry receiving antenna, and a remote control transmitter. The embedded PC104 is used as the hardware of the ground motion, and it contains a digital video recorder, a 17” LCD, a spherical mouse, and a membrane keyboard. The software system installed on it includes functions like Windows-based software modules for video processing, navigation, information processing, and map generation. We can display the video interface, the navigation interface, and the task management and status bar interface on the LCD and switch them in real time.

Figure 4.

Processing flow of the ground station.

2.3. Image processing system

When referred to applications that include measuring and mapping, it is imperative to contain georeferencing and geometric correction of the acquired images. However, a highly precise geometric correction requires effort, time, excellent ground control, and a digital elevation model. To our UAS application, it may not really require such efforts, and based on the relief and image features, simpler solutions may be quite sufficient.

The location of an image within a given ground coordinate system can be reconstructed through two methods. The first one is to adopt Ground Control Points (GCPs) with known locations in a reference system and appearing on the images. The second one computes the exterior orientation position (X, Y, Z) and rotations (yaw, pitch, and roll of the platform or κ, φ, and ω of the image) of the images during the flight. Although the captured data of the latter are inaccurate and cannot be used for direct georeferencing relative to image scales and resolution, these data can be used as initial values of the system for approximate orientation. Why we adopt the second approach for image location reconstruction is that it is impossible for us to deploy GCPs for emergency rescue immediately after an earthquake and the demand for high data timeliness is of utmost priority.

Figure 5 shows the photogrammetry-based workflow for collecting and processing images acquired in a disaster. The exterior orientation values are taken from the log-files recorded during the survey flight for initial direct georeferencing of the images. The log files contain values for different angles (κ, φ, and ω) as well as GPS values (X, Y, and Z) for each image center. During image processing, we refine the values for image orientation in iterative triangulation computations. Besides, calibration parameters associated with camera, which were derived from self-calibration based on a well-suited study site, are also used for image orientation.

Figure 5.

Working flow for collecting and processing of disaster images.

A certain workflow is adapted to process small-format aerial photographs (SFAPs) in advance. Firstly, we screen all SFAPs obtained during survey flying and select well-suited SFAPs with a certain stereoscopic overlap rate for further processing. Those information in the log files can enable us to visualize the image location and distribution based on a GIS software. Therefore, we can select well-suited SFAPs for further processing in a much quicker manner based on an attribute-based selection of points, e.g., by excluding images deviating from flight altitude or images exceeding a predefined threshold for maximum deviance from a nadir position during image acquisition, which are taken during the starting and landing phases.

Advertisement

3. Field test

After completing the system development, field tests were carried out in Yuxi County, Yunnan province, China, for acquiring optimal parameters of the system (e.g., flight heights and camera parameters). The required image scales and resolutions vary depending on the processes observed. For considering the different observation scales and site extents, we employed different flight plan designs for image acquisition.

3.1. Investigation of UAS flight parameters

With different flight altitudes and flight parameters, the image resolution taken by UAS will be different. For taking aerial photographs in higher resolution, flight tests of the UAS parameters were undertaken to achieve the optimal flight parameters, which would be appropriate for the system.

Firstly, different flight altitudes were chosen for the system, like 200, 300, 500, and 800 m. We performed the tests in the same environment: (a) infinite focal length, (b) exposure time of 1/1000 second, (c) ISO 200, (d) interval timer of shooting of 4 seconds (minimum shooting interval of Sigma DP2), (e) wind speed of 5 m/s, (f) length of the flight line of 8 km, (g) horizontal and vertical viewing angle of 60 and 45°, and (h) flying speed of 30 m/s. The achieved results showed that the image resolution taken at these flight altitudes meets our requirements. Nevertheless, at the flight altitude of 200 m, we only obtained oblique images during the UAS turns in the overshoot zone and missed some shots. Considering the minimum turn radius of the UAS and its average speed, this case represents almost the lower survey limit and should not be set during the flight. The overlap rates of images acquired at other three flight altitudes are shown in Table 1. It can be seen from Table 1 that the image overlap rate depends on the shooting interval and the flight altitude and all of these overlap rates fulfill our splicing requirements.

Flight heights (m)Shooting interval (s)Overlap (%)
300463–36
554–20
500473–64
567–55
800486–75
582–69

Table 1.

The image overlaps taken at flight heights of 300, 500, and 800 m.

After obtaining the image overlap rates at various flight altitudes, we carried out different tests for acquiring the camera parameters involving ISO sensitivity and exposure time at different types of weather (cloudy, dark cloud, cloudless, and fog). For ISO sensitivity, the auto control mode is more suitable for the first two types of weather than the fixed scale one. However, the fixed scale mode operates superior in the latter two types of weather. The reason is that the light in the foggy and cloudy weather conditions changes based on the clouds or fog states. For achieving the optimal camera exposure time, the UAS at different flight altitudes (i.e., 300, 500, and 800 m) with infinite focal length and the same ISO sensitivity is tested. Six types of exposure time were set during the flight, including 1/250, 1/320, 1/500, 1/640, 1/700, and 1/1000 second. The obtained findings demonstrated that the clarity and resolution of images, which were taken at flight altitudes of 300 and 500 m, were lower than those taken at flight altitude of 800 m (Figure 6). Besides, because of high-speed movement of the camera during the flight, long exposure time would blur images, especially for 1/250, 1/320, and 1/500 second. Thus, in a real field test, only flight altitude of 800 m and exposure time with 1/1000 second are set for images taken.

Figure 6.

Images taken from three different flight altitudes: (a) 300 m, (b) 500 m, and (c) 800 m.

3.2. Data analysis

The image resolution can be obtained by a formula referring to the 1:500, 1:1000, and 1:2000 aerial photography standard [19], based on the relation of focus length, flight altitude, and resolution. The formula can be expressed as follows:

f=H×C/AE1

where f is the focus length, H denotes the flight altitude, C represents the CCD size, and A indicates the ground covering area computed by A = pixel number × resolution.

For images taken at flight altitude of 800 m, fixed focus model was proposed during shooting. The focus length was 41 mm, with image sensor size of 20.7 × 13.8 mm2 and image size of 2640 × 1760 pixels. Substituting data into Eq. (1), the obtained resolution is 0.06 m, and the real test result is higher than the theoretical value, which is between 0.1 and 0.12 m.

3.3. Regional panoramic image of Yuxi County

After determining the UAS flight parameters, we selected Yuxi County for flying test for acquiring the panoramic image with high resolution and verifying our UAS performance. Yuxi County has a northeast-southwest distance of 8 km as well as a northwest-southeast distance of 6 km. The total area for mapping is approximately 42 km2. Based on the overlap requirement, we need at least 25 lines at flight altitude of 800 m. To obtain more auxiliary data, 30 lines are finally set for flight, and the total distance is 200 km as well. Herein, 1711 images (3 MB per image) with data volume of 20 GB were captured in three flights. We excluded images which exceeded a predefined threshold for maximum deviance from a nadir position during image acquiring or deviated from flight altitude, and a total of 1610 images were left. We then used Pix4UAV software for rapidly splicing these images. Figure 7 showed the automatically computed flight plan. During the triangulation process, the whole three-dimensional (3D) points and key-point observations for bundle block adjustment are 387,784 and 1,268,824, respectively, while the mean value of re-projection error is 0.711432 pixel. The geotag localization variances, which were calculated by Pix4UAV, were 0.251682, 0.258808, and 0.374265 m for X-, Y-, and Z-directions, respectively.

Figure 7.

The flight line automatically calculated by the Pix4UAV software.

Eventually, through the use of point cloud densification in Pix4UAV software, we acquired the panoramic image of Yuxi County with high resolution shown in Figure 8. One can clearly distinguish the violin strings of Nie Er Square, which is the landmark of Yuxi County, by magnifying the Square step by step.

Figure 8.

Panorama and the landmark of Yuxi county (Nie Er Square).

Advertisement

4. Real field application to Lushan earthquake

A destructive earthquake with Ms7.0/Mw6.6 struck Lushan County, Sichuan province, China, at 08:02 Beijing Time (00:02 UTC) on April 20, 2013 [1]. The location of the main shock provided by China Earthquake Network Center was 30.3°N, 103.0°E, and the focal depth was 13 km as well. This earthquake caused 193 fatalities and 25 missing, in addition to more than 10,000 injuries. Direct economic losses were estimated to more than 10 billion RMB.

In order to apply the presented system to this earthquake for obtaining its performance, we departed Beijing at 15:00 Beijing Time on the same day, and it took us more than 1 day to reach the Lushan field headquarter (12:00 pm on April 22). Based on the disaster information acquired from the earthquake emergency team, we designed a flight plan and committed to the China Earthquake Administration for censoring. It took us about 1 week to obtain permission to fly. After the plan was approved, an aerial flight was immediately carried out on April 29 and 30. We totally spent 9 hours on completing this designed plan, which was from Longmen Township to Lushan County with the total roundtrip distance of about 40 km (Figure 9(a)). The longitude and latitude of waypoints of the flight line are listed in Table 2. We used similar flight parameters obtained from the field test in Yuxi County. Here, we set the flight altitude to 600 m according the local conditions. From this flight, we totally acquired 1101 images for 25.51 km2 affected area mapping, and the data volume stored was approximately 9.16 GB. It took us 2 hours to splice these images through Pix4UAV software, and the obtained result is displayed in Figure 9(b).

Figure 9.

(a) The flight line and (b) the splicing results for Lushan earthquake.

No.Long.Lat.No.Long.Lat.No.Long.Lat.
1102.87730.15322102.86630.19343102.81930.270
2102.92730.15823102.86330.19944103.04030.281
3102.87530.16124102.86530.20245102.79930.290
4102.92030.16225102.95830.20446103.04430.291
5102.88630.16226102.85230.20747102.79130.292
6102.89330.16427102.85730.20848103.03730.297
7102.93330.16428102.85030.21149103.03830.300
8102.86830.16729102.84630.21250102.79030.304
9102.94230.16530102.95530.21551102.78630.306
10102.89730.16831102.95930.21552103.03530.310
11102.91530.17032102.96130.21953102.78330.316
12102.89430.17333102.96530.22454103.04530.321
13102.87030.17434102.97030.22455102.77230.330
14102.94830.17735102.96830.22956102.77830.332
15102.86830.17836102.97430.23257102.78630.340
16102.90030.17837102.98630.23358102.79530.342
17102.95530.18138102.99930.24359102.80330.349
18102.86330.18239103.01430.24960102.80730.363
19102.86330.18840102.82730.25261102.81230.371
20102.95630.19041103.02330.264
21102.95330.19342103.03330.266

Table 2.

Longitude and latitude of the flight line waypoints.

For the Lushan earthquake, due to the failure to obtain flight permission timely, we only took the flight after the end of the earthquake emergency response (EER), and the achieved result was not used for emergency rescue and decision-making. However, it further attempted to test the combat capability of the presented UAS for EER.

Advertisement

5. Real earthquake emergency application during Ludian earthquake

Regarding the Ludian earthquake, it was the first time that the presented UAS was practically used for EER. This earthquake occurred at 16:30 on 3 August 2014 (Beijing Time), with the location of 27.1°N, 103.3°E, and focal depth of 12 km provided by the China Earthquake Network Center. According to the Chinese intensity scale, the maximum intensity of this earthquake was IX, similar to the modified Mercalli intensity IX, and this disaster caused 617 deaths, 112 missing, and 3143 injuries.

Due to serious damage to the traffic roads, it was impossible for the earthquake emergency team to enter the most difficult hit areas timely for emergency rescue and disaster assessment. Therefore, the headquarters decided to dispatch the UAS team to learn more about the road destruction. On the early morning of the 5th of August, we arrived at the field command and immediately began to design the flight route from Shaba to Longtoushan to Lehong town and to apply for air traffic control (ATC) clearances. At 12:00, under coordination of the Yunnan Bureau of Surveying, Mapping, and Geoinformation, we obtained the permission to fly. In response to the situation that a barrier lake formed in the Niulan River because of a huge landslide was too dangerous, we immediately used Google Earth to design a flight plan along the river for this barrier lake. At 15:00, we departed from the field command and selected Li yard lied in Huodehong town as a landing point. It took us about 3 hours to complete the two flight plans. After returning to the headquarters (at 21:00), we spent 2 hours on splicing the images, and the panoramic image of the barrier lake was obtained (Figure 10).

Figure 10.

Panorama of the barrier lake.

On August 6 and 7, we designed another six flight plans and finally obtained aerial images of almost the whole extreme disaster areas (Figure 11), compared with the aftershock distribution. When the image is partially enlarged, we can clearly see different types of disaster caused by this earthquake, including landslides, road destruction, structural damages, and floods caused by barrier lakes.

Figure 11.

Coverage of the panorama image for the Ludian earthquake compared to the aftershock distribution of this earthquake. The inset represents the study region within Yunnan Province.

Advertisement

6. Discussion

Although accuracy of the processing workflow cannot be compared with those with installed filed-measured GCPs, the results obtained from this workflow are very suitable for post-earthquake epicentral area mapping, and these data are much more accurate and detailed than traditional SRS data. The image resolution is ca. 10–15 cm, and it is approximately one order of magnitude better than those images taken by QuickBird or WorldView. Such commercially available imagery can be used to analyze larger areas in notable detail. However, UAS data obtained from flight altitude of about 800 m can provide the coverage of post-earthquake epicentral regions with much higher precision at different repeat rates, user-specified times, and lighting conditions.

Based on the authors’ experience, the flight altitude should not be below 300 m with the same performance of the presented UAS; otherwise the accuracy of the initial exterior orientation values obtained from the in-flight GPS logs would be extremely low relative to the size of area covered by the image. When the initial position of the image using in the triangulation process is inaccurate, we cannot calculate the overlap rate with the neighboring images. Therefore, we cannot establish tie points and relative orientation between the images. In addition, larger image extents obtained at higher flight altitudes could reduce the relative images’ mispositioning. Accordingly, the necessary initial triangulation process and subsequent refinement should be performed by using the Pix4UAV software.

To obtain results with a high level of details, image processing is rather work-intensive, and we need to improve it in the future. However, it may be beneficial when image processing requires manual input because the user can have more control over the process [20]. Recent techniques may lead to further facilitation, such as workflows developed based on structure from motion (SfM) techniques [21].

Advertisement

7. Conclusions

GIS techniques and Earth observation can be used to improve efforts dedicated in developing applicable disaster mitigation strategies. In addition, they can provide relevant agencies with crucial information to alleviate the impact caused by a disaster. However, due to technical and financial issues, the traditional use of aerial and satellite images for this task has been challenged.

More and more UAS are being used as photogrammetric platforms for civilian applications because of their ease of operation, their relatively low cost, and the emergence of low-cost imaging and navigation sensors with performances comparable with more expensive sensors. Based on their operational nature and cost of fabrication, we can adopt this technology to establish a low-cost mapping system.

In this chapter, we presented the application of an UAS to rapidly collect images of post-earthquake disaster. With the chosen fixed-wing UAS, we successfully performed data acquisition at different scales. The results obtained from the two disastrous earthquakes, especially the 2014 Ludian earthquake, demonstrated that the system plays a significant role on the processes of investigating and gathering information about a disaster in epicentral areas of the earthquake, like secondary disaster investigation, road detection, and rapid disaster assessment. The salvation headquarters can effectively use the obtained earthquake information to rapidly develop the relief measures to improve the emergency rescue efficiency.

As to the image analysis, we applied a photogrammetric workflow for coping with the high resolution of the obtained SFAPs, which were captured from relatively high flight altitudes (i.e., 800 m above ground). At these flight altitudes, the resolution of the acquired images still remains below 15 cm. Because we carried out the aerial survey in post-earthquake epicentral areas and it covered a very large area, there were no GCPs distributed in the mapping field. Instead, we applied direct georeferencing using the GPS log recorded by the UAS to create the image block. Among other information, the content of the log files includes the measured position (in X-, Y-, and Z-directions) by GPS for each image obtained as well as the information on tilt of the image axes (κ, φ, and ω). We used this information as initial values of exterior orientation for the images in the photogrammetric block fil, and directly analyzed the accuracy of these values through the Pix4UAV software. According to our analysis, the accuracy of these values highly relies on the camera alignment relative to the IMU coordinate system and the measurement precision of the GPS and IMU units. In addition, the accuracy may be influenced by a time-lag between triggering command and actual triggering of the camera. Therefore, these values can be expected to deviate up to several degrees and meters from the actual ones. We need further study to investigate these problems in the future.

Advertisement

Acknowledgments

This research was cofunded by the National Natural Science Foundation of China (41404048) and the Special Fund of the Institute of Geophysics, China Earthquake Administration (DQJB15C04).

Advertisement

Conflict of interest

The authors declare no conflict of interest.

References

  1. 1. Peng CY, Yang JS, Chen Y, Zhu XY, Xu ZQ, Zheng Y, Jiang XD. Application of a threshold-based earthquake early warning method to the Mw6.6 Lushan earthquake, Sichuan, China. Seismological Research Letters. 2015;86(3):841-847
  2. 2. Peng CY, Yang JS, Zheng Y, Zhu XY, Xu ZQ, Chen Y. New τc regression relationship derived from all P wave time windows for rapid magnitude estimation. Geophysical Research Letters. 2017;44:1724-1731. DOI: 10.1002/2016GL071672
  3. 3. Peng CY, Chen Y, Chen QS, Yang JS, Wang HT, Zhu XY, Xu ZQ, Zheng Y. A new type of tri-axial accelerometers with high dynamic range MEMS for earthquake early warning. Computers & Geosciences. 2017;100:179-187. DOI: 10.1016/j.cageo.2017.01.001
  4. 4. Boccardo P. New perspectives in emergency mapping. European Journal of Remote Sensing. 2013;46:571-582. DOI: 10.5721/EuJRS20134633
  5. 5. Bendea H, Boccardo P, Dequal S, Tonolo FG, Marenchino D, Piras M. Low cost UAV for post-disaster assessment. In: Chen Jun, Jiang Jie, Ammatzia Peled, editors. Beijing, China: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; 3-11 July 2008; Vol. XXXVII, No. Part B8. 2008. p. 1373-1379
  6. 6. Gaszczak A, Breckon TP, Han JW. Real-time people and vehicle detection from UAV imagery. In: Proc. SPIE 7878, Intelligent Robots and Computer Vision XXVIII: Algorithms and Techniques. January 24, 2011. 78780B. DOI: 10.1117/12.876663
  7. 7. Al-Tahir R, Arthur M, Davis D. Low cost aerial mapping alternatives for natural disasters in the Caribbean. In: FIG Working Week 2011, Bridging the Gap Between Cultures; Marrakech, Morocco. May 18-22, 2011
  8. 8. Al-Tahir R, Arthur M. Unmanned aerial mapping solution for small island developing states. In: Global Geospatial Conference 2012; Québec City, Canada. 9 p
  9. 9. Meo R, Roglia E, Bottino A. The exploitation of data from remote and human sensors for environment monitoring in the SMAT project. Sensors (Basel). 2012;12(12):17504-17535. DOI: 10.3390/s121217504
  10. 10. Watts AC, Ambrosia VG, Hinkley EA. Unmanned aircraft systems in remote sensing and scientific research: Classification and considerations of use. Remote Sensing. 2012;4:1671-1692. DOI: 10.3390/rs4061671
  11. 11. Baiocchi V, Dominici D, Milone MV, Mormile M. Development of a software to plan UAVs stereoscopic flight: An application on post earthquake scenario in L’Aquila city. In: 13th International Conference of Computational Science and Its Applications (ICCSA 2013); Ho Chi Minh City, Vietnam. June 24-27, 2013. pp. 150-165. DOI: 10.1007/978-3-642-39649-6_11
  12. 12. Baiocchi V, Dominici D, Mormile M. Unmanned aerial vehicle for post seismic and other hazard scenarios. WIT Transactions on the Built Environment. 2013;134:113-122
  13. 13. Adams SM, Friedland CJ. A survey of unmanned aerial vehicle (UAV) usage for imagery collection in disaster research and management. In: Proceedings of the Ninth International Workshop on Remote Sensing for Disaster Response; Stanford, CA, USA. September 15-16, 2011
  14. 14. d’Oleire-Oltmanns S, Marzolff I, Peter KD, Ries JB. Unmanned aerial vehicle (UAV) for monitoring soil erosion in Morocco. Remote Sensing. 2012;4:3390-3416. DOI: 10.3390/rs4113390
  15. 15. Xu ZQ, Yang JS, Peng CY, Wu Y, Jiang XD, Li R, Zheng Y, Gao Y, Liu S, Tian BF. Development of an UAS for post-earthquake disaster surveying and its application in Ms7.0 Lushan earthquake, Sichuan, China. Computers & Geosciences. 2014;68:22-30. DOI: 10.1016/j.cageo.2014.04.001
  16. 16. Pastor E, Lopez J, Royo P. UAV payload and mission control hardware/software architecture. IEEE Aerospace and Electronics Systems Magazine. 2017;22:3-8
  17. 17. Pix4D. Pix4D Software Introduction [Internet]. 2010. Available from: https://pix4d.com/ [Accessed: February 1, 2018]
  18. 18. Sigma DP2. The Sigma DP2: A Full Spec Compact Digital Camera with all the Power of DSLR [Internet]. 2009. Available from: http://www.sigma-dp.com/DP2/specification.html [Accessed: February 1, 2018]
  19. 19. GB/T 6962-2005. 1:500 1:1 000 1:2 000 Aerial Photogrammetric Standard. Beijing: State Standardization Publishing House; 2005
  20. 20. Hendrickx M, Gheyle W, Bonne J, Bourgeois J, de Wulf A, Goossens R. The use of stereoscopic images taken from a microdrone for the documentation of heritage – An example from the Tuekta burial mounds in the Russian Altay. Journal of Archaeological Science. 2011;30:2968-2978
  21. 21. Turner D, Lucieer A, Watson C. An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SfM) point clouds. Remote Sensing. 2012;4:1392-1410

Written By

Chaoyong Peng, Zhiqiang Xu, Jiansi Yang, Yu Zheng, Weiping Wang, Sha Liu and Baofeng Tian

Submitted: 01 December 2017 Reviewed: 29 March 2018 Published: 05 November 2018