Open access peer-reviewed chapter - ONLINE FIRST

Unmanned Aerial Systems (UAS)-Derived 3D Models for Digital Twin Construction Applications

Written By

Jhonattan G. Martinez, Luis A. Alarcon and Søren Wandahl

Submitted: 08 February 2024 Reviewed: 12 February 2024 Published: 14 March 2024

DOI: 10.5772/intechopen.1004746

Point Cloud Generation and Its Applications IntechOpen
Point Cloud Generation and Its Applications Edited by Cumhur Şahin

From the Edited Volume

Point Cloud Generation and Its Applications [Working Title]

Associate Prof. Cumhur Şahin

Chapter metrics overview

30 Chapter Downloads

View Full Metrics

Abstract

The advent of Construction 4.0 has marked a paradigm shift in industrial development, integrating advanced technologies such as cyber-physical systems (CPS), sensors, unmanned aerial systems (UAS), building information modeling (BIM), and robotics. Notably, UASs have emerged as invaluable tools seamlessly embedded in construction processes, facilitating the comprehensive monitoring and digitization of construction projects from the early design phase through construction to the post-construction phases. Equipped with various sensors, such as imaging sensors, light detection and rangers (LiDAR), and thermal sensors, UASs play an important role in data collection processes, especially for 3D point cloud generation. Presently, UASs are recognized as one of the most effective means of generating a Digital Twin (DT) of construction projects, surpassing traditional methods in terms of speed and accuracy. This chapter provides a comprehensive overview of the applications of UAS-derived 3D models in DT, outlining their advantages and barriers and offering recommendations to augment their quality and accuracy.

Keywords

  • unmanned aerial systems (UAS)
  • digital twin
  • construction industry
  • point cloud
  • photogrammetry
  • LiDAR
  • 3D model

1. Introduction

The advent of the construction 4.0 concept has ushered in a new era in industrial development, characterized by the integration of various technologies such as cyber-physical systems (CPS), sensors, building information modeling (BIM), robotics, big data, cloud manufacturing, virtual and augmented reality [1]. Among the various components of cyber-physical systems (CPS), unmanned aerial systems (UASs) stand out as valuable tools that can be seamlessly integrated into construction processes [2]. UASs can monitor and digitize construction projects throughout their lifecycle, starting from the early design phase, continuing through construction, and extending to post-construction phases. Integrating UASs in construction processes enhances efficiency and effectiveness in project management and execution [3].

A UAS stands for unmanned aircraft system. It refers to a complex system that includes an unmanned aircraft (UA), a drone, and all the necessary components and infrastructure for its operation [4]. UAS are commonly used for various applications, including aerial surveillance, photography, agriculture, search and rescue, scientific research, and commercial applications. They play a crucial role in a wide range of industries due to their adaptability and ability to access areas that may be dangerous or difficult for human pilots. The acceptance of UASs in the construction industry arises from their operational adaptability and capability to collect visual information dynamically and promptly. UAS can carry various sensors, which are crucial in collecting data and information from the environment during its operations. The type of sensors used can vary depending on the specific application and the data requirements of the flight mission. For instance, imaging sensors, light detection and rangers (LiDAR), thermal sensors, multispectral and hyperspectral sensors, and radar systems are some of the most common sensors used for the data collection process in construction [5]. Some of the UAS-derived products include aerial imagery, orthomosaic maps, digital elevation models (DEM), point clouds or 3D models, thermal maps, and inspection reports [6].

Nowadays, UASs are considered one of the most effective ways to create a Digital Twin (DT) of a construction project. This is because they can collect various types of data (e.g., images, videos, and point clouds) faster and with higher accuracy than traditional methods. DT is an interactive, real-time digital representation of building or infrastructure assets, employing sensors and cyber-physical systems to understand the physical world better [7]. Applying methods based on the DT concept has shown success, particularly in industries like manufacturing, where they have been effectively utilized for production planning and control. According to Tao et al. [8], a DT possesses three main elements: the physical object, the virtual element, and the link connections. Point clouds are crucial for creating and maintaining a DT because they provide a detailed and accurate representation of the physical environment [9]. Point clouds offer a highly detailed and precise 3D model of physical objects and their surroundings. Each point in the cloud corresponds to a specific location in space, accurately capturing the shape and geometry of surfaces.

Accordingly, this chapter aims to provide an overview of the application of UAS-derived 3D models in DT. The chapter describes the advantages and barriers to UAS applications for 3D point cloud generation and offers recommendations to enhance their quality and accuracy.

Advertisement

2. UAS applications in construction

A UAS, commonly called a drone, unpiloted aerial vehicle, or remotely piloted aircraft (RPA), is a powered aerial vehicle without a human operator on board (see Figure 1). It relies on aerodynamic forces for lifts, can operate autonomously or be controlled remotely, and can carry either lethal or nonlethal payloads [10]. The UAS’s flight can be managed by onboard computers or a ground-based pilot in another vehicle. A UAS encompasses the aircraft and emphasizes the significance of additional elements such as sensors, control systems, control links, and a ground station. One of the fundamental parts of UASs is sensors. They enable UAS to gather data and perform specific tasks.

Figure 1.

Type of UASs: (a) fixed wing and (b) multi rotor.

Some of the most common sensors used in construction are imaging, LiDAR, and thermal sensors (see Figure 2) [11].

Figure 2.

UAS-based sensors: (a) multispectral, (b) LiDAR, (c) thermal, and (d) imagery.

UASs have been incorporated into various construction-related tasks, as Ham et al. [12] outlined. Within the construction sector, UASs have been utilized for diverse purposes, including mapping and surveying construction sites [13], monitoring the progress of construction projects [14], safety inspections [15], inspecting structures [13], and conducting post-disaster assessment [16]. The integration of UASs into the construction industry is driven by the operational adaptability of aerial platforms and their ability to access areas that are difficult to reach or pose potential risks [17]. This adaptability allows UASs to contribute substantially across a spectrum of activities, from routine tasks such as site mapping and progress monitoring to critical functions like structural inspection and post-disaster assessment within the construction field.

In the construction stage, UASs are helpful, particularly in building inspection and progress monitoring [18]. Building inspection involves assessing a building’s structural integrity, and the deployment of UAS during this phase offers several benefits, including cost reduction, time efficiency, and risk mitigation compared to traditional inspection methods like elevating platforms and scaffolding. UAS with high-resolution cameras can digitally monitor buildings, accurately detecting cracks and damages at a millimeter range [19]. This technology provides access to hard-to-reach areas, addressing the drawbacks of conventional methods, such as logistics time, safety hazards, and expensive equipment acquisition. The effectiveness of UAS in building inspection has been validated through comparisons with traditional methods. Various examples illustrate the versatility of UAS applications in building inspection, such as visualizing and assessing the condition of structures like stone-made church towers, hangar wall and roof joints, wind turbine cores, and chimneys for damage identification. Despite environmental challenges, UAS technology allows for detailed assessments, encouraging its use in such settings.

Integrating UASs can significantly enhance progress monitoring, involving tracking construction project advancements. Conventional monitoring methods, such as on-foot inspections and terrestrial or satellite imagery collection, face limitations due to factors like the time required for manual surveillance and environmental influences on satellite image quality [20]. UASs offer improved flexibility, high-resolution imagery, and operational proximity, enabling monitoring of construction sites across various periods. UASs have also been effectively utilized in monitoring the construction progress of roads and bridges within extensive infrastructure projects [16]. Unger et al. [21] utilized UAS-generated orthophotos to monitor specific building-zoned areas over 5 months, accurately identifying changes. The authors highlighted UAS technology as an efficient alternative to traditional photogrammetry for monitoring cost- and time-effective progress.

Advertisement

3. Digital Twin in construction

DT is a digital emulation of a tangible asset, employing DT-enabling technologies like sensors, communication networks, and 3D models to acquire real-time information and facilitate two-way coordination [22]. This ensures that the digital model is a counterpart to the physical asset. A DT relies on the fundamental principles of the Internet of Things (IoT) for its functionality [23]. The implementation of a DT is contingent upon the requisite level of information necessary to sustain and consistently update the model. As shown in Figure 3, A DT can be characterized by three main elements: (1) physical reality, (2) virtual representation, and (3) interconnections that exchange information among the physical and virtual counterparts [24].

Figure 3.

DT representation.

Researchers have used the term DT in the construction domain as an equivalent of BIM models made in design and construction [25]. Other researchers perceive a DT as a digital representation of construction assets such as buildings, bridges, or industrial facilities [26]. However, such definitions fall short in describing a DT since they only consider the physical representation of the assets rather than the process and resources involved. A more accurate perspective defines a DT as a digital representation of resources, processes, or systems in the built or natural environment. This concept has been applied to all the construction stages, including design and engineering, construction, operation and maintenance, and demolition and recovery [27]. It has been used primarily in the construction stage to assess the structural integrity of construction elements, control the actual and planned performance, and detect any deviation in the flows of materials, equipment, and labor [26]. Accordingly, DT could provide proactive and accurate status information on the construction site.

Various researchers have explored DT applications in construction. Barazzetti et al. [28] developed a historical building information model (HBIM) with augmented reality (AR) and virtual reality (VR) to boost cultural tourism. Gabor et al. [29] utilized DT for online planning and analyzed information flows within a cyber-physical system, enhancing engineering processes. Alonso et al. [30] focused on optimizing energy consumption in buildings’ life cycles to reduce costs and environmental impact. Boje et al. [27] examined BIM applications in construction and outlined requirements for a new DT model. Lu et al. [31] developed a semi-automatic approach that determines an organized, accurate, and accessible DT system, combining images and computer aided design (CAD) drawings. Angjeliu et al. [32] instituted a technique to build an accurate digital model that reproduces experimental physical conditions. This model is used for exploring the structural response of the system, as well as for analyzing preventive maintenance and reinforcement procedures. The research addressed the methodological development of structural simulation, analysis, and control models for historic buildings by the DT concept. More recently, Pan and Zhang [33] devised and developed a data-driven DT framework integrating BIM, IoT, and data mining for advanced project management. The proposed framework aimed to enhance data communication and exploration to understand better, predict, and optimize physical construction processes.

Advertisement

4. UAS-derived 3D models for DT

Construction researchers and practitioners have shown increased interest in enhancing the accuracy of UAS-generated 3D models and, ideally, improving this technology’s integration within construction practices [34]. High accuracy of 3D point clouds is required for various construction practices, including building surveying and digital construction site representation. 3D point clouds are crucial for creating and maintaining a DT because they provide a detailed and accurate representation of the physical environment. Some of the 3D-point cloud applications in DT are:

  • High-precision 3D representation: Point clouds provide an intricately detailed and accurate three-dimensional portrayal of physical objects and their environment. Every individual point within the cloud corresponds to a precise spatial location, effectively capturing the surfaces’ shape and geometry.

  • Reality capture: Point clouds serve as a form of reality capture, allowing for the accurate documentation of the as-built environment. This is especially valuable in construction, architecture, and urban planning, where maintaining an up-to-date and accurate representation of the physical world is crucial.

  • Detailed spatial information: Point clouds provide rich spatial information about the objects within their scope. This includes geometry and the relationships between elements, enabling a comprehensive understanding of physical space.

  • Integration with other data sources: Point clouds can be integrated with data from other sensors and sources, such as images, LiDAR, or IoT devices. This integration enhances the DT’s overall fidelity, providing a more comprehensive and holistic view of the physical environment.

  • Visualization and simulation: The detailed information within point clouds facilitates realistic visualization and simulation within the DT. It allows users to investigate and analyze the environment virtually, helping in decision-making processes, planning, and simulations of various scenarios.

  • Monitoring changes over time: A DT must reflect changes over time as a living representation of the physical world. Point clouds can be periodically updated to capture alterations in the physical environment, allowing for an accurate historical record and aiding in predictive analysis.

  • Asset management: For infrastructure and facility management, point clouds provide essential data for maintaining and managing assets. They assist in assessing the condition of structures, planning maintenance, and optimizing operations.

Overall, 3D points clouds serve as a foundational element for a DT, offering the intricate spatial details necessary for creating a realistic and dynamic virtual replica of the physical world. This accuracy is vital for urban planning and construction applications to facility management.

Advertisement

5. Techniques to generate 3D point cloud data for DT

There are two main techniques to generate 3D point clouds of the construction site: photogrammetry and LiDAR-based mapping.

5.1 UAS-based photogrammetry

Photogrammetry is a technique used to extract geometric information from two-dimensional images or photographs. It encompasses analyzing the positions and relationships of points in multiple images to create a three-dimensional representation of the photographed object or scene [35]. This process relies on the principles of triangulation, where the positions of points are determined by measuring the angles formed by the lines of sight to those points from known positions.

UAS-based photogrammetry offers several advantages over traditional photogrammetry methods. UAS provides the flexibility to capture images from various angles and altitudes, allowing for more comprehensive coverage of an area. Besides, using UAS can significantly reduce costs compared with survey methods or ground-based surveying [36]. This is because they can cover more areas in less time and faster than other survey methods. Another advantage of using UASs is their ability to capture high-resolution images, leading to detailed and accurate mapping results. This is especially beneficial for applications requiring precise measurements or identifying small features [37]. UAS-based photogrammetry begins with the data collection process. It is important to define the area to be surveyed properly during this. Flight altitude, overlap settings, and camera parameters are essential for optimal image acquisition. Once the data is collected, the raw data is processed, considering parameters like camera calibration, camera alignment, and coordinate system. The data rectified is later reconstructed to generate a dense point cloud by triangulating the positions of identified features in multiple images (see Figure 4).

Figure 4.

UAS-based photogrammetry workflow.

5.2 UAS LiDAR-based mapping

Lidar-based mapping is a remote sensing technology that uses laser light to calculate distances and create detailed, three-dimensional maps of the Earth’s surface [38]. LiDAR-based mapping involves deploying a LiDAR sensor to gather precise and accurate elevation data, enabling the creation of high-resolution topographic maps, digital terrain models (DTMs), and 3D representations of the environment [39]. LiDAR-based technology is characterized by its high precision and accuracy, rapid data collection, day and night operation capabilities, data density and resolution, and ability to create realistic representations of the environment. Like UAS-based photogrammetry, UAS LiDAR-based mapping requires planning the flight mission and considering the flight altitude, speed, and overlap setting. A pre-flight check is recommended to ensure the UAS, and the LiDAR sensors are well calibrated. During the data collection, LiDAR sensors will emit laser pulses toward the ground, and the system records the time it takes for the pulses to return, calculating distances. After that, the data is uploaded into software to filter, classify, and organize the point cloud. The final step consists of removing noise, classifying ground and non-ground points, and enhancing data quality.

Advertisement

6. Improving UAS photogrammetry-based 3D point clouds

Various methods and techniques can be used to enhance 3D point clouds for DT applications. Those methods involve optimizing multiple data acquisition, processing, and post-processing aspects. Here are several strategies to improve the accuracy of UAS-derived point clouds, including:

Use ground control points (GCP) to improve spatial accuracy: GCPs are spots of known position strategically located in a place of interest. Such points could achieve a 3D point cloud accuracy of up to 5 centimeters [40]. Nevertheless, this approach is characterized as a time-consuming and labor-intensive process. In classic photogrammetry, GCP must be distributed uninformedly throughout the whole area and the center. A study conducted by Sanz-Ablanedo et al. [41] demonstrated that 3D point cloud accuracy depends on the location and the number of GCPs used during reconstruction. GCPs are often used to increase the positioning accuracy of the UAS-derived 3D point clouds. Commercially available off-the-shelf UAS are commonly equipped with a single-frequency global navigation satellite system (GNSS) (L1) device, which limits positioning accuracy, typically ranging from 1 to 25 meters [42]. Nevertheless, specific construction applications, especially DT-related ones, demand higher precision. To address this, a common approach involves identifying critical features in the 3D point clouds that can be matched to known real-world coordinates using GCP [40]. Figure 5 presents a scenario in which four GCPs were placed to enhance the 3D point cloud accuracy of the building. The experiment aimed to compare the accuracy of 3D point clouds created with and without using GCP.

Figure 5.

Use of GCP to improve the 3D point cloud accuracy.

Use real-time kinematic (RTK) or post-processing kinematic (PPK) to enhance the geometric accuracy: Unlike GCPs, RTK and PPK methods utilize single-frequency GNSS (L1) or dual-frequency GNSS (L1/L2) receivers, offering positioning accuracy ranging from 1 to 30 centimeters and eliminating the need for GCPs, thus saving time and costs significantly [40, 43]. RTK corrects positioning coordinates during flight, while PPK applies corrections post-flight [44]. Dual-frequency GNSS receivers are preferred over single-frequency ones due to their ability to address ionospheric and tropospheric effects, resulting in higher accuracy in UAS-generated 3D point clouds [45]. RTK technology provides instant access to rover position and solution quality information, facilitating informed decision-making during field operations. It also saves time on post-processing calculations compared to PPK. However, RTK requires a live radio link between the reference station and rover, adding equipment and setup costs and complexity to the UAS payload. Loss of this link poses a risk of losing differentially corrected rover positions. In contrast, PPK operations do not require live links, simplifying logistics and reducing risk.

Use of high-resolution RGB sensors: the UAS’s camera resolution is crucial in determining the accuracy of 3D point reconstruction [46]. A better camera resolution reduces the ground sample distance (GSD) during the data collection. In remote sensing, GSD refers to the space on the ground that each pixel in the image represents [47]. Higher-resolution cameras have smaller pixels, resulting in a smaller GSD. Smaller GSD allows for the differentiation of finer details in the scene, leading to more accurate and precise reconstruction of 3D points. Besides, a high camera resolution allows for capturing more information and features in the images. This increased level of detail enables the photogrammetric software to identify and match more distinct points in the images, contributing to a more accurate reconstruction of the 3D scene.

On the other hand, higher-resolution images also provide more overlap and redundancy in the captured data, allowing for better triangulation and more accurate determination of 3D point positions. This aspect improves measurement precision, especially when determining distances between points in the images. This precision translates into more accurate 3D coordinates for the reconstructed points.

Proper flight mission design: a proper flight mission is essential to guarantee an adequate data collection process. Initially, it is critical to define the overlap percentage between images. A higher overlap between images (both along-track and across-track) ensures more redundant information for accurate feature matching and 3D reconstruction, helping to create a denser point cloud [48]. At the same time, it is vital to guarantee that the images adequately cover the entire area of interest. Gaps or missing sections can lead to incomplete or inaccurate point clouds. Another important aspect is the camera orientation and angle of view. For instance, nadir (straight-down) imagery suits top-down mapping, while oblique imagery can provide more details for vertical structures. The choice depends on the application, but combining both might enhance accuracy and completeness. Flight altitude determines the GSD, which in turn affects the accuracy of the 3D point cloud. Lower altitudes with smaller GSD can capture finer details but might require more flight lines for complete coverage. Finally, flight patterns, such as a grid or spiral, impact the distribution of images and the redundancy in coverage. A well-designed pattern ensures optimal overlap and helps overcome challenges like occlusions and shadows.

Proper camera calibration: the calibration process involves determining a camera’s intrinsic and extrinsic parameters, which helps correct distortions and aligns the camera’s view with the real-world coordinates [49]. Camera calibration enhances 3D point cloud accuracy and quality by correcting distortions, determining intrinsic and extrinsic parameters, aligning coordinates, and reducing systematic errors. It ensures that the 3D reconstruction accurately represents real-world dimensions and relationships, improving depth accuracy and providing reliable spatial information for various applications.

Advertisement

7. UAS LiDAR-based 3D point clouds enhancement

The generation process of 3D point clouds derived from LiDAR data collected by UAS involves several steps to improve data quality, accuracy, and interpretability. Here is a short explanation of the process of generation of LiDAR-based 3D point clouds:

  • Remove the noise: LiDAR data often contains noise caused by various factors such as sensor inaccuracies, environmental conditions, or object interference [50]. Filtering techniques like statistical outlier removal or voxel-based filtering can help eliminate noise points from the point cloud.

  • Point cloud registration: If the LiDAR data is collected from multiple flight passes or from different sensors, aligning and registering the point clouds into a single coordinate system is necessary [51]. Iterative closest point (ICP) algorithms or feature-based registration methods can be used for point cloud registration.

  • Data fusion: Combining LiDAR data with other datasets like imagery or GIS data can provide additional context and improve the point cloud interpretation [52]. Techniques such as sensor fusion or data integration can be employed for this purpose.

  • Colorization: Adding color information to the point cloud can enhance visualization and interpretation [53]. This can be achieved by integrating imagery data collected concurrently with LiDAR or by using image-based methods to colorize the point cloud.

  • Quality assessment and control: Performing quality assessment and control measures to identify and correct errors or artifacts in the point cloud is essential [54]. This can involve visual inspection, comparing against ground truth data, or using automated validation algorithms.

Advertisement

8. Advantages and disadvantages of using UAS-based photogrammetry and UAS LiDAR-based mapping for DT

UAS-based photogrammetry and UAS LiDAR-based mapping are powerful tools for the real-time digitalization of construction job sites. Nevertheless, factors such as deployment cost, environmental conditions, payload capacity, and flight regulation can significantly influence the impact of such technologies for DT-related applications. On the one hand, UAS-based photogrammetry offers cost-effectiveness, flexibility, and accessibility, with the ability to access remote areas swiftly. It provides high-resolution imagery for detailed mapping and analysis, along with real-time data processing capabilities, facilitating immediate decision-making. Moreover, its rapid deployment capability allows for quick response to changing project needs or emergencies, ensuring near-instantaneous data acquisition. On the other hand, UAS LiDAR-based mapping enables efficient data collection over vast areas, reducing field time. Additionally, LiDAR generates accurate 3D point clouds, allowing for detailed analysis and visualization of terrain features. Its suitability for mapping complex terrain, including urban areas, forests, and rugged landscapes, makes it particularly advantageous where traditional photogrammetry may be limited.

Nevertheless, the application of UAS-based photogrammetry for DT is highly dependent on weather conditions. Unfavorable conditions can introduce errors, distortions, and limitations during data acquisition. To ensure precision, it is crucial to use sensors designed for robustness in diverse conditions, employ appropriate calibration and filtering techniques, and consider environmental factors during data collection [55]. Adequate lighting is essential for precise depth sensing, and issues such as low light or strong shadows can lead to inaccuracies in in-depth measurements and impact overall 3D point cloud quality. To address these challenges, controlled lighting setups or sensors designed for varied lighting conditions can be beneficial. Outdoor settings are particularly susceptible to weather conditions affecting data quality. For instance, rain, snow, or fog can scatter or absorb light, reducing sensor range and accuracy. Advanced sensors with features like adaptive filtering or environmental compensation can help mitigate the impact of adverse weather conditions. Moreover, temperature and humidity fluctuations can influence sensor performance, potentially causing thermal expansion or contraction that alters internal geometry. Maintaining a stable environmental temperature and humidity level is imperative for consistent and accurate 3D data acquisition. UAS platforms have limited payload capacity, restricting the types of sensors and equipment that can be carried. This can limit the versatility of UAS-based photogrammetry solutions.

UAS LiDAR-based mapping solutions present challenges including high cost due to expensive LiDAR sensors and complex and time-consuming data processing that requires specialized expertise. Besides, dealing with highly reflective surfaces, such as glass or water, poses challenges in in-depth sensing, as reflections can lead to inaccurate depth measurements and artifacts in the 3D point cloud. Specialized algorithms or sensors with enhanced capabilities for handling reflective surfaces are essential for achieving precise 3D reconstructions.

For both approaches, UAS flight regulations can significantly impact the quality and accuracy of point clouds generated through photogrammetry or LiDAR. Compliance with regulations is essential to ensure safe and legal UAS operations, and these regulations can influence the data collection process and subsequent data quality [56]. Regulatory restrictions on maximum flight altitudes may affect the resolution of the captured imagery or LiDAR data. Higher altitudes can result in lower point cloud density, potentially reducing the level of detail in the reconstruction. Besides, regulations may impose constraints on flight patterns and the area that can be covered during a single flight. Compliance with these boundaries may impact the completeness and accuracy of the point cloud, particularly for large or complex survey areas. UAS regulations often define no-fly zones around airports or critical infrastructure. These restrictions can limit data collection in certain areas, affecting the overall coverage and accuracy of the point cloud.

Another important aspect of the UAS flight is the operational hours. Guidelines may specify permissible operational hours for UAS flights. Adhering to these timeframes is crucial, as variations in lighting conditions (e.g., sunrise and sunset) can influence image quality and, consequently, point cloud accuracy. Moreover, compliance with safety regulations, including collision avoidance measures, can impact flight paths and speed. Safe flight practices are essential to prevent accidents that could lead to data loss or inaccuracies in the point cloud. Finally, collaborating with relevant authorities and obtaining necessary permits is essential for planning UAS surveys. Understanding and adhering to regulatory requirements during the planning phase is critical for accurate and legally compliant data collection.

Advertisement

9. Conclusions

This chapter provides an overview of the application of UAS-derived 3D models in DT by describing the advantages and barriers of its applications for generating 3D point clouds. Moreover, the chapter provides insights to enhance the UAS-based 3D point clouds for DT applications. Using UASs in construction as a data collection system provides enormous benefits by reducing the cost and time to generate 3D point clouds. The data gathered by UAS provides a high-precision 3D representation and rich spatial information on the physical asset. These aspects classify UAS as fundamental tools for applications related to DT in construction. Two techniques to generate 3D point clouds stand out over traditional mapping techniques. On the one hand, UAS-based photogrammetry is accessible and cost-effective, allows rapid data collection, and provides real-time construction site monitoring. On the other hand, UAS LiDAR-based mapping offers high-precision 3D mapping and efficient data processing and reduces GCP requirements.

Despite the many advantages of both approaches, challenges persist in addressing safety concerns, privacy issues, and regulatory compliance when deploying UASs within the construction industry. This is particularly crucial given the likelihood of a significant population residing and working in the deployment areas. To address these challenges effectively, it is advised to incorporate frequent routine training sessions and conduct thorough risk assessments as integral components of any UAS deployment. This proactive approach aims to minimize the impact of using UASs in the construction domain.

References

  1. 1. Forcael E, Ferrari I, Opazo-Vega A, Pulido-Arcas JA. Construction 4.0: A literature review. Sustainability. 2020;12:9755. DOI: 10.3390/su12229755
  2. 2. Gheisari M, Costa DB, Irizarry J. Unmanned aerial system applications in construction. In: Construction 4.0. United Kingdom: Routledge; 2020
  3. 3. Rakha T, Gorodetsky A. Review of Unmanned Aerial System (UAS) applications in the built environment: Towards automated building inspection procedures using drones. Automation in Construction. 2018;93:252-264. DOI: 10.1016/j.autcon.2018.05.002
  4. 4. Albeaino G, Gheisari M. Trends, benefits, and barriers of unmanned aerial systems in the construction industry: A survey study in the United States. Journal of Information Technology in Construction. 2021;26:84-111. DOI: 10.36680/j.itcon.2021.006
  5. 5. Toth C, Jóźków G. Remote sensing platforms and sensors: A survey. ISPRS Journal of Photogrammetry and Remote Sensing. 2016;115:22-36. DOI: 10.1016/j.isprsjprs.2015.10.004
  6. 6. Pricope NG, Mapes KL, Woodward KD, Olsen SF, Baxley JB. Multi-sensor assessment of the effects of varying processing parameters on UAS product accuracy and quality. Drones. 2019;3:63. DOI: 10.3390/drones3030063
  7. 7. Lopez V, Akundi A. A conceptual model-based systems engineering (MBSE) approach to develop digital twins. In: 2022 IEEE International Systems Conference (SysCon), Montreal, QC, Canada. 2022. pp. 1-5. DOI: 10.1109/SysCon53536.2022.9773869
  8. 8. Tao F, Sui F, Liu A, Qi Q, Zhang M, Song B, et al. Digital twin-driven product design framework. International Journal of Production Research. 2019;57:3935-3953. DOI: 10.1080/00207543.2018.1443229
  9. 9. Xue F, Lu W, Chen Z, Webster CJ. From LiDAR point cloud towards digital twin city: Clustering city objects based on Gestalt principles. ISPRS Journal of Photogrammetry and Remote Sensing. 2020;167:418-431. DOI: 10.1016/j.isprsjprs.2020.07.020
  10. 10. Vergouw B, Nagel H, Bondt G, Custers B. Drone technology: Types, payloads, applications, frequency spectrum issues and future developments. In: Custers B, editor. The Future of Drone Use: Opportunities and Threats from Ethical and Legal Perspectives. The Hague: T.M.C. Asser Press; 2016. pp. 21-45. DOI: 10.1007/978-94-6265-132-6_2
  11. 11. Rao AS, Radanovic M, Liu Y, Hu S, Fang Y, Khoshelham K, et al. Real-time monitoring of construction sites: Sensors, methods, and applications. Automation in Construction. 2022;136:104099. DOI: 10.1016/j.autcon.2021.104099
  12. 12. Ham Y, Han KK, Lin JJ, Golparvar-Fard M. Visual monitoring of civil infrastructure systems via camera-equipped Unmanned Aerial Vehicles (UAVs): A review of related works. Visualization in Engineering. 2016;4:1. DOI: 10.1186/s40327-015-0029-z
  13. 13. Álvares JS, Costa DB, de Melo RRS. Exploratory study of using unmanned aerial system imagery for construction site 3D mapping. Construction Innovation. 2018;18:301-320. DOI: 10.1108/CI-05-2017-0049
  14. 14. Jacob-Loyola N, Muñoz-La Rivera F, Herrera RF, Atencio E. Unmanned aerial vehicles (UAVs) for physical progress monitoring of construction. Sensors. 2021;21:4227. DOI: 10.3390/s21124227
  15. 15. Martinez JG, Gheisari M, Alarcón LF. UAV integration in current construction safety planning and monitoring processes: Case study of a high-rise building construction project in Chile. Journal of Management in Engineering. 2020;36:05020005. DOI: 10.1061/(ASCE)ME.1943-5479.0000761
  16. 16. Ezequiel CAF, Cua M, Libatique NC, Tangonan GL, Alampay R, Labuguen RT, et al. UAV aerial imaging applications for post-disaster assessment, environmental management and infrastructure development. In: 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA: IEEE; May 2014. pp. 274-283. DOI: 10.1109/ICUAS.2014.6842266
  17. 17. Albeaino G, Gheisari M, Franz B. A systematic review of unmanned aerial vehicle application areas and technologies in the AEC domain. The Electronic Journal of Information Technology in Construction. 2019;24:381-405
  18. 18. Lin JJ, Han KK, Golparvar-Fard M. A framework for model-driven acquisition and analytics of visual data using UAVs for automated construction progress monitoring. In: Computing in Civil Engineering. 2015. pp. 156-164. n.d. DOI: 10.1061/9780784479247.020
  19. 19. Mader D, Blaskow R, Westfeld P, Weller C. Potential of UAV-based laser scanner and multispectral camera data in building inspection. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2016;XLI-B1:1135-1142. DOI: 10.5194/isprs-archives-XLI-B1-1135-2016
  20. 20. Zhou Z, Irizarry J, Lu Y. A multidimensional framework for unmanned aerial system applications in construction project management. Journal of Management in Engineering. 2018;34:04018004. DOI: 10.1061/(ASCE)ME.1943-5479.0000597
  21. 21. Unger J, Reich M, Heipke C. UAV-based photogrammetry: Monitoring of a building zone. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2014;XL-5(5):601-606. DOI: 10.15488/892
  22. 22. Madubuike OC, Anumba CJ, Khallaf R. A review of digital twin applications in construction. Journal of Information Technology in Construction. 2022;27:145-172. DOI: 10.36680/j.itcon.2022.008
  23. 23. Datta SPA. Emergence of Digital Twins—Is this the march of reason? Journal of Innovation Management. 2017;5:14-33. DOI: 10.24840/2183-0606_005.003_0003
  24. 24. VanDerHorn E, Mahadevan S. Digital twin: Generalization, characterization and implementation. Decision Support Systems. 2021;145:113524. DOI: 10.1016/j.dss.2021.113524
  25. 25. Aengenvoort K, Krämer M. BIM in the operation of buildings. In: Borrmann A, König M, Koch C, Beetz J, editors. Building Information Modeling: Technology Foundations and Industry Practice. Cham: Springer International Publishing; 2018. pp. 477-491. DOI: 10.1007/978-3-319-92862-3_29
  26. 26. Opoku D-GJ, Perera S, Osei-Kyei R, Rashidi M. Digital twin application in the construction industry: A literature review. Journal of Building Engineering. 2021;40:102726. DOI: 10.1016/j.jobe.2021.102726
  27. 27. Boje C, Guerriero A, Kubicki S, Rezgui Y. Towards a semantic Construction Digital Twin: Directions for future research. Automation in Construction. 2020;114:103179. DOI: 10.1016/j.autcon.2020.103179
  28. 28. Barazzetti L, Banfi F, Brumana R, Oreni D, Previtali M, Roncoroni F. HBIM and augmented information: Towards a wider user community of image and range-based reconstructions. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2015;40:35-42
  29. 29. Gabor T, Belzner L, Kiermeier M, Beck MT, Neitz A. A simulation-based architecture for smart cyber-physical systems. In: 2016 IEEE International Conference on Autonomic Computing (ICAC). 2016. pp. 374-379. DOI: 10.1109/ICAC.2016.29
  30. 30. Alonso R, Borras M, Koppelaar RHEM, Lodigiani A, Loscos E, Yöntem E. SPHERE: BIM digital twin platform. Proceedings. 2019;20:9. DOI: 10.3390/proceedings2019020009
  31. 31. Lu Q , Chen L, Li S, Pitt M. Semi-automatic geometric digital twinning for existing buildings based on images and CAD drawings. Automation in Construction. 2020;115:103183. DOI: 10.1016/j.autcon.2020.103183
  32. 32. Angjeliu G, Coronelli D, Cardani G. Development of the simulation model for Digital Twin applications in historical masonry buildings: The integration between numerical and experimental reality. Computers & Structures. 2020;238:106282. DOI: 10.1016/j.compstruc.2020.106282
  33. 33. Pan Y, Zhang L. A BIM-data mining integrated digital twin framework for advanced project management. Automation in Construction. 2021;124:103564. DOI: 10.1016/j.autcon.2021.103564
  34. 34. Malihi S, Valadan Zoej MJ, Hahn M. Large-scale accurate reconstruction of buildings employing point clouds generated from UAV imagery. Remote Sensing. 2018;10:1148. DOI: 10.3390/rs10071148
  35. 35. Gini R, Pagliari D, Passoni D, Pinto L, Sona G, Dosso P. UAV photogrammetry: Block triangulation comparisons. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2013;XL-1-W2:157-162. DOI: 10.5194/isprsarchives-XL-1-W2-157-2013
  36. 36. Kršák B, Blišťan P, Pauliková A, Puškárová P, Kovanič Ľ, Palková J, et al. Use of low-cost UAV photogrammetry to analyze the accuracy of a digital elevation model in a case study. Measurement. 2016;91:276-287. DOI: 10.1016/j.measurement.2016.05.028
  37. 37. Saadatseresht M, Hashempour AH, Hasanlou M. UAV photogrammetry: A practical solution for challenging mapping projects. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2015;XL-1-W5:619-623. DOI: 10.5194/isprsarchives-XL-1-W5-619-2015
  38. 38. Pricope NG, Halls JN, Mapes KL, Baxley JB, Wu JJ. Quantitative comparison of UAS-Borne LiDAR systems for high-resolution forested wetland mapping. Sensors. 2020;20:4453. DOI: 10.3390/s20164453
  39. 39. Li X, Liu C, Wang Z, Xie X, Li D, Xu L. Airborne LiDAR: State-of-the-art of system design, technology and application. Measurement Science and Technology. 2020;32:032002. DOI: 10.1088/1361-6501/abc867
  40. 40. Bolkas D. Assessment of GCP number and separation distance for small UAS surveys with and without GNSS-PPK positioning. Journal of Surveying Engineering. 2019;145:04019007. DOI: 10.1061/(ASCE)SU.1943-5428.0000283
  41. 41. Sanz-Ablanedo E, Chandler JH, Rodríguez-Pérez JR, Ordóñez C. Accuracy of Unmanned Aerial Vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sensing. 2018;10:1606. DOI: 10.3390/rs10101606
  42. 42. Yao H, Clark RL. Evaluation of sub-meter and 2 to 5 meter accuracy GPS receivers to develop digital elevation models. Precision Agriculture. 2000;2:189-200. DOI: 10.1023/A:1011429815226
  43. 43. Fazeli H, Samadzadegan F, Dadrasjavan F. Evaluating the potential of RTK-UAV for automatic point cloud generation in 3D rapid mapping. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2016;XLI-B6:221-226. DOI: 10.5194/isprs-archives-XLI-B6-221-2016
  44. 44. Tomaštík J, Mokroš M, Surový P, Grznárová A, Merganič J. UAV RTK/PPK method—An optimal solution for mapping inaccessible forested areas? Remote Sensing. 2019;11:721. DOI: 10.3390/rs11060721
  45. 45. Deng Z, Bender M, Zus F, Ge M, Dick G, Ramatschi M, et al. Validation of tropospheric slant path delays derived from single and dual frequency GPS receivers. Radio Science. 2011;46:1-11. DOI: 10.1029/2011RS004687
  46. 46. da Silva Neto JG, da Lima Silva PJ, Figueredo F, Teixeira JMXN, Teichrieb V. Comparison of RGB-D sensors for 3D reconstruction. In: 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil. 2020. pp. 252-261. DOI: 10.1109/SVR51698.2020.00046
  47. 47. Tahar KN. An evaluation on different number of ground control points in unmanned aerial vehicle photogrammetric block. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2013;XL-2-W2:93-98. DOI: 10.5194/isprsarchives-XL-2-W2-93-2013
  48. 48. Biundini IZ, Pinto MF, Melo AG, Marcato ALM, Honório LM, Aguiar MJR. A framework for coverage path planning optimization based on point cloud for structural inspection. Sensors. 2021;21:570. DOI: 10.3390/s21020570
  49. 49. Cramer M, Przybilla H-J, Zurhorst A. UAV cameras: Overview and geometric calibration benchmark. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2017;XLII-2-W6:85-92. DOI: 10.5194/isprs-archives-XLII-2-W6-85-2017
  50. 50. Zeybek M, Şanlıoğlu İ. Point cloud filtering on UAV based point cloud. Measurement. 2019;133:99-111. DOI: 10.1016/j.measurement.2018.10.013
  51. 51. Shao J, Yao W, Wan P, Luo L, Wang P, Yang L, et al. Efficient co-registration of UAV and ground LiDAR forest point clouds based on canopy shapes. International Journal of Applied Earth Observation and Geoinformation. 2022;114:103067. DOI: 10.1016/j.jag.2022.103067
  52. 52. Elamin A, El-Rabbany A. UAV-based multi-sensor data fusion for urban land cover mapping using a deep convolutional neural network. Remote Sensing. 2022;14:4298. DOI: 10.3390/rs14174298
  53. 53. Xu J, Yao C, Ma H, Qian C, Wang J. Automatic point cloud colorization of ground-based LiDAR data using video imagery without position and orientation system. Remote Sensing. 2023;15:2658. DOI: 10.3390/rs15102658
  54. 54. Pereira LG, Fernandez P, Mourato S, Matos J, Mayer C, Marques F. Quality control of outsourced LiDAR data acquired with a UAV: A case study. Remote Sensing. 2021;13:419. DOI: 10.3390/rs13030419
  55. 55. Long N, Millescamps B, Pouget F, Dumon A, Lachaussée N, Bertin X. Accuracy assessment of coastal topography derived from UAV images. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2016;XLI-B1:1127-1134. DOI: 10.5194/isprs-archives-XLI-B1-1127-2016
  56. 56. Yang Q , Yoo S-J. Optimal UAV path planning: Sensing data acquisition over IoT sensor networks using multi-objective bio-inspired algorithms. IEEE Access. 2018;6:13671-13684. DOI: 10.1109/ACCESS.2018.2812896

Written By

Jhonattan G. Martinez, Luis A. Alarcon and Søren Wandahl

Submitted: 08 February 2024 Reviewed: 12 February 2024 Published: 14 March 2024