Open access peer-reviewed chapter

Enhancing BIM through Mixed Reality for Facility Management

Written By

Massimo Vaccarini, Alessandro Carbonari, Francesco Spegni and Alberto Giretti

Submitted: 14 June 2022 Reviewed: 17 June 2022 Published: 07 September 2022

DOI: 10.5772/intechopen.106186

From the Edited Volume

From Theory of Knowledge Management to Practice

Fausto Pedro García Márquez and René Vinicio Sánchez Loja

Chapter metrics overview

143 Chapter Downloads

View Full Metrics

Abstract

Implementation of processes in facility management asks for coordination and collaboration among several factors, each implementing its own sub-process. This goal must cope with several challenges caused by the fragmentation of the AEC (architectural, engineering, construction) industry. In the specific case of facility recommissioning, further constraints are determined by limitations posed by the current status of the existing facility whose knowledge is often limited to coarse preliminary surveys. In this chapter, the benefits determined by the integration between BIM and mixed reality will be presented, along with a prototypical platform that realizes an efficient, distributed collaborative workflow enabling asynchronous collaboration among members of the facility management office, the owner, the design team and technical specialists that may be appointed in recommissioning workflows. Technically, this approach provides an immersive mixed-reality environment capable of seamlessly displaying project information, through which specialists can evaluate and refine different recommissioning options. In addition, the platform supports on-site enrichment of BIM models for a facilitated, yet asynchronous, collaboration between remote and on-site users. This technology was validated by means of real-life experiments regarding a hypothetical recommissioning project of the Construction Division of the DICEA Department at Università Politecnica delle Marche (Ancona, Italy).

Keywords

  • facility management
  • asset recommissioning
  • BIM
  • AR/MR
  • mixed reality
  • efficiency

1. Introduction

1.1 Facility management

Facility Management is inherently a management discipline that is articulated within the context of a very broad and disparate set of technical-operational domains. Consider in primis the areas of maintenance, energy efficiency and sustainability, safety and resilience, organization and usability of spaces. In addition to these purely technical areas, there is the economic-financial management of both current activities and new construction projects. Finally, at the top of this challenging pyramid is perhaps the most important organizational aspect, the management of the quality of the day-to-day service offered by facilities in relation to their intended use and their ability to adapt to the sudden changes in the operational processes that take place there. Today, the holistic and managerial view of Facility Management, therefore, accompanies every phase of the building life cycle and represents a key competence of construction management engineering [1]. Although many studies have highlighted the importance of consistency and interoperability in FM, the processes used in facilities still lack an adequate conceptual framework to support informed and data-driven decision-making. Like any management activity, Facility Management is data intensive. A correct, complete, and timely information flow is of fundamental importance for supporting decision-making processes, extending from medium-term strategic level decisions to the real-time management of events [2].

1.2 The BIM promise for the FM revolution

Building Information Modelling (BIM) can play a key role as an operational framework for coordinating FM operations. BIM has the potential to enable the facility manager to manage the complexity of processes through a structured, referenced, and easily accessible data flow [3]. The BIM data interchange structure is based on the official Industry Foundation Classes (IFC) standard. An IFC model can map a complete facility model, topography, and technical systems, including 3D representations, structural design, and all technical features. In its possible extensions, IFC allows, through the CoBie MVD [4], handovers management and the inclusion of management data, such as maintenance plans, technical specifications, operating instructions, warranty and service contract information, and costs. From an ICT perspective, the use of a BIM-based operational framework in a cloud platform has the potential to integrate and make enriched data (e.g., from energy simulations or predictive maintenance assessments) accessible to all stakeholders [5]. Finally, from the pure operational standpoint, the virtualization of the facility model allows the evolution of the operational workflow through the use of new mixed reality techniques [6].

1.3 Integrating BIM into FM workflows

The multifaced influence of the BIM technology in the Facility Management operational framework requires therefore a careful analysis to meet operational requirements. FM mainly concerns assets and processes operating on assets. BIM integration in FM cannot be limited to information mapping, e.g. [7] because this narrow scope fails to capture the real FM perspective. Assuming a systemic view of FM processes, the requirements analysis must identify the operational context, as the set of processes, assets and services oriented to support one or more well-defined core processes, and on this basis characterize the entire information profile, according to the four fundamental qualities of an information profile:

  • What information is to be processed,

  • How it is to be stored and retrieved,

  • Where it is to be stored,

  • When it must be stored and used.

In other words, to meet the operational requirements of FM processes, not only the information content must be defined but also the way it is stored and retrieved as well as the storage location. Consider, for example, the asset management process, in the context of sustainable Facility Management. In a circular economy perspective, a building component, e.g. a window or doorframe, has a life cycle that goes beyond the simple usage phase. It is in fact produced through processes that tend to minimize the use of resources, used in such a way as to guarantee the best performance in terms of energy, and recovered as far as possible in its materials or sub-components after its disposal from the FM cycle.

Therefore, the asset’s information life cycle, while intersecting to a large extent with the life cycle the asset management information profile has a much more extended scope, involving both the production and the recovery phases, themselves complex processes with highly articulated information structures. This scenario is not purely conceptual. In fact, it has a definite impact on the technology that manages the information. The transition of information from one phase to another involves the transition between different management systems. At the practical level of the construction production system, given the quantity of materials and the diversity and spread of manufacturers, this transition is difficult to manage through communication between different information systems for the usual reasons of standardization of protocols and accessibility. More effectively realized is the embedding of information in the component itself through inexpensive RFID tags [8, 9, 10]. Ultimately, the integration of BIM-based technologies into FM processes requires the extension of the simple mapping of information into the IFC interchange structures and the inclusion of the entire process workflow, making explicit the specific operations that emerge from the adoption of a particular technology in the operational process, especially for on-site processes.

In the next Section 2, the scientific background concerning the integration between BIM and advanced visualization techniques will be reported, with a specific focus on mixed reality and applications to facility management. Section 3 suggests an efficient cooperative workflow enabling asynchronous interactions between remote and onsite users in the Facility Management context. This is the premise for the development of a prototypical web platform for Asynchronous Collaborative Onsite Survey (ACOS), which is the subject of Section 4, where its architecture, GUIs, alignment, and onsite apps will be reported. Experimental results and remarks about the management of a recommissioning project in a hypothetical real-life test case are then presented in Section 5. Finally, Section 6 ends this chapter with conclusions.

Advertisement

2. Scientific background about mixed reality

The basic technology requirement for Mixed Reality (MR) systems is that they provide a sensory, visually coherent set of stimuli. This is fundamentally different from other competing or complementary technologies [11]. The conventionally held view of a Virtual Reality (VR) environment is one in which the participant-observer is totally immersed in, and is able to interact with, a completely synthetic world. Augmented Reality (AR) is characterized by digital content superimposed on the users’ real surroundings; Augmented Virtuality (AV) involves real content overlapped on the user’s virtual environment.

In Mixed Reality, users are placed in the real world and digital content is totally integrated into their surroundings so that they can interact with both digital and real content, and these elements too can interact [12]. The basic elements of a mixed reality environment are interaction, lighting, objects, and the real environment [11]. They involve both hardware and software issues, such as the technical performances of head-mounted displays and scene reconstruction combining both real and virtual content.

Building Information Modelling is being increasingly used in the construction industry not only to produce n-Dimensions data-rich models but also to promote the use of on-site models. This application can enhance communication among all stakeholders. For this reason, BIM and visualization technologies, such as VR, are integrated to create an immersive environment to be used for assessment tasks. Examples range from site layout and planning to the evaluation of construction scenarios, inspection, and maintenance. Indeed, VR broadens the vision of potential end-users about what output to expect after the completion of the project [13]. The potentials of improved visualization enabled by BIM have been discussed from several points of view:

  • interactive MR visualization can connect virtual models and digital planning information based on BIM with the physical building or production site for self-inspection and self-instruction; this means, respectively, that workers on-site can check their own working processes and results in collaboration with others, and that actors on-site can be provided interactive guidance to prevent incorrect actions, even helping workers to rectify errors immediately, if any [14].

  • BIM can provide technical information as integration of management functions supported by computerized maintenance management systems (CMMS) that can facilitate data collection and data entry, as well as visualization when and where needed; more specifically, the technical office can update the inspector with information from the database or can monitor and advice him/her by sharing the inspector view; in addition, data collected by an inspector operating on-site, such as defects and their attribution, can be saved in the database and immediately shared with the technical office [15].

  • data from different sources can be linked within a BIM environment, provided that a platform for big data management has been put in place; then, several layers devoted to data extraction, integration, analysis, and man-machine interoperability can facilitate several tasks in the facility management where operators are usually involved, such as interacting with asset data in a 3D environment [16].

As an additional application, advanced visualization provided by AR in a BIM environment has been used as a tool to enable untrained individuals (ranging from professionals down to unskilled personnel) to complete construction tasks [17]. These tests found out that the use of such a technology can be of great advantage for all the considered categories of workers, with some limitations to the simplest tasks for the under-trained individuals.

Overall, mixed reality setups can allow the generation of distributed collaborative construction processes, where personnel located on remote sites and equipped with smart see-through glasses cooperate in the construction of a virtual 3D model combining tangible and virtual objects. Such a collaboration environment is characterized by multiple client instances, each of them defining the environment which will run at a remote site. Those instances must communicate via a middleware allowing data flow concerning three main types of data: real objects incorporated by means of the user’s scene visualization; virtual objects and information that are retrieved from the construction model and relocated to the remote site; the gesture recognition component that informs about physical interactions with virtual objects [18, 19]. Another specific application involves a BIM system, coupled with AR and integrated with a location-based management system, to provide context-specific information on construction projects, as well as evaluation of performance indicators on the progress and execution of construction activities. Such information is displayed within head-mounted displays put on by managers while they are walking through the construction site [20]. The main challenges addressed by the authors are the integration between BIM and the development environment, and the interface of position systems in the BIM environment.

The target of the solution showcased in the present contribution is to improve the efficiency of planning renovation actions in asset management. This might concern both repairs of built assets and replacement or changes of building sub-systems and components [21]. More specifically, the data exchange processes described in the next paragraph can be made more efficient thanks to enhanced collaboration between the technical office and on-site personnel. In order to successfully apply MR technologies in the field of FM, the full integration between reality and virtuality is required. The users’ perception of the real environment can be improved by showing information that users cannot directly acquire otherwise. One essential requirement to use MR technology for on-site FM purposes is the alignment of the virtual BIM model so that it perfectly matches its physical counterpart [22]. Some applications (e.g. Trimble Connect [23, 24]) support model alignment functions. In the following of this chapter, four methodological and technical challenges will be tackled. The first one concerns the development of a workflow supporting recommissioning in facility management. The second one reports the technical development of a web platform facilitating collaboration in facility management, which is an extension of some technology developed within the H2020 EU project Encore (Id:820434) [27]. The third challenge presents the details of a technique to align virtual models over virtual facilities during on-site surveys. Finally, the use of the platform is showcased in the case of a hypothetical recommissioning project.

Advertisement

3. Workflow and information model

In order to fully exploit the potential of novel technologies, it is of paramount importance to have a clear understanding of how they impact existing workflows, understand what tasks are eased or even automatized, and who will benefit from such innovation. In the following, the workflow of Facility Management is described as a collaboration among several actors, each implementing its own process. Next, the activities that are supported by the presented approach are exploded and analyzed in detail.

3.1 Workflow for facility management

Recommissioning can be described as the interaction among three main actors: the client, the facility management office, and the design team. The former is responsible for expressing the project needs, allocating the budget for the project, and has the final word about which design should be accepted among the several alternative ones. The facility management office receives the needs of the client and through careful analysis, the process isolates the set of requirements that the recommissioning must or should meet in order to satisfy the client’s needs. Such requirements express both technical constraints or desiderata (e.g. as in the case of seismic or energy retrofits) as well as non-technical ones (e.g. as in the case of budget constraints or safety regulations).

The facility management office is in charge of managing the overall recommissioning project and coordinating several technical specialists on behalf of the client. Most of the time, indeed, the latter is not able by itself to identify how the many possible technical solutions may impact the ability to meet the desired needs. In the recommissioning workflow, a tight interaction between facility management office and one or more design teams is needed. For the sake of simplicity, and without loss of generality, in this work, it is assumed that there is only one design team responsible for producing the desired design options. Finding the best design, that is able to satisfy the client’s needs, is an iterative process. One attempt is never enough, and thus usually a sequence of design options is produced by the design team and then assessed by the facility management office. The main purpose of this loop is only the design team and the facility management office and its purpose is to set a great amount of technical and non-technical details, ruling out solutions that do not fit in the client’s budget or that do not meet the client’s needs. Usually, after some attempts, the design team and the facility management office agree that a small set of the produced design options meet all the relevant requirements, and the facility management office decides that they are ready to be presented to the client which in turn is in charge of selecting the final design (Figure 1).

Figure 1.

Recommissioning workflow.

Even the process of selecting the final appointed design can be iterative, since the client can ask the facility management office and the design team to provide different solutions, thus causing a new round of generating new design options and assessing them. If the presented design options, on the other side, satisfies the client’s expectations, the client itself is able to select one of them as the appointed design, thus passing it over to the next steps (e.g. involving a call for tender, selecting the constructor company, executing the designed project, and so on).

The degree and quality of the interaction among the involved parties in the workflow strongly determine the realizability of the overall recommissioning project and the quality of the final renovated building or infrastructure: on one side, each new loop between the client, the facility management office, and the design team may introduce a delay in the project schedule; on the other side, a superficial assessment of the design options may speed up the project schedule in the first stages at the cost of overlooking significant design flaws that will be caught only at execution phase, causing increments of the project costs and even greater delays in the overall project schedule.

This motivated the definition of the aforementioned methodology called ACOS, assisting the parties during two critical steps of the overall recommissioning workflow, viz. the design assessment involving a tight interaction between the facility management office and the design team, and the final design selection, which involves a tight interaction between the client and the facility management.

3.2 Workflow about the usage of the information platform

The ACOS platform usage workflow depicted in Figure 2 begins with the facility management office creating a recommissioning project in the platform itself, collecting the requirements elicited analyzing the project needs as expressed by the client. The recommissioning project is then shared with the design team, which in turn is required to produce and upload design options that should meet the given (technical as well as non-technical) requirements.

Figure 2.

The ACOS platform usage workflow.

The design team usually works with a wide range of BIM authoring tools in order to produce design options. The ACOS platform is compatible with all the BIM authoring tools that can export the produced design options as IFC files. For every IFC file, a new design option can be created in the ACOS platform, within the given recommissioning project, and the IFC file is uploaded into the just produced design option. Once the IFC dataset is imported, the platform triggers an automatic conversion service extracting two different pieces of information from the IFC file: on one side, a 3D model representation is saved as a GLTF object, while on the other side a JSON document represents the graph of objects contained in the IFC model (walls, windows, doors, furniture, …) together with their properties and relevant IFC metadata (e.g. the GUID identifying each object). After this stage, the facility management office or any appointed technical specialist can go on-site, wear the MR headset and in turn exploit WiFi and Internet connectivity in order to select the design option that she wants to assess directly on site.

Once the onsite user assessing the project has some comments to do about one or several components, he/she can point at them exploiting the capability of the MR headset of interpreting the user gestures and start recording an audio comment as an MP3 file which is then uploaded onto the web server and linked to the pointed GUID of the assessed design option. The uploaded comment will be wrapped onto an annotation that in turn is linked to the specified GUID of the current design option. The assessment activity ends when the facility manager or technical specialist in charge has no more notes to leave.

Let us underline that while it is very likely that in medium or big organizations the client does not show up on the site of the building or infrastructure to be recommissioned, nothing prevents the clients from using the same technology in case they want to assess in first person the options that have been designed. In this case, the client can wear the same MR headset, add annotations and contribute to the BIM enrichment process that will link extra information to the objects present in each design option.

Remotely, both the remote users of the design team and the facility management office, as well as the client itself, are able to access the web interface of the platform, display the 3D representation of the enriched BIM, and listen to the recorded annotations. The enriched project information model obtained using the ACOS platform becomes a key enabling factor for increasing the quality of interaction among clients, facility management office, design team members, and appointed technical specialists. This in turn translates to a better recommissioning process and helps intercept design flaws during the early stages of the overall recommissioning project, reducing the cost of fixing them.

Advertisement

4. Technology implementation

4.1 Architecture

The ACOS methodology has been implemented on a platform whose architecture is depicted in the UML component diagram in Figure 3. One of the main purposes of the platform is to ease the collaboration between two types of users: Remote Users and On-Site Users. The former has at its disposal tools to be used on a desktop computer, while the latter can work on the field using wearable devices such as an MR headset and mobile devices such as a smartphone or a tablet. The synchronization among the tools is made possible using Internet and JSON web services implementing a RESTful API. The latter offers a unified view of the information in the system, allowing all the distributed tools to access the most updated information available.

Figure 3.

Architecture of the ACOS platform.

The components in the server are deployed realizing a microservice-based architecture, where each offers a separate service and runs on a separate environment called container. This architectural pattern has several known advantages that pay back for the added complexity required in the configuration phase.

First, running services on separate containers make each of them more secure and able to protect its own datasets from intruders or non-authorized users trying to steal sensitive information about the managed projects. Second, the architecture is more scalable and can rapidly migrate from deployment on a single server to deployment on a pool of servers where each component runs on a separate machine, globally achieving higher performance levels. Third, it eases the maintenance activities of the services themselves over a longer period of time, because each service can be replaced by any other better implementation of the same service, as long as the service interface is respected by the new implementation. Fourth, services running in containers generally ensure a higher availability to the service clients, because one of the main characteristics of containers is that they can be easily programmed to be restarted even when critical errors happen, thus limiting the disconnection time for each service and allowing to reach the so-called “five-nines” availability target, i.e. ensuring that a service is up and responding for 99.999% of the daytime.

All the aforementioned features are very desirable for cloud-based architectures that may transition from being a prototype in a test environment to being an actual business service in a production environment.

The platform organizes information in objects known as entities that are stored persistently on a database. Each entity has a type, called entity type, and entities are linked among themselves through relations. Figure 4 is a UML Entity-Relationships diagram documenting the core entity types and relations used to implement the platform.

Figure 4.

Entity-Relationship UML diagram of the data model implemented by the platform.

The core entity types are Users, Projects, Design Options, Conversions, Annotations, NFC Tags, and NFC Readings. An instance of User stores some basic metadata about the users allowed to access the platform, in order to implement standard authentication and authorization mechanisms (e.g. username, password, first and last names, contact information, …). Projects store information concerning the overall facility management project (e.g. project name, owner, contact information, project begin and end dates, …). Design Options are developed by the design team using their favorite authoring software tools, and can be updated in the form of IFC files on the platform. Conversions store the 3D object model in the form of a GLB/GLTF file together with a JSON representation of the product objects that are described in the original IFC file. Both the GLB/GLTF file, as well as the JSON file, are generated automatically using the IFC Converter component described in the architecture. Annotations wrap audio files recording the comments of On-Site Users expressing their opinions about specific objects of the IFC model they are assessing. The IFC object is referred to through its GUID, detected when the user points to a virtual object in the AR/MR Headset App. Finally, NFC Tags store information of each tag deployed on the facility that is used to complete the model alignment procedure, while the NFC Readings store the serial identifier of tags that are scanned using the Mobile NFC Scanner, as well as the timestamp of the scan operation. Later, in Section 4.3.1, it is explained in further detail why, for alignment purposes, it is more suitable to rely on NFC tags rather than other RFID technologies.

4.2 Desktop/remote tools

Through their desktops and browsers, Remote Users access the Web Application through a web GUI obtained by combining HTML and Javascript. The Web Application component is realized using the Python Django framework and is deployed on the server. The purpose of the Web Application component is to generate the web GUI that the Browser displays to the user.

The web GUI allows the Remote Users to access the Web Application entities (projects, design options, …). Different users can have different privileges over the platform, and thus they can see or modify different entities, depending on the authorization they have been assigned to. For each entity type in the platform, the user has several operations available: list all the entities of the given type, search and filter the list of entities, create a new entity, change an existing entity, and delete an existing entity. In Figure 5 the Remote Users can see the available entity types they can operate upon, while in Figure 6 the same users can edit a single entity, in this case, an Annotation containing an audio comment uploaded from the On-Site User through the AR/MR Headset and linked to the IFC GUID the comment is referring to.

Figure 5.

The web GUI shows the available entity types to the Remote User.

Figure 6.

The web GUI allows to view and edit a single entity (in this example, of type Annotation).

4.3 On-site tools

4.3.1 The alignment tool

For successfully applying mixed reality technologies on field, the full integration between reality and virtuality is required. In MR devices, this integration requires a set of enabling technologies [25, 26], namely: displays, calibration, tracking, registration, and interaction. Among these, the tracking and registration tasks are the most challenging and play a key role in Facility Management. Tracking task is strictly connected to the registration problem, which aims to achieve a precise real-time alignment between virtual and real elements. Especially indoors, due to the absence of an absolute tracking system, there is an unknown offset between the coordinate system of the virtual model and one of the 3D representations of the environment.

Feature recognition could be exploited for tracking and registration tasks when the visible features can be exactly matched with the virtual model, but this is not always the case, for instance during building construction. The currently adopted approaches for addressing the registration issue by the commercially available devices, such as TrimbleTM XR10 for MicrosoftTM HoloLens, are the semi-manual alignment and the markers-based one. The first method consists in aligning any two surfaces in the model with the corresponding real ones. Then, a manual scaling and rotation of a cube for fine-tuning are required. This method produces a rough but acceptable alignment, provided that two conditions are fulfilled. First, the user must be aware of his/her position in the virtual space; secondly, he must be enabled to visually select any item, even those usually hidden by other holograms. Moreover, such an alignment method can be applied only with reference to virtual elements that have their physical and stable counterpart in the real world. Such requirements could be hard to meet in AEC scenarios that continuously evolve (e.g. during work progress) and may not offer stable references. The markers-based alignment method overcomes those issues by aligning the BIM model based on the user position, retrieved by scanning a real and visible marker, e.g. a QR code or a target image, having its virtual replica [23, 24]. Although the correctness of visual markers’ positions must be verified before each scanning, this method ensures a pretty good model alignment.

In this chapter, the MicrosoftTM HoloLens 2.0 device is adopted for addressing all the above MR requirements, except for the registration that is achieved by means of the tracking capability of the MR device, together with an offset elimination procedure (that we call model alignment) that must be implemented for each specific model. The developed approach uses RFID tags together with a handheld device (e.g. a smartphone) capable of reading them. We assume such tags have been previously and once for all embedded in the building (e.g. during the construction or during the first on-site survey). They can be also embedded in building components thus making them not visible and persistent through the whole building life-cycle. In addition, this approach overtakes the need for having the real counterpart of a virtual element, which is compulsory for the manual/semi-manual alignment methods. This improves and generalizes the alignment process both in terms of efficiency and quality of results.

Various RFID technologies can be exploited for model alignment, but MIFARE-type (ISO/IEC 14443A and ISO/IEC 14443) Near Field Communication (NFC) family tags have several advantages over others: they can be read only at short distances, thus allowing their location in space to be more accurate (within a few centimeters); they do not require batteries since they are passive and powered by the reader; both the tag and the reader are very small and inexpensive compared to other technologies; furthermore, their packaging can be very strong and durable, thus allowing persistent incorporation within building components. The counterpart of the first advantage is the need to know almost exactly where they are located in order to read them, but this problem can be overcome by including them under small notable elements (e.g., near corners) or under nameplates usually attached to certain elements (e.g., doors, equipment, electrical panels, etc.).

Alignment of models with the physical world is made by means of a visual localization of the handheld device when reading a tag and calculating its relative position w.r.t. the model (Figure 7). The offset between its position in the model and its actual position is used for aligning models over the physical world. When the NFC scanner embedded in the mobile detects a NFC tag, it sends a request to the RESTful API to save a new NFC reading entity. The latter stores the tag serial code together with the timestamp of the reading event. If the communication with the RESTful API is successful, a target image is shown on the display of the mobile that is recognized by the MR application that precisely and rapidly localizes it in space by exploiting capabilities of an AR engine; in our application, we have embedded VuforiaTM engine for this purpose.

Figure 7.

First-person view of the expert while reading an NFC tag.

Since the detected distance from the observer is highly sensible to scale variations of the target image, only the observer direction is used for image detection and the actual distance is measured by exploiting the raytracing capability of the MR headset. The resulting position is then always snapped to the observed surface regardless of image scaling issues.

The alignment procedure based on NFC tags assumes that at least two NFC tags have been placed in known positions in the BIM model of the building with their GUID as identifier. The same tags must be registered in the web GUI under NFC tags in order to link the GUIDS with the serial codes of the corresponding tags.

For the sake of simplicity, but without loss of generality, the model is assumed horizontal since the inertial sensors of the MR headset are usually accurate enough to ensure this. Therefore, the remaining degrees of freedom to align the model with reality are just four: three for translation and one for rotation around the vertical axis. This implies that just two reference points are enough to perform alignment: the first point allows to translate the model and the second one is used for rotating it.

In the beginning, the loaded model is placed in position P with respect to the right-handed reference system of the MR device. When the image target is visually detected for the first time, its position in space i1 is returned by the MR headset and the handheld device sends the serial code of the NFC tag to the server for matching it with the model and retrieving its GUID and the position of the tag t1 in the model. Since the two positions should overlap, the first alignment operation is a translation of the model by the detected offset P=P+i1t1.

Then, the user looks at the second image target in position i2 at which corresponds to the tag in position t2 in the model. The distance vectors among a couple of image targets in reality i21i2i1 and a couple of tags in the model t21t2t1 are then used to determine the rotation that overlaps also the second tag. The horizontality assumption allows us to project them to the horizontal plane and to perform just a rotation around the vertical axis.

By denoting the projections of vectors i21 and t21 on the horizontal plane with i¯21 and t¯21, and by defining their unit vectors i¯i¯21/i¯21 and t¯t¯21/t¯21 respectively, the rotation matrix that rotates i¯ onto t¯ by a counter clockwise angle θ is given by R=cs0sc0001, where vi¯×t¯ is the cross product of the two unit vectors with module sv=sinθ, ci¯t¯=cosθ is the dot product of the two unit vectors.

Since the two vectors are applied to a common point i1=t1 generally not placed at the origin, this pivot point must be placed at the origin before rotation by applying translation P=Pt1, then the model must be rotated with rotation matrix R (eventually regularized by the use of the corresponding quaternions), and at the end, it must be translated back to the original position by P=P+t1. Due to unavoidable uncertainties, the distance between the image targets will always be slightly different from that one between the tags in the model. Once the planes containing the tags and the one containing the image targets overlaps, the two remaining degrees of freedom are used to fine-tune the position of the model by moving it along this plane in order to minimize the distance among the two couples P=P+i2t2/2.

In order to avoid unnecessary computational burden, for the implementation in the MR device, all the transformations are done on a not rendering dummy game object and, only at the end of the alignment, they are applied to the building model. In case more than two reference points are detected, the procedure can be repeated by considering the most significant couple of tags (e.g. the last two or the farthest). A regression approach can also be implemented for progressively refining the alignment when many tags are considered simultaneously.

4.3.2 The MR app

The MR application for Microsoft HoloLens has been developed under Unity3D 2019 environment with Microsoft Visual Studio 2019 and Mixed Reality Toolkit (MRTK) for Unity. The on-site app is a client for the ACOS server that hosts all the information used and produced by it. As reported in Section 4.2, at least one project and one design option must be loaded via the remote tools into the server and the corresponding “Conversion” entity must be generated.

The assessment task starts as soon as an expert gets on site and establishes Internet connectivity. Then, he/she wears the MR headset, logs in into the device and runs the app. At this time a virtual menu appears in front of the user that is enabled to switch across different alternative options by selecting the desired project (Figure 8a) and, within it, the desired design option (Figure 8b). After the selection of the design option, the MR app queries the RESTful API for the related “Conversion” entity (GLTF object and JSON document). Once the download has been completed, the “Assessment menu” pops up (Figure 8c) for managing the loaded model that is not displayed on the MR app, yet. This menu is made of several virtual buttons that allow the user to manage the model, align it via RFID tags, adjust it manually, or to hide/show specific IFC categories of objects. As shown in Figure 8c, by using the buttons in the last two columns of the menu, any components corresponding to pre-defined IFC types can be hidden/shown in order to keep visible only those parts of the renovation options that are useful to perform the assessment task and to hide objects that may obstruct the observer’s view with respect to model parts to be assessed.

Figure 8.

Main menu to select the project options (a) and design options (b); the virtual menu to navigate on-site (c) and to attach an audio comment to an IFC object (d).

Moreover, every component of the virtual model can be selected by gazing it and performing an “air tap” gesture. The selected object is highlighted, and a new virtual menu is shown (the “Recording menu” of Figure 8d) that enables the user to record a voice comment as an audio file. Once the recording phase is over, the file is sent to the ACOS server together with the GUID of the object at which it should be attached, and it is shown as an available file on the right side of the “Recording menu” (Figure 8d). As a result, the final assessment of a design option will be made of a set of Annotation entities attached to IFC elements of the available design options. Each such entity wraps an audio file that can be played back both on-site by technical specialists, using the MR application, as well as remotely and asynchronously by the facility manager, using the web GUI.

Advertisement

5. Experimental results

5.1 Case studies

This ACOS web platform can support various types of case studies that have been tested by the authors. Among them, it is worth mentioning:

  1. productivity improvement in maintenance processes, thanks to on-site localization and visualization of components, retrieval and display of information relevant for repairing; previous tests regarding a scenario of potential failure of a communication plug in an office room showed that, once the virtual model of the communication system has been aligned over the real asset, an operator using the MR app and aiming at the switch-board, can query the model to single out which switch in the board is connected to the failed plug [27];

  2. enabling the asynchronous cooperation between designers and specialists involved in the assessment of a set of alternative renovation projects of residential buildings, which asks for the evaluation of some factors that can be assessed only through on-site surveys; in particular, a group of volunteers was committed to test how efficient an MR application is in the on-site display of renovation design options regarding a residential building located in Caceres (Extermadura, Spain); those renovation options were assessed and some constructability issues were found and fixed in order not to hamper the execution of renovation works [28];

  3. this paper section reports real-life tests regarding a facility management scenario, carried out in the construction division laboratory of the DICEA Department at the Engineering Faculty of Università Politecnica delle Marche (Ancona, Italy); these tests highlighted the validity of the automatic alignment approach carried out by means of the alignment tool; in addition, appointed technical specialists were shown to be able to check and enrich models of recommissioning projects of the asset, and gradually enrich and refine such models by wondering throughout the asset and generating annotations.

More specifically, the last scenario mentioned in the above list concerns the recommissioning of two laboratory rooms of the Construction Division at the DICEA Department (Figure 9a). Such a recommissioning involves three categories of actions:

  • installation of a new air supply system in both rooms and modifications of the ceiling;

  • construction of a new partition that creates a new shared entrance room leading into the two laboratory rooms;

  • installation of two doors as the new entrances into the laboratory rooms.

Figure 9.

Current layout of the two laboratory rooms (a) and renovation model (b), both developed in a BIM authoring software tool.

The changes listed above are also depicted in Figure 9b, where the addition of a new shared entrance with two doors and the new air supply system are visible in the model. Both models have been uploaded on the web platform as shown in Figure 10a.

Figure 10.

Alternative design options uploaded on the web service for each project (a), and first-person view of the operator while wandering in the room and looking at the “Renovation Demo” design option for “UnivPM” project (b).

Comparing the two sides of Figure 9, possible critical issues that might emerge after an on-site assessment may relate to conflicts between the new partition and secondary elements of existing electrical and communication systems (e.g., switches and plugs). Also, the installation of the new modular partition requires that connections with existing prefabricated modular wall panels have been put in place, that must be constrained at specific locations and that may intersect other non-compatible elements. In addition, unacceptable clashes between the new partition, the air supply system and other systems accommodated in the ceiling panels may be detected. From a functional point of view, the size and relative positions of doors must be assessed in relation to the actual use of the shared room. In this regard, an immersive view, as the ones shown in Figures 10b and 11a, is required. In particular, the view in the second picture is able to show a clash between the virtual air inlet included in the recommissioning and the existing light appliance embedded in the ceiling. Finally, Figure 11b shows how the operator usually relates with the environment where he/she is immersed, thanks to the capability of selecting components and enriching them with annotations. In other words, this case study allowed developers to test the reliability of a step-by-step enrichment of the initial recommissioning model with specialists’ opinions and other information helpful to refine it and converge towards a correct and approved version of the model.

Figure 11.

First-person view of the on-site operator while checking a clash (a) and while recording an annotation with audio comments about the observed clash (b).

5.2 Assessment of results

As described in the previous section, the case study concerns two laboratory rooms of the DICEA Department at the Engineering Faculty of Università Politecnica delle Marche (Ancona, Italy) (see Figure 9a). For some reason, the dean of the faculty (the client) presents to the facility manager office the need to renovate the two rooms by creating a new shared room before the entrances of the two rooms, by installing two new doors in the new partition and a new air supply system.

The FM office, using the Desktop/Remote tools (see Section 4.2), creates a new project and registers the two NFC tags already placed inside the building by means of the web GUI in Figure 5. After an appropriate selection procedure for the design team, it hands them the requirements and a reference BIM (Figure 9a). This model also includes the NFC tags embedded in building components; in this, a couple of tags have been embedded in each partition wall.

After some time, the design team produces a design option (Figure 9b) that is submitted to the facility manager for approval. The latter uploads this option into the previously created project (Figure 5), and this event triggers IFC conversion into 3D holograms. The facility manager tells a technical specialist to evaluate the corresponding solution directly on site and provides her/him with a summary of the evaluation task (e.g. drawings with annotations in PDF format) in which tag locations and major renovations are highlighted.

The technical specialist, as soon as possible, goes onsite with the MR headset and the smartphone, selects a project among the available ones and then loads the renovation option through virtual menus depicted in Figure 8a and b. With the support of the drawings, It looks for the first NFC tag by sliding the smartphone over the wall and, when this displays a target image, the MR app surrounds it with a transparent blue square (Figure 7) and the user makes a tap gesture on it in order to lock the tag and store its position. Then, the same procedure is repeated for the second one (see Section 4.3.1).

After the second tag is locked, the holograms of the design option are automatically displayed superimposed on reality and the technical specialist is in an immersive view (Figure 10b). Wandering inside the building, the user checks the location and appearance of the new partition wall and door inside each room. He also checks that the new furniture does not interfere with the door opening space and wall plugs, which are not modeled in the IFC file. Then, he hides the countertop holograms (by air tapping the corresponding button of the virtual menu in Figure 8c) to uncover and check the new ventilation system above it. He notices that a ventilation outlet partially overlaps an existing light fixture (Figure 11a) and decides to leave an annotation for the facility manager. By making an air-tapping gesture on the socket, this object is highlighted and the recording menu is displayed (Figure 11b); then the operator selects the microphone button and starts talking.

After some time, the technical specialist decides to close the assessment as no problems with the other elements are detected. The facility manager, notified by the specialist of the conclusion of the assessment, opens up the Desktop/Remote tool (Figure 5), selects the “Annotation” entities, and checks the results of the assessment. By exploiting the embedded viewer, the user can browse the construction elements with annotations and listen to the audio comments by using the multimedia controls and the model viewer in Figure 6. The facility manager then asks the design team to fix the problem and to provide a new solution that could be eventually object for new assessment.

When the design options are finalized, the client itself can ask for an on-site survey in order to better evaluate the proposed solutions and to better make the final decision.

It is noteworthy that, as is often the case, due to lack of information at the design stage, the problem was only discovered on site. In fact, many elements of the building, such as lighting fixtures, plugs, etc., are fully defined only as built and are subject to many changes during the building’s life cycle. Since time and resources are always limited, it is not possible to capture all the details of the existing building during any preliminary surveys. This case study demonstrates that this situation is successfully solved by the ACOS platform, avoiding future constructability problems of the planned renovation and improving the final result in terms of execution time and quality.

Advertisement

6. Conclusions

One of the tasks of facility managers is the adaptation of facilities to the sudden changes they may undergo over their everyday operational process. This may trigger a recommissioning process that involves several parties, each implementing its own process, all steered by the facility management office. In order for this process to be successful, the ACOS platform implements mixed reality technology in a BIM environment to generate distributed collaborative construction processes and to ensure a timely information flow and enhanced collaboration among the parties. Technically, an information platform and an MR application supporting the most critical steps of the workflow were developed. The platform enables asynchronous collaboration between remote users (e.g. facility manager and owner) and on-site users (e.g. technical specialists). Thanks to the mixed reality app, appointed technical specialists can work in an immersive environment to enrich and refine BIM recommissioning projects directly on-site. Then, the outcomes of this task constitute the basis for the decision-making process charge of the manager and owner. Validation tests showed that the ACOS platform allows a better assessment process for the recommissioning workflow; it helps intercept design flaws at the early stages of the workflow and, as a consequence to that, it reduces efforts and costs involved in the recommissioning process.

Advertisement

Funding

This work was partially supported by the EU H2020 ENCORE Project (Grant Agreement No. 820434).

Advertisement

Thanks

The authors wish to thank the master’s students Eng. Eleonora Monaldi and Eng. Filipe Mate, who contributed to the technical setup of validation tests.

References

  1. 1. Brunet M, Motamedi A, Guenette LM, Forgues D. Analysis of BIM use for asset management in three public organizations in Québec, Canada. In: Built Environment Project and Asset Management. 2019;9(1):153-167
  2. 2. Patacas J, Dawood N, Kassem M. BIM for facilities management: A framework and a common data environment using open standards. Automation in Construction. 2020;120:103366
  3. 3. Pishdad-Bozorgi P, Gao X, Eastman C, Self AP. Planning and developing facility management-enabled building information model (FM-enabled BIM). Automation in Construction. 2018;87:22-38
  4. 4. 2013. Available from: http://docs.buildingsmartalliance.org/MVD_COBIE/
  5. 5. Matarneh ST, Danso-Amoako M, Al-Bizri S, Gaterell M, Matarneh R. Building information modeling for facilities management: A literature review and future research directions. Journal of Building Engineering. 2019;24:100755
  6. 6. Jurado D, Jurado JM, Ortega L, Feito FR. GEUINF: Real-time visualization of indoor facilities using mixed reality. Sensors. 2021;21:4
  7. 7. Marmo R, Polverino F, Nicolella M, Tibaut A. Building performance and maintenance information model based on IFC schema. Automation in Construction. 2020;118:103275
  8. 8. Jaselskis EJ, El-Misalami T. Implementing radio frequency identification in the construction process. Journal of Construction Engineering and Management. 2003;129(6):680-688
  9. 9. Tzeng CT, Chiang Yc, Chiang Cm, Lai Cm. Combination of radio frequency identification (RFID) and field verification tests of interior decorating materials. Automation in Construction. 2008;18(1):16-23
  10. 10. Naranje V, Swarnalatha R. Design of tracking system for prefabricated building components using RFID technology and CAD model. Procedia Manufacturing. 2019;32:928-935
  11. 11. Collins J, Regenbrecht H, Langlotz T. Visual coherence in mixed reality: A systematic enquiry. Presence: Teleoperators and Virtual Environments. 2017;26(1):16-41
  12. 12. Milgram P, Kishino F. A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems. 1994;77(12):1321-1329
  13. 13. Muhammad AA, Yitmen I, Alizadehsalehi S, Celik T. Adoption of Virtual Reality (VR) for site layout optimization of construction projects. Teknik Dergi. 2020;31(2):9833-9850
  14. 14. Riexinger G, Kluth A, Olbrich M, Braun JD, Bauernhansl T. Mixed reality for on-site self-instruction and self-inspection with building information models. In: Procedia CIRP 51st CIRP Conference on Manufacturing Systems. 2018
  15. 15. Ammari KE, Hammad A. Collaborative BIM-Based Markerless Mixed Reality Framework for Facilities Maintenance. Computing in Civil and Building Engineering. Edited by Raymond Issa Issa and Ian Flood. Orlando, Florida: ASCE; 2014. pp. 657-664
  16. 16. Farghaly K, Abanda H, Vidalakis C, Wood G. BIM big data system architecture for asset management: a conceptual framework. In: Proceedings of the Joint Conference on Computing in Construction Congress (JC3). Heraklion, Greece; 2017. pp. 289-296
  17. 17. Chalhoub J, Ayer SK, Ariaratnam ST. Augmented reality for enabling un-and under-trained individuals to complete specialty construction tasks. Journal of Information Technology in Construction. 2021;26:128-143
  18. 18. Blank C, Eckhoff M, Petersen I, Wege R, Wendholt B. Distributed Collaborative Construction in Mixed Reality. In: Proceeding of ACHI. Lisbon, Portugal; 2015
  19. 19. El Ammari K, Hammad A. Remote interactive collaboration in facilities management using BIM-based mixed reality. Automation in Construction. 2019;107:102940
  20. 20. Ratajczak J, Riedl M, Matt DT. BIM-based and AR Application Combined with Location-Based Management System for the Improvement of the Construction Performance. Buildings. 2019;9(5):118
  21. 21. Cotts D, Roper K, Payant R. The Facility Management Handbook. New York, NY: AMACOM; 2010
  22. 22. Huang Y. Evaluating mixed reality technology for architectural design and construction layout. Journal of Civil Engineering and Construction Technology. 2020;11(1):1-12
  23. 23. Trimble Inc. Aligning a Model to Site in Trimble Connect for HoloLens. 2020. Available from: https://www.youtube.com/watch?v=AUfv2dRGcNc [Accessed: May 27, 2022]
  24. 24. Trimble Inc. Aligning Models Using Markers in Trimble Connect for HoloLens. 2020. Available from: https://www.youtube.com/watch?v=n5TKyeQ8QOU [Accessed: May 27, 2022]
  25. 25. Azuma R, Baillot Y, Behringer R, Feiner S, Julier S, MacIntyre B. Recent advances in augmented reality. IEEE Computer Graphics and Applications. 2001;21(6):34-47
  26. 26. Costanza E, Kunz A, Fjeld M. Mixed reality: A survey. In: Lalanne D, Kohlas J. (eds). Human Machine Interaction. Lecture Notes in Computer Science, vol 5440. Berlin, Heidelberg: Springer; 2009. pp. 47-68
  27. 27. Naticchia B, Vaccarini M, Corneli A, Messi L, Carbonari A. Leveraging Extended Reality technologies with RFID to enhance on field maintenance of buildings. In: Proc. of the 38th International Conference of CIB W78, Luxembourg; 2021. pp 378-387
  28. 28. Carbonari A, Vaccarini M. Pictures and Videos Collected during ODAVS Activity. Zenodo; 2022; Available from: https://doi.org/10.5281/zenodo.6531860 [Accessed: May 27, 2022]

Written By

Massimo Vaccarini, Alessandro Carbonari, Francesco Spegni and Alberto Giretti

Submitted: 14 June 2022 Reviewed: 17 June 2022 Published: 07 September 2022