The human exploration of the Universe is a real challenge for both the scientific and engineering communities. The space technology developed till now allowed scientists to achieve really outstanding results (e.g. missions around and landing on the Moon, the International Space Station as an outpost of the human presence, satellites and spaceships investigating and exploring Solar System planets as well as asteroids and comets), but at the same time further steps are required to both overcome existing problems and attain new and exceptional goals. One of the harshest trouble is the operative environment in which astronauts and rovers have to work in. Indeed, the outer space and extra–terrestrial planets have such different physical properties with respect to Earth that both space machinery has to be conceived selectively and manned crew has to be suitably trained to adapt to it. Nevertheless the entire product assembly integration and test campaign are made on Earth in 1G. Given so different ambient conditions, each phase in the whole life cycle of a space product is thorny and tricky and should be therefore carefully engineered. In particular, testing and operative phases could involve the most of risks because of the different product environmental conditions. Micro–or zero gravity environments are both impossible to be found and tough for a practical and realistic reproduction on Earth. In the past, for astronaut’s tests, only parabolic flights and underwater conditions lead to some limited success, but their drawbacks – especially related to costs and dangerous situations – exceeded all the possible benefits and therefore, nowadays, they have a limited use.
The outstanding development of computer science techniques, such as virtual and augmented reality, has provided an interesting way to deal with such problems. Indeed, computer simulation of (portions of) real worlds is currently the best practice to feel one’s concrete presence into a totally different environment, without being there physically. Virtual Reality (VR in the following) high realism, immersion and easiness in interaction are the key ideas for such applications. Realism is to faithfully reproduce those environments, but not only from a graphical point of view: physical/functional simulation of objects presence and behavior/(mutual) interaction is fundamental too, especially for those disciplines (e.g. electrical, thermal, mechanical) heavily based on ambient reactions. Immersion is useful to enhance perception of the new environment and allows users (e.g. Astronauts, Engineering disciplines, Manufacturing) to behave as if it was their real world. At the end, interaction is the user’s capability of communication with the simulation: the easier, the more effective and expressive the experience; the more intuitive, the less the amount of time required by (specialist and unskilled) users for practicing it; the more suitable, the better VR capabilities at one’s disposal are exploited.
The space industry can largely benefit of the virtual simulation approach. In this context, the main help for aerospace disciplines is related to improve mission planning phase; its advantages are to allow realistic digital mock–up representations, provide collaborative multidisciplinary engineering tasks, and simulate both critical ground and flight operations. But benefits can arise in a number of other ways too. For instance, due to the one–of–a kind nature of space products, the only product after the spacecraft launch available on ground is its digital representation. Second, complex, scientific data can be successfully represented by VR applications. Indeed, data belonging to astrophysics and space mission domains are usually very hard to be understood in all their relationships, essence and meaning, especially those ones describing invisible properties such as radiations. In that sense, a suitable, graphical representation can help scientists (and not specialized audience too) to improve their knowledge of those data. Finally, VR laboratories can be organized to host virtual training of human crews, by exploiting their capability to direct interaction and physical behavior simulation ().
Thales Alenia Space – Italy (TAS – I from now on) experience in virtual reality technologies is mainly focused on considerably enhancing the use of such tools. Two main research branches can be found there: user interaction with the virtual product/environment, and data cycle (that is from their production till their exchange among engineering teams) management. In the former case, the research is devoted to virtual reality technologies themselves – with an emphasis on the way to visualize different scenarios and large amounts of data – while in the latter case the focus is on the system data modeling. When put together, they shall converge towards a complex system architecture for a collaborative, human and robotic space exploration (see  for a more detailed insight). Our vision entails an unique framework to enforce the development and maintenance of a common vision on such a complex system. Therefore, recent advances in the entertainment and games domains coexist alongside the most up–to–date methodologies to define the most complete and reliable Model–Based System Engineering (MBSE) approach. This multidisciplinary view shall have an impact on the way each actor could conceive its own activity. For instance, engineering activity will benefit from such representation because the big picture could be at its disposal at any level of detail; it should be easier to prevent possible problems by detecting weakness and critical points; it should improve the organization of the entire system, and check whether all the mandatory requirements/constraints are met. Astronauts themselves find this is a worthwhile experience to gain skills and capabilities, especially from a training viewpoint. At the end, scientific missions could be planned more carefully because the simulation of several scenarios requires a fraction of time, can be easily customized by suitable set of parameters, and provides valuable feed–backs under several forms (e.g. especially simulation data and sensory perceptions). Collecting and analyzing data and information from there could help to diminish the crash and failure risks and consequently increase the chance the mission targets will be really achieved. For all the aforementioned reasons, the pillars of our research policy comprise, but they are not limited to: concurrent set–up and accessibility; several elements of 4D (space + time), 3D and 2D features for data manipulation and representation; exploiting immersive capabilities; easiness in interfacing with highly specialized tools and paradigms; user friendly capabilities; adaptability and scalability to work into several environments (e.g. from desktop workstations to CAVEs).
This chapter is organized as follows: paragraphs 2 and 3 will introduce the space domain context and the current state of art of VR systems in this field. The focus will be put especially on its requirements, the key points of view of our researches and the objectives we are going to meet. Progresses in VR field focused on the collaborative features and the interdisciplinary approach put in practice are described in 4. Section 5 is instead targeted on modeling complex space scenarios, while paragraph 6 some practical example of space applications are given. The chapter is ending up with section 7 illustrating both some final remarks and a possible road–map for further improvements and future works.
2. Motivation and goals
The aerospace domain embraces an enormous variety of scientific and engineering fields and themes. Given the intrinsic complexity of that matter, space challenges could be successfully tackled only when all the inputs from those disciplines are conveniently melt together in a collaborative way. The lack of both a common vision and suitable management tools to coordinate such several subjects can indeed limit the sphere of activity and the incisiveness of researches in space sciences. Computer science in general and VR in particular could really play an indispensable role for space scientists towards a significant qualitative leap.
To show up the previous assessment, we chose to discuss a specific theme from a practical point of view: the simulation of hostile environments. In this context, the term "hostile" is referring without loss of generality to those places where either stable or temporarily human presence is extremely difficult because of harsh physical conditions. Indeed, that is really the case of outer space and extra–terrestrial planets, but that definition could be extended to some environments on Earth too. In the latter case, it could even denote built–up areas after an outstanding event, such as natural disasters, and temporally unreachable, with limited communication links, or altered topography. Virtual reproduction of such environments is a particularly interesting activity according to several points of view. For that end, three main steps could be outlined. For each of them, examples will be discussed in detail aiming at presenting our
3. State of the art
The use of virtual reality (VR) and immersive technologies for the design, visualization, simulation and training in support to aerospace research have become an increasingly important medium in a broad spectrum of applications, such as hardware design, industrial control and training for aerospace systems or complex control rooms. VR applications provide a panorama of unlimited possibilities for remote space exploration, and their flexibility and power can impact many aspects of future space programs and missions. Modeling and interactive navigation of virtual worlds can provide an innovative environment which can be thought as an excellent medium for brainstorming and creation of new knowledge, as well as to synthesize and share information from variety of sources. Nevertheless, they can serve as a platform to carry out experiments with greater flexibility than those ones conducted in real world. In this section, a review of the projects and works related to the use of VR in the fields of (1) planet rendering (see section 3.1), (2) remote space exploration (see section 3.2 ) and (3) virtual prototyping (see section 3.3).
3.1. Planets rendering
Recently Google Earth, one of the most popular work related to the visualization of the terrestrial environment in 3D, has enabled users to fly virtually through Mars and Moon surfaces, providing a three-dimensional view that aid public understanding of space science. Moreover, it has given to researchers a platform for sharing data similar to what Google Earth provides for Earth scientists. The Mars mode includes global 3D terrain, detailed maps of the Mars rover traverses and a complete list of all satellite images taken by the major orbital camera. Likewise the Moon mode includes global terrain and maps, featured satellite images, detailed maps of the Apollo surface missions and geologic charts. Similar purposes and results are achieved by a number of 3D astronomy programs and planetarium software. The limited 3D modeling capabilities is their major drawback, nonetheless their usefulness in terms of public outreach has been definitively demonstrated by the increasing interest among the public audience in the space exploration.
In any case, they are somehow limited in providing suitable space industry services. The importance of supporting scientists and engineers work by highly- specialized, immersive facilities is a milestone at Jet Propulsion Labs and clearly described, among the others, in . In this paper, the authors remark the contributions of 3D Martian soil modeling in the success in accurately planning the Sojourner rover's sorties during Mars Pathfinder Mission. The need for a well-structured and comprehensive reproduction of large amount of data, collected during Mars probes (especially Mars Pathfinder and Mars Global Surveyor missions) brought researchers to lay stress on VR coupled to astronomy and cartography applications (). Indeed, frontiers of knowledge can achieve unprecedented and amazing results as coupled with tailored VR tools: new research directions spent their efforts both to increase the overall visual quality of the virtual scenes (e.g. see ) and improve the user's interaction with those VR facilities (e.g.  and ). In particular, the first real, immersive environment is the one described in Head's work . His ADVISER system (Advanced Visualization in Solar System Exploration and Research) was conceived as a new form of problem- solving environment, in which scientists can directly manipulate massive amounts of cartographic data sets, represented as 3D models. Its novelty was in integrating both hardware and software technologies into a very powerful corpus, being able to extend and improve scientists' capabilities in analyzing such data as they were physically on its surface. On the other hand, a first attempt to place side by side virtual and augmented reality tools was described in . In order to enrich the users' experience, the authors created MarsView, where they added a force feedback device to a topographic map viewer. Thus, the haptic interface favors a more intuitive 3D interaction in which the physical feeling allows the users to actually touch the Martian surface as they pan around and zoom in on details. The golden age for Mars exploration from late 1990s onward has really generated an impressive data mole, whose main challenge is represented by its analysis tools. In this sense, examples above can illustrate how it could be efficiently faced by exploiting simulation and interaction capabilities. Nowadays, they are considered as unavoidable winning points for time saving, effectiveness, and catching complex interactions and relationships skills.
3.2. Virtual remote space exploration
Interactive 3D computer graphics, virtual worlds and VR technology, along with computer or video games technology support the creation of realistic environments for such tasks as dock landing, planetary rover control and for an effective simulation of the space–time evolution of both environment and exploration vehicles. In  the major characteristics of the available virtual worlds are described, along with the potential of virtual worlds to remote space exploration and other space–related activities. Here, a number of NASA sponsored activities in virtual worlds are described, like ’NASA CoLab Island’ and ’Explorer Island in second life’ (the latter providing spacecraft models and Mars terrain surface model based on real NASA data), ’SimConstellation’ that explores a broad range of lunar mission scenarios and ’SimStation’, simulating the operation of the ISS, and training the astronauts to work on the space shuttle and space station. This work also describes some tools for virtual space activities, including Google Mars 3D and Google Moon. Landing on planets and their later exploration in space missions requires precise information of the landing zone and its surroundings. The use of optical sensors mounted to the landing unit helps to acquire data of the surface during descent. The retrieved data enables the creation of navigation maps that are suitable for planetary exploration missions executed by a robot on the surface. In  a Virtual Testbed approach is used to generate close–to–reality environments for testing various landing scenarios, providing artificial descent images test–data with a maximum of flexibility for landing trajectories, sensor characteristics, lighting and surface conditions. In particular, a camera simulation is developed including a generic camera model described by a set of intrinsic parameters distortions; moreover, further camera effects like noise, lens flare and motion blur can be simulated, along with the correct simulation of lighting conditions and reflection properties of materials in space. Besides these images are generated algorithmically, the known data in the Virtual Testbed can be used for ground truth verification of the map–generation algorithms. The work in  describes a Human Mars mission planning based on the Orbiter space flight simulator, where the authors have used Orbiter to create and investigate a virtual prototype of the design reference mission known as ’Mars for Less’. The Mission Simulation Toolkit (MST)  is a software system developed by NASA as a part of the Mission Simulation Facility (MSF) project, which was started in 2001 to facilitate the development of autonomous planetary robotic missions. MST contains a library that supports surface rover simulation by including characteristics like simulation setup, controls steering and locomotion of rover, simulation of the rover/terrain interaction, power management, rock detection, graphical 3–D display. In another work carried out by NASA Ames Research center , a visualization and surface reconstruction software for Mars Exploration Rover Science Operations is analyzed and described. It is based on a ’Stereo–pipeline’, a tool that generates accurate and dense 3D terrain models with high–resolution texture–mapping from stereo image pairs acquired during Mars Exploration Rovers (MER) mission. With regard to lunar environment modeling, a realistic virtual simulation environment for lunar rover is presented in , where Fractional Brown motion technique and the real statistical information have been used to modeling the lunar terrain and stones, forming a realistic virtual lunar surface, where main features may be easily expressed as simulation parameters. In this work a dynamics simulation model is developed considering the mechanics of wheel–terrain interaction, and the articulated body dynamics of lunar rover’s suspension mechanism. A lunar rover prototype has been tested in this environment, including its mechanical subsystem, motion control algorithm and a simple path planning system.
3.3. Virtual prototyping
Prototypes or mock–ups are essential in the design process . Generally a mock–up involves a scale model, more frequently full size, of a product. It is used for studying, training, testing, and manufacturability analysis. Prototyping, which is the use of mock–ups for designing and evaluating candidate designs, can occur at any stage of the design process. In a later stage, mock–ups are already completed in every detail and can be used for testing ergonomic aspects. However, physical prototypes can be expensive and slowly to be produced and thus can lead to delays in detecting eventual problems or mismatches in the solution under development.
Computer science offers the opportunity to reduce or replace physical prototypes with virtual prototypes (VP). A VP is a computer–based simulation of a physical prototype and having a comparable degree of functional realism than a physical prototype but with the potential to add some extra functionality. By using VP, different design alternatives can be immediately visualized, allowing users to give real-time feedback about the design alternatives and their use. Furthermore, changes to the solutions can be made interactively and more easily than with a physical prototype, which means that more prototypes can be tested at a fraction of time and costs required otherwise. The last feature is particularly crucial for the development of ’one–of–a–kind’ or ’few–of–a–kind’ products.
The use of VR can contribute to take full advantage of Virtual Prototyping. In order to test the design optimization of a VP product in the same way as the physical mock–up, a human–product interaction model is required. In an ideal way, the VP should be viewed, listened, and touched by all the persons involved in its design, as well as the potential users. In this scenario VR plays a meaningful role since it can allow different alternative solutions to be evaluated and compared in quite a realistic and dynamic way, such as using stereoscopic visualization, 3D sound rendering and haptic feedback. Therefore VR provides a matchless and more realistic interaction with prototypes than possible with CAD models .
By using VR tools, not only aesthetic but also ergonomic features could be evaluated and optimized. There are several approaches for the ergonomic analysis in a VR scenario. The first involves having a human operator interacting with the VE through haptic and/or tactile interfaces and the second is based on human virtual models that will interact with the VP, in a pure simulation technique. These human virtual models can be agents, that are created and controlled by the computer, or avatars, controlled by a real human.
4. VR for collaborative engineering
Model Based System Engineering (MBSE) is the term currently used to represent the transition between system data management through documents (e.g. specifications, technical reports, interface control documents) to standard–based semantically meaningful models, to be processed and interfaced by engineering software tools. MBSE methodologies enable a smoother use of VR in support to engineering teams, representing one of the most interesting applications.
The core of a MBSE approach is the so–called system model, that is the collection of different models, representing one of the possible baselines of the product, and formally describing the different, characterizing features throughout the product life cycle. In particular MBSE provides a consistent representation of data from the system requirements to the design and analysis phases, finally including the verification and validation activities. With respect to a more document–centric approach, the different characteristics of a product are defined more clearly, from its preliminary definition up to a more detailed representation. This shall ensure less sensitivity to errors than the traditional document–centric view, still widely used for system design. MBSE methodologies have highlighted the capability to manage the system information more efficiently compared to the existing approaches. This process allows introducing advantages that draws attention particularly for commercial implications. Indeed, since the last decade many industrial domains have been adopting a full–scale MBSE approach through their research, developments and applications, as demonstrated by INCOSE (International Council of System Engineering, ) initiatives in that sense. There is not a unique way to approach MBSE. The main discriminating factor is the definition of concepts, as a semantic foundation derived from the analysis of the system engineering process. The resulting conceptual data model shall be able to support the product and process modeling, with a particular emphasis on the data to be exchanged during the engineering activities, considering both people and computer tools. The selection or definition of the modeling and notation meta–models is specific to the needs of a particular domain, and even engineering culture, but it shall be compatible with current efforts, so to assure compatibility between tools and companies. A joint team from TAS – I and Politecnico di Torino is currently involved in researches focusing on the latest developments in this domain, with a particular emphasis on active participation on the related European initiatives. For instance, worthwhile experiences are: the Concurrent Design Facilities for the preliminary phases (lead by ESA experience in its CDF , but also in the ASI CEF&DBTE  and in industrial practices inside TAS – I); the ESA Virtual Spacecraft Design on–going study for more advanced phases . The current developments have the objective to summarize the above mentioned initiatives, giving the possibility to be in line with the ongoing standardization and language definition efforts (e.g. ECSS–E–TM–10–25, ECSS–E–TM–10–23 (), OMG SysML , Modelica ). The definition of a system model generally involves several engineering disciplines in a deeper way with respect to the traditional approach. The project team is composed by experts belonging to engineering and/or scientific areas that are very different among them. In this context the VR definitely becomes a useful tool in the management of data available, providing the technology necessary for effective collaboration between different disciplines. The VR allows viewing directly data and information that are often difficult to read for those who may not have technical background but who are otherwise involved in the design process of a given system.
The MBSE methodology is commonly characterized by the definition of all the processes, methods and tools that allow supporting and improving the engineering activities. In particular it is possible to consider some of experiences that are evolving within various organizations’ system engineering structure and procedures and that are spreading through technical publications and studies. For instance Telelogic Harmony–SE® represents a subset of a well-defined development process identifiable with Harmony® . In this case activities as requirements analysis, system functional analysis and architectural design are properly related each other within the context of life cycle development process. Another example may be expressed with INCOSE Object–Oriented Systems Engineering Method (OOSEM). The model–based approach introduced is characterized by the use of OMG SysML™ as an instrument to outline the system model specification. This language enables a well-defined representation of the systems, supporting the analysis, design and verification activities . IBM Rational Unified Process for Systems Engineering (RUP SE) for Model–Driven Systems Development (MDSD) may be considered an interesting methodology similarly to the examples considered above. In particular this process is derived from the Rational Unified Process® (RUP®) and it is used for software development in the case of government organizations and Industrial . Vitech Model–Based System Engineering (MBSE) Methodology is another example where a common System Design Repository is linked to four main concurrent activities defined as: Source Requirements Analysis, Functional / Behavior Analysis, Architecture / Synthesis and finally Design Validation and Verification . The elements that characterized the methodologies presented above as other similar initiatives are particularly suitable for the management of complex situations, which are difficult to handle when the product development progresses over time. For instance the study of hostile environments, such as the analysis of certain space mission scenarios, generally leads to the definition of high complexity systems. In this case the management of a considerable amount of data through a coherent and flexible way has expedited the spread of model–based methods. The growing complexity of systems that are analyzed often becomes increasingly too difficult to realize a proper collaboration, avoiding at the same time potential design errors. The MBSE provides the necessary tools to formally relate the possible aspects of a given system. A representation through the techniques of VR about hostile environments, as well as a similar view of the data generated, points out many advantages. The VR allows, in relation to the structure data available through MBSE approach, to define in an extended manner the system architecture, while ensuring greater availability of information. Another benefit is also linked to the clarity with which the VR allows to report for instance the development phases of a given system. Virtual model directly connected to the network of information of a unique data structure also ensures access to the most current representation of the system.
Based on the progress made in recent years VR allowed to generate an ever more faithful representation of the reality about the possible physical phenomena that are analyzed. In this manner it is therefore possible to consider the generation of virtual environments where to conduct realistic simulations of possible scenarios in which the system can potentially operate so making use of the time variable (the 4D). The advantages related to this capability are highlighted in the ability to reproduce situations for which the construction of a real mock–up requires substantial economic investment. This becomes evident especially in the aerospace industry where both the complexity of the systems involved, the high amount of changes to manage and the possible operational scenarios require a limitation of the physical prototypes that are built. Today space domain is becoming a free worldwide market so there is a clear trend towards a reduction of economic costs that are incurred during the project and that most affect the tests that are made on real physical systems. The generation of virtual models has also the advantage to be able for example to analyze directly different possible design alternatives. Through the use of VR in fact more people may be involved at the same time in project activities for which there are discussions about equivalent system configurations. Generally the development of virtual environments becomes necessary when there is the need to face critical situations. VR in fact allows considering environments that commonly are not possible to reproduce on Earth, as for instance in the case of space mission scenario: gravity, dust. In a virtual model it is possible instead to recreate some of the characteristic features that we can potentially find during these situations. Moreover it is possible to manage the system variables to proper modify the scenario, considering in this manner other different conditions for the system under analysis. This capability could be difficult to reproduce with real physical elements mainly because of the economic investment that would require. The simulations that can be realized in VR environment allows also to avoid all the possible unsafe situations for the possible user. This characteristic becomes of particular interest for human space activities where often certain actions may lead to harmful situations.
MBSE techniques applied to space projects are often associated to 2D diagram–based models (e.g. an activity diagram in SysML, a control loop visualized in Simulink), or to 3D virtual models (e.g. a virtual mock–up built with a CAD application, multi–physics analysis visualized with CAE tools). These visualization techniques reached a high degree of maturity in the last decade, deriving from different experiences performed at discipline level. Just as an example, a SysML–like representation is closer to a software engineer than to a mechanical engineer. In a multidisciplinary team, the integration of discipline–level defined data in a system–level Virtual Environment represent an effective way to assure the full understanding by the whole team of the key system issues, representing a WYSIWYG at product level, such as a modern word processor is for a document. Figure 2 shows a simplified example of integration of tools in VR. The CAD model is used to define the physical configuration, and retrieve the related drawing. Current applications allow the user to calculate and/or store in the same CAD model also relevant properties, such as mass, moments of inertia (MOI), center of gravity position. Such values are of interest of the whole team and through dedicated interfaces those properties may be extracted and related to the system architecture (product structure, interfaces between elements). If in the same integrated environment the CAD model is linked with the system model providing input for simulations (e.g. mass properties for spacecraft dynamics) then the Virtual Environment allows a project team to visualize them in the same place.
The above mentioned approach may be used to visualize products and their properties (with precise values, such as mass properties or nominal values). As far as the product elements are linked with the virtual reality elements, also their behavior may be associated through the related parameters (e.g. instantaneous position). Behaviors are represented by functions (e.g. Provide Locomotion, with related ports with the Distribute Electrical Energy function, and the Environment functions for the terrain). Each function (or composition of functions) can be represented by a model able to provide simulation capabilities. Figure 3 shows an example at data level of linking between virtual reality and Modelica code through the system model. The integration of simulation models allow the Virtual Environment to be the collector of engineering discipline analysis, but a complete system level simulator is still far to be implemented in such way and it is subject of our current research. The integration of several simulations requires a simulation process manager and a revision of the simulation models to be able to include the multi–effects. As explained in previous sections, the virtual environment may contain own simulation capabilities, thanks to an embedded physical engine, able to simulate e.g. collisions, dynamics, soft bodies. These features may be used for a rapid prototyping of the simulation, providing rapid feedback during concept and feasibility studies, as well as during the evaluation of alternatives.
Product and operational simulations does not saturate the VR support capabilities for a project team. The use of the VR with embedded simulation capabilities may also be used to validate part of the AIT (Assembly Integration and Test) planning, supporting the definition and simulation of procedures, or for training purposes. Procedures can be created in VR, they can be validated and then made available using Augmented Reality (AR) format so that to guide hands free assembly task execution (see Figure 4).
5. Modeling environments
Since space environments are extreme with respect to Earth's ones, a careful model of them is mandatory before undertaking any scientific mission. The study of real operative conditions spans from understanding physical laws to defining geological composition of the surface, from measuring magnetic fields to analyze natural phenomena. Of course, the better the knowledge, the greater the likelihood to succeed in a mission. That is, failure factors such as malfunction, mechanical crashes, accidents and technical unsuitability are less likely to happen, while crew safety, support decision optimization, costs reduction and scientific throughput and outcome will increase consequently. The added value of VR in this context is its ability in supporting this need for realism in a smart and effective way.
5.1. Physic laws
Technically speaking, a physic engine is a software providing a numerical simulation of systems under given physical laws. The most common dynamics investigated by such engines comprise fluid and both rigid and soft bodies dynamics. They are usually based on a Newtonian model and their contribution to virtual worlds is to handle interactions among several objects / shapes. This way it is possible to model object reactions to ambient forces and therefore create realistic and complex software simulations of situations that might be hardly reproduced in reality: for instance, by changing the gravity constant to the Moon one (that is more or less one sixth of the terrestrial value), it is possible to handle objects as they were really on Earth’s satellite; similarly, precise space module conditions could be achieved in order to train astronauts in a (close to) zero gravity environment. The great advantages of these solutions are cheapness, flexible customization and safety. Indeed, with respect to other common solutions usually adopted, such as parabolic flies, they do not require expensive settings to work - a modern PC with standard hardware, graphical card and processing power is more than enough to perform simulations of medium complexity. At the same time, setting–up virtual world behaviors relies mainly on customizable parameters as inputs for the simulation algorithms. Lastly, digital mock–ups can be stressed out till very extreme conditions without their breaking physically occurs. And also final users are not subject to any risks while they are facing a simulation.
The two main components a modern physics engine typically provide, concern rigid body dynamics, that is a collision detection/collision response system, and the dynamics simulation component responsible for solving the forces affecting the simulated objects. More complex cores allow engines to successfully deal with particle/fluid, soft bodies, joints and clothes simulations. Given all those features, it appears clear why a physic engine allows studying natural and artificial phenomena with ambient conditions that are different from the Earth ones: for example, testing dust behavior at gravity conditions on Mars (natural phenomena), or driving a Martian rover acting on velocity, friction and external forces (artificial phenomena). Virtual reality simulations are so flexible that specific and reiterated tests could be performed several times in a row. This could be accomplished for a variety of scenarios: for instance, training crew in performing particular difficult actions could lead to find the best practice for a given task; simulating different terrain conformations could help in finding possible troubles on the way of an autonomous, robotic vehicle; pushing the use of some mechanical component to the limit could suggest how resilient it is to external stresses, its risk threshold and so on.
When physic engine results are connected to suitable input/output devices being able to return perceptions to the user, then the realism of the simulation is definitely increasing. Therefore, feedbacks making the user feel lifelike forces and sensations (e.g. bumps of an irregular terrain while driving a rover or the weights in moving objects) push further specific studies in complex fields. For example, by means of haptic feedback device and motion capture suite it is possible to perform ergonomic and feasibility studies (i.e.: reachability test to check if an astronaut is able to get to an object and then to perform a particular action like screwing a bolt). On the other side, a primary limit of physics engine realism is the precision of the numbers representing the positions of and forces acting upon objects. Direct consequences of this assertion are: rounding errors could affect (even heavily when precision is too low) final computations and simulated results could drastically differ from predicted ones, if numerical (small) fluctuations are not properly taken into account in the simulation. To avoid such problems, several tests on well-known phenomena should be performed before any other simulation in order to detect the margin of error and the index of trustfulness to count on.
5.2. Terrain modeling
To model planetary surfaces like the Moon and Mars ones, a Digital Elevation Model (DEM) is required. Technically speaking, it looks like a grid or a raster-graphic image where elevation values are provided at regularly spaced points called posts. Reference DEMs come from NASA High Resolution Imaging Science Experiment and Lunar Reconnaissance Orbiter missions (HiRISE  and LRO  respectively) and represent the most up-to-date and precise advances in space geology measurements and cartographic imagery. In general, ground data can be derived at a post spacing about 4X the pixel scale of the input imagery. Since HiRISE images are usually between 0.25 and 0.5 m/pixel, each pixel describes about 1-2 m. Vertical precision is then also very accurate, being in the order of tens of centimeters. The altitude computation is a very time intensive procedure and requires several stages as well as careful pre–and post–processing data elaboration, sophisticated software, and specialized training. During this process, image elaborations techniques could inherently introduced some artifacts but despite this fact, a near-optimal reconstruction satisfy modeling constraints is largely possible. For more detailed information about the complete (Mars) DEM computation process, see  and the on-line resources at . Instead, for a visual reference, look at Figure 5.
Inserting a terrain model into a virtual scene is only the first step we perform to achieve environmental reconstruction. Indeed, the description of a planet could be more complicated than it appears at a first glance. In the next sub–sections, we will describe how to enrich the simulation of a planetary terrain by inserting more typical landscape elements and modeling natural phenomena occurring on their surfaces.
Almost every image taken from astronauts and/or robotic instrumentation shows Mars (and somehow the Moon too) to be a very rocky planet. But those details do not appear into reference DEMs, despite their astonishing resolution. Even if those small details cannot (still) be caught by advanced laser instrumentation, the presence of rocks and stones poses a severe challenge for robotic equipment because they increase the chance of a mechanical crash in case of collisions. Then, for the sake of a better plausibility, we have to add rock models on the so far reconstructed surface. In that sense, studies made for Mars, like  and , are really useful because they describe a statistical distribution of them, with a particular emphasis of those terrains visited during rover missions, like the Pathfinder site. Moreover they can estimate both the density and rock size-frequency distributions according to simple mathematical functions, so that a complete description of the area is furnished. Those data turn to be really useful especially during landing operations or when a site has to be explored to assess the risks in performing exploration tasks. For instance, those model distributions estimate that the chance for a lander impacting a >1 m diameter rock in the first 2 bounces is <3% and <5% for the Meridiani and Gusev landing sites, respectively.
Our 3D rock models are inserted onto the terrain by following that statistical approach and according to specific site parameters such as the total number of models, size and type. During simulation sessions, that distribution could change. The aim is clearly at forcing operational situations in order to analyze reactions of the simulated equipment in hardly extreme conditions. In particular, thanks to the collision detection engine, it is possible to evaluate impact resistance factors to guarantee the highest level of safety ever. From a modeling point of view, the rock generation procedure could be summarized as follows: i) generate a random set of points (rock vertices) in a given 3D space; ii) compute the convex hull in order to create the external rock surface; iii) compute the mesh of the given volume; iv) adjust and refine themodel (e.g., simulate erosion or modify the outer appearance with respect to shape and roundness) in order to give it a more realistic look; v) statistically compute the site on the planet surface where the rock will be laid; vi) put the rock onto that site according to the normal direction in that point. Examples of rock skeletons (that is after the first three steps of the previous algorithm) are shown in figure 6, while complete rocks can be seen in many figures spread across this paper.
Another issue is represented by the presence of an huge quantity of dust laying down on the soil. When any perturbation of the stillness state occurs (such as the rover transit or an astronaut’s walk), a displacement of an huge amount of small, dusty particles is caused: they could form big clouds raising up quickly and being in suspension for a long time period after (because of the lower gravity). Scientific literature describes this phenomenon mainly for the Moon because of the several lunar missions undertaken in 70s and 80s. For instance, studies like ,  and  show in details the typical behavior of dust when its particles are emitted by a rover wheel: schemes and formulas are then given (for instance, to determine the angle of ejection or the distance a particle covers during its flight) with the aim of characterizing this unavoidable effect, which should definitely modeled in our simulations since it affects any operational progress. Indeed both the visual appearance and the physical behavior of dust have to be carefully represented. In the former case, to test driving sessions under limited conditions in the vision field or to find a set of man-oeuvre being able to lift the smallest quantity of dust as possible. In the latter case, because avoiding malfunctions, especially for those modules directly exposed to dust interaction (e.g. solar panels, radiators and wheels joints), is still a high-complex engineering challenge.
5.2.3. Atmosphere events
A thin atmosphere is surrounding Mars. Even if it could not be compared to the Earth’s one, some weak weather activities happen all the same in it, so that winds blow and seasons rotate. The presence of winds in particular could be considered as an issue, especially during some thorny task performance, like a capsule landing. Therefore even this new factor should be simulated efficiently.
The Mars Climate Database (MCD, ) offers an interesting set of data particularly suitable for that purpose. Indeed, it collects several observations (e.g. temperature, wind, chemical composition of the air and so on), caught at different sites and over periods, and focusing towards the definition of a complete 3D Global Climate Model (GCM) for Mars. In  and  further details on such models can be found. A complete predictive model for Martian atmosphere behavior is still far to come to a complete end, but some good approximations could be achieved through a simplified version of the Earth’s weather models. In particular and without loss of generality, a simpler version of equations described in  have been considered throughout our experiments Where the simplification comes after considering Martian atmosphere distinctive features, such as extreme rarefaction, (almost) absence of water vapor and heat exchange, lower gravity and so on
Where the simplification comes after considering Martian atmosphere distinctive features, such as extreme rarefaction, (almost) absence of water vapor and heat exchange, lower gravity and so on
First results made on the Pathfinder site showed a good approximation in describing the wind activity, compared to different MCD entries. Visualizing them in a 3D environment (see Figure 7) represent therefore a first step towards a fully definition and integration of a Martian weather ’forecast’ predictor. When this result will be achieved robustly, missions definition will gain another powerful tool to ensure reliability and safeness.
The goal of this paragraph is to show how virtual reality paradigm can be adopted for real applications into the space industry domain. Case studies described in the following represent only a small part of the most innovative activities undertaken at TAS-I. Nevertheless they are really representative of how flexible and effective VR simulations are for several challenging and practical problems.
6.1. Rover driving
This is maybe the best example to explain the tight collaboration among several scientific disciplines when there is the need to represent several data into a visualization application only. Indeed, it comprises contributions from: astronomy and geology (high–resolution planet surfaces and rocks modeling); physics (to handle the behavior of objects according to specific environmental conditions); technical engineering disciplines (to set–up the 3D rover model as a logic set of layers and sub–systems, considering for each of them its working functionality as both a stand–alone and in collaboration with all the other ones); ergonomic (to understand astronauts’ requirements about a comfortable and safe life on board and therefore design suitable tools); human–computer interaction (to design interfaces to help crew in understanding the surrounding environment and take actions accordingly).
Figure from 8 to 13 shows many of the features aforementioned. We present two different scenarios: on Mars (Figure 8–10) and on the Moon (Figure 11–13). In the former case, we reconstructed an area of approximately 1 km2 where the Victoria Crater, an impact one located at 2.05°S, 5.50°Wand about 730 meters wide, stands. Instead in the latter case, our attention is paid to Linnè Crater in Mare Serenitatis at 27.7°N 11.8°E. The goal is to drive a (prototype of a) pressurized rover –that is an exploratory machine with a cabin for human crew –onto those surfaces, avoiding both to fall down into the pits and crashing against natural hindrances (mainly massive rocks, such as those ones depicted in Figures 8 and 9). The task is made more difficult by the presence of huge clouds of dust which, according to the specific planets conditions, are usually thicker, broader and take more time with respect to the Earth to dissolve completely. Since in those situations the visibility could be extremely reduced, the importance of being able to rely on secure instrumentation, prior knowledge of the terrain to be explored and accurate training sessions is essential, because indeed, any error could have wasting consequences on crew and equipment. Therefore, astronauts should be able to fully understand all the risks, the policies to avoid them and how to approach every step in such missions. In this context, a VR simulation offers a reliable tool to safely undertake such training. To help the crew to perform their duty, a suitable, basic interface has been built. It stands on the rightmost side of the screen where a double panel is shown. In the first one, at the top right corner, parameters such as roll, pitch and yaw angles, level of battery, speed, acceleration and outside temperature, are mapped onto a deformable hexagon, to keep them always under control. Their values are continuously updated during the simulation to suddenly reflect the current situation. If all of them are kept under a pre–defined safety threshold, the whole hexagon is green. When an alert occurs, the respective parameter turns to red: in this case, the crew should take appropriate countermeasures to face that danger (for instance, by reducing the rover speed). In the second control panel, a small bird’s–eye–view map of the surroundings is depicted. On this map, small red circles represent potential hazards, such as huge rocks. As the rover is reducing too much its minimum safety distance (that is, it could run into collision with a rock), a red alert appears, so that a correct man-oeuvre could be undertaken in time. To help the drivers a blue cylinder is projected facing the engine too. In this case, it points out where the rover will be after a configurable, small amount of time (e.g., 20 seconds) if any change in the march occurs. The driving commands are given through a suitable interface aiming at reproducing the corresponding mean to be mounted on the rover (e.g. control sticks, levers, steering wheel and so on). They could be either some haptic interfaces (with or without the force feedback impulse) or, as in our case, wii–motes. The direction as well as the intensity of the strength applied to the rows is shown by a couple of green arrows.
6.2. Planet landing
Another essential task (and another typical example where cooperation among disciplines is strictly essential) is to bring onto the extra- terrestrial surface all the machinery required for the scientific mission. This operation is usually performed by a lander. It could be thought as a composition of at least three distinct parts: the capsule, the propulsion system, and the anchoring units. The first module carries all the machinery to settle on the ground; the second part is used during both the take–off and the landing and it aims at balancing loads and thrusts and avoiding sharp and compromising movements; the last one is the first one to touch the soil and has to soften the landing and provide stability. This kind of operation is really ticklish because in case of failure, the equipment is very likely to be lost or damaged or having malfunctions. To avoid such a possibility, carefulness in choosing the landing site is mandatory: interesting sites from the scientific point of view could be landing targets if in the surroundings a flat terrain, almost rock–free and without any other obstacle is present. Therefore, an accurate research should be performed prior the implementation of the mission itself. During the VR tests, different landing sites could be tested, till the most appropriate one is detected (see the first two pictures in Figure 14). Those trials are suitable for another couple of things. First of all, to test endurance, impact absorption, breaking and tensile strength and some other mechanical properties of lander legs. In this case, series of physical simulations should be set up to test changes in parameters and find the right combination of them to guarantee the maximum of safety in real operative environments. (see the last picture in Figure 14) Then, since dust clouds are a major challenge, blind landing should be taken into account. In this case, both automatic and manual landing operations have to deal with complementary sensors (e.g. sonar and radar) integrating previous knowledge of the targeted site. In this case, VR simulations can help scientists to find the best descent plan according to the supposed hypothesis and the real operative situations, which can be surprisingly different from the first ones. Therefore, plan corrections should be undertaken to face problems such as malfunctions, higher speed, error in measuring heights, winds (on Mars) and other unpredictable events.
6.3. Visualizing radiations
Scientific visualization is an interdisciplinary field whose objective is to graphically represent scientific data so that scientists could understand and take a more detailed insight of them. It usually deals with 3D structures and phenomena coming from several science branches such as astronomy, architecture, biology, chemistry, medicine, meteorology and so forth. Computer
graphics plays a central role because of its techniques in both rendering complex objects and their features (among the others volumes, surfaces, materials and illumination sources) and dealing with their evolution in time (see ). The importance of visualization is essential to manage complex systems and when events to be displayed are invisible (i.e., they could not be perceived because of either micro–or even lower scales or they happened outside the optic frequency band). In those cases, visual metaphors should be used to show such phenomena and therefore keep the audience aware of their existence, effects and consequences. This approach has been successfully applied to projects aiming at investigating how radiations will affect human health and electronic components during space missions. In particular, we focused on representing the Van Allen radiation belt surrounding the Earth. This area is located in the inner region of the magnetosphere and mainly composed by energetic charged particles coming from cosmic rays and solar wind. The purpose of this study is to show how radiations will spread and amass on and all around the whole spaceship volume, given the significant time spaceships spend in orbit. This way, it will be possible to design suitable countermeasures to shield against all the potential risks. As shown in Figure 15, the belt has been represented as a ball of threads enveloping Earth, getting thicker and thicker as time flows and spaceships orbit our planet. At the same time, a color scale gives to the observer the feeling of danger, by ranging from cold (that is, low risks) to warm colors (highest damages) (Figure 16).
6.4. Cargo accommodation
The International Space Station (ISS) is the farthest outpost of the human presence in space and can be thought as an habitable satellite. Since 1999, its pressurized modules allowed the presence of astronauts whose main goal is to conduct experiments in several fields by exploiting its micro–gravity and space environment research facilities. Shuttle services provided in years a continuous turnover of astronauts as well as supplies, vital items and scientific equipment. Anyway, carrying provisions and other stuff back and forth is far from being a simple task, at least in its designing phase. Indeed, the most difficult challenge is how to put the greatest amount of items into a cargo so that time, money and fuel could be saved and providing at the same time the best service as possible. In other words, it means facing the well–known knapsack problem on a larger scale. The CAST (Cargo Accommodation Support Tool) program has been established to work out that problem by optimizing the loading within transportation vectors such as Culumbus and ATV (Automated Transfer Vehicle). Practically speaking, it has to find the optimal disposal for items (usually bags) into racks, that is the main focus is on properly balancing the loading. This means finding the best center of mass position for each rack into the vector, such that resource wasting is minimal, any safety issues will occur and it will take the smallest number of journeys as possible. The balancing problem can be solved algorithmically through an interactive, multi–stage process, where problems such as items–racks correlation, rack configuration and item, racks and cargo accommodation have to be addressed. The result is a series of 3D points, whose final configuration corresponds to how bags have to be stored into racks according the given constraints. A visual representation of them is particularly useful if it could be conceived as a practical guide to help people during load/unload phases. In order to allow users to test several configurations at run–time and analyze how they will affect the final cargo accommodation, direct interaction has been guaranteed through wii–motes, data gloves and force–feedback haptic devices. Moreover, in order to guarantee the best simulation as possible, physical constraints have been added too. So, easiness in picking and moving objects will be affected by object masses and weights; collision detection among bags and racks will limit movements in changing object positions and guarantee at the same time the consistency of results (that is, impossible positions cannot occur).
6.5. Understanding natural risks
Although TAS – I experience in modeling 3D terrains is principally devoted to reconstruct extra–terrestrial soils, we can present here an example of an application involving Earth territories. The work comes after Alcotra–Risknat (Natural Risks) project. Alcotra is an European Commission approved program for cross–border cooperation between Italy and France. In the context of improving the quality of life for people and the sustainable development of economic systems through the Alpine frontier between the two countries, a special care towards enforcing public and technical services through a web–based platform in the natural risk protection field is given. Among the objectives, we can remind the need to to provide innovative technological strategies to manage territory policies efficiently; develop an environmental awareness depending on sustainability and responsible management of resource use paradigms; coordinate civil defense facilities and equipment in the cross- border areas. Given this context, our main contribution consisted in a 4D physically- realistic simulation demo of a landslide occurred at Bolard in the high Susa Valley. Thanks to stereoscopic vision and 3D sound effects, we developed interactive and highly immersive scenarios for citizen risks awareness purposes. The demo consists of a 3D model simulating the physical propagation of debris and rocks slides in a mountain site (see Figures 19 and 20). The simulation has been built on real geological data coming after
7. Conclusions and future work
The COSE Center facility is an innovative and highly technological equipped laboratory, currently involved into developing both VR and AR applications to support inner research at TAS-I. After being successfully used in several fields such as the entertainment industry, they have been satisfactorily introduced also in the management of complex production projects with the aim of improving the quality of the whole engineering steps chain, from the collection and validation of requirements till to the final realization of the product itself. TAS-I proficiently application to its products is double-folded. First, as a new, integrating tool in all the decision making phases of a project, by supporting manual engineering tasks and other well-known instruments (e.g., CAD) and overcoming their limitations. Second, as a set of interactive simulation tools, being able to realistically reproduce hostile, extra-terrestrial environments and therefore supporting disciplines to properly understand operational behavior under extreme conditions. The VR facilities could be considered as a center of attraction to improve knowledge, technical skills and know-how capability. This enables the COSE Center research activities to have reached several positive results in the policies of simplifying the team approach to complex products and projects. Among them, we could cite a better interaction with customers and suppliers and among multidisciplinary experts too; improving the effectiveness of evaluation/assessment by the program teams according to a tightly collaborative approach. The good results achieved thank to the VR-lab have been reached because the system structure and behavior are shown in a more realistic way to the team. Running several simulation sessions by stressing virtual models under different conditions is a fast and economic way to collect data about product requirements, limitations and strong points. Practically speaking, the set of virtual tools adopted at TAS-I and the current research results has lead in some cases engineering disciplines to rethink about both their relationship to the implementing system and the necessity to focus on new critical aspects, emerged during interactive sessions. In some other cases, engineers decided to optimize their internal process given the results obtained through virtual tool analysis.
In the future, we are aiming at improving the capabilities of our VR facility in several research directions. First of all, by implementing new features / applications according to the engineering fields needs and allowing a more natural interaction with them through specific devices (e.g., new tracking devices, touch-screen devices, improved AR interfaces and so on). Second, by involving a higher number of disciplines in order to achieve the most complete vision as possible of the environment to be simulated. A complete simulator of hostile environments is still far from being implemented, but our efforts tend towards that end. This shall mean that physical engine features would be extended to encompass a wider range of possible dynamics to be reproduced. This shall also mean that a tighter cooperation with scientist is mandatory to enforce the realism of a simulation.
The realization of such a work comes after several years of researches. The authors gratefully thank all the people from COSE Center being involved to enrich our experience into the Virtual Reality fields. This work is also the result of several projects such as: Astro–VR, STEPS, Manu–VR and Alcotra–RiskNat.
- Where the simplification comes after considering Martian atmosphere distinctive features, such as extreme rarefaction, (almost) absence of water vapor and heat exchange, lower gravity and so on