The AWEAR plaftorm enables the creation of 3D maps (development of 3D models) of complex industrial facilities using low-cost mobile sensors. The AWEAR plaftorm is, based on the created oﬄine models and current sensor data (RGB + IMU), able to accurately localize the worker (position and orientation) without the need of further expensive sensing infrastructure and provide assistance in the form of augmented navigation guidance, e.g. for maintenance workers or remote expert applications.
The challenge of the AWEAR project was to realize the integration of cutting-edge awareness technology (object tracking, real-world interaction, user localization in complex environments, augmented reality) and to provide a unique combination of technology and thus functionality. The modular technical augmented reality (AR) plaftorm can be adapted to most diverse industrial assistance applications enabling both, adaptivity and scalability.
The realization of the AWEAR plaftorm consists of three main components:
- Generation of a point cloud 3D model of the relevant environment.
- Localization of the user in the recorded environment and tracking motion of user or users.
- Displaying AR guidance markers.
Environment Model Generation – SLAM (Simultaneous Localization and Mapping) is an approach towards solving the problem of localizing a mobile device in a potentially unknown environment and keeping track of its position.
The localization of the camera in its environment is performed by extracting 2D features from the RGB image and associating them with 3D locations in the environment. 2D features are saved to a database, once the environment is mapped.
Visual guidance markers were created in Unity, a cross-plaftorm real-time 3D engine developed by Unity Technologies. The application creates markers and places them in 3D AR space following the localized position from the SLAM / odometry information. This application was deployed to the smart phone and used in the testing and demonstration cases.
Economical / Business impact
The number of embedded and wearable computers increases exponentially, it is being introduced into more and more types of everyday applications, and a wide span of algorithms tackle increasingly complex problems. However, the nature of man-machine interaction is still largely dominated by decades old paradigms. The latest developments in interactive ICT systems are approaching truly immersive and intuitive interaction. Especially augmented and mixed reality technologies oﬀer interaction potentials which will shape the future of interaction design.
Research Studios Austria FG, Pervasive Computing Applications ‘