VSimulators at the University of Bath was finalised in October 2019. This new generation simulator consists of an environmental chamber on a hydraulic based moving platform, that Antycip equipped with a virtual reality projection base solution.
The VSimulators at Bath University is designed primarily as a research tool to conduct experiments with real humans in the loop. It is already being used to immerse people in a range of lifelike environments, in order to study their reactions to different structures, including swaying skyscrapers and bridges.
Research teams have already identified over 50 potential applications for the facility, including immersive VR game development, physical rehabilitation and driverless vehicle design, bringing together varied industry and academic sectors.
The space can be realistically configured as an office, apartment, hotel room or hospital ward, giving the researchers the ability to create convincing ‘mixed reality’ simulations.
The projected virtual reality on the walls of the 3 x 4 metre chamber is combined with motion-tracking glasses and programmed to adjust the visual and audio sensory output according to the time of day and building height. The front projection is delivered over three active display “faces” combining to form a continuous image effect inside the specialist chamber mounted upon the motion system.
The visual requirements that were outlined needed an immersive virtual environment that could be projected around three internal screen faces (walls) upon the interior of the specialist motion enabled chamber. The concept at its basic form was to offer computer generated real-time interactive visuals of a realistic looking cityscape combined with partial internal views of a room space that virtually extended the internal dimensions of the chamber volume itself.
Firstly, ST Engineering Antycip (formerly known as Antycip Simulation) had to have content created in terms of a 3D modelled city scape environment that could be rendered in real-time frame rates at high resolution.
It would also have to be conforming to a set criteria outlined by the University, which defined what they wanted to see in terms of the richness of the virtual world.
Working with our technology partners at Real-Media, a Unity framework was produced that delivered content which could be switched during run-time to address different perspectives in altitude from the virtual buildings elevated synthetic views points.
A control interface to switch and cycle through the options to be selected for the experiment scenario was created. The entire visual then needed to be enabled for spatial tracking and multichannel synchronisation. To that purpose, a license from MiddleVR was provided and configured by Antycip’s engineering team, to ensure the experience could address the hardware accordingly and provide the realistic experiences needed.
The project called for a projector chassis that was lightweight and compact as its volume would have to be carried as part of the payload supported by the motion platform.
The 7.5kg chassis reduces payload weight and the advanced ultra-short throw lens optics enable this projector to be vertically mounted whilst the light path throws an extremely short distance to achieve a large image.
This 0.3:1 optics was ideal for our engineers to design a unique approach whereby the 3 projectors remained outside of the chamber but fed the optics through apertures in the chambers ceiling. The result was a larger usable space within the chambers interior with the main noise and exhaust of the projectors remaining outside of the user’s domain.
All three projectors are situated in a custom mounting harness upon the chambers ceiling and each unit is fed by pure fiber optic cable links to assure that the high bandwidth stereo signals could be delivered artefact free when in operation.
For VR applications of this nature the projectors can map to the display surface and achieve an image that reduces shadowing artefacts caused by the user moving closer to the image.
Stereoscopic visuals needed to be dynamically tracked so that a user within the chamber could look at the virtual content from any view point and have this view point computed and corrected to their vision accordingly.
Therefore, there was a need to supply spatial tracking cameras and the Vicon Vertex units were the obvious choice. These specialist cameras divorce their primary electronics away from the lens and head of the camera in order to be less obtrusive to the user. Antycip could take advantage of this unique design and placed the main camera heads inside the chamber whilst keeping the electronics fixed to the outside.
A tracking volume was calibrated, enabling stereo and monoscopic glasses to be worn by a user within the chamber so that their head movements and gaze could be perfectly tracked.
This tracking aspect was very important as the V-Simulator requirement was to simulate a realistic virtual world around the chamber that could be perceived through a virtual window as if the chamber itself was part of the fabric of a much larger physical building. The user is thus able (like in real life) to walk and freely look around out of the virtual windows and have their view corrected and not distorted. This gives the ability to obtain natural viewing angles of the virtual cityscape and helping to secure a sense of presence in that virtual world for the research and test scenarios.
To drive the projection array and to provide control for the technologies delivered by Antycip we had to deliver a dedicated equipment rack with associated middleware components that was populated with Image generators that feature flag-ship GPU’s and both hardware and software level synchronisation for this multi-channel array. A control operator station area enables the user to change scenarios, communicate with the projectors and tracking cameras whilst also providing access for administration purposes to the PC’s within the rack.