Augmented Reality sees what lies beneath

Royal Navy Vanguard Class submarine HMS Vigilant returning to HMNB Clyde after her extended deployment. The four Vanguard-class submarines form the UK's strategic nuclear deterrent force. Each of the the four boats is armed with Trident 2 D5 nuclear missiles. Like all submarines the Vanguard Class are steam powered, their reactors converting water into steam to drive the engines and generate electricity.

With wearable technology on the rise, Augmented Reality (AR) has the potential to push the boundaries of innovation for Defence platforms. AR is a live direct or indirect view of a physical, real-world environment where elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. The technology functions by enhancing the viewer's current perception of reality. By contrast, virtual reality (VR) entirely replaces the real world with a simulated one.

As Head of Innovation at Babcock’s Analytic Solution business, my job involves researching and integrating innovative and emerging technologies that look at different ways of ImageGenworking in order to provide effective customer solutions.

The Defence sector is keen to learn what role emerging technologies will play in future military operations. The availability and readiness of technologies such as Augmented Reality are adding momentum to Defence work. Being able to demonstrate quickly the art of the possible, the MOD wanted to understand whether AR had any utility for the Royal Navy, whilst understanding the maturity of the technology as it looks ahead to the Digital future.

In order to demonstrate the benefits of AR, Babcock’s Analytic Solutions business developed a scenario for the MOD in support of maintenance activity on-board a Submarine. A truly immersive experience was created, building a physical mocked up of a submarine compartment, with real-life equipment located inside to enhance the effect. We tracked the location of the user down to millimetre accuracy, and determined the exact direction they were looking.AR-News-image2

This meant we could overlay the Computer-aided design (CAD) models used in the making of the real-life equipment, and other animations and videos, over the user's vision of the existing real equipment. This gives a really easy and efficient way of viewing information such as how components were disassembled for repair. We really wanted to push this demo to the limits both to show off the benefits of AR, but also to demonstrate how other technologies can enhance the user experience.

We came up with the concept of digital dashboards; by taking sensor readings from Condition Based Monitoring systems (CBM), we were able to display all of the statistical information of an asset the engineer straight to the glasses. By using simple traffic light colours problems could be easily highlighted to the user.

Our demonstrator meant that once the engineer put the glasses on they became truly immersed – able to stand back and view the overall status of the equipment, and then get right down to the detail by taking advantage of the animations, technical diagrams and videos.

The team are now exploring how the technology could be used to improve communications in the support environment and dramatically shorten timelines for gaining permissions for critical engineering processes whilst enhancing skills, safety and auditability.




Leave a comment

We only ask for your email address so we know you're a real person