Interview: Rear Admiral Jim Robb, US National Training and Simulation Association
Rear Admiral Jim Robb, President of the US National Training and Simulation Association, spoke to Mandy Langfield about how simulation technology has changed the way pilots train in different aircraft types, and why the move to the Metaverse is going to move the industry forward
As President of the National Training and Simulation Association (NTSA), what does your role involve?
I lead a team that brings government, industry and academia together in support of development and fielding of cutting-edge training and simulation systems to our warfighter and first responders. NTSA hosts eight events a year that address key elements of the training and simulation community and lead up to the Interservice/Industry, Training, Simulation and Education Conference (I/ITSEC), which is the largest training and simulation event in the world. (iitsec.org)
How did your naval experience prepare you for the challenge of engaging national security markets with technology providers?
Throughout my 34 years of service in the US Navy, I interacted with aviation technology providers extremely closely. I was part of the initial cadre flying the F-14 Tomcat, which had many challenges in its early years. In those times as well as today, aircrew have to interact closely with vendors to identify and solve anomalies that would occur on the ground and in flight. We also interacted with industry to assess threat systems and how our own aircraft would need to be modified. At TOPGUN, we spearheaded new real-world capabilities that would be essential to maintaining a strategic advantage. As a flag officer, I was in charge of identifying future national security challenges that would shape future requirements across all of aviation. In my current position, I represent a consortium of government, academic and industry entities that work together to define future training requirements and push for alignment of government and industry investment to bring timely and economical capabilities to the force.
As a fighter pilot, how did you see training technology change during your service?
When I started flying the Tomcat, our training was not structured and there were very few systems that would allow you to review performance in the air. That all changed in the mid-70s, when the Cubic Corporation fielded the Air Combat Maneuvering Range (ACMR). This range tracked all the aircraft in flight with great accuracy by interacting with pods that were attached to the aircraft. This capability revolutionized aviation training by giving us detailed visualization of the flight maneuvers, weapons flyout simulations, and the ability to review the flights in great detail. Previously, all this information had to be ‘remembered’ by the pilots and you can imagine the different recollections and disagreements that would drive the debriefs. In the old days, whoever got to the chalkboard won the fight; ACMR brought facts to the table. This system also brought the ability for third party safety officials to monitor the fights.
You have been in command of the Navy Fighter Weapons School, better known as TOPGUN – what were the simulation options available to you then, and what options are available to crews now?
At TOPGUN, we used the instrumented ranges on most flights. We also had video recording of the radar scopes and gunsight to use in the debriefs. We had the ability to simulate enemy aircraft and tactics as well as the surface-to-air threats to the aircrews, but they weren’t flying against the real threat. Next generation simulations are coming in the form of Live, Virtual and Constructive systems that allow us to present simulated threats to the aircrews electronically from the ground. Tomorrow, with virtually unlimited computing power and data storage capabilities, we will be able to record detailed data twins of the entire ground, sea, air and space activity for a training event. This data base will revolutionize how we assess performance and improve training. It will also allow us to assess how we can merge multi-domain operations into a single picture.
It costs a great deal for operators of HEMS and SAR services to buy the latest high-fidelity simulation software; are you seeing more options of open-source solutions on the market?
There is a real revolution going on in air training through the use of small simulator ‘sleds’ that use virtual reality goggles to allow aircrews to practice procedures and flight maneuvers at a very low cost. The sleds are being used in military air training to prepare students for live flight but maybe more importantly, to allow them to come back and practice areas they might find challenging in their live training. The use of these simulators is reducing the time to train and has almost eliminated attrition in the program at great savings to the services. We are also engaging the large gaming companies to enter the military training space, which is fairly small compared to the commercial gaming industry.
How much further do you think simulation companies can go in terms of providing ultra-realistic scenarios and environments in which pilots can practice?
Simulators for large aircraft are so good that the first time an airline pilot actually flies an aircraft, there are passengers in the back. For tactical aircraft, simulators are being used to a much higher degree in training due to the inherent classification of the capabilities of the aircraft. Today, you cannot fly a F-35 to its full capability in live airspace due to classification and range constraints. The simulators are also being netted in highly classified networks to allow tactics that involve the integration of many platforms to be accomplished and repeated. In many cases, these capabilities could not be trained to in the real world.
What was the latest simulation training technology you saw on show at the recent I/ITSEC?
There are two cutting-edge demonstrations at the most recent I/ITSEC that I found very exciting. The first is the evolution of the family of augmented reality/virtual reality (AR/VR) and extended reality (XR) systems that are being fielded. New systems include haptics, which are gloves that allow the aircrews to ‘feel’ the objects they pick up or actuate. Some systems are including smell, heat and motion to enhance realism and the visual in the goggles is becoming lifelike.
The second important demonstration that I observed involved what is being called the beginnings of the ‘Metaverse’. This is the development of new networks and data management systems that will allow the production of ‘Digital Twin’ environments that you will be able to enter as an immersive environment. Here, a simulated threat could be presented to the pilot in the aircraft or in a simulator to the point where he or she will be looking at what seems to be a real enemy aircraft in their helmet display. This is a generational leap in the use of simulation for training and one that will be extremely cost effective and save asset life.
March 2022
Issue
In this issue - international requirements for rescue swimmers, aerial firefighting assets in Australia, flying in icing conditions, and how to maximize the training opportunities offered by VR technology.
Editorial Team
The AirMed&Rescue Editorial Team works on the website to ensure timely and relevant news is online every day. With extensive experience and in-depth knowledge of the air medical and air rescue industries, the team is ready to respond to breaking industry news and investigate topics of interest to our readers.