The power of eye tracking technology for pilot training
Eye-tracking technology could be a game changer for pilot training, giving participants and trainers valuable feedback and ultimately improving safety. Colin Gunn, General Manager, Toll Helicopters and Pat Nolan, General Manager of Aviation Seeing Machines, report on the latest developments in this pioneering sector
Through unique eye- and face-tracking sensor technologies, with a key focus on alertness and attention, Seeing Machines technology enables key data and metrics to understand and improve human performance. This technology is unique. It is a non-wearable, unobtrusive system that can be adapted and integrated into various real-world environments to provide new data to optimise training and operational monitoring. This technology will challenge the ‘current way’ and provide a new perspective, with data that focuses on the human element, their alertness and performance.
Insufficient knowledge of automation behaviour, mode confusion and loss of awareness, poor scanning techniques and over-confidence and trust in automation are all recognised as prevalent in today’s on and offshore helicopter crews. Poor visual scanning signals an emerging split between the pilot and the automated system, which leads to adverse safety outcomes. Inappropriate use of autopilot modes was cited in the fatal ditching of an Airbus AS332 L2 Super Puma near Sumburgh Airport, UK, in 2013. Addressing and rectifying these deficiencies is, however, quite difficult.
Seeing Machines has pioneered the development and commercialisation of proprietary algorithms and hardware that helps machines to interpret the human face and eyes in order to understand their state.
System data
The non-intrusive, fully automatic, camera-based technologies do not require the user to wear any form of hardware or sensors. The technology detects and locates a human face and then tracks in real-time without any form of pre-calibration or prior knowledge of the subject. The system can immediately provide a variety of accurate head- and eye-related data, measures and metrics including gaze tracking, microsleep detection and very accurate measures of pupil diameter. These core signals can then be integrated into mutually developed visualisations, or further interpreted to develop higher-level signals to support the use case of the carrier/operator.
The Seeing Machines data output that can be produced both in real-time or saved for post-run review/debrief is relevant to several key major aviation initiatives, including the following:
- IATA (International Aviation Transport Association) Initiative of Evidence-based Training (EBT) from 2013 – building on over 20 years of flight operations.
- A new evidence-based paradigm for competency-based training and assessment.
- EBT Aim – identify, develop and evaluate the core competencies required by pilots to operate safely, effectively, and efficiently in a commercial air transport environment by managing the most relevant threats and errors, based on evidence collected in operations and training.
- Pilot gaze tracking – a key enabler and contributor to enhanced EBT:
- New training data – specific to where a trainee is looking.
- Provides improved instructional awareness.
- Supports early evidence-based identification, evaluation, development/correction and refinement of pilot competencies related to operation of the specific aircraft type.
Background and history
The venture between Toll Helicopters and Seeing Machines started immediately after the PACDEFF 2016 Human Factors Conference in Brisbane in November 2016. During PACDEFF, Seeing Machines and Boeing jointly briefed the early results from trials conducted with Seeing Machines tech on a B737 simulator at Brisbane Airport. At the same time, Toll Helicopters were at the peak of the training workload as they mobilised Australia’s largest ever aeromedical contract. Over a six-month period, they had to train 80 pilots and front seat crewman in AW139 aeromedical operations at the ACE Training Centre in Sydney, which housed a Level D AW139 Full Flight Simulator. Colin Gunn, then Chief Pilot for Toll, saw immediate correlation and potential benefits of the eye-tracking tech in helicopter training and the collaboration commenced.
Colin Gunn, then Chief Pilot for Toll, saw immediate correlation and potential benefits of the eye-tracking tech in helicopter training
At about the same time, two additional occurrences further supported investment in the tech. Firstly, Gretchen Haskins, CEO of HeliOffshore, a global safety-focused association for the offshore helicopter industry with a mandate to deliver industry-wide safety enhancements through collaboration with and between members, had recently released a list of HeliOffshore’s priorities for 2017. These priorities were perceived as the optimal commitments and investments that the industry could make to enhance aviation safety in the oil and gas industry. Several of the HeliOffshore priorities could be directly supported by the Seeing Machines technology, including:
- Effective design and use of automation, including Flight Crew Operating Manual (FCOMs) and automation training videos.
- Enhance situational awareness through better understanding of eye movement.
- Evidence-based training to support effective operational performance.
Secondly, Leonardo Helicopters was preparing to release the first ever FCOM for the AW139 type and was seeking industry input to not only optimise the FCOM itself, but to identify ways in which to optimise training and operations consistent with the FCOM guidance.
Results are in
Several trials/studies have been conducted to date. The benefits of Seeing Machines as an agile SME, combined with ease of availability of Toll’s AW139 Full Flight Simulator, saw the first trial conducted in mid-January 2017. The tech was installed in a rudimentary but non-intrusive manner and the results were immediate.
The initial integration provided headtracking and eyelid behaviour:
- Headtracking provides precise detection and measurement of the frontal area and sides of a subject’s face and head in real-time.
- The fully automatic, camera-based technology of Seeing Machines returns a comprehensive model of the face that includes the coordinates of all facial features, their current state and their rate of change. This includes a very accurate measure of blink rate and eyelid aperture.
Eye gaze tracking:
Eye gaze tracking is the measurement of where a person is looking. By directing a safe, invisible light source at a subject’s eye and then using a special camera to track the glint, it can interpret precisely where or what the subject is looking at.
When combined with a precise understanding of the real-world environment, such as the cockpit of a helicopter and a Line Orientated Flight Training (LOFT) scenario, gaze tracking can be used in real-time (or recorded and played back) to help assess exactly how the subject is processing their visual surrounds.
In the initial trials, students benefited from improved debriefing from flight examiner instructors on use of checklists for all normal phases of flight and handling of emergency procedures both live during the training sortie, and afterwards in the debrief, with recorded material. Conversely, the flight examiners also noticed students (some of whom have 10-15 years flying aeromedical, SAR and/or offshore operations) managing their IF scan pattern or dealing with malfunctions in an alternate but perhaps more effective way than traditionally taught.
When combined with a precise understanding of the real-world environment, such as the cockpit of a helicopter and a Line Orientated Flight Training (LOFT) scenario, gaze tracking can be used in real-time (or recorded and played back) to help assess exactly how the subject is processing their visual surrounds
The results generated from the initial study offered access to naturalistic pilot behaviours that would not normally be visible to an instructor/evaluator. The eye-tracking sensor suite delivers the capability to expand the understanding of native operator behaviour, thus enabling continuous improvement of the training system and methods.
The results of the initial study enabled some tweaks to camera placement, incorporation of an audio feed and also revealed the enhancement achievable through a rear-of-cabin-mounted video feed to see the bigger picture, including arm and hand movements simultaneously to the scan pattern. A further short trial occurred mid-February 2017 – to implement and confirm suitability of these enhancements.
The predominant impression following the second study was the increase in ‘contextual data’ that eye-tracking supplied beyond that which the observer assessment of a pilot and/or crew could give. The evaluators (instructors) were able to add considerable context to the performance of the subjects (or students). The second study reiterated that the delineation of ‘work imagined’ versus work done generated a core question: are the subjects doing something wrong that needs correcting, or are the training strategies simply ignoring the way expert practitioners behave on the line? What is required – do we correct the trainee or the training?
The results of these initial trials have largely remained in-house as the tech integration is being formalised. Toll’s Helicopter team continue to use learnings from the technology integration to understand more about how pilots monitor cockpit instruments and use automation during flight and to complement our focus on training to the FCOM.
A further study was conducted in August 2017, in which a discrete set of emergency procedures were selected, including engine, electrical and avionics malfunctions, to evaluate and optimise aircrew response to these in a variety of settings, including the runway and offshore helidecks, by day and night – unaided. Again, the results were outstanding.
In its most simple form, what has been realised through these initial trials is that, knowing where a pilot is looking is ‘interesting’, but when the scan behaviour is matched to aircraft context, the results are ‘compelling’.
Looking forward
A key focus area for the next phase of investigation will be a more effective application layer to support the instructor and fit into the workflow for both instructor and pilot/crew. This collaboration will also look to more closely identify the key areas of interest in terms of instrument/s, windscreen and chin bubble to ensure precise attention capture and ensure the most relevant information is presented at the most appropriate time.
Toll and Seeing Machines will present their findings at the PACDEFF Human Factors Conference in September 2019 at the Gold Coast in Queensland.
May 2019
Issue
In this issue:
- The latest advancements in eye- and face-tracking technology for pilot training
- Hoists, hooks and winches as essential parts of the SAR tool kit
- A look at Patriot South 19’s March training exercise
- A special report on the CL-415 Full Flight Simulator
Colin Gunn
Colin is General Manager of Toll Aeromedical / Helicopters
Pat Nolan
Patrick leads the Seeing Machines aviation business, and focuses on consultation with manufacturers, Tier 1 companies, operators, carriers and air navigation service providers (ANSPs).