Artificial intelligence is helping pilots to learn from their mistakes through targeted teaching

Big data and artificial intelligence (AI) are helping FlightSafety International instructors provide more effective, targeted training – by detailed analysis of a pilot’s performance based on behaviours and actions sometimes invisible to an expert human eye. However, the benefits of the technology may not always be immediately clear to those doing the flying.

“One of the concerns is likely to be: ‘Why is a computer grading me?’,” says Chris Starr, senior product manager for FlightSmart®, an AI-based training tool developed by FlightSafety with IBM. “It’s not. FlightSmart is an enhancement. It gives the instructor an additional tool to help get the pilot to proficiency. The goal is not machine-based grading, but making the instructor even more effective.”

fsi_p06_FlightSafety-FlightSmart

Source: FlightSafety International

The programme is an entirely new way of working for many instructors

FlightSafety unveiled FlightSmart in November 2019 as a method of improving teaching and enhancing safety “through automated, intelligent and objective training.” It blends AI and machine learning to evaluate a pilot’s ability to perform critical tasks and maneuvers, before creating a “customized corrective action training path that addresses any identified deficiencies.”

fsi_p06_FlightSmart-Pilot-Dashboard-example

Source: FlightSafety International

The FlightSmart tool maps the pilot’s input for later analysis

FlightSafety has been using its technology to evaluate pilots on 16 Textron Aviation Beechcraft T-6A initial and operational training devices, as part of a contract  with the US Air Force’s Air Education and Training Command at Columbus Air Force Base in Mississippi. 

The company is now introducing FlightSmart into its other markets, including business aviation.

Starr says his FlightSafety colleagues had the idea to approach IBM in 2017 after realising “how much data we had in our simulators that we were otherwise discarding.” They were convinced that by partnering with a data science specialist, they could transform that surplus information into a “tool for instruction and client debrief.”

He believes FlightSmart, a proprietary FlightSafety product, will speed the sector’s transition away from qualitative-based instruction with a fixed syllabus, to competency- and evidence-based training. This shift to identifying and focusing on areas that need most attention is a cultural change in training philosophy that regulators and other industry bodies are keen to foster.

Starr explains how the technology works in practice. A session in the simulator proceeds as normal, with the client aware of but not burdened by the FlightSmart programme mapping their every input for later analysis, as well as being monitored by an instructor in real time. Because it does this behind the scenes through AI algorithms, the instructor is not distracted and can remain totally focused on the pilot.

Detailed debrief

Once the session is over, the recording is stopped and the data sent to a secure cloud, where algorithms process the information, explains Starr. Two to five minutes later FlightSmart’s assessment of the pilot’s performance is available on a dashboard, where it is used by the instructor as part of the debrief.

Training instructors with a “robust” programme of their own is crucial, as, for many, this may be a new way of working, adds Brian Moore, FlightSafety’s senior vice president of operations. “It is critical that instructors are using this well,” he says. “We really wanted to be sure they were proficient and feel comfortable with it.”

Although FlightSmart is still in the process of being rolled out, its effectiveness will improve the more it is used. “As we capture anonymous data from more and more pilots, the system itself will become smarter,” says Starr. “We are spending hundreds of hours to identify what data we need to be aware of, and how that feeds our algorithms.”

There is much scope for future enhancements. “You could take it to a different level,” says Starr. “Imagine you are the head of a flight department with 10 to 12 pilots. Upon request, and with the appropriate data privacy controls, historical performance from all these pilots could be aggregated to recognise patterns and get insights into areas where we could help you improve, for example, if your pilots have atendency to come in too fast on approach.”

Ultimately, Starr believes that initiatives such as FlightSmart could be used to help the re-framing of training requirements. He gives an example: “We could go to the regulators with data that suggests that pilots who can perform an upset recovery manoeuvre and a windshear escape maneuver are 96% competent at unusual attitudes, thereby alleviating them of unnecessary training in the future.”

As with any new technology when it comes to training pilots, Starr, a former major and helicopter instructor with the US Marine Corps, admits that there will always be some reluctance to embrace change. But he insists that by “focusing on the value” of FlightSmart, sceptics can be won over.

“Ultimately, the simulator is a big computer, and we are putting all that data into something that is useable, that can provide objective root cause analysis and see  where a student is having issues,” he says. “We will make the case that this is not a way of beating you about the head for what you didn’t do right. Instead it is all about making you a better, safer pilot.”