Embry-Riddle and Eclipse Aerospace Build AI Tool to Cut Pilot Workload
Embry-Riddle Aeronautical University and Eclipse Aerospace are developing an AI system that listens to pilot-to-air traffic control communications and extracts critical flight instructions in real time. The tool uses speech-to-text and natural language processing to capture commands like heading, altitude, speed, and frequency - then displays them so pilots can verify before sending to avionics.
Eclipse Aerospace is funding the project and providing flight test infrastructure. The research team includes Dr. Andrew Schneider, director of Flight Research at Embry-Riddle, and Dr. Jianhua Liu, an associate professor of electrical and computer engineering.
How It Works
The system processes voice transmissions through automatic speech recognition to identify specific ATC instructions. Natural language processing then extracts the key data points and presents them on the flight deck.
Pilots retain full control. They review the extracted information and decide whether to execute the command.
The Challenge
Cockpit environments are noisy. Radio signals degrade. Pilots don't always follow standard phraseology.
Schneider said the team must "parse out these instructions in a really messy environment." Early testing in labs and actual flight operations shows the speech recognition is working with high accuracy. Natural language processing development is on track.
What Comes Next
Researchers will measure whether the tool actually reduces pilot workload. A human factors study will track pilot performance, workload levels, and trust in the automation.
The team will also examine how pilots interact with AI on the flight deck and whether those interactions affect performance.
"This collaboration allows us to explore how human-AI teaming can strengthen decision-making," Schneider said.
Your membership also unlocks: