Thales Aerospace is busy developing the flight decks for the Sukhoi Superjet 100, ATR 600 series, Sikorsky S-76D and Airbus A350 at its Toulouse facility. At the same time, the company is working to visualize what the cockpit of a next-generation widebody might look like 20 years from now. The biggest potential breakthrough from this could be single pilot operations for commercial aircraft.
So what current consumer technologies might find their way onto the flight decks of the future? In a press presentation just ahead of the Farnborough airshow, Dennis Bonnet, head of safety and human engineering in Thales’ cockpit center of competence, pointed to examples including the Philips Research Intelligent Shopwindow concept, jewelry that changes color to reflect the wearer’s mood and the driver monitoring system available on Lexus automobiles.
This is what Thales categorizes as “cockpit 3.0” in a classification that ranks the Concorde, with its electromechanical instruments and a processor and display for each sensor, as cockpit 1.0, and glass cockpits with information merged into displays as version 2.0.
The irony is that where the pilot using cockpit 1.0 had a limited number of tools but could master them completely and understand the cause of problems, the more capable cockpit 2.0 in some ways makes it harder for pilots, leaving them little to do when things are going well, but proving complicated to handle when something goes wrong.
“It requires too much training,” Bonnet said. “The cockpits are designed by very clever engineers but managed by pilots from diverse backgrounds. There is too much opportunity for human error and too many complex functions that are not used.” Cockpit 3.0, accordingly, needs to be crew-centric, “using the benefits of the crew’s strengths and helping them manage their weaknesses.”
Along with intelligent interfaces, such as the Philips Intelligent Shopwindow, Bonnet pointed to helpful tools, such as the Vodaphone Android, which uses GPS to show an image of the real world behind the screen and automobile engine stop/ start buttons.
Other advances include more extensive networking along with interactive languages. “Until 2000 we had only keyboards,” Bonnet said. Since then advances such as touch screens, multi-touch systems and 3-D interfaces have redefined interactivity to the extent that “if you give iPhone owner an old phone, he will think the screen is broken.”
So cockpit 3.0 is likely to feature intelligent interfaces that deduce what the pilot wants to do and help him do it, and would probably monitor crew safety. “An eye tracker, for example,” said Bonnet, “would see what the pilot is looking at and know it’s not the right tool for the problem, directing him to the right tool or even removing the wrong one from view.”
Interfaces would also be intuitive, providing a synthetic view or combined vision system to help the pilot safely navigate and manage the mission. And they would provide user-centric system management for pilots who are not engineers. “A lot of what is displayed is engineering stuff,” Bonnet said. “Pilots don’t care about it and it costs a lot to maintain their skill.” Losing the main galley at the beginning of flight, for instance, can be a serious issue. “You could have a display to suggest how to reallocate power in order to keep the galley,” he said.
There will be new interactive languages for touch screens and localized or 3-D sound. And there will be dematerialization, with several small screens replaced by one big one plus a head-up display. The upshot, in short, will be a cockpit that is safer, simpler, easier to train and smaller.