An advanced helicopter cockpit free of the traditional trio of controls could be on the way if Airbus Helicopters opts to advance the development of an automated flight-control system that was successfully trialled in recent flight tests.

Performed by innovation arm Airbus UpNext, the Vertex project integrated a suite of sensors and high-power flight-control computers, coupled to a four-axis autopilot, into the airframer’s H130-based FlightLab demonstrator.

Vertex On Board-c-AirbusHelicopters

Source: Airbus Helicopters

Tests were performed using H130 FlightLab demonstrator

Ultimately, the goal of the project was to simplify mission preparation and management, and reduce helicopter pilot workload, and further increase safety.

The test flights, which began in the summer, culminated on 22 November with a 1h sortie during which the FlightLab flew a fully automated mission along a pre-defined route, including lift-off, taxi, take-off, cruise, approach and then landing.

Rather than the standard helicopter controls – the collective and cyclic levers, and the foot pedals – the crew were able to input instructions using a touchscreen tablet computer backed up by a primary display.

Automated departures saw the Vertex-equipped FlightLab taxi at a height of 7ft, along a 2m-wide corridor, from Airbus Helicopters’ Marignane base in southern France towards the main runway at Marseille airport, stopping at predetermined intersections before proceeding to take off.

“It was probably a world premiere to have such precision along a corridor that was two metres-wide,” says Alexandre Gierczynski, head of the Vertex demonstrator.

As an additional safety feature, a LIDAR sensor is able to detect potential ground obstacles, automatically bringing the helicopter to a halt before a collision if the pilot does not select an alternate route suggested by the computer.

Once airborne, the same LIDAR system plus an array of electro-optical/infrared cameras and an image processing system – building on technologies demonstrated by Airbus Helicopters through its earlier Eagle project – is used to build up a colour-coded picture of the terrain surrounding the helicopter on a specially configured display. A collision avoidance system also plots a safe path away from obstacles in flight.

Images captured are also used to augment the existing Synthetic Vision System (SVS), creating a higher-fidelity representation of the surroundings.

The utility of the SVS can be further bolstered through the addition of specific data sets, such as the location of powerlines – a well-known hazard for helicopters.

Using the tablet, the pilot can also select a landing site and the Vertex system will automatically calculate – and once selected – fly the optimum approach and landing. A related emergency landing function is also incorporated, adds Gierczynski.

In that case, if an emergency landing is needed, the Vertex system displays the nearest known landing site. Although someone on board the aircraft still has to approve the selection, once done, the helicopter then automatically descends to land safely.

Vertex cockpit-c-Airbus

Source: Airbus Helicopters

Simplified human machine interface comprises touchscreen tablet and primary flight display

Although a high level of autonomy was achieved, he says it was always the project’s intention to “keep the pilot as the captain of the aircraft”; suggested manoeuvres must be approved and can also be overridden if the pilot wants to take charge. 

“The pilot always remains in control of the system and has to validate and monitor what’s going on in the aircraft,” he says.

To simplify the cockpit displays, Airbus UpNext took advice from those at the coal face: “We worked a lot with our test crews and we changed and adapted all along the project what we display and how we display it to make sure we show enough but not too much,” says Gierczynski.

Although the tablet is the primary control interface, it is also equipped with two miniature joysticks – not unlike those you would find on a games console – in case vibration levels made the touchscreen hard to use. But in the end, says Gierczynski, the feature was not needed.

A “smart radio” is also included in Vertex, which automatically selects the correct setting depending “on where you are and where you are going”, he says, adding: “Radio management can represent 80% of pilot workload.”

Flights of the Vertex-equipped FlightLab began in July, although at that point the sensors were active but not coupled to the flight-control system. Tests with Vertex working as planned began in September, with the main evaluation period running from 27 October until 22 November. In all, around 10 flights were completed of around 1-1.5h each.

“We had a very mature product after 7h of flight and just made some incremental improvements to reach the level we had [on the final flight],” says Gierczynski.

Technologies matured through the Vertex flight demonstrations could go on to equip any rotorcraft – although light helicopters are seen as a particular target – or electric vertical take-off and landing aircraft such as the CityAirbus NextGen.

Airbus Helicopters says it will continue to mature the different technologies from Vertex: vision-based sensors and algorithms for situational awareness and obstacle detection; fly-by-wire for enhanced autopilot; and an advanced human-machine-interface in the form of a touchscreen and head-worn display for in-flight monitoring and control.

A separate project also using the H130 FlightLab has also tested a fly-by-wire control system and single joystick control lever.

A three-year project, Vertex represents a double-digit-million euro investment that was part funded by the French civil aviation regulator DGAC.