Synthetic vision: WYSIWYHROT*
The cockpit of the future will still face the challenge of making the pilot focus on the outside world when appropriate.
Synthetic vision has been enthusiastically talked about for all of this century. Leading manufacturers have already developed certified versions, used as situational awareness tools from 2006 onwards and adopted by air cargo operators, military transports and fighters, as well as high-end GA aircraft, with business aircraft and advanced rotorcraft now flying with tailored versions.
And by 2020 it’s tipped to enter airliners as a standard installation. Not retrofitted to any current designs as such, but beyond the B777X and the A350-1000 when the next clean-sheet designs emerge, it’s predicted by NASA that versions of the synthetic vision system (SVS) will feature as the primary flight and taxi navigational instrument for pilots from gate to gate. (Typically manufacturers roll out newly-designed airliners about every 10 years.)
SVS creates a virtual reality world outside and ahead of the aircraft; by a 3-D rendering of runways, terrain and obstacles as if seen on a sunny day at midday, with the unfolding optical view ahead artificially creating a sense of aircraft movement. This imparts an intuitive sense of orientation intended to provide pilots with continuous situational awareness, as if seeing past that real outer world of darkness and buffeting squalls currently going on to a sunny-day view of the same area illuminated on the primary flight display (PFD).
Alternative displays can be rendered on the windscreen straight ahead as a head-up display (HUD), or even as a helmet-mounted display (HMD).
At present elements that will be standard in synthetic systems, such as which particular colours to employ to depict features like oceans, mountains and undeveloped land, are being pondered from a human factors perspective and whether pilots ought to have some control over presentation brightness, contrast, resolution and opacity. The latter feature is an important consideration when the virtual world is combined with digital flight instrumentation (probably with HUD-type symbology superimposed on the LCD screen and set to infinity for pilot focus).
Also appearing on the screen can be a flight plan shown both graphically and spatially in 3-D. How much information is too much is another design question for the future SVS—could pilots be overwhelmed by excess clutter?
NASA first conceived the idea of developing the SVS in the early 1990s and regularly discussed ideas with the FAA, the USAF and scientists and engineers from highly regarded universities, in the practicalities and ergonomics involved to devise a standardised visual system.
In 1998 a number of stakeholders combined to form CAST, the commercial aviation safety team, comprising experts from government and manufacturers. The team’s initial goal was to identify the cause and then to lower controlled flight into terrain (CFIT) accidents. Analysis revealed that pilots tended to lose situational awareness and spatial orientation in reduced visibility such as at night and in closed-in weather. NASA identified fog, haze, rain, snow and darkness as the common elements impeding visibility and leading to disorientation.
Other stakeholders have come on board such as the airframe manufacturers—Boeing, Airbus, Bombardier and Embraer—as well as such original equipment manufacturers as Honeywell, Rockwell-Collins, Universal Aircraft Systems and Garmin.
These avionics leaders in their fields, through their own rigorous R&D programmes, have come to be called original equipment manufacturers (OEMs) in the prosaic jargon of the aviation industry. They, along with the safety agencies such as the FAA, EASA and Transport Canada, will have a major say in what the imagery of the SVSs of the 2020s will look like and how the visual systems will perform to a fine degree.
CAST conducted a four-year study from 2009 and concluded that the solution to lower CFIT accidents was to further develop and refine the SVS. Although a voluntary body without any overt authoritative clout, CAST’s formulation of standards will eventually morph into guidelines and rules to be promulgated by regulators.
CAST now seeks to further lower civil aviation accidents by making synthetic vision the norm for high-end operations. It sees the ideal SVS as providing a full-time presentation on the PFD, and a clear representation of the aircraft’s energy state cues—flight path, acceleration, speed deviations—in HUD-type symbology. It is predicted, incidentally, that like other innovations in avionics, a cheaper SVS will eventually emerge and find its way into the basic GA cockpit
Screen format
Generally the ideal display screen is seen as uncluttered but with every visual cue necessary to orientate pilots within their 3-D environments and to guide them along their assigned velocity vectors and down to decision height at the destination runway; and also to alert them throughout with visual and aural cues to recover from an unusual attitude or to regain track and/or level after a diversion.
Airliners currently use flight directors (F/D) to assist in pitch and roll, and the pilot seeks to match the “wings” in the flight attitude indicator (FAI) to the moving bars or chevrons, implemented by the F/D within the same instrument, to recover from a loss of attitude or return to a planned track or level, or simply to follow an assigned approach path, for example.
A number of sensing devices on an aircraft equipped with SVS feed data through to the flight computer, which then renders the input into a digital depiction of the outside world as described. This information, through an image generator, is presented within the cockpit on the PFD and, if required, on additional screens such as the navigation unit, in moving map form (top down view), and even with an additional inset in flight profile aspect, similar to an instrument let-down chart.
The SVS shows mountains, valleys and other landforms just as they are, including obstacles such as towers, and even other traffic in the nearby airspace. Superimposed over this topographical representation are ideal flight paths and waypoints for the next route segment, with visual and aural warning cues for any deviations from level, track, speed, or climb and descent profiles, or if any sensor input becomes degraded.
The stakeholders driving this innovation are convinced that the synthetic vision system will make flying easier, both for routine operations and—more importantly—when pilots are under pressure, by providing the reassurance of always knowing precisely where the operating aircraft is and how it is configured for the situation.
The latter is sometimes referred to as the aircraft’s energy state, defined as a combination of aircraft airspeed, altitude, rate of descent and to some extent thrust setting and flap configuration. When an aircraft is under control while manoeuvring, it is said to be in a stable energy state.
If the stakeholders have any reservations, it’s for the SVS to make flying and navigation such a simple operation that pilots might succumb to the insidious sirens of laziness and complacency—and then possibly be caught out if the SVS suddenly fails or degrades at a crucial time when especially needed, such as when making an approach to a destination runway in adverse IMC with threatening terrain all around.
Further, it is of concern that a new breed of pilot might have difficulty in reverting to a more arcane system of shuffling charts and having to conjure up mental pictures of the aircraft’s position—and whether it’s appropriately set up—by scanning the backup instruments as in days of yore. (You young blokes don’t know how easy you’ve got it!)
We know from actual incidents that equipment and systems losses in the flight deck can be further exacerbated by various alarms and voice warnings, all vying for the pilots’ attention, and in the 2020s such scenarios will be mandated as emergency procedures and a regular simulator recovery requirement in the future training syllabus for the SVS.
Even with all of the advanced instrumentation of 2016, complying fully as it does with required navigation performance (RNP) requirements, it’s deemed essential to improve on it yet further to lower airline, air cargo, regional and military accidents due to limited visibility.
For this new century, commercial aviation and other accidents have often been attributed to pilots losing aircraft control due to a loss of spatial orientation … as the black boxes have attested. These types of accidents are officially referred to as controlled flight into terrain (CFIT), and in the USA 66 percent of fatal accidents in commercial aviation are attributed to it.
It is predicted that SVS will improve safety by a factor of some 70 percent and the system will be the norm in the 2020s, acting as the primary flight and navigation aid. The pilot will fly from A to B as if on a VMC day, with each sector en route defined by graphical representation.
Then down along the virtual pathway to the virtual runway, numbered, identified, and clearly delineated, even depicting the particular runway surface texture and the aircraft closing speed. As the aircraft nears the runway, an electronic billboard will appear next to the aiming point providing speed, height and runway visual range information.
The ultimate aim—not yet finalised—is to be able to land and taxi to the gate in absolute zero visibility, such as in dense fog. Apparently it can already be performed in the simulator when conducting R&D exercises; infrared cameras and laser sensors with variable scanning of the field of view, and radar, are going to make it possible, but research goes on with some of the vagaries of radar and infrared propagation still being explored.
High-frequency radar (94GHz) and infrared sensors, for example, can have a degraded range in heavy precipitation and some types of fog. Conversely, low-frequency (9GHz) and middle-frequency (35GHz) radars are better able to penetrate these conditions but deliver a poorer signal with lowered resolution. They are often unable to extract some colour attributes as well as being more susceptible to interference from other radar sources such as other nearby aircraft and also when close to or at airports.
At present it’s anticipated that an enhanced SVS ought to safely direct an aircraft on approach phase down to a 100ft decision height, equivalent to a CAT II ILS approach. The SVS will be tailored to perform RNP approaches and departures, and to conduct integrity monitoring of its own performance and accuracy, alerting pilots if its self-sensing falls below set benchmarks.
If one of the enhanced sensors failed, for example, the enhanced imagery component would discreetly fade and the screen would inform the pilot(s) of the loss of input. Imagery could still be rendered by a combination of the GPS and onboard database, but it would lack the accuracy and detail of the enhanced input, and thus navigational accuracy, and so former skills would need to be employed and alternatives considered, perhaps either continuing the approach on the ILS or declaring a missed approach and flying away to an alternative airport.
Straightforward architecture
The architecture of the SVS is quite straightforward. A basic version could simply consist of an aircraft containing the following equipment: GPS, database, flight computer and an EFIS. With the appropriate software, this system will render basic synthetic imagery able to assist pilots to confirm their energy states and positions, with certain provisos.
Of course such a system has limitations, such as whether the terrain, obstacles and cultural (built-up areas) data are up to date, whether there’s a lag time with the computer processing required to produce the imagery, and if it can be trusted without some sort of data integrity monitoring facility. An inaccurate database has already led to at least one fatal accident in New Zealand.
A synthetic vision system becomes enhanced when multiple inputs are employed, resulting in much more sophisticated imagery and also with additional artefacts appearing on the screens that greatly assist the pilots in energy state and navigational management.
For instance infrared cameras, which transduce thermal energy into meaningful imagery, and laser altimeters and millimetre wave radar could all be melded with the air data computer information and GPS to provide a highly accurate rendition of the aircraft’s position. That would indicate how closely it’s adhering to the flight plan or clearance, all transparently superimposed onto the screens. Such a system is anticipated to perform as a highly accurate navigation tool that could eventually see aircraft landing in zero visibility and taxying to the gate through impenetrable fog with complete safety.
At present there appears to be some confusion between the terms “synthetic vision system”, “enhanced vision system” and even “enhanced synthetic visual system”. It appears that “synthetic vision system” is the generic descriptor for the new flight deck development, and an “enhanced visual system” has the visual presentation composed from much more input, from enhanced sensors (laser, radar and infrared) plus flight symbology and pathway-in-the-sky depictions utilising enhanced software.
In the future it’s likely that today’s enhanced SVS will simply be called the SVS because by then new enhancements will exist to further the SVS evolution. (Or will there be an enhanced “enhanced visual system”?)
Flight instrumentation symbology will also appear on the PFD to emulate that used in HUDs. (Young aspirants keen on airline careers in the 2020s, might do well to familiarise themselves with types of HUD symbology if they haven’t already done so.)
Flight paths will be depicted as tunnels-in-the-sky, and the pilots, or autopilot, would simply fly the aircraft along the rendered tunnels, with indications of any divergence and recommended correctional guidance. The imagery will appear to alter in accord with the aircraft’s ground speed and actual elevation. Some SVS commercial models already feature such additions as grid lines to give a sense of movement, and range rings to show distances to features of interest such as runways.
Enhanced synthetic vision also allows for the fusion of other developments such as traffic collision avoidance (TCAS) and taxi control, complete with airport maps. On the ground enhanced synthetic vision is expected to greatly assist pilots in fog, for example, with infrared sensors picking up tugs, equipment and people on the tarmac. Taxi clearances at complex airports ought to be as simple as a PC flight simulator program where the user simply follows the “breadcrumbs” (dashed line).
As we have seen with such innovations as the electronic flight instrumentation system—now over 30 years old—any new development is initially very expensive and finds entry only in high-level applications such as in airliners, military aircraft and business jets. Manufacturers later work out ways to produce smaller and cheaper versions that eventually find their way into base-level GA aircraft. This has already occurred with basic SVS, but enhanced versions will also inevitably follow.
There will probably be some yin and yang in all of this. It’s almost certain flying will appear to become easier, with a CAVOK view ahead and cues on how to get into and out of airports and, most importantly, how to recover from an aircraft upset leading to an unusual attitude.
But could a little bit of knowledge be a dangerous thing? Might an overconfident beginner get beyond his or her depth and rely too much on the visual representation? What if the vision became compromised or lost altogether?
If flying with a basic SVS, what if the database was out of date and didn’t include the new microwave repeater tower smack bang in the middle of the saddle ahead, intended to be the escape route home from the impending gloom closing in?
There’s no doubt civil aviation will benefit from this inevitable aid to flying. Since NASA began driving it more than 30 years ago, a lot of thought and effort have gone into the SVS. In 10 years’ time it will probably seem commonplace—even its predicted touch-screen facility.
It’s glaringly obvious that it will be the perfect tool to conduct RNP procedures and to provide pilots with assurance when operating into airports such as Kathmandu and Queenstown. It might even finally overcome the Achilles heel of most aircraft (the pitot-static system), ensuring that pilots never again slavishly follow an erroneous instrument indication.
Forty years ago commercial aviation was able to perform operations in lowered visibility due to gyroscopic instruments and precision ground aids. Pilots were required to make mental transformations of the aircraft’s situation. The artificial horizon in its many incarnations with its basic blue-over-brown world depiction was the primary instrument for resolving attitude departures.
Now, with the SVS about to be ushered into the airlines, the system’s designers are reminding the industry of how it will benefit all.
Safety, for reasons given above but also a range of other benefits: more movements at airports; more on-time flights; better ATC spacing; more completed flights. In short, it will lessen the corrosive effects of flight delays.
Both CAST and NASA confidently express that all categories of aviation can benefit from the SVS—business jets, cargo jets, airliners, military transport lifters, jet fighters and rotorcraft. When datalink communications such as ADS-B are comprehensively introduced, aircraft flow between airports is predicted to increase significantly. The SVS is seen as the logical tool to integrate with the advanced air traffic management network of the 2020s.
Synthetic vision will enhance safety and save money—but it’s worth remembering, especially for private operations, that up-to-date databases are crucial. And despite a virtual sunny day, knowing when to retreat from adverse IMC will still require airmanship. Mountain waves, ice and turbulence can still be lethal.
- Report by Peter Gamble, photographs by Christian Stirling and others.
More Articles
Current Issue
» Airport makes use of quiet time
» Airline returns to regions under level 2
» Auckland Is wreckage recovered
» Sweeping changes proposed
» AOA sensor grounds Cirrus jets