Capt Darryl "Spike" Long, aviation training systems programme manager for the US Naval Air Systems Command (NAVAIR), has no doubt about the vector the service's simulator technology needs to take.

"Vision systems have always been the way to create a realistic environment," he says. "The eyes are immediate."

Although partial or full motion-based simulation remains an important part of the US Navy's training regime, recent advances in display systems, image generators, visual databases and networked systems lead Long to believe a portable vision-based full immersion simulator experience is coming soon.

With the equivalent of 20/20 visual acuity - a "God's world view" state that Long says the industry will achieve in two to six years - pilots will be immersed in the "fog of war", flying in tight formation with other manned devices during networked simulations where the aircraft are tracking hundreds of datalinked targets. Visual systems will be detailed to the point where pilots, unmanned vehicle operators and ground troops will see simultaneously the actual targets in multiple spectrums, while sensing activities in other electromagnetic realms like cell phone signals.

A census of technology advances by suppliers to the US military simulator and training market would appear to support Long's hypothesis.

Rockwell Collins, the market share leader in visual systems for aviation simulators and a provider of full-flight simulators for the US Air Force's Rockwell B-1B and Boeing B-52 programmes, says its new image generator, the EP-8000, is a key step toward full immersion.

"It's not an incremental change," says Kent Statler, executive vice-president and chief operating officer of Rockwell Collins services group, "it's a transformation in realism to the war fighter." The company is unveiling the technology to the public for the first time at this year's Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) show in Orlando during the first week of December. Included will be a "special session" for Long, who has not yet seen the system in operation, says Statler.

VISUAL SCENE

The "near eye-limiting" technology, resulting in sub-meter "out the window" and sensor imagery "over very large areas" is a combination of more megapixels, polygons and clarity in the visual scene. Behind the technology is Rockwell Collins' "Core" open architecture and a software-programmable image generator, the first of its kind, says the company.

Rockwell Collins says the EP-8000 also has "correlated lighting, atmospherics and special effects and the largest dedicated texture memory in the industry for high-resolution imagery". The benefit of having software programmability is that new features and performance upgrades can be accomplished with software improvements rather than graphics card swaps, as is the case with legacy PC-based systems, says the company.

Driving the higher "scene density" for full immersion in part are the fundamental changes in the way the military is preparing its troops to fight.

"The whole training experience is going through a major revision," says Statler. Based on experience from the campaigns in Iraq and Afghanistan, Statler says training must include the ability for house-to-house searches, looking for details as small as disturbed soil where an improvised explosive device (IED) may have been planted, or studying facial features to determine whether a person is friend or foe. "Visuals in the past were not very good for that," he says.

Behind the visuals are image generators and databases that are growing in speed and size, respectively. "The first level of real immersion is display technology," says Frank Delisle, simulator manufacturer L-3 Link's vice-president of engineering and technology. "Now you've got to feed it with content." Included in Delisle's definition of content are databases representing certain parameters of the physical world, such as terrain, and and image generators that compute the visual content to send to the digital displays, which are increasingly liquid crystal on silicon-based from the consumer TV market.

REAL WORLD

"We use the analogy of a home TV," says Delisle. "If you bought a high-definition TV and didn't order HD cable, the picture still looks fuzzy. We've been focusing on the kinds of content that would replicate fidelity in the real world, down to the leaves on trees for a specific geographical area."

Recent wins for L-3 include a $68 million contract from the US Air Force in January for 20 four-ship F-16 mission training centre simulators for basic and advanced pilot mission training, tactics validation and mission rehearsal. L-3 also recently won upgrades to its Predator mission aircrew training system (PMATS) first ordered by the USAF in 2005. Enhancements to the Predator system include low-level light TV simulation, high-resolution databases of the Kabul and Bagram airfields in Afghanistan, as well as the addition of L-3's physics-based environment generator (PBEG) into the simulator. PBEG includes software from the gaming world that predicts human behaviour based on certain stimuli. Where the gaming industry pales compared with what is needed for military exercises is in the scale of the game, however. "We have to go across a city that might be 100 square miles large," says Rockwell Collins' Statler.

L3 Link F-16 MTC 
L-3 Link is providing 20 four-ship F-16 mission training centre simulators to the air force 
© L-3 Link

L-3 plans to boost the immersion capability of the F-16 mission training centre this year by increasing the number of entities, including people, cars and aircraft, injected into any real-time scene from 5,000 to about 100,000. "Five thousand doesn't represent a typical urban environment," says Delisle. "As you get up to the 100,000 level of magnitude, you can more correctly emulate the urban environment." Making those entities look and act credible, from facial features such as eye movement to body motion, is code coming from artificial intelligence advances in the gaming industry.

Such an environment is what L-3's customers are "salivating over", says Delisle. "They want to make more and more scenarios with adversaries in cities".

The "adversary" can also be a natural disaster, he adds.

"What our helicopter customers are saying now, especially the Air National Guard, is that they can truly train emergency rescue operations with this technology," says Delisle. "They want to create the content and scenarios of floods and natural disasters with people stuck in certain situations." He says the company is working "with several customers" and hopes to get contracts soon, the primary customer being the National Guard. "They have pilots trained in the military for military operations. Now, when they come in to do emergency rescue operations in the civilian side, tactics are different."

In a recent demonstration, L-3 created the scenario of a flood zone where a building had collapsed and the helicopter pilots were tasked with rescuing people on the roof. In the same city, there were other people stranded on top of cars and on ledges. For the National Guard, the idea was to help train the pilots to prioritise which victims to rescue first. Delisle notes that the PBEG algorithms provide realism, in this case, when pilots rescue someone from on top of a floating car, the vehicle picks up speed as it becomes lighter with one person removed, changing the equation for how the rescue must proceed.

As databases defining the various spectrums get more populous, so does the need for commonality to save cost and time. Each service is pursuing its own common databases, and there has not been a wholesale merging of needs and provisions to date. "The end of the day hasn't been reached yet where anyone can work with the same database," says Rockwell Collins' Statler. "We're still on the journey to where you become totally agnostic to database."

The US Army's 160th special operations aviation regiment, based in Fort Campbell, Kentucky, is making progress in a comprehensive single database that spans boundaries with the help of CAE. Called the common database (CDB), the system is now installed on a CAE-built MH-60L Blackhawk full-flight simulator and an MH-47D simulator at Fort Campbell, and more upgrades are on the way.

"There has been a push in military procurement to think of ways to increase the use of content," says Philippe Perey, director of synthetic environment engineering at CAE. Perey says the CDB was designed "from the ground up" to tackle the challenges of rapid turnaround of a database after a change and managing content and format so it does not become obsolete with database evolution.

 MH-60S
CAE to date has delivered four of seven MH-60S operational flight trainers to the navy
© CAE

DATABASE UPDATING

Perey says the traditional approach to database updating is to build the database, which might be 50 terrabytes in size, then make some tweaks and compile a new version. But the CDB has a revision control mechanism whereby changes are stored alongside the original section in a database flagged with a revision level. "Only the deltas are stored," says Perey, leading to reduced storage needs and the ability for soldiers in the field to update these immense databases over a low-bandwidth satellite link to a laptop computer in the field.

"This is exactly what special operations was looking for - not days or months in waiting for new database, but minutes or hours," says Perey.

Also helping to speed database compilation is CAE's motif compositing technology, in which by knowing the latitude, longitude, altitude and geopolitical boundaries, algorithms can be used in near real time populate the particulars of the visual scene and make it look more realistic without burdening the database unnecessarily. "There are many decisions you can make about that data and populate the area of the database with content that is representative," says Perey.

CAE received "a significant endorsement" of the CDB concept last year with a contract from the German air force, which will upgrade a fleet of more than 60 of its simulators to be CDB-compliant. Perey says CAE successfully demonstrated the system in July, proving that early concerns about end-to-end data latency with CDB were not an issue.

 M-346
CAE's Medallion-6000 image generator provides realism for the company's Aermacchi M-346 trainer simulator
© CAE

NEED FOR MOTION

Largely missing from the discussion of full immersion so far has been whether motion is needed. "We struggle with motion," says NAVAIR's Long. "Most people in most cases are going to go away from motion. They would rather make visual and weapons system accuracy more important." While full-motion simulation may be important for initial training of pilots, Long says eyes "do the precision stuff at the end of the day, and motion makes the simulators very expensive and harder to maintain."

Long says simulators "have to do a lot of trickery" to make the human vestibular system think it is moving in a certain way. "Every study you look at, whether done by NASA, the military or others, finds that motion is OK, but it's not the most important thing," says Long. "There is no proof that motion systems enhance the training experience, but if the cues are mismatched, it causes conflict between the senses."

This mismatch can be amplified in centrifuge-type simulators which are used for high-g tolerance training for fast-jet pilots once a year, or in some cases, as sustained-g full-flight simulators.

In an effort to get quantitative data on centrifuge applications, Long says the navy is doing a "series of studies" with NASA looking at the effects of motion, in particular the effects of g-loading on the vestibular system and whether something can be done to nullify the false cues.

"Everyone will look for ways to go to motionless simulators," says Rockwell Collins' Stater, but he says the level of acceptance of being able to certificate such a system is not a given. "You will see a lot of that technology introduced in fixed-base simulators before you see it come from the top down [from the full-flight simulator]," he says.

Motion or motionless, Long says the bigger picture is what is best way to create an environment "where a service can generate the training required to make sure pilots have the requisite knowledge, skills and experience to fly the mission".

The answer to that question is a work in progress, but the evidence to date indicates that full immersion will be a growing part of the solution for training and mission readiness.

"We are using simulators across the spectrum," says Long. "As you get further down the training continuum, the complexity of the mission goes up. Early on, we're grading your ability to null your errors to zero. Later on, you have to null errors and operate the weapons system."

Long says the Marine Corps is "leading the way" on using simulators for mission readiness, with about 50% of readiness tasks done in the simulator for some platforms and up to 25% for the F/A-18 in a fixed-base simulator.

"For them to accept the simulator for 25% of training is significant," says Long. "We intend to take the numbers dramatically higher in the simulator, not because it's cheaper, but because the training is that much better."

 CAE compositing
CAE's motif compositing technology takes an otherwise bland scene (left) and makes it more lifelike (right), as compared with the Google Earth snapshot (centre)
© CAE

 

 

 

 

 

 

Source: Flight International