CLOUDS, CODES AND AI ON “THE PERFECT PLATFORM”

“The U-2 is the perfect platform to drive cutting-edge military technology” said the 9th Wing commander Col Heather Fox. She was referring to recent pioneering U-2 flights that have explored how cloud computing can install innovative software into aircraft that are already airborne, enabling sensors and navigation systems to react quickly to fast-changing threats. The flights have been well-publicized, with a degree of hype. But one innovation to the aircraft’s radar sensor has gone almost unnoticed.

I have previously described how the Dragon Lady has already demonstrated extended, multi-domain networking, and fast integration of new sensors, made possible by open mission systems (OMS). The US Air Force has funded a major avionics and sensor upgrade of the entire U-2 fleet, including an OMS computer.

Now the process has been taken a step further. Cutting-edge digital technology developed commercially in ‘cloud’ environments is being adapted for military use. The work on the U-2 has been done at the prompting of the Air Force’s Open Architecture Management Office (OAMO); by the Lockheed Martin Skunk Works; and most notably by the “U-2 Federal Laboratory”, a small group of civilian and military developers based at Beale AFB and led by Major Ray Tierney, a former U-2 pilot.

Federal Lab Artificial Intelligence flight on U-2 per Will Roper image Dec20

Last August, the OAMO posed a challenge to software engineers from the Air Force and three major aerospace companies – Boeing, Lockheed Martin, and Northrop Grumman – in the OpenAirKube (OAK) project. They were asked to demonstrate within 60 days, how the new digital technology could be quickly and safely integrated with the embedded systems on military aircraft. Each team used open-source cloud applications, notably Kubernetes, which manages and optimizes computer memory and processing through a method that developers have named containerization. It enables software to run correctly when moved from one computing environment to another. It allows for the services provided by one computer to be used in another, and for the two to interact. It provides for easier software builds. Crucially for the military therefore, it allows software to be re-used across platforms, rather than have them rely on bespoke software that is slower and more expensive to develop.

For its part in the OAK project, Lockheed Martin demonstrated in a laboratory at Palmdale, how mission software could run on the company’s Enterprise Open System Architecture Mission Computer (EMC2) computer via Kubernetes. EMC2 is representative of the OMS computer that will be installed on all U-2s. It is already flying on test flights from Palmdale on a U-2.

Federal Lab Achieves Flight on U-2 with Kubernetes release shows SYERS nose Oct20

Meanwhile, there was a flight from Beale on an operational Dragon Lady, on 22 September (left). This used an open software architecture developed by the U-2 Federal Lab and orchestrated by Kubernetes. According to an Air Force press release, the flight “brought together the power of four individual, flight-certified computers onboard the aircraft, leveraging the advantages of Kubernetes to run advanced machine-learning algorithms without any negative effects on the aircraft’s flight or mission systems.”

Two inflight updates were performed. The first added code to help track future changes. The second added new automatic target recognition (ATR) algorithms designed by the Sandia Labs, to improve the performance of the U-2’s SYERS-2C electro-optical imaging sensor.

Lockheed Martin recognized that in-mission software updates like that were innovative. But at Palmdale, they were more focused on the distributed processing that Kubernetes makes possible, and how that could help the networking of platforms that it had already been exploring, notably using the U-2 as a demonstrator.

Einstein Box

In mid-November, the Skunk Works flew the U-2 equipped with the EMC2 computer (right) to prove the work previously done in the lab. The jet offloaded sensor processing tasks to a ground node that offered additional processing power. This “tactical load-balancing” helps to enhance networking in a “distributed battlespace”. The ground node was a bespoke surrogate for long-term solutions, when the nodes could also be airborne eg other platforms such as fighters or bombers. The U-2 communicated with the ground node via new OMS-compliant datalink gateway software, which ran in the airborne Kubernetes cloud, and was capable of bridging a variety of other datalinks, such as those found on the F-22 and F-35.

Federal Lab Artificial Intelligence flight on U-2 per USAF release Dec20

Back at Beale, the flight which has attracted most attention took place on 15 December (left). An artificial intelligence (AI) algorithm derived from the world of computer gaming was modified in the U-2 Federal Lab (below) and used to manipulate the aircraft’s ASARS imaging radar reconnaissance sensor for tactical advantage, and make corresponding navigational adjustments. To prepare for the flight, the Lab performed over a half-million software simulations.

Federal Lab image at Beale Sep20_Fotor

The modified algorithm was named ARTUu, after the robotic droid R2-D2 in Star Wars movies. This, together with some publicity-seeking statements from Dr Will Roper, the chief of acquisition for the Air Force, such as “we flew an AI co-pilot”, succeeded in misleading many media outlets. “AI robot takes control of US spy plane” was a typical headline.

Control of the ASARS sensor is normally vested in the Distributed Ground Station (DGS). It sends commands via datalinks, if the collection plan that has been devised before takeoff needs to be adjusted. If a change of course is required, for instance to go back and image a particular area again, the DGS sends new waypoints to the pilot, who approves their insertion into the aircraft’s navigation system. Apart from this, the only intervention that the pilot makes is to consult a checklist if an ASARS fault code is displayed, and input some digits that might restore its operation. The data that ASARS collects may be recorded and processed onboard, which conserves bandwidth. But more usually it is sent ‘raw’ to the ground for processing. Personnel in the DGS then analyze the radar imagery, using a variety of tools including ATR, and take actions as appropriate.

But on this flight, “sensor control was positively handed off to ARTUu after takeoff”, according to the Air Force press release. “ARTUu had complete radar control while switching off access to other subsystems,” said Roper.

The advantage of doing this was played out in a scenario in which the U-2 was threatened by a possible surface-to-air missile (SAM) attack, and air-to-air interception. “ARTUu’s primary responsibility was finding enemy launchers while the pilot was on the lookout for threatening aircraft, both sharing the U-2’s radar,” said the press release.

Screen Shot 2021-01-13 at 13.45.13

Although no further detail was given, this revealed new ASARS capabilities that were presumably only made possible by AI. To quickly confirm the location of a SAM threat, a site or sites indicated by ASARS operating in search mode, is most likely confirmed when “ARTU” uses the aircraft’s onboard processor to direct the radar to perform a spot collection eg a higher-resolution image. Then “ARTU” compares that image to known SAM patterns, and if a correlation is found, it provides new waypoints so that the aircraft can avoid the threat. Whether those waypoints had to be approved by the pilot before they were implemented, as per standard practice, was not made clear.

Meanwhile, the pilot was supposedly “on the lookout” for threatening aircraft by “sharing the radar”, per the press release. This could be a clumsy description of a technique that may have become possible by modifying ASARS to interleave an air-to-air search mode with the air-ground search mode, making use of the time between transmitted pulses. The radar receiver could process a Doppler return to indicate the range of an approaching interceptor, and whether it is closing or departing. The pilot would receive a warning by audio or by a visual display. The interleave technique might also have been used to detect threat emissions.

Dr Roper noted that “with no pilot override, ARTUu made final calls on devoting the radar to missile hunting versus self-protection”.

Col Heather Fox USAF photo Jun20?

This is all clever stuff, but can it provide better protection than the ALQ-221, which is the U-2’s Electronic Warfare System (EWS)? This is an integrated warning and jamming system that has been frequently modified to cope with new threats. It has longer range than ASARS. Together with appropriate tactics, techniques and procedures, and the protection incurred by flying above 70,000 feet, the EWS has enabled the U-2 to fly closer to “denied airspace” than other ISR aircraft, according to Lockheed Martin. Moreover, the Dragon Lady’s altitude advantage negates many, if not most, air-to-air interception threats. Still, according to Col Fox (right), “the integration of Kubernetes onto the U-2 capitalizes on the aircraft’s high altitude line of sight and makes it even more survivable in a contested environment”.

Creating what amounts to a “mini-cloud” onboard the U-2 seems like a good idea. But why the need to send new code to the aircraft inflight, rather than load it prior to takeoff? I presume that the new code was sent via the aircraft’s standard datalinks, and not by direct communication between the aircraft and a cloud server. If the latter, can a constant high-speed and signal-strength broadband connection be guaranteed? The commercial customers of cloud services have discovered to their cost, that this is not always the case. And how can a direct connection to the cloud be as secure as directional datalinks that provide greater protection from interception and hostile jamming?

Either way, there are caveats. Nicholas Chaillan, the USAF’s chief software officer, said of cloud computing that “we’re pushing the envelope when it comes to security”. As Dr Roper admitted, “today’s AI can easily be fooled by adversary tactics.”

Will Roper Asst SecAF for ATL Nov20

Nevertheless, Dr Roper (left) proclaimed that “putting AI safely in command of a US military system for the first time ushers in a new age of human-machine teaming”. Ah, but teaming a human with a machine is a description that could equally be applied to the operation of UAVs. Including the Global Hawk that was supposed to replace the U-2. But it didn’t.

A few months ago, Dr Roper took an orientation flight in one of the 9th Wing’s two-cockpit U-2 trainers. He must have passed through Building 1025, the wing’s U-2 headquarters. I wonder if he noticed the framed photo of a pressure-suited U-2 pilot hanging on the wall in one of the offices. It has a large caption that reads: The Ultimate Computer.

2 thoughts on “CLOUDS, CODES AND AI ON “THE PERFECT PLATFORM”

  1. Very interesting work. I was lucky enough to have Lt. Col Robert A Ray Jr USAF (RET) As my JROTC instructor at Ashbrook high school in Gastonia NC. He was a no nonsense man of integrity I loved hearing his stories about piloting the U-2 and seeing some of his declassified pictures he was able to show us. What an amazing machine I only wish I would have asked more questions when I had everyday access to the man. Its a small club and I consider myself lucky to have known someone in it.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s