Dawn Zoldi on “Autonomous Drones and National Security”

Yesterday Duke’s Center on Law, Ethics and National Security (LENS) completed its 26th Annual LENS National Security Law Conference. Conducted virtually, the conference garnered–by far–our largest number of registrants ever.  That’s no surprise due to the conference’s fabulous lineup of spectacular speakers.

Our guest post today is by Dawn Zoldi, a retired USAF JAG Colonel, about the conference discussion she had with Duke University’s Dr. Missy Cummings  whose background made her perfect for the conference.  Here’s just a glimpse of her bio from the university’s site:

A naval officer and military pilot from 1988-1999, she was one of the Navy’s first female fighter pilots.  Cummings is currently a Professor in the Duke University Pratt School of Engineering, the Duke Institute of Brain Sciences, and is the director of the Humans and Autonomy Laboratory and Duke Robotics.

Her research interests include human-unmanned vehicle interaction, human-autonomous system collaboration, human-systems engineering, public policy implications of unmanned vehicles, and the ethical and social impact of technology.

I’ve known Missy for many years, and I’d underline her background as a fighter pilot.  Why?  Beyond her obvious technical brilliance, she gives you her views in a refreshingly unambiguous and straightforward manner–as is the way of the fighter-pilot community.

Colonel (ret.) Zoldi is familiar (and a favorite) to LENS conference veterans and Lawfire® readers (see, e.g. here).  She’s become one of the nation’s top authorities on unmanned system (drone) law and policy, so the combination of these two authentic experts who are also military veterans made for a very lively presentation.

Col (Ret.) Zoldi and Dr. Cummings at the virtual LENS conference 2021

Dawn summarizes much of yesterday’s discussion, and adds some observations (and links!) of her own.  Her essay is a rare opportunity to help you get up-to-date on some of the latest developments in both artificial intelligence and drone technology.  I think many readers will find, as I did, lots of ‘I-didn’t-know-that’ material here!

Here’s Dawn:

Autonomous Drones and National Security

By Dawn M.K. Zoldi (Colonel, USAF Retired)

This past Saturday at the 26th Annual LENS National Security Law Conference I was privileged to discuss autonomous drones and their national security implications with Dr. Missy Cummings, Professor, Department of Electrical and Computer Engineering, Duke University and one of the U.S. Navy’s first U.S. F-18 fighter pilots.

She has extensively researched autonomous systems, artificial intelligence and their nexus with drones, in both military and civilian contexts. Below, I summarize and synthesize some highlights from our talk and a few of her articles.

Words Matter 

According to Dr. Cummings, “Conversations about autonomous systems are often filled with a lack of technical literacy and emotional rhetoric.” What some may think is A1 may not actually be.  Case in point: a search and rescue software company that uses spectrum to find and sort through colors may not actually be “AI,” says Dr. Cummings, but rather simply good engineering. Knowing what you are actually getting for your money is important too, especially if you are the Department of Defense (DOD), according to Cummings. Words matter.

So, what is the difference between automated and autonomous systems and where does artificial intelligence (AI) fit into the mix?

Automated systems operate by clear repeatable rules based on unambiguous sensed data.  For the military, the Patriot missile system provides an example of the defensive application of automation. The Predator and Reaper’s radar-guided missiles illustrate offensive uses. In the case of the latter, a human guides a drone to the target, identifies the target, and approves weapons launch.

Contrast this with autonomous systems which take in data about the unstructured world around them, process that data to generate information, and generate alternatives and make decisions in the face of uncertainty. People often refer to this set of capabilities as self-governing. In the military context, an autonomous system would be given a goal (e.g., find and kill the enemy) and then should execute the missions by themselves until that goal is accomplished.

AI is difficult to define. Dr. Cummings states it is, “the capability of a computer system to perform tasks that normally require human intelligence such as visual perception, speech recognition and decision-making.” Some autonomous systems contain AI; some do not. Thus, AI and autonomous systems are not necessarily synonymous in practice.

Do Robots “Think”?  

Not really.

Cummings explained that human intelligence generally follows a sequence known as the perception–cognition–action information processing loop. Individuals perceive something in the world around them, think about what to do, and then, once they have weighed the options, make a decision to act.

AI is programmed to do something similar. A computer senses the world around it, and then processes the incoming information through optimization and verification algorithms, with a choice of action made in a fashion similar to that of humans.

Benefits of Autonomous Drones

In the fog and friction of combat, when warriors are stressed, hungry and fatigued, autonomous drones can fill a gap. Specifically, autonomous weaponized drones, or what some organizations refer to as “killer robots,” could, in theory, replace error-prone humans. Absent human targeting errors, autonomous drones could aid human decision makers in time-critical targeting scenarios by making unbiased decisions quickly and more accurately for static, predesignated and well-mapped targets. One important benefit: reducing blue-on-blue (friendly fire) incidents.

However, Dr. Cummings explained, autonomous drones are not the be-all-and-end-all solution for every scenario. “Autonomous technologies perform best when they are doing a very narrow task with little to no uncertainty in their environments.” Dynamic environments are filled with uncertainty and AI can be imperfect. For example, a car’s self-driving feature may generally be able to “recognize” a stop sign.  However, add snow to the top of that sign – maybe not so much.

LIMFACs

What are the current limiting factors (LIMFACS) of autonomous systems, military and civilian? According to Dr. Cummings, the “Achilles Heels” are the limitations of machine learning and computer vision systems. All autonomous systems are deeply flawed in their ability to reliably identify objects, as the stop sign example illustrates. On the battlefield, the doctor described a system that, unable to distinguish between adults and children, would have mowed down a bunch of kids. Yikes!

She explained, “There remain fundamental problems in current computer vision technology that make it very unreliable in understanding the world, especially in dynamic situations. These computer vision weaknesses could be exploited and create new cybersecurity concerns. This makes them less useful in high certainty engagements and dynamic environments.” 

Humans and Loops

We’ve all heard the terms “human in the loop” or “human on the loop.” Because of the uncertainty in current autonomous systems, at least with regard to weapons systems, people want to see humans still involved, to one degree or another. Some refer to this as meaningful human control (MHC). However, defining that MHC is also incredibly difficult.

Most people frame the MHC debate in terms of what the human should or must do. Does this mean that a human has to initiate the launch of such a weapon? Does MHC mean a human has to monitor the weapon until impact and perhaps have the ability, however remotely, to abort the mission? Are we asking humans to take on MHC tasks that have high likelihoods of human error and if so, what are the consequences?

Dr. Cummings has reframed the debate as primarily a design issue.To her, engineering is key, and meaningful certification of autonomous systems is the answer. Once an autonomous system can be shown to perform better than a human at a complex safety-critical task, it should be certified. The challenge in that, she explains, is that engineers and designers must know the strengths and limitations of both humans and the relevant technologies and such boundaries are not always clear.

The Public-Private Connection

Funding for testing and evaluation (T&D) and research and development (R&D) in the DOD have declined. Until perhaps recently, autonomous systems development has not been a priority for the DOD due to intense competition for limited investment dollars where traditional platforms like narrowly capable manned fighter aircraft (e.g. the F-35) and extremely costly laser weapons are viewed as more important. Increasingly, the DOD has outsourced the development of autonomous systems and AI to Silicon Valley or other groups. Cummings sees a shift in AI and autonomy expertise from the military to commercial enterprises.

This is a double-edged sword.

On the upside, public-private partnerships often bear positive fruits in putting advanced tech capabilities in warfighters’ hands. One example is Skydio, a company that developed an autonomous drone which, on the commercial side, just received the first Federal Aviation Authority (FAA) approval for state-wide beyond visual line of sight (BVLOS) bridge inspections in North Carolina. BVLOS is key to the repeatable and scalable commercial drone operations. On the military side, Skydio is part of the Defense Innovation Unit (DIUX) Blue Small Unmanned Aircraft System (sUAS) program and one of only five companies on the current GSA-approved list for commercial-off-the-shelf (COTS) drones for the DOD. The Blue sUAS arose out of cyber security concerns with Chinese COTS drones. It filled the gap created by a FY20 National Defense Authorization Act ban on Chinese drones in the DOD.

Another example of collaborative R&D and T&E on the aviation side is the U.S. Air Force’s (USAF) Agility Prime non-traditional effort, seeking to field “flying cars” or electric vertical take off and landing (eVTOL) aircraft for a variety of military and commercial purposes (distributed logistics, medical evacuation, firefighting, disaster response, search and rescue, and humanitarian-relief operations). Partnering with Joby Aviation, which just this week announced a net worth of $6.6B, as part of the “Agility Prime Air Race,” the Air Force is pushing the FAA on first-ever certifications. The U.S. Army is also assessing the utility of autonomous drones in missions like Army Future Tactical UAS Program (FTUAS) and Joint Tactical Autonomous Aerial Resupply System (JTAARS) programs for tactical reconnaissance, strike and resupply missions.

On the down side, this reliance on the commercial sector for key technologies results in brain drain. The hotly competitive market for roboticists and related engineers across these industries, aerospace and defense sectors, especially when funding lags behind, is less appealing to the most able personnel. This creates a situation where the global defense industry falls behind its commercial counterparts in terms of technology innovation. According to Cummings, that gap continues widening as the best and brightest engineers move to the commercial sphere.

Securing National Security

Education and workforce development are key to building DOD expertise in all facets of AI and autonomous systems. Failing to focus on this will eventually lead to deficient fielded products or ones lacking in appropriate safeguards and testing. Perhaps even more alarming, continuing to outsource AI and autonomy will negatively affect military readiness in both the short and the long term. Other second and third order effects are as yet imagined.

To educate yourself and others more about Dr. Cummings’ thoughts on autonomous drones in national security, find two of her excellent articles here and here and check out Duke’s website for the impressive list of her other scholarly musings.

For additional articles and podcasts on commercial and military drones, visit Col (Ret) Zoldi’s website at: https://www.p3techconsulting.com, follow her on http://patreon.com/dawnzoldi and tune into the Drones at Dawn podcast every Monday 10:00 am ET weekly at https://launchpad.aero/podcasts/.

About the author:

Col. Dawn Zoldi, USAF (Ret.)

Dawn M.K. Zoldi (Colonel, USAF, Retired) is a licensed attorney with 28 years of combined active duty military and federal civil service to the Department of the Air Force. She is an internationally recognized expert on unmanned aircraft system law and policy, the Law-Tech Connect™ columnist for Inside Unmanned Systems magazine, a recipient of the Woman to Watch in UAS (Leadership) Award 2019, and the CEO of P3 Tech Consulting LLC.

The views expressed by guest authors also do not necessarily reflect the views of the Center on Law, Ethics and National Security, or Duke University.

Remember what we like to say on Lawfire®: gather the facts, examine the law, evaluate the arguments – and then decide for yourself!

 

 

 

You may also like...