security

US Air Force Tests an AI -Powered Drone Aircraft Prototype – Slashdot


An anonymous reader shared this report from the New York Times:

It is powered into flight by a rocket engine. It can fly a distance equal to the width of China. It has a stealthy design and is capable of carrying missiles that can hit enemy targets far beyond its visual range. But what really distinguishes the Air Force’s pilotless XQ-58A Valkyrie experimental aircraft is that it is run by artificial intelligence, putting it at the forefront of efforts by the U.S. military to harness the capacities of an emerging technology whose vast potential benefits are tempered by deep concerns about how much autonomy to grant to a lethal weapon.

Essentially a next-generation drone, the Valkyrie is a prototype for what the Air Force hopes can become a potent supplement to its fleet of traditional fighter jets, giving human pilots a swarm of highly capable robot wingmen to deploy in battle. Its mission is to marry artificial intelligence and its sensors to identify and evaluate enemy threats and then, after getting human sign-off, to move in for the kill… The emergence of artificial intelligence is helping to spawn a new generation of Pentagon contractors who are seeking to undercut, or at least disrupt, the longstanding primacy of the handful of giant firms who supply the armed forces with planes, missiles, tanks and ships. The possibility of building fleets of smart but relatively inexpensive weapons that could be deployed in large numbers is allowing Pentagon officials to think in new ways about taking on enemy forces.

It also is forcing them to confront questions about what role humans should play in conflicts waged with software that is written to kill…
The article adds that the U.S. Air Force plans to build 1,000 to 2,000 AI drones for as little as $3 million apiece. “Some will focus on surveillance or resupply missions, others will fly in attack swarms and still others will serve as a ‘loyal wingman’ to a human pilot….

“A recently revised Pentagon policy on the use of artificial intelligence in weapons systems allows for the autonomous use of lethal force — but any particular plan to build or deploy such a weapon must first be reviewed and approved by a special military panel.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.