security

Military, tech experts raise concerns about AI weaponization: ‘We have to be very concerned’ – The Hill


(NewsNation) – The speed at which artificial intelligence (AI) is developing and its vast array of capabilities have legislators wondering how much the advanced tech is capable of doing.

“It really takes me back to the 1930s, 1940s, 1950s, and nuclear physics. We knew right away nuclear physics could build energy,” said Sen. Roger Marshall, R-Kan., who serves on the Homeland Security Committee. “But we also knew it could turn into atom bombs.”

Marshall is one of many lawmakers to tell NewsNation that rapid development of AI technology is inevitable but that Congress should put guardrails in place to ensure it is not misused.

“We have to be very concerned with what we talked about, which is the unintended consequences,” said retired Air Force Maj. Gen. William Enyart.

The voices of caution come as reports have surfaced of the military dabbling with the idea of scenarios in which AI-powered drones are weaponized.

“Whether we’re talking about the Chinese building it, or us building it, or Iran building it … once it’s built, you can’t keep it secret. That’s just not going to happen. That’s not the way science works,” Enyart cautioned.

Friday, China’s communist government reportedly voiced similar concerns about the weaponization of AI.

Meanwhile, even those who have helped develop the technology have issued warnings to lawmakers. Last month, OpenAI CEO Sam Altman told lawmakers on Capitol Hill that humans could lose control of AI.

Readers Also Like:  WordPress Update 6.2.1 Causing Sites To Break - Search Engine Journal

“I think if this technology goes wrong, it can go quite wrong,” Altman said. “And we want to be vocal about that. We want to work with the government to prevent that from happening.”

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.