Senior scientists and academics reportedly say Artificial Intelligence is ready to engage on the battlefield. But in the wrong hands, technology could carry out heinous acts. (Zapp2Photo/Getty Images)
© 2024 Blaze Media LLC. All rights reserved.
Killer robots will likely fall into the hands of "terrorist groups and rogue states" within the next few years, experts warn.
Senior scientists and academics say Artificial Intelligence is ready to engage on the battlefield, the BBC reports. But in the wrong hands, technology could carry out heinous acts, such as self-driving cars driving into crowds of innocent people.
It's part of a debate on just how dangerous autonomous weapons are becoming.
Drones are part of the equation, said Alvin Wilby, vice-president of research at French defense firm Thales. He made his comments to a Parliament committee in Great Britain, the BBC reported.
The U.S. and China are already testing technology that includes "dozens of cheap, unmanned aircraft," that can hunt down targets or be used as a way to protect people from attacks, Wilby said.
"The technological challenge of scaling it up to swarms and things like that doesn't need any inventive step," he told the Lords Artificial Intelligence committee. "It's just a question of time and scale and I think that's an absolute certainty that we should worry about."
Bad copies of these autonomous weapons could easily fall into the hands of terrorists, said Noel Sharkey, emeritus professor of artificial intelligence and robotics at University of Sheffield.
When used for evil, keep in mind that the robots are simply killing machines carrying out orders without conscience, Sharkey warned. He paints a picture in which drones and other robots could roam around "firing at will."
And he believes it has a very good chance of quickly becoming a reality.
"I don't want to live in a world where war can happen in a few seconds accidentally and a lot of people die before anybody stops it," said Sharkey, who is also a spokesman for the Campaign to Stop Killer Robots.
The campaign is part of an effort to raise awareness of the issue by Human Rights Watch, an international, nonprofit and non-government group.
Some activists say there is far too little oversight on the development of autonomous technology. Sharkey said the United Kingdom needs to make a commitment to "banning AI weapons in law."
Russia, for example, recently said it would ignore any United Nations attempt to ban "killer robots."
Some of the robots appear to be quite nimble. Boston Dynamics, a defense contractor that is linked to DARPA and Google, demonstrated a new humanoid robot jumping over blocks and doing backflips.
Want to leave a tip?
We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
more stories
Sign up for the Blaze newsletter
By signing up, you agree to our Privacy Policy and Terms of Use, and agree to receive content that may sometimes include advertisements. You may opt out at any time.
© 2024 Blaze Media LLC. All rights reserved.
Get the stories that matter most delivered directly to your inbox.
By signing up, you agree to our Privacy Policy and Terms of Use, and agree to receive content that may sometimes include advertisements. You may opt out at any time.