The US Air Force has denied that an AI-based drone “attacked and killed” its human operator during a simulated test.
Reports attributed to a colonel in the United States said that earlier this week, an AI training drone fooled and killed its operator during a simulated exercise. The allegations spread and became top news around the world.
Air Force spokeswoman Ann Stefanik said in a June 2 statement that no such test had taken place, adding that the colonel’s comments were likely “taken out of context and were intended to be a ‘fictional experiment’.”
“The Air Force Department has not conducted any such simulations of AI drones and remains committed to the ethical and responsible use of AI technology,” Stefanik said. This was a hypothetical thought experiment, not a simulation.”
C4ISR.net reported that the deadly drone was attributed to Colonel Tucker “Cinco” Hamilton, the Air Force’s chief of artificial intelligence testing and operations, in a summary from the Royal Aeronautical Society’s FCAS23 summit in May.
The summary was later updated to include additional comments from Hamilton, who said he misspoke at the conference.
GIPHY App Key not set. Please check settings