In this screenshot from the film, students flee as AI drones have been programmed to track anyone associated with being a university student.Stop Autonomous Weapons via YouTube
Artificial intelligence is to the point where it barely needs humans to code itself. The applications seem endless. Automated services, generating content, basic care and services can all be greatly improved with the introduction of AI.
But what about weaponry?
This short film called Slaugherbots serves as a horrific prediction of what could happen should automated weaponry overrun the need for protection in our world.
[WARNING: The following video does contain violence and images of injury and even death. Viewer discretion is advised.]
For those who support AI-based weaponry, it's an object without feelings that can do the work of a military while saving officers from potentially devastating situations -- both mentally and physically.
And while there aren't governments openly and brazenly funding it now, AI-driven weaponry was a big enough threat to send Elon Musk, Stephen Hawking, and other industry leaders to pen a letter to the United Nations begging them to preemptively work toward putting together solutions.
The video opens a bit like Tesla's semi truck reveal, actually. The press event swayed the audience into ooos and ahhhs as they watch miniaturized, AI-driven robots kill a group of "bad guys" during the demonstration. The press speaker encourages the enthusiasm, promising that all it takes to program these drone bots with a profile is someone's age, sex, fitness, uniform, and ethnicity.
However, as with a lot of technology, the drones fall into the wrong hands. The AIs get hacked and target one single group of politicians. They later break through concrete and glass to kill university students who had all shared a video exposing the injustices of the drones and their policies.
At the end of the video, you're left wondering if this is what Elon Musk sees when he rails against autonomous AI weaponry being developed.
And it's not just Musk. Stuart Russell works at the University of California Berkeley as a leading AI scientist. He said the world is closer to integrating autonomous weapons than we are self-driving cars.
"The technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance," Russell said in an interview with the Guardian.
Russell also pointed out similar issues in the video for the non-profit Stop Autonomous Weapons.
"I've worked in AI for more than 35 years," says Russell in the video. "Its potential to benefit humanity is enormous, even in defense, but allowing machines to choose to kill humans will be devastating to our security and freedom."
"Thousands of my fellow researchers agree. We have that opportunity to prevent the future you just saw, but the window to act is closing fast."
Those researchers include Noel Sharkey, emeritus professor of AI at Sheffield University who notably tried warning the robotics community about this issue in 2009.
“The movie made my hair stand on end as it crystallizes one possible futuristic outcome from the development of these hi-tech weapons,” he said. “There is an emerging arms race among the hi-tech nations to develop autonomous submarines, fighter jets, battleships and tanks that can find their own targets and apply violent force without the involvement of meaningful human decisions. It will only take one major war to unleash these new weapons with tragic humanitarian consequences and destabilization of global security.”