Everyone knows what the AI apocalypse is supposed to look like. Movie War Games and Terminator It features a super-intelligent computer controlling weapons to end humanity. Fortunately, this is currently unlikely to happen. U.S. nuclear missiles use decades-old technology and require humans to have a physical key to launch them.
But artificial intelligence is already killing people around the world in more boring ways. The U.S. and Israeli militaries have reportedly been using artificial intelligence systems to sift through intelligence and plan air strikes Bloomberg News, protectorand +972 Magazine.
This type of software enables commanders to find and list targets faster than human staff officers. Attacks are then carried out by human pilots using manned aircraft or remotely piloted drones. “The machine did its job ruthlessly, which made it easier,” an Israeli intelligence official reportedly said. protector.
Going further, Turkish, Russian and Ukrainian weapons manufacturers claim to have built “autonomous” drones that can attack targets even if the connection to the remote pilot is lost or interfered with. However, experts are skeptical that these drones can actually kill autonomously.
Whether in war or peace, artificial intelligence is a tool that can help humans do what they want to do more efficiently. Human leaders will make decisions about war and peace as usual. For the foreseeable future, most weapons will require a flesh-and-blood warrior to pull a trigger or push a button. Artificial intelligence lets those in the middle—staff officers and intelligence analysts in windowless rooms—mark enemy deaths with less effort, less time, and less thought.
“That Terminator The image of killer robots obscures all the existing ways in which data-driven warfare and data-driven policing, analytics, border control and other areas already pose a serious threat.
Suchman believes it’s most helpful to think of AI as “stereotype machines” running on top of old surveillance networks. “Powered by large amounts of data and computing power, these machines can learn to pick out patterns and people that are of interest to the government,” she said. minority report instead of Terminator.
Suchman said that even if humans review AI decisions, the speed of automatic positioning leaves “less and less room for judgment.” “It’s a very bad idea to take an area of human practice that’s fraught with problems and try to automate it.”
Artificial intelligence can also be used to approach goals that humans have chosen. For example, Turkey’s Kargu-2 attack drones were able to pursue targets even after losing connection with their operators, according to a United Nations report on the 2021 fighting in Libya involving Kargu-2s.
Zachary Kallenborn, a policy researcher at George Mason University who specializes in drone warfare, said the usefulness of “autonomous” weapons “is really very situational.” For example, a ship’s missile defense system may have to shoot down dozens of incoming rockets with little risk of hitting other objects. Cullenborn believes that while AI-controlled guns would be useful in such situations, firing autonomous weapons at “human beings in urban environments” is a bad idea because it would be difficult to distinguish between friendly troops, enemy fighters and bystanders. By.
The scene that really keeps Cullenborn awake at night is the “drone swarm,” a network of autonomous weapons issuing commands to each other, as one mistake could span dozens or hundreds of killing machines.
Several human rights activists, including the Sukman Commission, are pushing for a treaty that would ban or regulate autonomous weapons. The same goes for the Chinese government. While Washington and Moscow have been reluctant to accept international control, they have imposed internal limits on AI weapons.
The U.S. Department of Defense has issued regulations requiring human supervision of autonomous weapons. Russia appears to have quietly turned off the artificial intelligence capabilities of its Lancet-2 drones, according to an analysis cited by an online military magazine break through defense.
The same impulse that drives the development of artificial intelligence warfare appears to be limiting it: human leaders’ desire for control.
Military commanders “want to be very careful about managing the level of violence you inflict because ultimately you’re doing it to support a larger political goal,” Karen Byrne said.