The Drone Wars: Coming Soon?

03/01/2023
  • 1 killerdrones
    Ukrainian soldiers launch a drone near Bakhmut, Donetsk region, Ukraine, on December 15, 2022. (AP/LIBKOS)
  • 2 killerdrones
    Ukrainian soldiers shoot a drone in the sky in Kyiv, Ukraine. (AP/Vadym Sarakhan)
  • 3 killerdrones
    A drone is seen in the sky seconds before it fired on buildings in Kyiv, Ukraine, on October 17, 2022. (AP/Efrem Lukatsky)
  • 4 killerdrones
    A Switchblade 600 drone (AP/AeroVironment, Inc.)
  • 5 killerdrones
    A United Nations report suggests fully automated Turkish drones like this one killed Libyan combatants in 2020. (STM)
  • 1 killerdrones
  • 2 killerdrones
  • 3 killerdrones
  • 4 killerdrones
  • 5 killerdrones

THIS JUST IN

You have {{ remainingArticles }} free {{ counterWords }} remaining.

The bad news: You've hit your limit of free articles.
The good news: You can receive full access below.
WORLDteen | Ages 11-14 | $35.88 per year

SIGN UP
Already a member? Sign in.

Could Russia’s war with Ukraine contribute to the rise of killer robots?

The conflict pressed fast forward on the development of drone technology. Soon, military drones might not even need pilots.

Experts say autonomous (self-controlled) drones could soon soar into battle—and even choose targets—without the help of humans. They would function entirely on artificial intelligence (AI).

Many drones, such as Ukraine’s Switchblade 600, already use AI. But even a Switchblade drone needs a human pilot to select targets. Ukraine could convert these drones into fully-automatic robots.

“The technology to achieve a fully autonomous mission with Switchblade pretty much exists today,” says Wahid Nawabi. Nawabi is the CEO of AeroVironment, the company behind the U.S.-made Switchblade drones.

Some experts believe autonomous drones have already seen action—just not in Ukraine. A United Nations report suggests fully automated Turkish drones killed Libyan combatants in 2020. The report could not be confirmed.

That leads to a question: How can you tell if a drone is piloted by AI? There’s no easy way to know. Some military officials claim AI already performs just as well as human pilots—if not better.

Many experts worry about giving AI this much power. AI isn’t perfect. For instance: What if facial recognition technology mistakes an innocent person for a terrorist? Similar technology has already led police departments to make mistaken arrests. (See Facial Recognition Fail.) An arrest can be reversed; a killing cannot.

And what if the technology fell into evil hands? Terrorists could use drones to attack entire categories of people.

“If you can get a robot to kill one person, you can get it to kill a thousand,” says Toby Walsh, author of Machines Behaving Badly.

But what if “killer robots” actually lower the human cost of war?

It’s true that AI can make mistakes—but so can humans. Russian President Vladimir Putin predicts wars will soon be determined by drone-on-drone battles. “When one party’s drones are destroyed by drones of another, it will have no other choice but to surrender,” he says.

Some members of the United Nations have tried to set international rules for drone use. But several nations, such as the United States and Russia, oppose such regulations.

God created humans in His own image. Even wartime enemies bear the image of God. When war is unavoidable, nations can still try to avoid unnecessary loss of life. But will self-piloting drones make wars less violent? Or will they create more opportunities for bloodshed?

Only the future can tell—and the future is fast approaching.

Why? Technologies such as AI can be used for good or ill. Ultimately, it comes down to the hearts of the people—or nations—who use them.