KYIV — On a sunny, balmy Tuesday in late April, a line of cars formed outside a military proving ground near Kyiv, their passengers eager to catch a glimpse of the future of warfare. As the drivers waited to enter, hosts of maybugs swarmed overhead. Above them buzzed bigger, robotic competitors: military drones crisscrossing the sky.
The grounds were full of techies and soldiers, gathering on a field scarred by Russia’s foiled assault on Kyiv to test-drive their latest innovations. Among them was a Ukrainian startup called Swarmer. Registered in Delaware and with offices in Romania and Poland, it had something special to demo: drones that use artificial intelligence to work together as a coordinated swarm.
It took about 20 minutes for Swarmer’s team of five engineers to prepare the drones for their mission: to find and destroy two targets hidden somewhere in the field. Usually, drones need pilots equipped with video-game-style controllers and goggles to see through their cameras. For this test, Yaroslav Sherstyuk, a former Ukrainian army officer, planned to run three reconnaissance drones and two larger bombers. “I will be in charge,” Sherstyuk said. “But that only takes pushing three buttons.”
He indicated the targets on a map, pushed start and leaned back in his chair. Two reconnaissance drones zoomed off. “Each of them decided on the best trajectory on their own, based on a possible terrain relief or other possible obstacles we pointed out on the map,” said Sherstyuk, watching their progress on his screen. The two bombers followed. Only the last, small drone stayed back.
The bombers found their targets. Sherstyuk approved the attack, then hopped on a phone call with his son while the drones carried out the strike. The last drone whirred forward. “It’s checking whether the bombers destroyed the target,” Sherstyuk said. “Command usually demands destruction or damage confirmation, so it’s doing that.”
Mission accomplished, the drones floated back to Sherstyuk. Swarmer’s team toasted the demo’s success with alcohol-free beer.
As its war against Russia drags on, Ukraine is emerging as a testing ground for cutting-edge warfare, including drones and other vehicles capable of carrying out parts of their mission on their own. Serhii Kuprienko, Swarmer’s 39-year-old founder and chief executive, thinks his swarms might be deployed this year. “We have already passed the first combat tests,” he said.
By allowing a single pilot to control multiple drones, Swarmer aims to alleviate the manpower shortages that have put Ukraine’s armed forces on the back foot in recent weeks. Using artificial intelligence, or AI, the company’s drones will react autonomously to changing circumstances and communicate with each other to orchestrate a sortie.
“AI-powered drones can do in seconds what would take a human several hours, simply because we are slow to process a large volume of information,” Kuprienko said. “The swarm is effective because one experienced drone pilot can work effectively with dozens of drones at the same time.”
He stressed that the decision to attack a target would always be made by a human. Still, he couldn’t resist making the Arnold Schwarzenegger-sized sci-fi reference that inevitably looms over any conversation about AI-powered combat robots.
“My goal,” he said, “is to make a proper version of the Terminator that will protect us and help our army.”
Ukraine turns to AI to maintain its drone advantage
Technology has been the linchpin of Ukraine’s fightback from the very moment Russian tanks rolled over its borders in February 2022. Demography — Russia’s population of 144 million versus Ukraine’s 38 million — made Moscow’s preponderance in manpower and materiel unmatchable. There was no fighting fire with fire. You had to douse it with bits.
Drones — military-grade or cheap hobbyist models — epitomize that strategy. On the battlefield, they are ubiquitous, scanning the ground to spot targets with ammo-saving precision, dropping grenades or ramming kamikaze-style into Russian vehicles, gear and troops. Telegram, a laxly moderated social network popular in Russia and Ukraine, is awash with videos of Ukrainian drones rigged with explosives fluttering around Russian soldiers and tanks like angry, deadly mosquitos. Russia has also leaned heavily on drones, both imported from allies and domestically produced.

Drones have broadened the buffer zone between Ukrainian and Russian combatants from a couple of kilometers to as much as 20 km, Ukraine’s deputy minister of digital transformation Alex Bornyakov told POLITICO. “Anything you put in this ‘gray area’, it can be destroyed by drones,” he said.
Their pivotal role was officially acknowledged in February 2024, when Ukrainian President Volodymyr Zelenskyy launched a new branch of the military: the Unmanned Systems Forces, entirely dedicated to working with aerial, ground and sea drones. Days later, his government announced that Ukraine was on track to manufacture more than a million drones by the end of 2024.
Even if the recently announced inflow of U.S. ammo helps Ukraine diversify its approach, drones are not going away. “Once additional artillery is available again, the Ukrainians will use [commercial drones] less,” said Torsten Reil, co-CEO of the pan-European AI defense company Helsing, which works with the Ukrainian government. “[But] midterm, we will see a development of much more capable drones, strike drones, that will complement and mirror artillery in terms of their effect.”
The challenge for Ukraine is that Russia has recognized the disruptive power of the technology and turned to electronic warfare — deploying vehicles covered in dish antennas to jam the radio frequencies relied on by Ukrainian drone pilots. “Russia has jamming stations every 10 kilometers at the front line, so Ukraine’s previously successful drone-based approach doesn’t work as well as it used to,” Reil said.
In addition to cutting off drones from their pilots, jamming can interfere with navigation systems like GPS. Worse, the Russian technology can work out the location of drone pilots — who in many cases need to remain within line of sight of their vehicles — and tag them for artillery strikes. Concerns about jamming and other types of electronic warfare have contributed to reluctance in Washington to providing Kyiv with American-made Reaper spy drones.
To try to maintain its edge, Ukraine has launched an effort to restyle itself as a tech accelerator in the field of drone warfare, working with local and Western companies to develop countermeasures and test them and deploy them on the battlefield.
At the forefront is a government program called Brave1, a joint venture encompassing six ministries modeled on the U.S.’s Defense Innovation Unit (DIU), a Department of Defense outfit tasked with facilitating military deployments of commercial technologies.
“Everyone was like: ‘Let’s build Ukrainian DARPA’,” said Bornyakov, referring to the Department of Defense’s military research agency. “But DARPA is more about complex theoretical innovations.” Brave1 is focused on developing solutions that can be rolled out, stat.
Roughly a year after its launch, some 700 inventions that went through Brave1’s program were approved for use by the Ukrainian Armed Forces; around 40 have found their way to the front. Once a working prototype is identified, government certification can be obtained in as little as 21 days, Bornyakov said.
Brave1 supports startups working on medical technologies, logistics and cybersecurity, among other areas. But its top priority is unmanned systems. “The goal [is] that robots, not humans, should fight on the battlefield,” Bornyakov said.
Given the Russian jamming, achieving that goal often boils down to one thing: AI, the use of vast troves of data — images, text, audio, video, radio signals — to teach drones to act on their own. Reconnaissance drones use image-recognition to do away with GPS or spot and identify camouflaged military targets. Explosive kamikaze drones can be trained to forge on even if they lose contact with their pilots.

“AI can help lock in on targets and then automatically — without communication, or in conditions of suppression by the enemy’s electronic warfare systems — make it possible for the drone to hit the target,” Ukraine’s deputy prime minister and technology chief Mykhailo Fedorov said in an interview in a high-security governmental building in April. He said that 10 companies active in Ukraine are developing AI technology to help drones close in on their targets.
The technology on offer may fall far short of Terminator-style robots: in Ukraine, the call to engage with a given target still lies with a human decisionmaker. But for how long?
“If you think of AI as drones autonomously and independently making some decisions to strike or not to strike something, there is no such thing yet,” said Fedorov. “But I think that there is a certain future for it.”
In an interview with the Associated Press in 2023, he was more explicit: Autonomous killer drones, he said, are “a logical and inevitable next step.”
The future of warfare is being written in Ukraine
Just how much of a difference AI is making on the battlefield is a matter of some debate. Jim Acuna, a former Central Intelligence Agency officer and the founder of the Baltic Ghost Wing Center of Excellence drone pilot school in Estonia, is skeptical. “It’s all wishful thinking, it’s not a reality,” he said.
Autonomous weapons systems still don’t make sense from a “cost-benefit analysis” perspective, said Franz-Stefan Gady, an associate fellow at the International Institute of Strategic Studies think tank. “None of these platforms currently are at a stage where they can genuinely be deployed at scale, consistently, and without a huge support infrastructure.”
Technologists, unsurprisingly, have a different take. “If you zoom in, it’ll look like pilot projects,” said Louis Mosley, vice president for Europe for the U.S. artificial intelligence giant Palantir, which works in Ukraine. “But when you zoom out, you can see that this is now the way the war is being fought.”
Peter W. Singer, a defense analyst and author of 2009 best-seller “Wired for War”, sees the war in Ukraine as playing a similar role to the Spanish Civil War, which served as a dress rehearsal for new techniques and technologies ahead of World War II. Modern tank warfare and aerial bombing — as captured in Pablo Picasso’s dramatic oeuvre Guernica — were arguably forged in the Spanish crucible.
“There’s a bit of debate in defense circles that in Ukraine, yes, you’ve got drones but also trenches and cannons, so nothing’s really changing,” Singer said. “That’s like looking at Spain in 1936 and saying, ‘They are still using rifles’ and ignoring that they are bringing together these new technologies like aerial bombing.”
As Ukraine and Russia fight, Singer said, “everyone else is watching and learning.And they’re not watching and learning about trenches, but about the use of drones and AI and how it’s being brought together,” he said.

Ukraine has attracted dozens of Western defense and technology companies eager to test — or advertise — their offerings in a live-fire situation. “If companies want to do something in the field of defense innovations, they have to be in Ukraine,” said Brave1’s Chief Operating Officer, Nataliia Kushnerska. “Ukraine is definitely the most dynamic innovation ecosystem in the world.”
One of those companies is Quantum Systems, a company that has deployed 400 reconnaissance drones in Ukraine, with a contract to deliver 800 more and eventually build a factory in the country, according to its CEO Florian Seibel, who accompanied Germany’s Vice Chancellor Robert Habeck on a mission to the country last month.
The company has made waves as far away as Silicon Valley: German-American billionaire Peter Thiel is an investor and Seibel met emissaries from the venture capital firm Sequoia just before sitting down with POLITICO in his office, a squat, drone-filled building in an industrial park outside Munich.
Quantum Systems’ drones are expensive, at €200,000 each, but they use AI to overcome electronic warfare with preloaded maps and landmarks to navigate a GPS-denied environment, and machine vision to flag foes. “You can completely take operators out of the loop,” Seibel said.
While Quantum System’s drones only work as spotters, Seibel said the Ukrainian experience convinced him to form a new company called Stark Defense to develop autonomous weapons with full-strike capability. “If we don’t want our kids to fight Chinese war robots in the future, we have to get going and work on robots ourselves,” he said. Crucially, he added, Stark’s systems will be “capable of operating without a human in the loop.”
He acknowledged that this was a controversial approach. “We will prepare the grounds so that it is possible, but in the end, it’s not my decision,” he said. “If the decision of the German government is that we cannot have autonomous weapons with no human in the loop, well, then this will not be used.”
Stark is still, to a certain extent, in “stealth” mode — recruiting, seeking venture capital investment — but it would theoretically be able to deploy its products on short notice, he said. “We will deploy whenever we think it’s ready to be viable, and whenever the Ukrainians decide to,” he says.
Has the company already tested its drones in Ukraine?
“Maybe,” Seibel said.
Autonomous weapons: ‘The Oppenheimer moment of our generation’
Among the people watching what’s happening in Ukraine are those seeking to prevent the technology from ever coming to fruition. In late April, delegates from 143 countries joined activists, academics and at least one journalist in the caryatid-graced hall of the Hofburg imperial palace in Vienna to attend the biggest ever conference on autonomous weapons systems.
Above a bust of the 19th-century Emperor Franz Joseph, a screen displayed digital images of autonomous drones devastating cities with firework-like effects. Nearby, a blueish cut-out of Austria-born Schwarzenegger’s Terminator fixed visitors with glowing red eyes and a skeletal steel grin.

“This is the ‘Oppenheimer moment’ of our generation,” Austrian Foreign Minister Alexander Schallenberg told the assembled audience. “Autonomous weapons systems will soon fill the world’s battlefields … Technology is moving ahead with racing speed, while politics is lagging behind.”
He ceded the floor to Jaan Tallinn, an Estonian tech entrepreneur who has warned against the dangers posed by AI. The risk that comes with autonomous weapons (or killer robots, as their critics call them) is of a “suicidal arms race” to more autonomy, he said. Mass killing will become a mechanized, easy affair. AI-controlled drones will be deployed in assassinations or by terrorists targeting civilians. “When you opt for speed, you give up control,” Tallinn said.
For Anthony Aguirre, the executive director of the Tallinn-backed think tank the Future of Life Institute, the problem is not so much what is taking place in Ukraine right now — but what will happen when powerhouses like China and the U.S. get in the game. “If you end up with half a million drones in a shipping container able to go out and kill roughly half a million people — that’s a WMD,” he said.
The solution, Schallenberg said in an interview after the event, is “a legally binding international instrument” — a treaty banning fully autonomous weapons whose actions cannot be predicted or explained (AI algorithms are notoriously opaque) by their operators.
Alexander Kmentt, the Austrian diplomat who organized the conference, said that the goal is to ban “weapons designed in a way that they could not be used in accordance with international humanitarian law.”
The Austrian foreign minister acknowledged that many countries, including the United States and Russia, oppose the initiative and made that clear during the conference.
Last summer, U.S. Deputy Secretary of Defense Kathleen Hicks announced the Pentagon is seeking to build thousands of “attritable, autonomous systems” able to “overcome [China’s] biggest advantage, which is mass.” In early May, the secretary of the U.S. Air Force Frank Kendall took a ride in an AI-controlled F16, an early version of what the service hopes to be a fleet of 1,000 unmanned warplanes, with the first deployed by 2028. The U.S. Marines are reportedly testing rifle-equipped robotic dogs capable of scanning a battlefield for targets before requesting permission to attack.
In early 2023, France’s Defense Innovation Agency put out a call for tenders for two national efforts to develop kamikaze drones, some of which will be delivered to Ukraine in the coming months. And on Thursday, the Bundestag was expected to begin discussion on a motion entitled “Building a Drone Army.”
China too is developing AI-augmented drones. Its military has test flown drones that were capable of carrying an assault to its explosive conclusion after they were cut off from their operators. On the other side of the Taiwan strait, Taiwan President Tsai Ing-wen has said she had drawn “great inspiration” from the use of autonomous weapons in Ukraine and pledged to close the drone gap with Beijing. Iran and — of course — Russia are also working to integrate AI into their fleets.
“The fact that maybe not everybody would agree at the very beginning is maybe an obstacle, but not an excuse,” said Schallenberg. He pointed to earlier treaties on landmines and cluster munitions as examples of rules that, while not backed by everyone, including the U.S., established a baseline of decency. “Countries that do not abide by the rules — even if they’ve not ratified the treaty — now feel obliged to explain themselves,” he said.
For Singer, the military futurist, the Rubicon on autonomous weapons has already been crossed. “Autonomy was this big red line, and we crossed it without a lot of hullabaloo,” he said.
“If your vision of the future is using more robotic systems at the end of a long leash — a physical wire or wireless … The enemy is going to cut that wire through electronic or cyber warfare,” he added. “Then you will have the option of either doing nothing about it — or going for more autonomy.”
Ukraine’s delegation at the Vienna conference did not make a statement, but Fedorov told POLITICO he did not believe in “quick decisions to limit such technologies, because they are actively developing and there is still no stable experience of how they can have a negative impact.”
“I am sure that the question of limiting the use of AI should be resolved,” he said. “After our victory.”