Trump’s strike on Iran and the new generation of AI warfare means bombs could fall faster than the speed of thought

Trump’s strike on Iran and the new generation of AI warfare means bombs could fall faster than the speed of thought

GettyImages-2263711844-1-e1772569142397 Trump's strike on Iran and the new generation of AI warfare means bombs could fall faster than the speed of thought

AI has entered the war room, and it’s not going anywhere anytime soon, according to experts.

Although President Donald Trump has asked federal agencies and military contractors to stop working with Anthropic, the US military used the company’s AI model, CLOUD, in its attack on Iran, according to the British Daily Mail. The Wall Street Journal.

Now, some experts are raising concerns about the use of artificial intelligence in military operations. “The AI ​​machine makes recommendations about what to target, which is actually much faster than the speed of thinking in some ways,” says Dr. Craig Jones, author of Lawyers of War: The United States, Israel, and Targeting Spaces Which examines the role of military lawyers in modern warfare The Guardian.

In conversation with luckJones, a Newcastle University lecturer on war and conflict, said AI had dramatically accelerated the “kill chain”, compressing time from initial target identification to final destruction. He said that the US-Israeli strikes on Iran, which resulted in the death of Ayatollah Ali Khamenei, might not have occurred in the absence of artificial intelligence.

“It would have been impossible, almost impossible, to do it that way,” Jones said. luck. “I think the speed with which they were executed and the size and magnitude of the strikes were aided by artificial intelligence.”

The Pentagon has tapped AI companies to speed up and enhance war planning, entering into a partnership with Anthropic in 2024 that collapsed last week over disagreements over the use of the company’s AI model, Claude. But OpenI He signed quickly A deal with the Pentagon and Elon Musk’s XAI I reached an agreement To use the company’s AI model, Grok, in classified systems. The US Army also uses Palantir’s data mining software to gain AI-powered insights for decision-making purposes.

Artificial intelligence on the battlefield

The USAF has used “speed of thinking” as a standard for decision-making pace for years, Jones said. The time from collecting intelligence, such as aerial reconnaissance, to executing a bombing mission could take up to six months during World War II and the Vietnam War, he said. AI has compressed this timeline significantly.

The main role of AI tools in the war room is to quickly analyze massive amounts of data. “We’re talking about terabytes and terabytes and terabytes of data, everything from aerial imagery, human intelligence, cyber intelligence, cell phone tracking, anything and everything,” Jones said.

Dr. Amir Hussain, co-author of Hyperwar: Conflict and Competition in the Artificial Intelligence CenturyAI is being used to compress the U.S. military’s decision-making framework, known as the OODA loop — short for Observe, Direct, Decide and Act, he said. He said that artificial intelligence already plays an important role in surveillance, or in interpreting satellite and electronic data, decision-making at the tactical level, and the “acting” phase, specifically through autonomous drones that must operate without human guidance when signals are jammed. Some of those drones Indeed imitators Shahed drones owned by Iran.

Artificial intelligence has also appeared on other battlefields. Israel has reportedly used artificial intelligence To determine Hamas goals during the war between Israel and Hamas. and Autonomous drones Set on the front lines of the Russo-Ukrainian War, where both Russia and Ukraine use some variation of autonomous technology.

Double the risk

However, Jones noted a number of concerns about AI-driven warfare. “The problem when you add artificial intelligence to that is that you multiply the error scores, orders of magnitude,” Jones said.

Human error certainly exists with or without artificial intelligence technology, Jones said, citing the 2003 US invasion of Iraq as a conflict built on flawed intelligence gathering. But he said AI could exacerbate such errors thanks to the volume of data the technology analyzes.

There are also a series of ethical questions raised by AI warfare, particularly around the issue of accountability, something Hussein said the Geneva Convention and the Laws of Armed Conflict already require states to adhere to. As artificial intelligence blurs the lines between machine and human decision-making, he said the international community must ensure that human responsibility is attributed to all actions on the battlefield.

“The laws of armed conflict require us to blame the person,” Hussein said. “A person must be responsible regardless of the level of automation used on the battlefield.”

Share this content:

Post Comment