MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Font ResizerAa
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Reading: ‘A real risk’: the rise of weapons that can act alone
Share
Font ResizerAa
MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Search
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Have an existing account? Sign In
Follow US
© Market Alert News. All Rights Reserved.
  • bitcoinBitcoin(BTC)$69,154.005.48%
  • ethereumEthereum(ETH)$2,059.197.16%
  • tetherTether(USDT)$1.000.02%
  • rippleXRP(XRP)$1.424.07%
  • binancecoinBNB(BNB)$617.311.87%
  • usd-coinUSDC(USDC)$1.00-0.02%
  • solanaSolana(SOL)$85.209.69%
  • tronTRON(TRX)$0.2801981.30%
  • dogecoinDogecoin(DOGE)$0.0964345.40%
  • Figure HelocFigure Heloc(FIGR_HELOC)$1.04-0.52%
Government Policies

‘A real risk’: the rise of weapons that can act alone

Last updated: October 29, 2025 9:10 pm
Published: 4 months ago
Share

“Latest: 8,300 students killed. 12 universities hit,” reads a news ticker at the bottom of the screen. “Authorities are still struggling to make sense of an attack on university campuses worldwide, which targeted some students and not others,” the newsreader says.

This scene is from Slaughterbots, a short, fictional film made by the Future of Life Institute, a California-based group that has called for a ban on autonomous weapons enabled by artificial intelligence. In the story, an executive at an arms company proudly pitches swarms of tiny drones that use facial recognition to spot and execute its customers’ enemies in a crowd. Spoiler alert: things don’t end well for student activists targeted by the weapons system because of their online political profiles.

At the time of the film’s 2017 release, Paul Scharre, a former special-operations infantry soldier and executive vice-president of the Center for a New American Security, a think tank in Washington DC, said that although a Slaughterbots-style scenario was technologically feasible, there were defensive measures against the type of attack portrayed.

Scharre, who has helped to draft US government policies on autonomous weapons and published a book on the topic in 2019, fears that humanity is at risk of an arms race that could lead to unreliable, hard-to-control machines with the power to decide when and whom to kill. Nature spoke to Scharre about how technology is enabling these new weapons systems.

There is no internationally agreed definition of autonomous weapons, which can complicate discussions about them. I define them as weapons that, once launched by humans, can search for, find and attack targets on their own. Since the 1980s, many countries have acquired air-defence systems that can automatically track and shoot down incoming threats, for example.

When people debate autonomous weapons today, however, they are generally referring to those used in offensive combat. An Israeli drone system called Harpy hunts targets that emit radar signals, such as ground-based air defences designed to detect incoming aircraft and missiles. Such offensive systems are designed to be used in specific scenarios, and I’m not aware that they have been used widely in combat.

Humans are still generally making the final decisions about targets, but the pace of technological development means I can certainly see a world in which offensive weapons systems with greater autonomy become widely used.

Ukraine has been experimenting with autonomous terminal-guidance systems, also called the autonomous last-mile solution. In this kind of system, human operators navigate a drone towards a moving vehicle or a person, for example, and once it has locked on, even if the communications link to the operator is severed, the drone can continue to track and then strike the target on its own. Although a human is still choosing the target, these systems are a potential stepping stone towards weapons with greater autonomy.

As well as terminal-guidance systems, we have also seen drones equipped with machine learning being used to identify targets in Ukraine. It is not a huge leap to see these two technologies being combined to enable systems to hunt for and attack targets without human involvement.

Countries are investing in ways to counter the increased use of drones, such as jamming their communications. If drones can operate autonomously, then jamming isn’t such a big problem. So I think we’re going to see more and more autonomy in warfare.

One vision of the future is that autonomous weapons systems are given the authority to strike targets within kill zones, or ranges of space and time, on the basis of certain criteria.

One risk is that, over time, militaries will hand over more decision-making responsibility to machines, potentially leading to greater suffering in war. If humans delegate decisions to autonomous weapons, they might feel less morally responsible if the systems attack civilians or cause excessive collateral damage and deaths.

There’s a risk that, when you look to identify responsibility, militaries will say that no one is responsible because a machine made that decision. We need to ask whether this is an acceptable answer and whether we are comfortable handing over life-and-death decisions to machines. There is nothing specific in international law that states it has to be a human that makes the decision to kill. However, existing laws governing conduct on the battlefield are designed to reduce civilian casualties, ensure attacks are militarily necessary and prevent the use of weapons that cause unnecessary suffering.

I am concerned about the trajectory the technology is on. We have no internationally agreed rules for how to use AI and autonomous weapons, and that’s a problem, because although some countries might be careful about how they are used, others certainly won’t be. Some international humanitarian organizations and some nation states have argued that there should be an international treaty that would ban the use of lethal autonomous weapons. There are some efforts to make headway on that at the United Nations, but it’s hard to see how effective a ban could be when major military powers including the United States, China and Russia don’t support it.

There’s a real risk that as militaries prioritize staying ahead of competitors, the concerns get shunted to one side and we end up with dangerous, unreliable weapons that are difficult to control. In the future, humans might develop autonomous weapons over which they technically have control but to which, in practice, they have ceded judgement about targeting people. Given the way these technologies are being used today, I think that’s a real danger in the coming years.

Read more on Nature

This news is powered by Nature Nature

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook

Like this:

Like Loading...

Related

A trail of online posts could decide immigration fate in the US
No of detainees on rise, but no plan to expand centres, says Gurgaon DC | Gurgaon News – Times of India
Abia lifts ban on NURTW, Azode emerges chairman
Semi-Flexible Cable Assembly Market was valued at US$ million in 2022 and is projected to reach US$ million by 2029 | Taiwan News | Nov. 27, 2025 03:53
Senior Data & Apps Development Specialist – Indonesia

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Email Copy Link Print
Previous Article JUI-F backs PTI’s nominee Achakzai for NA opposition leader
Next Article Naxalite Exodus: 51 Surrender in Chhattisgarh, Embracing Peace and Progress | Law-Order
© Market Alert News. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Prove your humanity


Lost your password?

%d