How AI defeats humans on the battlefield | BBC News

78,656
0
Published 2024-07-25
An array of tools powered by artificial intelligence (AI) are under development or already in use in the defence sector.

For instance, BAE Systems, a global defence contracting company, has unveiled the industry’s first AI-powered learning system that aims to instruct military trainees to be “mission-ready” faster.

Blending human and machine intelligence, AI weaponisation in warfare, such as autonomous weapons, can improve military capabilities via rapid data processing and more accurate targeting.

However, researchers and critics caution against this appetite for the acceleration of AI in the defence industry, noting that several of these development companies operate without checks on transparency and accountability.

AI presenter Priya Lakhani joins this week’s AI Decoded to discuss the military use of AI-enabled weapons.


Subscribe here: bit.ly/1rbfUog

For more news, analysis and features visit: www.bbc.com/news

#Technology #War #BBCNews

All Comments (21)
  • @fspg3207
    Down through history, humans have only ever sought evermore efficient ways to destroy themselves. - A log cabin in the wilderness sure looks good right now...
  • @tobyli52
    The tone, filter and music would be very different if BBC was reporting Chinese military with AI. 😂
  • @mybad.7164
    Crazy How she Smiles while she says Ai is Much more effective at Killing humans than Humans =)
  • I thought the point was to avoid war altogether not to celebrate advancements in mass killing.
  • @leevah
    Weapons with brains, perfect.
  • @HI-sw7vj
    Ive made a AI powered mechanicle arm to wipe my arse. Its attached to the pot and goes live tomorrow.
  • @luihinwai1
    Can we just have AIs fighting each other in a simulation, instead of actually going to war?
  • @BsERoss
    This is the most dystopian s**t I've heard all year and BBC are reporting it as if it's just another weekly roundup of cool gadgets!?
  • @8ersoul8
    Laying out a perimeter with AI. Launch vehicles. Is amazing
  • @honotron
    Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.
  • @KaiserV-2
    Hail our new AI Overlords! Neuro should rule the solar system!
  • @WormholeJim
    Came to this from a documentary of the Ukrainian attack on the Crimean brifge in July last year. the attack was carrid out by originally five drones. But due to human decisions enroute, three of those five drones did not have enough fuel to get to the bridge and had to be selfdestructed. Of the remaining two, one missed it's initial target being the railroad bridge, again due to human decision. It came around and instead rammed loadbearing pillars udner the road bridge. At the end, the bridge was severely damaged and could not be used for full capacity for a good while. But it didn't come down as was the intention and as would have happened if all five drones had managed to hit it, and it was repairable. If these drones had been autonomous they likely would have brought the entire bridge down - and thre is little doubt what the Ukrainians would have preferred. That is super scary to me, because it shows that it's only during peacetime that everyone agrees AI shouldn't be left with deciding power of initiating an attack. During wartime that all changes, however, and without anyone batting an eyelid.