sexta-feira, 10 de junho de 2022

Experts warn that UN failure to ban slaughterbots could spell the end of humanity

Written by Cassie B.


See here

Experts in military strategy and artificial intelligence are raising the alarm after a UN conference did not reach an agreement on banning the use of so-called slaughterbots at a recent meeting in Geneva.

Slaughterbots is the name that has been given to weapons that can select and apply force to targets without using any human intervention. These weapons make their decisions using a series of algorithms in artificial intelligence software. Capable of hunting and striking targets without any input from controllers, their technology is growing so fast that many fear societies and governments have not taken the time to fully consider the dangers.

This year, for the first time, most of the 125 nations in the UN Convention on Certain Conventional Weapons called for new laws governing the killer robots. However, some countries opposed the measure, such as the U.S. and Russia, both of whom are known to be working on developing such weapons. Other nations that objected included India, Australia and the UK, with some arguing that continuing the development of these killer robots is vital to avoid having a strategic disadvantage.

The leader of the Future of Life Institute’s advocacy program on autonomous weapons, Emilia Javorsky, called the group’s failure to reach an agreement an “epic failure.”

She added: “It is now blatantly clear this forum — whose unanimity requirement makes it easily derailed by any state with a vested interest — is utterly incapable of taking seriously, let alone meaningfully addressing, the urgent threats posed by emerging technologies such as artificial intelligence.”

Unfortunately, time appears to be running out as slaughterbots are already being used in some places on the battlefield. For example, a UN report published this spring showed that STM Kargu drones have been used in the Libyan civil war. These small, portable rotary wing attack drones have precision strike capabilities and were used to hunt down soldiers who were retreating.

The companies that are developing the drones are working on AI systems that will be able to find a human target’s thermal signature or even identify people’s faces using a camera. However, they seem to lack some of the accuracy needed to make the distinction between a combatant and a non-combatant.











These weapons could be easy for anyone to obtain

The STM drones are among the most worrying for many officials, not least because of their resemblance to a normal consumer drone. They are fairly inexpensive, easy to mass produce, and can be equipped with guns. Some experts have warned that this accessibility means that gangs and other criminals could try to use them.

Massachusetts Institute of Technology Professor Max Tegmark believes we’re headed for the “worst possible outcome.” He said: “That’s going to be the weapon of choice for basically anyone who wants to kill anyone. A slaughterbot would basically be able to anonymously assassinate anybody who’s pissed off anybody.”

Tegmark told The Sun some of the ways this technology could be used. For example, he pointed out that if slaughterbots cost the same as AK-47s, drug cartels would use the bots to evade getting caught when they kill people. He also said that a judge with lots of bodyguards could still be killed by one of these if it was flown into their bedroom window while they were sleeping.

Macalester College Professor James Dawes said: “It is a world where the sort of unavoidable algorithmic errors that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities.”

“The world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia,” he added.

There’s no way it can end well when you let machines that are prone to unpredictable errors make their own decisions about who to kill. If these artificial intelligence weapons were to be equipped with chemical, biological or nuclear warheads, they could even wipe out humanity.

(In ROBOT.NEWS)





See here 





Nenhum comentário:

Postar um comentário