Killer Robots - It Is No Longer A Fantasy, But A Reality - Alternative View

Table of contents:

Killer Robots - It Is No Longer A Fantasy, But A Reality - Alternative View
Killer Robots - It Is No Longer A Fantasy, But A Reality - Alternative View

Video: Killer Robots - It Is No Longer A Fantasy, But A Reality - Alternative View

Video: Killer Robots - It Is No Longer A Fantasy, But A Reality - Alternative View
Video: 5 Most Disturbing Things Said By A.I. Robots (Documentary) 2024, April
Anonim

It should be admitted that robotics has made quite a step in recent years. The weapons created by defense companies are becoming smarter, artificial intelligence systems are connected to them, robots are gaining full autonomy, and so on. This means that a killer robot could become a reality sooner than we think. At least that's what representatives of the RAH, a non-profit organization based in the Netherlands and advocating for world peace, think so. They announced this in their report published in the magazine Quartz.

Why build killer robots?

Killer robots are designed to make decisions about taking or preserving life on their own, without human control. Specialists of the Russian Academy of Arts called this alarming sign "the third revolution in war" after the invention of gunpowder and the atomic bomb. Both activists and states are calling for the creation of a set of international rules governing the creation of such weapons, or even an outright ban on their use. But some countries, including the United States, China and the Russian Federation, have not yet taken action on this issue.

RAX specialists have identified at least 30 global arms manufacturers who do not have a policy against the development of the above types of weapons systems. These include American defense firms Lockheed Martin, Boeing and Raytheon, Chinese government conglomerates AVIC and CASC, Israeli firms IAI, Elbit and Rafael, Russian Rostec, and Turkish STM.

At the same time, activists do not believe that the military application of one or another artificial intelligence system is a problem. The problem is precisely that such systems can become beyond human control.

For example, the US military is already developing a cannon with artificial intelligence, which will independently select and hit targets, as well as tanks with artificial intelligence, which will be able to "identify and hit targets three times faster than anyone." And STM, the Turkish state defense company, is already in full swing producing an AI-powered robot called KARGU. Complete with facial recognition capabilities, KARGU can autonomously select and attack targets using coordinates pre-selected by the operator. It is reported that Turkey intends to use KARGU in Syria.

PAX are most concerned about the potential deployment of AI in offensive systems that will select and attack targets on their own, without human oversight. The Group wonders how these weapons will distinguish between military and civilians. Moreover, lawyers still do not know who will be responsible if autonomous weapons violate international law.

Promotional video:

The Turkish drone KARGU tracks down the target and destroys it, falling from above like a kamikaze
The Turkish drone KARGU tracks down the target and destroys it, falling from above like a kamikaze

The Turkish drone KARGU tracks down the target and destroys it, falling from above like a kamikaze.

However, unlike Google or Amazon, which have faced both public and internal reactions to their work on military systems, companies like Lockheed Martin and Raytheon deal exclusively with the military, so they face minimal reaction from partners. as well as from ordinary people, since most of the developments remain classified until a certain point.

While the development of autonomous weapons continues, the RAX believe that there is still a way to prevent a possible catastrophe. The group said that manufacturing companies can play a critical role in this, and must oppose the production of fully autonomous lethal weapons. On the subject of AI-enabled weapons systems, PAX officials will say defense firms must follow a set of rules that have yet to be developed. But no one calls for completely abandoning AI.

Vladimir Kuznetsov