Robotic Personalities: Lovers And Soldiers - Alternative View

Robotic Personalities: Lovers And Soldiers - Alternative View
Robotic Personalities: Lovers And Soldiers - Alternative View

Video: Robotic Personalities: Lovers And Soldiers - Alternative View

Video: Robotic Personalities: Lovers And Soldiers - Alternative View
Video: New Robot Makes Soldiers Obsolete (Corridor Digital) 2024, May
Anonim

Everyone remembers the three laws of robotics, formulated by Isaac Asimov back in the 1940s. Back then, the future inhabited by robots seemed like fantasy, but these days robots have become a reality. And it turned out that three laws for regulating the relationship between robots and humans are not enough. We asked Oksana Moroz, a culturologist, associate professor of the RANEPA and MSSES, to talk about the ethical problems that the intrusion of robots into our lives is fraught with.

Robotics is developing at such a rapid pace that it seems that in the near future machines will become permanent partners of man in almost all areas of activity. Experts already expect an increase in the spread of social robots, and by 2020 they predict the widespread development of "smart" enterprises. The machine is not as a participant in the uprising against humanity, but as an assistant, a necessary and useful element of a technocratic society - this is the image that arises when looking at the dialogue between artificial intelligence and its creators.

This rapprochement gives rise to a struggle to regulate relations between individuals of human and non-human nature. Over the past year, the legitimization of the rights of robots, the movement for the recognition of the machine as a subject of law have become an integral part of the political agenda - it is enough to recall the stories about the gynoid Sophia, who received citizenship, as well as the publication of the resolution of the European Parliament, which outlined the norms of civil law on robotics and laid the foundation for the future. The Robotics Charter.

The creation of all these documents may look like a political game of futurology. However, such conventions are needed now - if only because humanity involves machines in relationships that require legal regulation and the definition of mutual obligations. And the three laws of robotics by Isaac Asimov can hardly be considered sufficient - both ethically and even more so from a formal point of view - to support such interactions.

For example, engineers are racing to develop a new type of robotic assistant that can not only relieve loneliness, but also satisfy the natural human need for sexual pleasure. And while the tabloids circulate news about the creation of androids with bionic penises, activists seriously fear the negative impact of machines on human intimate practices. Against the background of constantly multiplying harassment scandals, the future, in which potentially any sexual habits are satisfied with a resigned sex doll, only seems cloudless.

People still, apparently, do not always know how to negotiate the boundaries of permissible, acceptable and undesirable sex behavior, in general they are not often inclined to discuss this side of the relationship in a dialogue mode between partners. When machines - programmed to meekly fulfill all the owner's wishes and equipped with algorithms for studying his / her tastes - fall into such a difficultly formalized but requiring some kind of regulation zone, they only at first glance look like salvation.

In fact, they dehumanize sex, turn it into a process of using an object deprived of any right to vote and will. And, according to human rights activists, as a result, they provoke the development of misogyny and misandria. A person is able to feel sympathy for even the most anthropomorphic robots, to empathize with them. Extremely anthropomorphic machines are much easier to become the object of emotions, implicitly based on the recognition of the presence of a certain identity in the programmed object. The habit of using sex robots, whose identity is made up of external attractiveness and activity humility, support and fulfillment of any requirements of the owner, can lead to the recognition as the norm only of such behavior vis-a-vis. And even - to transfer to living people just such a form of relationship,or even to refuse to communicate with "organic" partners.

By the way, robots are good not only in bed, but also in war. At least one of the infamous four-legged creatures of the Boston Dynamics company was produced with direct funding from the Office of Advanced Research Projects of the US Department of Defense in accordance with the Maximum Mobility and Manipulation program. The use of drones in the fight against international terrorism and, in general, military operations carried out in the Middle East in the framework of the so-called low-intensity conflicts is a story that could only be called secret in the 2010s.

Promotional video:

Some experts generally believe the digitalization of war is a consequence of the dot-com bubble - the use of new technologies then served to increase the efficiency of traditional business. And what could be a more traditional and classic profit-making activity than war?

Others believe that the price that states and citizens pay to digitize war is too high. The inability to completely eliminate the human factor from the process of remote destruction of victims leads to the emergence of new forms of PTSD in drone operators. And fully adequate methods of treating this condition have not yet been found. Moreover, officially recognizing this type of service as potentially traumatic means damaging the blissful image that is being created by all means around remote military operations.

On the other hand, the development of robotics and especially artificial intelligence, the solutions of which are not always clear even to developers, is not just another step in solving the problem of automating war as a very resource and energy-intensive type of human activity. This is a fundamental intrusion into existing ethical conventions that have been included in the laws of war for centuries, and in modern times have served as the basis for a whole set of legal norms and principles - International Humanitarian Law. This is the inclusion of mathematical logic and algorithms in the equation, parts of which were "human, too human". Together, creating something that can learn, but definitely does not know how to make moral choices - that is, does not have a skill that is very important for discussing issues of life and death.

It is extremely interesting to observe the infinity of ethical paradoxes that accompany the nascent cooperation between robotic systems claiming to possess some kind of identity and humanity. However, the prospects for collaboration between people and machines will become truly interesting when we get acquainted not with the fantastic assumptions of writers or IT evangelists about what the robots care about, but with the opinion of the living beings themselves of non-human nature.

Oksana Moroz

Recommended: