Consciousness Under Control - Alternative View

Consciousness Under Control - Alternative View
Consciousness Under Control - Alternative View

Video: Consciousness Under Control - Alternative View

Video: Consciousness Under Control - Alternative View
Video: Life, Death, and the Cycle of Awakening | Ram Dass | Full Lecture | NO Background music 2024, September
Anonim

The Internet has generated many subtle forms of influence that can significantly influence election results and manipulate what we say, think and do.

In the past century, a considerable number of writers have expressed concern about the future of humanity. In his book The Iron Heel, American writer Jack London presented a picture of a world in which a handful of wealthy corporate titans - "oligarchs" - keep the masses at bay with a brutal system of rewards and punishments. Most of the population is enslaved, while those who are fortunate receive decent wages that allow them to live comfortably - but without any real control over their lives.

In the novel "We" (1924), the brilliant Russian writer Yevgeny Zamyatin, foreseeing the excesses of the Soviet Union that was being created at that time, presented a world in which people are controlled by total surveillance. The walls of their houses are made of glass, so you can watch everything they do. They are only allowed to pull the curtains down for an hour a day to have sex, but both the date and the lovers must be registered in advance by the state.

In Brave New World (1932), British author Aldous Huxley painted an almost perfect picture of a society in which unhappiness and aggression are removed from humanity through a combination of genetic engineering and conditioned reflex development. In the much darker novel 1984 (1949), Huxley's compatriot George Orwell introduced a society in which thought itself is controlled; in Orwell's world, children are taught to use a simplified version of English called Newspeak so that they never get the opportunity to express ideas that would be dangerous to society.

These are all, of course, fictional stories, and in each of them, powerful leaders use visible forms of control that at least some people oppose and sometimes succeed. However, in the bestselling documentary The Hidden Persuaders (1957), which recently appeared in the 50th anniversary edition, American journalist Vance Packard drew a “strange and rather exotic type of influence” that was spreading rapidly in the United States. and which, to a certain extent, posed a more serious threat than the fictional types of control shown in the novels mentioned. According to Packard, American corporate executives and politicians have begun to use more subtle and in many cases completely unrecognizable methods to change the way people think.their emotions and their behavior, methods based on the achievements of psychiatry and social sciences.

Many of us have heard of at least one of them: subliminal stimulation, or what Packard calls "subthreshold effects" - in this case we are talking about the use of short messages that say what we need to do, but which appear for such a short moment that we simply do not realize that we saw them. In 1958, the National Association of Broadcasters, an association that sets standards for television, was forced to respond to concerns that a movie theater in New Jersey might have used hidden messages in films to increase volume. ice cream sales. As a result, the National Association of Broadcasters changed the existing rules and banned the use of subliminal messages in broadcasting. In 1974, the Federal Communications Commission argued that the use of such messages was "contrary to the public interest." The draft law on the prohibition of subliminal messages was also submitted to the US Congress, but was never approved. Both the United Kingdom and Australia have strict laws against their use. The draft law on the prohibition of subliminal messages was also submitted to the US Congress, but was never approved. Both the United Kingdom and Australia have strict laws against their use. The draft law on the prohibition of subliminal messages was also submitted to the US Congress, but was never approved. Both the United Kingdom and Australia have strict laws against their use.

Subconscious stimulation is probably still in use in the United States - after all, it is difficult to establish whether it is being used or not. Studies have shown that it has little impact and mainly affects those people who are already motivated and ready to follow the transmitted recommendation; subconscious stimulation of the desire to drink only affects those people who are already thirsty.

Packard discovered a much more serious problem - powerful corporations are constantly looking for (and in some cases already using) a wide range of techniques aimed at controlling people's behavior, and they do it in such a way that they are not aware of it. He described a kind of grouping in which traders work closely with academics to determine, among other things, how to get people to buy things they don't need and how to shape children's conditioned reflexes so that they become good consumers. - inclinations of this kind are openly inculcated, including through special training, in Huxley's novel Brave New World. Relying on the achievements of modern science, merchants quickly learned to use people's insecurity, their weaknesses, subconscious fears,aggressive feelings and sexual desires in order to change their thinking, emotions and behavior, while people are completely unaware that they are being manipulated.

Promotional video:

In the early 1950s, Packard noted, politicians took this message and began to advertise themselves using the same subtle forces used to sell soap. Packard cites an alarming statement from British economist Kenneth Boulding as an epigraph to his chapter on politics: "A world of unprecedented dictatorship is possible and will still use forms of democratic government." Could this actually happen, and how will it work?

The forces that Packard describes have become more widespread in recent decades. The soothing music we hear in supermarkets makes us move more slowly and buy more items, whether we need them or not. Most of the vapid thoughts and intense feelings our teens face from morning to night are carefully handpicked by very high-level marketing professionals in the fashion and entertainment industry. Politicians collaborate with a wide variety of consultants who carefully analyze all the actions of politicians aimed at enlisting the support of voters, while intonation, facial expressions, makeup, hairstyle and words are all optimized, as is done in the advertising design of grain breakfast. Some of the exhortations want us to buy something or believe in something. And it is the competitive nature of our society that ultimately allows us to remain relatively free.

But what happens if sources of control begin to emerge that have little or nothing to do with competition? What happens if new controls are developed that are significantly more powerful - and much more subtle - than any of the previous options? And what if new types of control allowed a small group of people to have a huge impact not only on the citizens of the United States, but also on most of the world's inhabitants?

You might be surprised if I say that all of this is already happening.

To understand how the new forms of mind control work, we first need to look at a search engine - especially one: the largest and best of them all, namely Google. The Google search engine is so good and so popular that the name of this company has become a widely used verb all over the world. To 'Google' something means trying to find something with a Google search engine, and this is how, in fact, most computer users around the planet today get most of the information about almost everything. They "google" it. The Google search engine has become the gateway to almost all branches of knowledge, and mainly because it performs its function perfectly, provides us with exactly the information we are looking for, and it happens almost instantly.and always in the first position after starting our query, a list of "search results" is given.

The SERPs are actually so good that about 50% of our clicks go to the first two positions, and over 90% of our clicks go to the 10 positions listed on the first page of search results; few people look at the other pages, although there are often thousands of them, which means they probably contain a lot of good information. Google decides which of the billions of web pages will be included in our search results and also in which order they will be ranked. How she does it is one of the world's best-kept secrets - like the Coca-Cola formula.

Because people are very likely to click on top search positions, companies today spend billions of dollars every year trying to influence Google's search algorithm in some way - a computer program that culls and prioritizes positions in the list. - and make it increase their place in search results by several positions. These few positions can be critical to success or failure in a business, and appearing at the top of the list can provide the key to making a solid profit.

At the end of 2012, I began to ask myself: is it possible that high positions in the search results influence not only consumer choice?

Maybe, I thought, high positions on the results list have little impact on how people think about various things? In early 2013, with my colleague Ronald E Robertson of the American Institute for Behavioral Research and Technology based in Vista, California, I decided to test this idea with an experiment in where 102 participants from the San Diego area were randomly divided into three groups. In one group, participants saw search results that favored one political candidate - that is, results with links to websites that made that candidate look better than his or her peers. In the second group, people received search results that were lined up in such a way asthat they gave preference to his or her opponent, and in the third group - in the control group - people received a mixed version of the arrangement of results, which did not support any of the candidates. Each group used the same search results and web pages; the only difference for the three groups was the order in which the search results were presented.

To make our experiment realistic, we used real search results associated with real web pages. We also used real elections - the 2010 election for the Australian Prime Minister. We used foreign elections for one purpose - we had to make sure that our participants did not have any “preliminary decision”. This was ensured by the fact that they did not know the candidates. Through advertising, we also attracted an ethnically diverse group of registered voters of various ages, in order to meet the key characteristics of the American electorate.

All participants were first familiarized with a short description of the candidates, and then they were asked to rate them according to various indicators, as well as indicate which candidate they would vote for; As you probably already guessed, initially the participants did not choose any of the candidates as their favorite, relying on the five indicators we proposed, and, as a result, the number of votes was divided into three equal groups. Participants were then given 15 minutes of time to conduct an online survey using Kadoodle, our fictional search engine, which provided them with access to five pages of search results associated with specific websites. The participants in the experiment were able to freely navigate within search results and websites, that is, to do just that,as happens when we use Google. When participants finished their search, we asked them to re-rate the candidates and again asked them who they would vote for.

We expected that the opinions and preferences of people in biased groups - that is, in groups in which people saw ratings in favor of one candidate - move in his favor, and this change would be 2% or 3%. But what we saw just amazed us. The number of people who endorsed the candidate with the highest scores in a launched search increased by 48.4%, and all five of our proposed metrics moved in his favor. What's more, 75% of participants in biased groups appeared to be completely unaware that they were looking at biased search results. In the control group, the opinion of the participants did not change much. It seemed that this was the main discovery. We called the shift we caused the "search engine manipulation effect" (SEME, pronounced "seem"), and this is probablywas the most significant impact on human behavior recorded. However, we did not immediately open the champagne. On the one hand, we only tested a small group of people, and they were all from the San Diego area.

Over the next year, we conducted our study three more times, and for the third time more than 2,000 people from 50 US states participated in it. In this case, the shift in voting preferences was 37.1% and even more in some demographic groups - in fact, it went up to 80%.

In a series of experiments, we also found that even with a slight bias on the first page of search results - in particular, by including one search term in favor of another candidate in the third and fourth positions of the result obtained - we could mask our manipulations in this way that only a few people realize that they are seeing a biased ranking option, while some of the study participants did not notice anything at all. Not only did we retain the ability to make significant changes in electoral preferences, but we were able to do so so quietly. Our results were significant and consistent, but they were all related to the foreign elections - the 2010 elections in Australia. But can the preferences of real voters change in the middle of a real campaign? We were skeptical about this. In real elections, people are bombarded with information from different sources, and, in addition, they already know a lot about the candidates themselves. It seemed unlikely to us that a single search engine experience could have a significant impact on voting preferences.

To answer this question, in early 2014, we came to India just before the start of the largest democratic election in the world - the election of the Prime Minister, held in the lower house of the Indian Parliament (Lok Sabha). The main candidates were three people - Rahul Gandhi, Arvind Kejrival and Narendra Modi. Using online experts, as well as online and print advertising, we recruited 2,150 people from 27 (35 in India) states and territories to the study. To do this, they had to be registered voters who had not yet decided who they would vote for.

All participants were randomly divided into three groups, whose members supported Gandhi, Kejriwala, or Modi, respectively. As expected, the level of awareness of the candidate would be high - between 7.7 points and 8.5 points on a ten-point scale. We thought our manipulations would have little or no effect, but we found something different. On average, we were able to change the proportion of people supporting any candidate by more than 20% overall, and in some demographic groups by more than 60%. Even more alarming was the fact that 99.5% of the participants in the experiment did not demonstrate their understanding that they were dealing with a biased ranking of search results - in other words, they did not realize that they were being manipulated.

The almost complete invisibility of the SEME effect is, in fact, interesting. This means that when people - including you and me - look at a biased search result, they think everything is okay. So even if you google "candidates for the United States president" now, the results you see are likely to look quite arbitrary, even if they are in favor of one of the candidates.

Even I had difficulty finding bias in search results that I knew in advance would be biased (because they were prepared by my teammates). However, our randomly designed and controlled experiments show one thing over and over again: when high ranking issues are linked to websites supporting a single candidate, it has a dramatic effect on hesitant candidates, and in large part it is for one simple reason - people tend to click. only for positions with a higher level of ranking. This is actually scary: like subconscious stimuli, the SEME effect is a force that you cannot see; but, unlike subconscious stimuli,it has a tremendous impact - like the spirit Casper the ghost pushing you down a flight of stairs.

We published a detailed account of our first five experiments on the SEME effect in the prestigious Papers of the National Academy of Sciences (PNAS) in August 2015. We've actually found something important, especially given Google's dominance in search. Google has a monopoly on Internet searches in the United States, and 83% of Americans use the Google search engine the most, according to data from the Pew Research Center. And if Google supports a candidate in an election, then its influence on undecided voters can easily decide the outcome of the election.

Keep in mind that we only exposed our members once. But what if the pressure to support one candidate in a human-driven search is carried out in the weeks or months before an election? This impact is likely to be much more significant than what we saw during our experiments.

Other types of campaign influence are counterbalanced by competing sources of information - for example, a large selection of newspapers, radio programs and television networks - but Google, despite its best intentions and efforts, has no rivals and people generally trust search results. and they assume that the company's cryptic search algorithms are completely objective and unbiased. A high level of trust, combined with a lack of competition, gives Google a unique position in terms of influencing the outcome of elections. Even more alarming, the ranking of search results is completely unregulated, and therefore Google has the ability to support any candidate without breaking any law. Some courts have even ruled that Google's right to rank search results as it sees fit is protected as a form of free speech.

Has Google ever provided support to a candidate? During the 2012 presidential election in the United States, Google and its leadership provided President Obama with $ 800,000 and only $ 37,000 for his rival Mitt Romney. And in 2015, a group of researchers from the University of Maryland and several other universities reported that Google search results tended to favor Democratic candidates. But is Google's SERPs really biased?

An internal 2012 report from the FTC highlighted that in Google search rankings, Google's financial interests tend to rank higher than its competitors, and antitrust investigations are currently underway against Google as both in the European Union and in India are based on findings of the same kind.

In most countries, 90% of online searches are conducted using the Google search engine, which provides the company with even greater opportunities to influence election results than in the United States, while increasing the speed of the Internet around the world and increasing the power of Google's influence. In our article published in the PNAS newsletter, Robertson and I made the appropriate calculations and came to the following conclusion: today Google has the ability to influence 25% of voters in national elections around the world, without anyone suspecting anything. … In fact, in our assessment, the ranking of Google search results - whether or not prior planning by company management - has long had an impact on election results.and every year this impact only increases. And since search engine rankings are ephemeral, they leave no paper trail, allowing this company to deny it.

Power of this magnitude and this kind of stealth is unprecedented in human history. But, as it turns out, our discovery regarding the SEME effect is just the tip of a huge iceberg.

Recent reports suggest that the Democratic presidential candidate Hillary Clinton is very active on social media in an effort to expand her following - including Twitter, Instagram, Snapchat and Facebook. At the time of this writing, she has 5.4 million Twitter followers, and her employees tweet multiple times per hour while they are awake. GOP candidate leader Donald Trump has 5.9 million Twitter followers and posts there with the same frequency.

Is social media a threat to democracy as much as search rankings seem to be? Not necessary. When new technologies are used in a competitive environment, they are not a threat. Even on newer platforms, they tend to be used in the same way that billboards and TV commercials have been used for decades: you install a billboard on one side of the street; I am setting my billboard to another. I may have more money to install more billboards than you, but the process is nonetheless competitive.

But what happens when technology like this is misused by the companies that own it? A study by Robert M Bond, who is currently a professor of political science at Ohio State University, and other authors, published in 2012 in the journal Nature, describes an ethically controversial experiment in which a company Facebook sent a reminder (“leave home and vote”) to over 60 million of its users. This reminder caused about 340 thousand people to vote, who otherwise would not have gone to the polls. In a 2014 article in The New Republic, Jonathan Zittrain, professor of international law at Harvard University, noted,that, given the enormous amount of information accumulated about its users, Facebook can easily send such messages only to those people who support a particular party or candidate, and such actions can affect the election results with an approximately equal number of votes, and no one about it finds out. And because advertisements, like search rankings, are ephemeral, election manipulation in this way leaves no paper trail. Are there laws that prohibit Facebook from targeting advertisements to specific users? There are absolutely none; in fact, it is through targeted advertising that Facebook makes money. Is Facebook currently manipulating elections this way? Nobody knows, but in my opinion it would be silly and perhapseven wrong if Facebook didn't do it. Some candidates are more suitable for the company, and Facebook executives have a fiduciary responsibility to their shareholders to support the interests of the company.

Bond's report was largely ignored, but another Facebook experiment reported in the PNAS newsletter sparked worldwide protests. In this study, covering a period of one week, 689,000 Facebook users were sent news posts that contained either too much positive data, too much negative data, or neither. Thereafter, the representatives of the first group used correspondingly more positive terms in the messages sent, while the second group used more negative terms in the messages. This, it has been argued, was done to show that social media can deliberately manipulate the "emotional state" of people on a large scale, a notion that many considered troubling. People were also saddened that a large-scale experiment on influencing emotions was carried out without the explicit consent of any of its participants.

Facebook undoubtedly has a huge amount of data about its users, but it pales in comparison with the Google database, which collects information about its users 24 hours, seven days a week, for which more than 60 different observing platforms are used - search engines, of course as well as Google Wallet, Google Maps, Google Awards, Google Analytics, Google Docs, Android, YouTube and many more. Gmail users tend to forget that Google stores analyze every email they write, even the drafts they don't send - as well as the incoming messages they receive from both other Gmail users and users of other services.

In accordance with Google's privacy policy - which a person agrees with as soon as he becomes a user of a Google product, and this happens even if the person has not been informed that he is using a product Google - Google can share information about you with virtually anyone, including government agencies. But Google will never share this kind of information with you. Google's data protection is sacred, while your personal protection is not at all concerned.

Could Google and “those we work with” (an expression from the privacy policy) use the information it collects about you for dishonest purposes, such as manipulation or coercion, for example? Can incorrect information in user profiles (they have no way to change it) limit their options or destroy their reputation?

Of course, if Google wants to influence the election results, it can first dive into its databases of personal information to identify exactly those voters who have not yet decided. She can then send appropriately ranked search results to those people day in and day out in favor of one of the candidates. One of the benefits of this approach is that it will make Google's manipulation extremely difficult for investigators to detect.

Extreme forms of monitoring, as was the case with the KGB in the Soviet Union, the Stasi in East Germany, or Big Brother in 1984, are fundamental elements of tyranny, and technology makes both monitoring and collecting surveillance data easier than ever. nor was it earlier. By 2020, China will have launched the most ambitious government monitoring system - a single database called the Social Credit System, in which multiple ratings and records for each of 1.3 billion citizens will be concentrated in one place for quick access by officials and bureaucrats. They will be able to instantly know if someone has written off a test at school, missed their bills, relieved themselves in public places, or talked inappropriately on blogs online. As the revelations of Edward Snowden have made clear, we are rapidly moving towards a world in which government and corporations - sometimes working together - collect vast amounts of data about each of us every day, with few or no laws to restrict what we do. how this data can be used. If you combine data collection with the desire to control or manipulate the possibilities are endless, but perhaps the most daunting possibility came from Boulding's statement that "unprecedented dictatorship" is possible "through forms of democratic government."yet there are few laws, or no laws at all, that would limit how this data can be used. If you combine data collection with the desire to control or manipulate, the possibilities are endless, but perhaps the most daunting possibility came from Boulding's statement that "unprecedented dictatorship" is possible "through forms of democratic government."yet there are few laws, or no laws at all, that would limit how this data can be used. If you combine data collection with the desire to control or manipulate, the possibilities are endless, but perhaps the most daunting possibility came from Boulding's statement that "unprecedented dictatorship" is possible "through forms of democratic government."that "unprecedented dictatorship" is possible "through the use of forms of democratic government."that "unprecedented dictatorship" is possible "through the use of forms of democratic government."

Since Robertson and I submitted our initial report on the SEME effect to the PNAS Newsletter in early 2015, we have carried out a complex series of experiments that have greatly expanded our understanding of the phenomenon, with additional experiments to be completed in the coming months. Today we better understand why the SEME effect is so powerful, as well as how it can be neutralized to some extent.

We also found something alarming - the fact that search engines affect significantly more things, not just what people buy or who they vote for. We have data that suggests that almost all issues on which people have not initially made decisions, the ranking of search data has an impact, that is, almost any decisions that a person makes. They influence the opinion, beliefs, attitudes and behavior of Internet users around the world - and yet people are completely unaware of it. This happens with or without the intentional intervention of company employees; and even so-called "organic" search engines regularly provide search results that are in favor of one point of view, which, in turn,can influence the opinion of millions of people who have not yet decided on some issue. In one of our recent experiments, biased search results changed people's minds about fracking by 33.9%. Perhaps even more disturbing, the small group of people who show their awareness of biased search engine rankings are even more dramatically shifting their minds in the predicted direction; knowing that the list is biased by itself may not necessarily protect you from the effects of the SEME effect. Even more alarming, the small group of people who show their awareness of biased search rankings are even more dramatically shifting their minds in the predicted direction; knowing that the list is biased by itself may not necessarily protect you from the effects of the SEME effect. Even more alarming, the small group of people who show their awareness of biased search rankings are even more dramatically shifting their minds in the predicted direction; knowing that the list is biased by itself may not necessarily protect you from the effects of the SEME effect.

Don't forget what the search algorithms do: in response to your overgrowth, they select a small number of websites from the billion in existence, and build them in a certain way using secret criteria. In seconds, the decision you make or the opinion you form - about the best toothpaste, about the safety of fracking, about where to go on your next vacation, who will be the best president, or whether there is really a global warming is all determined by the short list that you get, and yet you have no idea how it is created.

Meanwhile, behind the scenes, search engine consolidation is quietly taking place, and as a result, more people are using the dominant search engine even when they think they are not.

Because Google is the best search engine, and because it is becoming prohibitively expensive to build search engines for the booming internet, more and more search engines are getting their information from the industry leader rather than collecting it themselves.

The most recent transaction, recorded by the SEC in October 2015, was between Google and Yahoo! Inc.

As I watch the November 2016 presidential election in the United States, I see clear signs that Google is backing Hillary Clinton. In April 2015, Clinton hired Stephanie Hannon from Google, and she became the head of technology in the Clinton team, and a few months ago Eric Schmidt, the head of the holding that controls Google, created a semi-secret company - The Groundwork - to carry out a specific task related to ensuring Clinton's victory in the election. The creation of The Groundwork prompted Wikileaks founders Julian Assange to call Google Clinton's "secret weapon" in her bid for the US presidency.

We believe Hannon's old friends have the ability to secure between 2.6 million and 10.4 million votes for Clinton on election day without anyone knowing what's going on and leaving no paper trail. In addition, they can, of course, help her win the nomination by influencing swing voters during the primary. Voter influence has always been key to winning elections, and there has never been a tool as powerful, effective, and inexpensive as SEME.

We live in a world today in which a small number of high-tech companies, sometimes working alongside the government, not only oversee most of our actions, but also quietly control what we think, feel and say. The technologies that surround us today are not harmless toys - they made possible the imperceptible and undetectable manipulation of the population of entire countries. This kind of such manipulations have no precedent in the history of mankind, and at present they are outside the existing rules and laws. The new hidden exhortations are bigger, bolder and more vicious than anything Vance Packard imagined. If we choose to ignore it, then we do so at our own risk.