Celestial Purge: How Chinese Cyberspace Cleaners Work - Alternative View

Celestial Purge: How Chinese Cyberspace Cleaners Work - Alternative View
Celestial Purge: How Chinese Cyberspace Cleaners Work - Alternative View

Video: Celestial Purge: How Chinese Cyberspace Cleaners Work - Alternative View

Video: Celestial Purge: How Chinese Cyberspace Cleaners Work - Alternative View
Video: Online Event: Who Makes Cyberspace Safe for Democracy? 2024, June
Anonim

What Facebook failed to cope with, China took control: live broadcasts are moderated by hundreds of employees, and artificial intelligence and tracking systems help them in this.

Facebook attacks, Alphabet Inc. and YouTube by officials and public figures came against the backdrop of the recent March 15 terrorist attack in New Zealand, which killed 50 people. The companies were unable to quickly remove the alleged gunman's live video.

In the first hours after the terrorist attack, the 17-minute full video or its fragments were widely disseminated on social networks, despite the calls from the police not to do so, it was used by the media, including Australian ones. Facebook has been criticized for failing to block broadcasts and not actively discouraging the distribution of videos, contrary to its own rules.

Mark Zuckerberg does not admit failure on his part and claims that the video was only watched 4 thousand times before it was blocked on the platform, and therefore does not consider it necessary to introduce a delay during the live broadcast. In an interview with ABC, Zuckerberg replied in the affirmative when asked whether such a delay would have allowed to reduce the number of views of the live broadcast of the terrorist attack in Christchurch. “Yes, it could be so in this case,” he said.

In the Celestial Empire, companies are hiring armies of censors to control Internet broadcasts. As of the end of last year, nearly 400 million people in the country were broadcasting. Most of this content is harmless: show family and friends at home the sights of Paris or showcase what is prepared for lunch or dinner.

There are also professional streamers who make a living doing it, just like YouTube does. And that's all, not counting short videos, instant messengers, online forums and other places and formats where users generate terabytes of video content. It is naturally impossible to monitor all this manually.

Inke is a startup that monitors video content. In an interview with the South China Morning Post, the founders of the company said that the moderators are being helped by artificial intelligence algorithms and facial recognition software.

Artificial intelligence is used to do the basic work of labeling, evaluating and sorting content into different risk categories. This classification system allows a company to allocate resources in ascending order of risk. A single reviewer can track more low-risk content, such as food shows, and high-risk content is flagged for closer examination.

Promotional video:

Human moderators are needed to assess what machines are not very good at - in context. As an example, journalists cite bikinis. A woman's swimsuit means nothing to a computer. But to a person, a bikini in different contexts can mean completely different things. For example, a bikini demonstration in a pool with children running around? Can be published. Candid swimsuit in the bedroom interior with romantic background music? Most likely, this video will be blocked.

Nonetheless, smoking is the most censored on the Inke platform, as the authorities believe it contributes to unhealthy lifestyles. Showing excessive numbers of tattoos is also not.

China actively monitors and censors content that criticizes the ruling Communist Party or mentions sensitive topics such as the Dalai Lama, Tiananmen Square, and the persecution of Falun Gong. Beijing justifies the "Great Firewall" of China, since the system of network censorship is interpreted as the right to "cyber sovereignty", where each country has the right to control its internal Internet space.

The Inle team of moderators is the largest: it accounts for about 60% of all employees. Moderators work with detailed rules explaining what is allowed and what should be removed. Based on the guidance published by the China Performing Arts Association, the tutorial is updated weekly to accommodate all relevant cases.

Content at highest risk includes politically sensitive speech, sexual acts, violence, terrorism and suicide. Depending on the severity of the violation, content moderators can issue a warning, block or blacklist the account.

Moderators are helped by the fact that live broadcasts are not actually "live", but have a built-in delay of 10-15 seconds. It is in this narrow window that the moderator must decide whether to allow the broadcast of questionable content.

Inke has another effective tool for suppressing inappropriate content: When residents began gathering to protest the local government's plan to build an incinerator, Inke used device location software to block all broadcasts within a 10 km radius.

Despite the high demand for content moderators, the job itself is pretty monotonous and doesn't pay very well. Basically, people have to watch for hours bad singing, mediocre jokes and boring monologues. Of the 1,200 Inke moderators, about 200 are full-time employees, while the rest are contractual. The starting salary is 3,000 yuan a month, or $ 3 an hour, compared to the New York City minimum wage of $ 15 an hour. Many new hires leave before completing the mandatory one-month training course. Others leave within six months.