Your Face Is In The Hands Of Corporations - Alternative View

Table of contents:

Your Face Is In The Hands Of Corporations - Alternative View
Your Face Is In The Hands Of Corporations - Alternative View

Video: Your Face Is In The Hands Of Corporations - Alternative View

Video: Your Face Is In The Hands Of Corporations - Alternative View
Video: 🟢Кофе с комментариями 🟢 Новые проекты 2024, July
Anonim

Tech companies have quietly studied your photos to improve their facial recognition systems. And here's how they did it.

Facial recognition is a powerful technology that seriously threatens civil liberties. It's also a thriving business. Today, many startups and tech giants are selling facial recognition systems to hotels, retailers, and even schools and summer camps. This business is booming thanks to new algorithms that can identify people with much greater accuracy than five years ago. To improve their algorithms, they have been trained on billions of faces, sometimes without anyone's permission. Therefore, chances are good that your face is part of a "training kit" in one of these systems or is in a customer database of some company.

The information gathering methods used by companies can surprise consumers. For example, in at least three cases, companies have received millions of images through smartphone photo apps. Now the face recognition system is poorly regulated, so people have almost no way to restrict the use of their faces for commercial purposes.

In 2018, a camera identified the faces of passengers hurrying down a plane near Washington, Columbia. But in fact, both the plane and the passengers were part of a simulation created by the National Institute of Standards and Technology (NIST) to demonstrate how data could be collected “in the field”. The faces used in this experiment will be part of a NIST competition in which companies from around the world test their face recognition systems.

Image
Image

In the simulation of the aircraft situation, the volunteers agreed to use their faces. This was the case in the early stages of facial recognition systems - researchers tried to consistently include people in their sets. Now, business is unlikely to bother itself and ask permission.

According to Market Research Future, companies (including leaders such as Face ++ and Kairos) are vying to be # 1 in an industry that is growing 20% annually to reach $ 9 billion by 2022. The business model of these players is based on licensed software for a growing number of clients - from law enforcement agencies to universities - who use it in their own development.

In the competition, the winners are those products whose algorithms are able to detect faces accurately and without errors. As with all artificial intelligence, building a face recognition system involves accumulating a large amount of data for training. While companies can use government and university-approved data (such as the Yale Facial Database), these training kits are small enough to contain no more than a few thousand faces.

Promotional video:

These official kits have other disadvantages as well. Many people lack the racial diversity and different conditions - shadows, hats, cosmetics - that alter the perception of a person in the real world. A natural face recognition system requires more images. A lot more.

“A hundred is not enough, a thousand is not enough. We need millions of images. If you don't train the system to recognize people with glasses and different skin colors, you won't achieve good results,”says Peter Trepp, director of FaceFirst, a California-based company that helps retailers identify criminals in their stores.

Application for this

le. Notably, the app forced it to send sponsored links to all user contacts, a tactic known in Silicon Valley as "growth hacking." Users also complained about data theft.

"Immediately after installation, the application collects all the phones from the list of contacts and starts texting them … And then it downloads all your photos and copies them to the cloud storage," wrote Greg Miller, owner of a photo studio in Texas, in a 2015 review on Facebook.

Four years later, Miller was horrified to discover that his photographs were still stored at EverRoll, but now a company with a facial recognition system.

“No, I didn’t know about it and absolutely disagree with it,” Miller complained to Fortune. “This kind of surveillance is a real problem. Confidentiality is gone and that scares me a lot."

Doug Ely, CEO of Ever AI, claims the company doesn't transfer information from its database anywhere, and the photos are only used to train the system. He also added that the company is akin to a social network that can be abandoned at any time. Eli denied that Ever AI intended to become a facial recognition company from the start, saying the app shutdown was a management decision. Ever AI's customers now use its data for their own purposes, including managing employee identification systems, retail, as well as telecommunications and law enforcement.

Ever AI isn't the only facial recognition company that once offered a photography app. Orbeus, a San Francisco-based startup bought by Amazon in 2016 (which was not announced), also offered the popular PhotoTime storage.

According to a former Orbeus employee, the startup's appeal to Amazon lay in its artificial intelligence technologies and its vast array of photographs of people in public places.

“Amazon was looking for such opportunities. They bought everything and then closed the application,”says an employee who wished to remain anonymous, citing a nondisclosure agreement.

PhotoTime no longer exists, although Amazon continues to sell another Orbeus product known as the Rekognition brand. In business and law enforcement, it is used as a face recognition system.

Amazon declined to disclose details of whether the Orbeus photo app was used to train Rekognition, stating only that it was taking data from various sources. The company added that it does not use user data from the Prime app to train identity systems.

Real Networks is another company using apps to train its system. Based in Seattle and once famous for its video player from the 90s, the company is now focused on recognizing children's faces in schools. At the same time, the company offers a family app called RealTimes, which critics say collects data about users' faces.

“The application allows you to create video presentations from photos. Imagine that mom sends such a presentation to grandma, and the system uses these photos for training. It sounds creepy,”says Claire Gavry, professor at Georgetown University, who published a paper that has had a major impact on facial recognition technology.

Real Networks confirmed that the app was being used to improve facial recognition, but added that other sources of information were being used.

In all cases where companies used data from their own photo applications to train their systems, they did not ask for user permission, but received hidden consent through user agreements.

But this is already quite a lot compared to what other companies are doing. According to Patrick Grother, who runs the NIST competition, it’s okay for companies that collect facial data to write programs that “grab” images from sites like SmugMug or Tumblr. In these cases, user permission is not even assumed.

This self-help approach was highlighted in a recent NBC News report detailing how IBM downloaded over a million facial images from Flickr as part of an AI study. (John Smith, who oversees artificial intelligence technologies in the IBM research department, said that "personal data is protected" and that work is underway with those who want to remove personal information from the database).

All this raises questions about the protection of personal data by the companies that collect it, and the need for state control in this area. This topic will only get more serious with the further spread of facial recognition systems in society, as well as in the environment of large and small businesses.

From shops to schools

Facial recognition systems are not new. The simplest versions of such programs have been around since the 1980s, when American mathematicians began to define faces as a series of numerical values and used probabilistic models to find matches. Tampa, Florida security used it at the 2001 Super Bowl, and it has been used in casinos for years. But a lot has changed over the past few years.

"The face recognition system is going through something like a revolution," says Patrick Grother, adding that the change is most noticeable in the increasing quality of images. “The underlying technology has changed. Old developments have been replaced by new, much more efficient systems."

The face recognition revolution was driven by two factors that have greatly changed and expanded the scope of artificial intelligence technology. The first is the emergence of deep learning, a pattern recognition system that resembles the human brain in principle. The second is a record surplus of data that can be stored and analyzed at low cost using cloud computing.

Not surprisingly, the first companies to take full advantage of these developments were Google and Facebook. In 2014, the latter released a program called DeepFace, which can determine with an accuracy of 97.24% that two faces belong to the same person - people demonstrate a similar result in such a test. A year later, Google, with its FaceNet program, achieved 100% accuracy (according to security firm Gemalto).

Today, thanks in large part to access to large databases of facial data, these and other tech giants (like Microsoft) are leading the way in facial recognition. But more and more start-ups are showing high results, also striving to occupy their niche in the growing market for facial identification programs.

There are more than a dozen such companies in the United States alone, including Kairos and FaceFirst. According to market researchers from PitchBook, Silicon Valley is rapidly gaining traction in the sector, with significant investments over the past few years. PitchBook estimates the total investment over the past three years at $ 78.7 million. These are not fantastic numbers by Valley standards, but they reflect the confidence of venture capitalists that a few front-end startups will soon grow into large companies.

Venture capital activity in the face recognition industry in the United States
Venture capital activity in the face recognition industry in the United States

Venture capital activity in the face recognition industry in the United States.

New business models focused on face recognition are still emerging. This is especially noticeable in licensed enterprise software. According to Crunchbase, annual revenues for companies like Ever AI and FaceFirst are modest, ranging from $ 2 million to $ 8 million. Amazon and other tech giants do not disclose such figures.

For a long time, the most interested users of face recognition systems were law enforcement agencies. But now many companies, including WalMart, use such programs to get more information about the shoppers in their stores.

For example, California-based FaceFirst offers its systems to hundreds of retailers, including second-hand stores and pharmacies. According to the company's CEO, many clients are using the technology to detect theft, but a growing number of those trying to use it for other purposes, including finding VIP clients and identifying employees.

For a long time, the most interested users of face recognition systems were law enforcement agencies.

It seems that Amazon is also looking in a wide range of its activities for opportunities to apply face recognition systems. In addition to working with police stations, the retail giant is helping hotels speed up check-in processes, according to various sources.

“Companies from all over the world come to Amazon and say, 'This is exactly what we want to do.' And you understand that this is a wonderful area. Everyone is interested in it,”says an anonymous source who joined Amazon when purchasing Orbeus, a facial recognition company.

For Amazon, this activity has not been without controversial consequences. Last July, the American Civil Liberties Union (ACLU) tested the company's systems by comparing the faces of all members of Congress to a database of criminals. The test showed 28 false matches, with most errors due to the skin color of the participants in the experiment. As a result, the Union called for a ban on the use of the face recognition system in law enforcement agencies. However, Amazon insisted on selling the system to police officers and the US Immigration and Customs Enforcement.

Then some members of Congress, including Rep. Jerrold Nadler and Senator Ron Weeden, asked the Audit Office to investigate the use of facial recognition software. Leading companies are also concerned about these systems. In particular, Microsoft President Brad Smith in December called for regulation of such technologies at the state level.

But even as concerns grow, the use of facial recognition systems is only expanding as companies find more and more uses for them. For example, Real Networks, a family photography app developer, offers its software to schools across the country for free. The company speaks of hundreds of schools as its clients. In an interview with Wired magazine, Real Networks CEO Rob Glaser said he launched the project as an impartial solution to disputes over school safety and gun control. The company's website is currently positioning this product as a technology that allows event organizers to “recognize every fan, customer, employee or guest,” even if their face is hidden.

The unique technology recognizes faces even in coloring or intensive makeup. The system distinguishes and identifies faces in a variety of lighting conditions
The unique technology recognizes faces even in coloring or intensive makeup. The system distinguishes and identifies faces in a variety of lighting conditions

The unique technology recognizes faces even in coloring or intensive makeup. The system distinguishes and identifies faces in a variety of lighting conditions.

Real Networks isn't the only company targeting the kids' market. A Texas-based startup called Waldo is offering similar technology to hundreds of schools, as well as children's sports leagues and summer camps. In practice, this implies using such systems to scan images taken by video cameras or official photographers, and further matching the children's faces with a database of images provided by the parents. Parents can always refuse to participate.

According to Waldo CEO Rodney Rice, schools take tens of thousands of photographs every year, and few of them end up in annual albums. Face recognition, he said, is an effective way to spread the leftovers to those who need them.

"For the price of popcorn or brown paper, you can order these photos for your kids' grandparents," Rice says, explaining that Waldo has a revenue-sharing agreement with public schools. Currently, the company's services are used in more than 30 states of the United States.

The rise of Waldo and FaceFirst shows how businesses are normalizing facial recognition that seemed like science fiction until recently. And with the proliferation of such technologies, more companies will collect photos of your faces - either to train algorithms or to find customers and criminals - even if the risk of error and abuse only grows.

The future of your face

In 2017, an episode of the techno-dystopian TV series Black Mirror was released, in which an agitated mother worries about a reckless young man who spends time with her daughter. To find out who it is, she uploads a photo of him to a consumer identification service. The program quickly shows his name and place of work, and the woman goes to deal with him.

The once fictional scenario now seems quite real. While most of the concerns about facial recognition have focused on the use of this technology in government organizations, its use among commercial companies and even individuals (in the style of "Black Mirror") poses obvious risks to personal data.

As more companies start selling facial recognition systems and our faces get into more databases, the software could gain popularity among voyeurs and stalkers. Retailers and landlords can use it to identify unwanted clients and renters to quietly deny housing and services.

“Anyone with a camcorder in a densely populated area can start collecting image databases and then use this analytical software to see if the images they capture match your data,” says Jay Stanley, an analyst at ACLU.

There is also a risk of hacker attacks. Andrei Barisevich of Gemini Advisors, a cybersecurity firm, says he has seen profiles for sale on darknet sites stolen from India's national biometric database. He did not notice such information about the Americans, but added: "It's just a matter of time." Leaking customer data from a hotel or store can help criminals to commit fraud or identity theft.

Since the technology is distributed without much government control, the responsibility for limiting its misuse lies solely with the software vendors. In an interview with Fortune, the CEOs of facial recognition startups said they were prepared for such threats. Some, including the CEO of FaceFirst, described the spread of such systems in China as dangerous.

Leaders also suggested two approaches to curbing abuse. The first is to work closely with the buyers of the software to ensure that it is used correctly. For example, Ever AI's Doug Eli says his company has a higher standard than Amazon, which he claims provides its Rekognition tool to just about anyone.

In response to a question about abuse control, Amazon provided a previously released statement from Matt Wood, who runs artificial intelligence at Amazon Web Services. Wood points out that it is company policy that prohibits harmful and illegal activities.

Another possible guarantee of data safety is the use of technical measures to ensure the impossibility of hacking the databases of "front" data.

Rodney Rice, CEO of Waldo, says faces are stored as alphanumeric hashes. This means that even in the event of a data leak, confidentiality will not be compromised, since a hacker will not be able to decrypt the hashes and use them. This point of view was supported by others.

Rice fears that legislative definition of the rules for the use of "facial" technologies could do more harm than good. “Leaving a child to figure it out and create rules is ridiculous,” he says.

Meanwhile, some facial recognition software companies are adopting new techniques that can reduce the need for big data for training. This is the case, for example, with Kairos, a front-end startup from Miami that, among other things, works with a wide range of hotels. According to Stephen Moore, head of security at the company, Kairos creates "synthetic" faces to simulate a wide range of emotions and lighting. Such "artificial faces" reduce the use of facial data from the real world when creating technology products.

All of these measures - surveillance of system users, strong data protection, and synthetic learning tools - can mitigate some of the privacy concerns associated with business use of our faces. At the same time, FaceFirst's Trepp believes that anxiety will decrease with a closer look at the system. He even claims that the facial recognition scenes in the 2002 sci-fi film Minority Report will start to feel normal.

“Millennials are much more willing to share information. This world [from Minority Report] is getting closer to ours,”he says. - If you do everything right, then I think people will like it, and it will be a positive experience. It won't be that scary."

Others, including the ACLU, are less optimistic. However, despite the growing discussion around technology, there is practically nothing at this point that limits the use of your face. The only exceptions are in three states - Illinois, Texas and Washington DC - which require a certain degree of consent before using someone else's face. These laws are not really used in practice except in Illinois, where consumers can take legal action to enforce this right.

Illinois law is currently the subject of a high-profile appeals trial involving Facebook, which claims that digital crawling is not subject to restrictions on obtaining faces. In 2017, Facebook and Google launched an unsuccessful lobbying campaign to convince Illinois lawmakers to soften the law. In late January, proponents of the law were backed by the Illinois Supreme Court when it ruled that consumers can sue the unauthorized use of their biometrics, even if no real harm has been done.

Other states also allow for the possibility of adopting their own biometrics laws. At the federal level, legislators have paid little attention to this so far. That could change, however, as Senators Brian Schatz and Roy Blount introduced legislation this month that would require companies to obtain permission before using facial recognition in public places or sharing facial data with any third party.

Claire Garvey, a Georgetown researcher, supports laws to control these systems. But she says lawmakers have had a hard time keeping up with technology.

“One of the challenges of face recognition is their incredibly fast adoption, thanks to the existing databases. Our faces were lit up a lot,”she says. “Unlike fingerprints, which have long had data collection rules, there is still no regulation for facial recognition technologies.”

By Jeff John Roberts

Translated by: Ekaterina Egina

Edited by: Sergey Razumov