Amazon recently found itself in the midst of a debate whether or not their facial recognition software, Rekognition, is a violation of privacy.
Since the day the novel 1984, by George Orwell, was published, people have feared the arrival of an omnipresent and omnipotent government that is capable of tracking the physical location, behavioral patterns and thought processes of any, and all, individuals. But, despite this novel being a work of harrowing dystopian fiction, we have still ended up developing software (in the form of machine learning/deep learning algorithms) that is somewhat, if not very, capable of meeting these demands. However, the inherent problem with such technologies does not arise because of businesses or individuals using these services, but from nefarious individuals, police forces, or other law enforcement agencies having control of such powerful technology.
In 2016, Amazon developed a facial recognition software (Rekognition) that was able to detect the person of interest in an image of a crowd of 100 people with pinpoint accuracy. Now, two years later, this software is able to recognize a variety of objects, animals, and people in a real-time video environment, with an improved level of accuracy, regardless of whether or not the person of interest’s face is covered or distorted in the video footage. Additionally, a database of millions of faces, which is provided by the user, is queried using Rekognition and able to detect any of those people in the database, on video.
The main capabilities of Rekognition, as stated on Amazon’s site, are as follows:
- Object, scene, and activity detection.
- “With Amazon Rekognition, you can identify thousands of objects (e.g. bike, telephone, building) and scenes (e.g. parking lot, beach, city). When analyzing video, you can also identify specific activities happening in the frame, such as ‘delivering a package’ or ‘playing soccer’”.
- Facial recognition.
- “Rekognition’s fast and accurate search capability allows you to identify a person in a photo or video using your private repository of face images.”
- Facial analysis.
- “You can analyze the attributes of faces in images and videos to determine things like happiness, age range, eyes open, glasses, facial hair, etc. In video, you can also measure how these things change over time, such as constructing a timeline of the emotions of an actor.”
- Person tracking.
- “When using Rekognition to analyze video, you can track people through a video even when their faces are not visible, or as they go in and out of the scene. You can also identify their movements in the frame to tell things like whether someone was entering or exiting a building.”
- Unsafe content detection.
- “Amazon Rekognition helps you identify potentially unsafe or inappropriate content across both image and video assets and provides you with detailed labels that allow you to accurately control what you want to allow based on your needs.”
- Celebrity recognition.
- “You can quickly identify well known people in your video and image libraries to catalog footage and photos for marketing, advertising, and media industry use cases.”
- Text in images.
- “Specifically built to work with real world images, Rekognition can detect and recognize text from images, such as street names, captions, product names, and license plates.”
As we can clearly see, the range and variety of image analysis that Rekognition can detect and categorize is vast. This variety is what fuels the inherent abuse potential of the software; the person/organization utilizing Rekognition gets to decide which words, behaviors, demographics, etc., are to be detected.
Many civil liberties groups in the US fear these technologies (Amazon Rekognition, in this case) will be used to discriminate based on race, rather than just searching the database for people with outstanding warrants, further heightening racial tensions within the nation. There is already a fairly strong correlation suggesting that police are more likely to pull over Latinos or African Americans during random search and seizure stops. Further supporting this claim was a statistical study conducted by a team of Stanford researchers in 2016 which proved that “In nearly every one of the 100 departments we consider, we find that black and Hispanic drivers are subjected to a lower search threshold than whites, suggestive of widespread discrimination against these groups.”
If we choose to ignore any racial, gender, or other discrimination that would likely take place under a system utilizing such technologies, the attack on personal privacy and the excessive power bestowed upon those using these technologies is concerning. We have witnessed the violations of privacy committed by the NSA, the CIA, the FBI, and other governmental bodies, but for years local police have been heavily armed with excess military equipment donations, which President Trump has reinstated last year. So, it really isn’t conspiratorial nonsense when one fears an abuse of power by those who have already abused their power, when the power being utilized by the police, or any other governmental/non-governmental agency may allow millions of people to be tracked in real-time. Fortunately, and unfortunately, the seemingly faceless entities we hold in contempt are still people. More than anything, in this new technological era, we must hold empathy, compassion, and humanity above all else, lest we lose what made us human in the first place.