Apple's new iPhone X creates detailed 3D models of users' faces to unlock the phone. But that's not the only way the images are used. Apple can share the images with app developers, and privacy advocates are worried.
The 3D facial model is also called a biometric scan. Computer software creates the scans for the purpose of positively identifying human faces.
How does it work?
Apple's front sensors use 30,000 points to create a 3D model of a user's face. The technology is so advanced it is compared to animation software used for Hollywood films.
Phone application developers who have permission to tap into the phone's camera can retrieve the images.
The Washington Post explained that "the iPhone lets other apps now tap into two eerie views from the so-called TrueDepth camera. There’s a wireframe representation of your face and a live read-out of 52 unique micro-movements in your eyelids, mouth and other features. Apps can store that data on their own computers."
Although app developers must explain how they plan to use the images, the problem comes with enforcing it, privacy advocates say.
Why would anyone want pictures of our faces?
Advertisers and data brokers use our personal information to build profiles on us. In turn, the information is sold or traded to businesses or government agencies.
Apps that collect images of your face can gather a wealth of information about you. For example, when you unlock the phone, your facial expression can give away your mood. An app could also potentially detect your gender, race, and sexual orientation.
Stanford University created software that can guess someone's sexual orientation by studying facial features.
Is it secure?
An analysis by the Post pointed out that Apple makes its money by selling products, not by distributing pictures of our faces.
“We take privacy and security very seriously,” Apple spokesman Tom Neumayr said. “This commitment is reflected in the strong protections we have built around Face ID data — protecting it with the Secure Enclave in iPhone X — as well as many other technical safeguards we have built into iOS.”
What does the ACLU say?
The American Civil Liberties Union has repeatedly warned about privacy issues concerning facial recognition software. According to the ACLU, biometric facial scans could tie into a larger surveillance grid to monitor the public.
"The biggest danger is that this technology will be used for general, suspicion-less surveillance systems," the ACLU website states. A "comprehensive system of identification and tracking" could be created by tying into other systems such as public cameras in stores.
Matching your face to a store camera would allow app developers to observe and track you. Public cameras require no consent from the people being watched. Many stores have signs, however, stating video recording is in progress.
“I think we should be quite worried,” Jay Stanley, a senior policy analyst at the ACLU, said. “The chances we are going to see mischief around facial data is pretty high — if not today, then soon — if not on Apple then on Android.”