On Friday Apple fans were queuing to get their hands on the newly released iPhone X: The flagship smartphone that Apple deemed a big enough update to skip a numeral. RIP iPhone 9.
The shiny new hardware includes a front-facing sensor module housed in the now infamous ‘notch’ which takes an unsightly but necessary bite out of the top of an otherwise (near) edge-to-edge display and thereby enables the smartphone to sense and map depth — including facial features.
So the iPhone X knows it’s your face looking at it and can act accordingly, e.g. by displaying the full content of notifications on the lock screen vs just a generic notice if someone else is looking. So hello contextual computing. And also hey there additional barriers to sharing a device.
Face ID has already generated a lot of excitement but the switch to a facial biometric does raise privacy concerns — given that the human face is naturally an expression-rich medium which, inevitably, communicates a lot of information about its owner without them necessarily realizing it.
You can’t argue that a face tells rather more stories over time than a mere digit can. So it pays to take a closer look at what Apple is (and isn’t doing here) as the iPhone X starts arriving in its first buyers’ hands…
Face ID
The core use for the iPhone X’s front-facing sensor module — aka the TrueDepth camera system, as Apple calls it — is to power a new authentication mechanism based on a facial biometric. Apple’s brand name for this is Face ID.
To use Face ID iPhone X owners register their facial biometric by tilting their face in front of the TrueDepth camera. (NB: The full Face ID enrollment process requires two scans so takes a little longer than the below GIF.)
The face biometric system replaces the Touch ID fingerprint biometric which is still in use on other iPhones (including on the new iPhone 8/8 Plus).
Only one face can be enrolled for Face ID per iPhone X — vs multiple fingerprints being allowed for Touch ID. Hence sharing a device being less easy, though you can still share your passcode.
As we’ve covered off in detail before Apple does not have access to the depth-mapped facial blueprints that users enroll when they register for Face ID. A mathematical model of the iPhone X user’s face is encrypted and stored locally on the device in a Secure Enclave.
Face ID also learns over time and some additional mathematical representations of the user’s face may also be created and stored in the Secure Enclave during day to day use — i.e. after a successful unlock — if the system deems them useful to “augment future matching”, as Apple’s white paper on Face ID puts it. This is so Face ID can adapt if you put on glasses, grow a beard, change your hair style, and so on.
The key point here is that Face ID data never leaves the user’s phone (or indeed the Secure Enclave). And any iOS app developers wanting to incorporate Face ID authentication into their apps do not gain access to it either. Rather authentication happens via a dedicated authentication API that only returns a positive or negative response after comparing the input signal with the Face ID data stored in the Secure Enclave.
Senator Al Franken wrote to Apple asking for reassurance on exactly these sorts of question. Apple’s response letter also confirmed that it does not generally retain face images during day-to-day unlocking of the device — beyond the sporadic Face ID augmentations noted above.
“Face images captured during normal unlock operations aren’t saved, but are instead immediately discarded once the mathematical representation is calculated for comparison to the enrolled Face ID data,” Apple told Franken.
Apple’s white paper further fleshes out how Face ID functions — noting, for example, that the TrueDepth camera’s dot projector module “projects and reads over 30,000 infrared dots to form a depth map of an attentive face” when someone tries to unlock the iPhone X (the system tracks gaze as well which means the user has to be actively looking at the face of the phone to activate Face ID), as well as grabbing a 2D infrared image (via the module’s infrared camera). This also allows Face ID to function in the dark.
“This data is used to create a sequence of 2D images and depth maps, which are digitally signed and sent to the Secure Enclave,” the white paper continues. “To counter both digital and physical spoofs, the TrueDepth camera randomizes the sequence of 2D images and depth map captures, and projects a device-specific random pattern. A portion of the A11 Bionic processor’s neural engine — protected within the Secure Enclave — transforms this data into a mathematical representation and compares that representation to the enrolled facial data. This enrolled facial data is itself a mathematical representation of your face captured across a variety of poses.”
So as long as you have confidence in the calibre of Apple’s security and engineering, Face ID’s architecture should given you confidence that the core encrypted facial blueprint to unlock your device and authenticate your identity in all sorts of apps is never being shared anywhere.
But Face ID is really just the tip of the tech being enabled by the iPhone X’s TrueDepth camera module.
Face-tracking via ARKit
Apple is also intending the depth sensing module to enable flashy and infectious consumer experiences for iPhone X users by enabling developers to track their facial expressions, and especially for face-tracking augmented reality. AR generally being a huge new area of focus for Apple — which revealed its ARKit support framework for developers to build augmented reality apps at its WWDC event this summer.
And while ARKit is not limited to the iPhone X, ARKit for face-tracking via the front-facing camera is. So that’s a big new capability incoming to Apple’s new flagship smartphone.
“ARKit and iPhone X enable a revolutionary capability for robust face tracking in AR apps. See how your app can detect the position, topology, and expression of the user’s face, all with high accuracy and in real time,” writes Apple on its developer website, going on to flag up some potential uses for the API — such as for applying “live selfie effects” or having users’ facial expressions “drive a 3D character”.
The consumer showcase of what’s possible here is of course Apple’s new animoji. Aka the animated emoji characters which were demoed on stage when Apple announced the iPhone X and which enable users to virtually wear an emoji character as if was a mask, and then record themselves saying (and facially expressing) something.
So an iPhone X user can automagically ‘put on’ the alien emoji. Or the pig. The fox. Or indeed the 3D poop.
But again, that’s just the beginning. With the iPhone X developers can access ARKit for face-tracking to power their own face-augmenting experiences — such as the already showcased face-masks in the Snap app.
“This new ability enables robust face detection and positional tracking in six degrees of freedom. Facial expressions are also tracked in real-time, and your apps provided with a fitted triangle mesh and weighted parameters representing over 50 specific muscle movements of the detected face,” writes Apple.
Now it’s worth emphasizing that developers using this API are not getting access to every datapoint the TrueDepth camera system can capture. This is also not literally recreating the Face ID model that’s locked up in the Secure Enclave — and which Apple touts as being accurate enough to have a failure rate as small as one in one million times.
But developers are clearly being given access to some pretty detailed face maps. Enough for them to build powerful user experiences — such as Snap’s fancy face masks that really do seem to be stuck to people’s skin like facepaint…
And enough, potentially, for them to read some of what a person’s facial expressions are saying — about how they feel, what they like or don’t like.
(Another API on the iPhone X provides for AV capture via the TrueDepth camera — which Apple says“returns a capture device representing the full capabilities of the TrueDepth camera”, suggesting the API returns photo + video + depth data (though not, presumably, at the full resolution that Apple is using for Face ID) — likely aimed at supporting additional visual special effects, such as background blur for a photo app.)
Now here we get to the fine line around what Apple is doing. Yes it’s protecting the mathematical models of your face it uses the iPhone X’s depth-sensing hardware to generate and which — via Face ID — become the key to unlocking your smartphone and authenticating your identity.
But it is also normalizing and encouraging the use of face mapping and facial tracking for all sorts of other purposes.
Entertaining ones, sure, like animoji and selfie lenses. And even neat stuff like helping people virtually try on accessories (see: Warby Parker for a first mover there). Or accessibility-geared interfaces powered by facial gestures. (One iOS developer we spoke to, James Thomson — maker of calculator app PCalc — said he’s curious “whether you could use the face tracking as an accessibility tool, for people who might not have good (or no) motor control, as an alternative control method”, for example.)
Yet it doesn’t take much imagination to think what else certain companies and developers might really want to use real-time tracking of facial expressions for: Hyper sensitive expression-targeted advertising and thus even more granular user profiling for ads/marketing purposes. Which would of course be another tech-enabled blow to privacy.
It’s clear that Apple is well aware of the potential risks here. Clauses in its App Store Review Guidelines specify that developers must have “secure user consent” for collecting “depth of facial mapping information”, and also expressly prohibit developers from using data gathered via the TrueDepth camera system for advertising or marketing purposes.
In clause 5.1.2 (iii) of the developer guidelines, Apple writes:
Data gathered from the HomeKit API or from depth and/or facial mapping tools (e.g. ARKit, Camera APIs, or Photo APIs) may not be used for advertising or other use-based data mining, including by third parties.
It also forbids developers from using the iPhone X’s depth sensing module to try to create user profiles for the purpose of identifying and tracking anonymous users of the phone — writing in 5.1.2 (i):
You may not attempt, facilitate, or encourage others to identify anonymous users or reconstruct user profiles based on data collected from depth and/or facial mapping tools (e.g. ARKit, Camera APIs, or Photo APIs), or data that you say has been collected in an “anonymized,” “aggregated,” or otherwise non-identifiable way.
While another clause (2.5.13) in the policy requires developers not to use the TrueDepth camera system’s facial mapping capabilities for account authentication purposes.
Rather developers are required to stick to using the dedicated API Apple provides for interfacing with Face ID (and/or other iOS authentication mechanisms). So basically, devs can’t use the iPhone X’s sensor hardware to try and build their own version of ‘Face ID’ and deploy it on the iPhone X (as you’d expect).
They’re also barred from letting kids younger than 13 authenticate using facial recognition.
Apps using facial recognition for account authentication must use LocalAuthentication (and not ARKit or other facial recognition technology), and must use an alternate authentication method for users under 13 years old.
The sensitivity of facial data hardly needs to be stated. So Apple is clearly aiming to set parameters that narrow (if not entirely defuse) concerns about potential misuse of the depth and face tracking tools that its flagship hardware now provides. Both by controlling access to the key sensor hardware (via APIs), and by policies that its developers must abide by or risk being shut out of its App Store and barred from being able to monetize their apps.
“Protecting user privacy is paramount in the Apple ecosystem, and you should use care when handling personal data to ensure you’ve complied with applicable laws and the terms of the Apple Developer Program License Agreement, not to mention customer expectations,” Apple writes in its developer guidelines.
The wider question is how well the tech giant will be able to police each and every iOS app developer to ensure they and their apps stick to its rules. (We asked Apple for an interview on this topic but at the time of writing it had not provided a spokesperson.)
Depth data being provided by Apple to iOS developers — which was only previously available to these devs in even lower resolution on the iPhone 7 Plus, thanks to that device’s dual cameras — arguably makes facial tracking applications a whole lot easier to build now, thanks to the additional sensor hardware in the iPhone X.
Though developers aren’t yet being widely incentivized by Apple on this front — as the depth sensing capabilities remain limited to a minority of iPhone models for now.
Although it’s also true that any iOS app granted access to iPhone camera hardware in the past could potentially have been using a video feed from the front-facing camera, say, to try to algorithmically track facial expressions (i.e by inferring depth).
So privacy risks around face data and iPhones aren’t entirely new, just maybe a little better defined thanks to the fancier hardware on tap via the iPhone X.
Questions over consent
On the consent front, it’s worth noting that users do also have to actively give a particular app access to the camera in order for it to be able to access iOS’ face mapping and/or depth data APIs.
“Your app description should let people know what types of access (e.g. location, contacts, calendar, etc.) are requested by your app, and what aspects of the app won’t work if the user doesn’t grant permission,” Apple instructs developers.
Apps also can’t pull data from the APIs in the background. So even after a user has consented for an app to access the camera, they have to be actively using the app for it to be able to pull facial mapping and/or depth data. So it should not be possible for apps to continuously facially track users — unless a user continues to use their app.
Although it’s also fair to say that users failing to read and/or properly understand T&Cs for digital services remains a perennial problem. (And Apple has sometimes granted additional permissions to certain apps — such as when it temporarily gave Uber the ability to record the iPhone user’s screen even when the app was in the background. But that is an exception, not the rule.)
Add to that, certain popular apps that make use of the camera as part of their core proposition — say, the likes of social sharing apps like Facebook, Snap, Instagram etc — are likely going to able to require the user gives access to the TrueDepth API if they want to use the app at all.
So the ‘choice’ for a user may be between being facially tracked by their favorite app or foregoing using the app entirely…
One iOS developer we spoke to played down any expansion of privacy concerns related to the additional sensor hardware in the TrueDepth module, arguing: “To a certain extent, you could do things already with the 2D front facing camera if the user gives access to it — the added depth data doesn’t really change things.”
Another suggested the resolution of the depth data that Apple supplies via the new API is still “relatively low” — while also being “slightly higher res data” than the iPhone 7 Plus depth data. Though this developer had yet to test out the TrueDepth API to prove out their supposition.
“I’ve worked with the iOS 11 depth data APIs (the ones introduced at WWDC; before TrueDepth) a bit, and the data they supply at least with the iPhone 7 Plus is pretty low res (<1MP),” they told us.
Most of the iOS devs we contacted were still waiting to get their hands on an iPhone X to be able to start playing around with the API and seeing what’s possible.
Ultimately, though, it will be up to individual iPhone X users to decide whether they trust a particular company/app developer to give it access to the camera and thus also access to the facial tracking and facial mapping toolkit that Apple is placing in developers’ hands with the iPhone X.
The issue of user consent is a potentially thorny one, though, especially given incoming tighter regulations in the European Union around how companies handle and process personal data.
The GDPR (General Data Protection Regulation) comes into force across the 28 Member States of the EU in May next year, and sets new responsibilities and liabilities for companies processing EU citizens’ personal data — including by expanding the definition of what personal data is.
And since US tech giants have many EU users, the new rules in Europe are effectively poised to drive up privacy standards for major apps — thanks to the risk of far steeper fines for companies found violating the bloc’s rules.
Liabilities under GDPR can also extend to any third party entities a company engages to process personal data on its behalf — though it’s not entirely clear, in the case of Apple, whether it will be at risk of being in any way liable for how iOS developers process their app users’ personal data given its own business relationship with those developers. Or whether all the risk and responsibility pertaining to a particular app will lie with its developer (and any of their own sub-processors).
The EU regulation is undoubtedly already informing how Apple shapes its own contractual arrangements with app developers — such as stating developers must get appropriate consents from users so that it can demonstrate it’s taken appropriate contractual steps to safeguard user data. And also by setting limits on what developers can do with the data, as the clauses detailed above show.
Although, again, Apple is also creating risk by making it easier for developers to map and track users’ faces at scale. “Every time you introduce a new player into the ecosystem by definition you create vulnerability,” agrees Scott Vernick a partner and privacy and cybersecurity expert at law firm Fox Rothschild. “Because it’s a question of… how can you police all of those app developers?”
One thing is clear: The level of consent that app developers will need to obtain to process EU users’ personal data — and facial data is absolutely personal data — is going to step up sharply next year.
So the sort of generic wording that Snap, for example, is currently showing to iPhone X users when it asks them for camera permissions (see screengrabs below) is unlikely to meet Europe’s incoming standard on consent next year — since it’s not even specifying what it’s using the camera access for. Nor saying whether it’s engaging in facial tracking. A vague ref to “and more” probably won’t suffice in future…
GDPR also gives EU citizens the right to ask what personal data a company holds on them and the right to request their personal data be deleted — which requires companies to have processes in place to A) know exactly what personal data they are holding on each user and B) have systems in place capable of deleting specific user data on demand.
Vernick believes GDPR will likely have a big impact when it comes to a feature like iPhone X-enabled facial tracking — saying developers making use of Apple’s tools will need to be sure they have “proper disclosures” and “proper consent” from users or they could risk being in breach of the incoming law.
“That issue of the disclosure and the consent just becomes incredibly magnified on the EU side in view of the fact that GDPR comes into place in May 2018,” he says. “I think you will see a fair amount of interest on the EU side about exactly what information third parties are getting. Because they’ll want to make sure the appropriate consents are in place — but also that the appropriate technical issues around deletion of the data, and so forth.”
What does an appropriate consent look like under GDPR when facial mapping and tracking comes into play? Could an app just say it wants to use the camera — as Snap is — without specifying it might be tracking your expressions, for example?
The consent will have to be open and notorious in order for it to satisfy the GDPR.
“If you just look at it from the perspective of GDPR I think that there will have to be a very notorious and outright disclosure,” responds Vernick. “I haven’t quite thought through whether the consent comes from Facebook or whether it comes from the application developer itself or the application but in any event, regardless of who’s responsible for the consent, as we would say here in the States the consent will have to be open and notorious in order for it to satisfy the GDPR.”
“Start with the premise that the GDPR is designed to, as a default, establish that the data is the consumer’s data. It’s not the technology company’s data or the app developer’s data,” he continues. “The premise of GDPR is that every country controlling or processing data of EU citizens will have to get specific consents with respect to every use that’s intended by the application, product or service. And you will have to give EU citizens also the right to delete that information. Or otherwise reclaim it and move it. So those general rules will apply here with equal force but even more so.”
Asked whether he thinks the GDPR will effectively raise privacy standards for US users of digital services as well as for EU users, Vernick says: “It will depend on the company, obviously, and how much of their business is tied to the EU vs how much of it is really just based in the US but I actually think that as a regulatory matter you will see much of this converge.”
“There will be less of a regulatory divide or less of a regulatory separateness between the EU and the States,” he adds. “I don’t think it will happen immediately but it would not surprise me at all if the kinds of things that are very much present and top of mind for EU regulations, and the GDPR, if you don’t see those morph their way over to the States… [Maybe] it just becomes technically more efficient to just have one standard so you don’t have to keep track of two schemes.
“I think that the regulatory climate will hew if you will towards standards being set by the EU.”
In the GDPR context, Apple’s own decision to encrypt and only locally store users’ sensitive facial biometric data makes perfect sense — helping it minimize its own risk and liabilities.
“If you start with the premise that it’s encrypted and stored locally, that’s great. If the app developers move away from that premise, even in a partial manner, even if they don’t have the entire [facial] mapping and all of the co-ordinates, again that produces a risk if in fact there’s unlawful access to it. In terms of the same risk of getting hold of any other personally identifiable information,” says Vernick.
“Every time you collect data you expose yourself to law enforcement, in that law enforcement generally wants a piece of it at some point,” he adds. “Now Apple seems to have head that off by saying it can’t give over what it doesn’t have because the information is not stored at Apple, it’s stored on the device… But if there’s any erosion to that principle by whatever means, or by the app developers, you sort of become targets for that kind of thing — and that will raise a whole host of questions about what exactly is law enforcement looking for and why is it looking for it, and so forth and so on.
“At a minimum those are some of the legal challenges that facial recognition poses.”