On September 12th, Apple held its iPhone launch at the newly opened Steve Jobs Theatre in Cupertino California. Besides the new Apple Watch Series 3 and Apple TV, launches that app developers should be really excited for are three new iPhones and the upgrades to iOS 11.
One feature that deserves special mention is the new TrueDepth Camera system. In many ways, the TrueDepth Camera system has more going for it than any other phone brands have ever achieved. To illustrate, this is how many technological innovations the iPhone X has on the small strip on its front screen:
As you look at the phone, the flood illuminator detects your face. The infrared camera takes an image and the dot projector uses around 30,000 IR dots to create a 3D dot pattern of your face. This data is sent to the the new A11 Bionic chip processor, trained with a neural network of over a billion images.
For the development community, the new hardware has enormous implications. Judging from the features, can it make waves in the movie industry? What about app security in business technologies like enterprise applications?
Apple already demonstrated applications of the system, FaceID and Animojis, both of which promise implications of their own when the TrueDepth SDK opens for developers to tinker with. Lets see what these implications might be.
Till now, the TouchID feature has been a staple in all iPhones. Put your finger on the iPhone home button and it recognizes your fingerprint to unlock it. With TrueDepth, Apple has managed to find a more secure security system – the FaceID.
The new application is exactly what it says. Instead of your fingerprint, it is your face that is the new passcode. All you have to do is look at the screen to unlock the phone.
To set up FaceID you will have to move the camera in a circle around your face allowing for a more accurate perspective of your facial features.
This feature has made Apple venture into a territory that no other phone brand offers app developers – a chance to move from 2D object recognition to 3D. For developers, it offers the chance to create products with security that is basically foolproof as illustrated below:
Accurate facial recognition for security purposes:Would you recognize your friend if he suddenly came in to work with a full beard? No you won’t and unfortunately, apps with facial recognition like banking apps, have these limits too. For example, an app with the feature might recognize a user once and fail to do the same when the same user wears glasses. Additionally, some systems can also be fooled with a picture of a user’s face acquired from social networks.
Once TrueDepth SDK opens up, it might enable developers to create software that is harder to fool. Facial recognition in banking applications, for instance will become foolproof and more accurate. And since it is already familiar with a 3D versions of your facial features, a 2D picture of you won’t fool it.
More secure sign in in enterprise software:The TrueDepth Camera system is so accurate that Apple claims there are only “1 in 1000,000” chances that a random person will be able to unlock your phone. What if the same is applied to an enterprise software which, say requires employees to sign in via a facial recognition system to access classified documents. Apple’s engineering team used Hollywood grade facemasks to train its FaceID feature so not even a user’s evil twin will be able to gain unauthorized access to sensitive material.
FaceID will work with Apple Pay third party apps like the bank app Mint, One Password and Etrade.
The same tech that powers FaceID also comes to life in the TrueDepth camera system’s second application – Animojis.
The Animoji is a new feature in iOS11 and uses the FaceID feature in iPhone X to create customized 3D versions of your favorite emojis based on your facial expressions. While editing, you can personalize audio messages where your you chosen emoji mouths the words as you say them. Recipients receive these messages as looping videos with audio.
It’s safe to say that no smartphone has advanced with emojis as far as Apple. Thanks to TrueDepth’s, text messaging in iMessage has become more immersive. The recipient doesn’t have to see you in person to know how you feel when your Animoji performs your facial expressions for you (e.g like a frown or the downturn of the mouth.)
Judging by how far Apple’s TrueDepth camera system has elevated the static emoji, it could have implications beyond text messages.
To illustrate, consider its application in filmmaking, particularly The Fast and the Furious franchise. In the sixth instalment of the movie, filmmakers were able to replace a substitute actor’s face with a digital image of the deceased Paul Walker’s face. There were no face scans of the actor. To create a digital model of his face, filmmakers had to rely on older footage to create digital models that mimic how his facial features moved.
If the TrueDepth SDK makes its way into the movie industry, it might drastically cut the time and effort needed by visual effects artists to accomplish something like this. Additionally, movies could incorporate more dangerous scenes in which a stunt double can stand in with a digital “mask” of a real actor’s face in full view of the camera.
To recap, Apple’s TrueDepth camera system offers more than existing facial recognition technologies. It’s more accurate and once the SDK opens, might just allow developers to create more efficient facial recognition features in their products.
In the meantime, check out Apple’s WWDC conference if you haven’t already.