Apple Face ID uses both normal camera as well as IR projector for building depth map.
What is the purpose of this technology? Of course it is not about being handy (as it is not handy even compared to fingerprint).
Such technology is required now to establish tight connection between your face on all surveillance cameras and your identity.
It also provides astonishingly detailed face model for government.
Surveillance cameras system grow exponentially and around 80% already are connected to face recognition systems.
Having access to detailed 3D models database will allow such systems to boost recognition rate and make new approaches working perfectly in low light and visual occlusion situations.
Main goal of all new surveillance systems is to produce near realtime tracking of individuals using both smartphone signal triangulation and cameras.
Btw this week Apple tried hard to go around simple fact that they provide face depth map and image for any app that ask for it via API. And that they can share this data with any third party and sell it. They just lied that it is not the same data as for recognition.
And this is how Apple marketing team work:
In reality all Apple revolutionary ID is very simple (whole hard thing is to make it small). It is IR dot patter projector, IR led flash and monochrome camera with IR filter. Depth map is produced using long known algorithms that process dots.
Here's my take on real use of 3d scan of faces... Apple studio entertainment and VR
They want to motion capture all people and all places you go indoor and out, so they can use it for AR & VR movie Production and entertainment and marketing to advertisors... With or without your consent
Once the Apple AI has enough data on you, it can place you anywhere for any use from games to movies to a mall on the other side of the world
Of course, the government and other agencies wants that data so then they can place you anywhere you may or may not be in the future...
No thanks Apple- the big brother 1984 commercial directed by Ridley Scott is so appropriate... Anybdy doing the spoof video yet?
It works only on very small distance due to dot projector. So all things written make no sense.
You really believe this tech is ONLY introduced for controlling emojis and phone unlocking with your face?
Nope, Apple REALLY wants you scanned and into their artificial AR/VR environment 24/7 365 so they can turn you into their perfect digital consumer, with all relevant facial emotional response data points and even heart rate responses monitored, all in real-time to gage what to sell you or sell about you next.
This first depth mapping use is just the start of slippery slope, they know where they want you falling and which direction, and billions of 2D photos was insufficient.
You have no thinking to do, just research & Kindly please retract your "all things written makes no sense" comment after doing a couple of releveant searches, and I'll continue posting... here's a head start with 3 very relevant data points...
https://techcrunch.com/2015/11/24/apple-faceshift/
"Apple Has Acquired Faceshift, Maker Of Motion Capture Tech"
Apple itself already has patents and assets across motion capture, facial recognition and augmented reality, partly by way of three other European acquisitions, respectively PrimeSense, Polar Rose and Metaio. Faceshift could complement and expand Apple’s capabilities in these areas going forward.
https://www.ft.com/content/c4e4d1aa-8ee9-11e7-a352-e46f43c5825d
Apple eyes iconic studio as base for Hollywood production push
The Culver Studios, where ‘The Matrix’ was filmed, may be leased for content arm.
September 1, 2017
by Matthew Garrahan in London and Tim Bradshaw in Los Angeles
"Apple is eyeing the studio where films from Gone With The Wind to The Matrix were shot, as the base for its big push into Hollywood production.The iPhone maker is in discussions to move its original content division to The Culver Studios, whose former owners include RKO, Howard Hughes and Cecil B DeMille.Apple’s interest in a studio which has been central to Hollywood moviemaking for close to a century, comes amid an intensifying Silicon Valley battle for the best movie scripts and television projects...
Apple’s expansion in the heart of the entertainment industry comes as the company has said that it wants to double the amount of revenues it receives from services such as Apple Music, iCloud and the App Store by 2021 to make them into a $50bn business..."
https://www.cultofmac.com/501560/apples-big-push-augmented-reality-will-bring-new-hardware/
“A new high-end iPhone is expected to be unveiled Tuesday with a dual-lens camera system and 3-D sensors that improve depth-sensing and enhance augmented-reality experiences,” it reads.
This will be followed by a “flood” of new AR apps and games that will place products in our homes before we buy them, allow us to explore virtual worlds from the comfort of our living rooms, and enjoy immersive movies like never before.
This is one of those huge things that we’ll look back at and marvel on the start of it,” Apple CEO Tim Cook told analysts last month..."
Finally, https://www.google.com.au/patents/US20090195392
"Laugh detector and system and method for tracking an emotional response to a media presentation US 20090195392 A1"
Just Apple's small steps into "entertainment", aka emotional response research and control complete with biometric signatures only you have...
I really like that you dig deep into this, but here you are wrong.
The way 3D sensing works is using dot projector and later IR camera to get image. By the way dots are displaced you can calculate depth map. Unfortunately due to simple physics laws it is completely useless at bigger distances.
It looks like you're new here. If you want to get involved, click one of these buttons!