I got the iPhone X for Christmas (thanks to my lovely wife!) and I have been impressed by its capability to recognise my face in different light conditions, even in the dark (of course that is possible because it uses an infrared beam to detect face features, still it is amazing).
The interaction with the phone is straightforward, you just look at it and you are identified and ready to go. However, I can expect that the Face Id technology could be used for much more than identification and indeed we are starting to see the first applications taking advantage of it.
Rainbrow, watch the clip, will let you control a game by moving your eyebrows. It may not be nice to see you grimacing and making … faces… by a standby fellow but it is an example of what can be done.
The facial data generated by the TrueDepth camera (actually a small subset of them since Apple does not release them all for privacy concern) are analysed in the app in the phone and are not shared with third parties.
Another app is Nose Zone, where you use your nose to control a dot on the screen to hit and destroy boxes.
I am pretty sure many more, and more sophisticated apps will become available in the next months. It seems like an ineffective way of interacting with a phone but we have to think how effective is facial communications in our everyday interaction with other people. By looking at your fellow face you can read between the lines and get much more information than the ones being voiced. The presence of AI in applications will be open up a host of mood detection capabilities and in turns it will change the response of the applications.
So far Apple is limiting, for privacy consideration as I mentioned before, the use of facial data but I can imagine that web browsing, on line shopping, cooperative working will eventually take advantage of the possibility of reading “between the lines” by observing facial expression and customising the interactions accordingly.