According to this article this is on the iPhone X and iPad Pro models with the A12X Bionic chip. They say it's for face ID and Animoji, but it generates a 3D map of a user’s face even when those functions aren't being used.
I assume it’s doing that in order to fulfill its mandate of knowing what you look like despite growing facial hair, wearing different clothing, etc. The more data it has on your face shape, the better. What did people expect? You use FaceID, how do you think it knows what your face looks like?
Now if you didn’t use FaceID and it was still scanning your face then that would be a news story.
According to this article this is on the iPhone X and iPad Pro models with the A12X Bionic chip. They say it's for face ID and Animoji, but it generates a 3D map of a user’s face even when those functions aren't being used.
I assume it’s doing that in order to fulfill its mandate of knowing what you look like despite growing facial hair, wearing different clothing, etc. The more data it has on your face shape, the better. What did people expect? You use FaceID, how do you think it knows what your face looks like?
Now if you didn’t use FaceID and it was still scanning your face then that would be a news story.