iOS 13 and iPadOS will correct our look on FaceTime calls to make it look like we are looking at the camera

The third beta of iOS 13 and iPadOS have revealed a very subtle novelty, but that can make video conferencing by FaceTime change. If you have spoken through videoconferencing, you will know that you usually look at the screen to see the person with whom you are speaking, and do not look directly at the camera. That means that person with whom you speak He is not looking at you directly, but rather that he has a slightly diverted look To another direction.

iOS 13 and iPadOS want to solve this detail that video conferences have always had with a correction going software: the systems will detect our face and move our eyes, so that the person or people who are looking at us on the screen have the feeling that we look at them directly.

There will be an option to disable it optionally (if nothing changes)

The finding has been shared on Reddit and there are already some images shared on Twitter, and the effect is achieved:

The secret lies in the use of ARKit: iOS makes a three-dimensional map of our face and uses it to reposition both the nose and the eyes:

In anticipation that the correction may be somewhat inconvenient for some, Apple has included an option in the systems to deactivate it:

Of course, if this function is not removed in the next betas, it will be something that we will test when iOS 13 and iPadOS are stably launched for everyone. There are some doubts, such as whether the correction will work only on calls between two people or also on group calls; or if this correction is also present in macOS Catalina.

Share iOS 13 and iPadOS will correct our look on FaceTime calls to make it look like we are looking at the camera