iPhone 12 Pro and iPhone 12 Pro Max are getting a new feature called “People Detection” that will allow visually challenged users to detect how far away they are from other people. The new feature is a part of a new iOS 14.2 beta release. It uses the LiDAR sensor and wide-angle camera of the latest iPhone models to precisely detect the distance between the users and their nearby people. Apple may include it as amongst the key accessibility-focussed changes in the public release of iOS 14.2 in the coming days.
Available in the Magnifier app, the People Detection feature works with a combination of augmented reality (AR) and machine learning to help visually challenged iPhone users to detect where humans and objects are around them. The addition was initially spotted in September. However, it’s now a part of the iPhone 12 Pro and iPhone 12 Pro Max through the iOS 14.2 “Release Candidate” version that reached both developers and public beta testers on Friday.
As reported by TechCrunch, the People Detection feature is an extension of the people occlusion capability of Apple’s ARKit that allows the AR system to occlude virtual content by a person identified by the camera feed.
In addition to the software tweaks, the feature utilises the LiDAR sensor and the wide-angle camera of the iPhone 12 Pro and iPhone 12 Pro Max to accurately measure the distance between the users and their nearby people as well as objects. This eventually helps provide visual support to users who have blindness or extreme low vision.
The People Detection feature tells the users whether people are in their space. If it finds someone in close proximity, it then measures the distance and then produces a sound in stereo corresponding to the direction of that person. It is also capable of allowing users to set any particular tones once a certain distance is measured by the iPhone. This essentially helps maintain some distance with other people and objects. It is also something quite helpful in today’s time when people are advised to maintain at least six feet distance from others.
Apple has also provided a haptic pulse option that becomes faster when a person gets closer, as noted by TechCrunch. The haptic pulse option is, however, limited to the iPhone and is not available even on the Apple Watch. There is also an option to get a visual readout of how far away the person is from the user.
Since the People Detection feature is currently in its beta stage, it has some bugs that may not provide appropriate results all the time. Nevertheless, Apple is likely to fix them ahead of its public debut that may come alongside the public release of iOS 14.2.
It is important to highlight that the feature is currently meant specifically for the iPhone 12 Pro and iPhone 12 Pro Max. This means that you won’t be able to experience it if you’re using even the iPhone 12 or the iPhone 12 mini. The exclusivity may give another reason to some users to pick a pricier option in the iPhone 12 series.
Are iPhone 12 mini, HomePod mini the Perfect Apple Devices for India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.