Vision Pro Will Use External Display for More Than Just Showing Your Eyes

You are currently viewing Vision Pro Will Use External Display for More Than Just Showing Your Eyes
<span class="bsf-rt-reading-time"><span class="bsf-rt-display-label" prefix=""></span> <span class="bsf-rt-display-time" reading_time="2"></span> <span class="bsf-rt-display-postfix" postfix="min read"></span></span><!-- .bsf-rt-reading-time -->

The recently released VisionOS Beta 6 contains a video showing how users will scan their face to create their avatar using the Vision Pro cameras. Perhaps more interestingly, the video shows that Apple plans to use the external display for more than just showing the user’s eyes through the headset.

Probably the most unexpected thing about the Apple Vision Pro reveal is the headset’s external display. This is something that no commercial XR headset has shipped with to date. Apple calls this the EyeSight display, because its primary function is to show the wearers eyes ‘through’ the headset, so people nearby can tell if the wearer is looking at them or if they’re fully immersed and unable to see.

Image courtesy Apple

Technically, the EyeSight display isn’t actually showing the user’s real face. It’s actually projecting a view of their Vision Pro avatar (or ‘Persona’ as Apple calls them). Apple masks this fact with a stereoscopic display and some clever blurring and coloring effects to hide the limited resolution and quality of the avatar.

To generate the avatar, users will use the headset’s own cameras to capture multiple views of their face. The exact procedure was found in the files of the VisionOS Beta 6 which developers can get access to.

New video tutorial showing Persona Enrollment for Apple Vision Pro added in visionOS beta 6!

The enrollment uses the EyeSight display to guide the user.

— M1 (@M1Astra) November 14, 2023

In the video we see a pretty quick and easy process which employs the headset’s external display as a sort of step-by-step guide through the process.

The scanning process is interesting in itself, but perhaps more interesting is the way Apple is thoughtfully using the external display to help guide user.

It seems likely that Apple will leverage the display for more than just showing the user’s eyes and guiding them through the scanning process, which opens a bunch of interesting doors.

For one, the display could be used to let the headset communicate in other ways to the user when it isn’t being worn. For instance, it could light up green to indicate an incoming FaceTime call; Or blue to tell the user that a large download has finished; or red to indicate that it’s low on battery and should be plugged in.

While there’s nothing stopping Apple from literally just putting text on the display and going full Daft Punk, the company seems to be thinking of the external display as something a bit more organic and magical than a readout of how many emails are waiting for you or how many calls you missed.

Can you think of any other interesting use-cases for the headset’s external display? I’d love to hear more ideas in the comments below!