Meta research suggests that VR’s most transformative gains in telepresence and visual realism may come from improvements in screen brightness and dynamic range.
Talk about Meta CTO Andrew Bosworth’s podcast, the company’s head of display systems research spoke of the huge gap in brightness between the 100 nits delivered by Meta’s industry-leading Quest 2 headset and the more than 20,000 nits delivered by the Starburst research prototype. The latter can even match bright indoor lighting, while far outperforming today’s best-performing high-dynamic range (HDR) televisions. top out around 1,000 nits†
Douglas Lanman, Meta’s top display researcher, referred to this gap as what “we want most, but are least able to deliver right now.” The prototype is so heavy at 5 to 6 pounds with heat sinks, a powerful light source and optics, that when you look at Starburst, it has to be hung comfortably from above and held against the face with handles. While we know Sony’s PlayStation VR 2 display brings HDR for the first time for consumer VR, the exact brightness and dynamic range is unknown.
“You said you feel your eye react to it in a certain way,” meta researcher Nathan Matsuda told Tested’s Norman Chan when he tried Starburst. “We know there are a variety of perceptual cues that you get from that expanded luminance, and some of that is due to work that was done for the television and cinema display industry, but of course if you have a more immersive display device like this where you have a wide field of view, binocular parallax and so on, we don’t know if the perceptual responses actually come straight from the previous work that was done with TVs, so one of the reasons we built this to begin with is so we can starting to unravel where those differences are, where the thresholds might be where you feel like you’re looking at a real light rather than an image of a light, which will eventually lead us to being able to build devices that then content creators can use produce that takes advantage of this full range.”
For those who missed it, Meta this week offered an unprecedented look at the research into the prototype VR headset coupled with announcing a goal to one day become the “visual Turing test† Passing the test would mean creating a VR headset with images indistinguishable from reality. on Bosworth’s podcast, Boz to the FutureLanman described the challenges in advancing VR screens toward this goal in four ways: solution† varifocaldistortion correction and HDR – with the latter described as arguably the most challenging to fully achieve.
In this [Starburst] prototypes we’ve built, you’re watching a sunset… And when we talk about presence, you feel like you’re there. You are on Maui, watching the sun go down and the hairs on the back of your neck rise.
So this is the one we want the most, but can deliver the least at the moment. Where we are is just conducting studies, to determine what would work? How can we change the rendering engine? How could we change the optics and displays to give us this? But high dynamic range, that’s the fourth, arguably the king of them all.
The Starburst prototypepictured below demonstrated an implementation of extremely clear visuals in high dynamic range (HDR) VR, which Meta CEO Mark Zuckerberg described as “perhaps the most important dimension of all.”
While Starbust’s brightness greatly enhances the sense of presence and realism, the current prototype would be “extremely impractical” to ship as a product, as Zuckerberg put it. If you haven’t dived in yet, we highly recommend taking the time to watch Tested’s full video above and listen to the podcast with Lanman and Bosworth embedded below. As Meta’s CTO said, “the prototypes give you” the ability to reason about the future, which is super useful because it lets us focus.”
We also reached out to Norman Chan on Tested, as his exclusive look at the hardware prototypes and the comment he made to Zuckerberg that Starbust was “the demo I didn’t want to launch” suggests HDR will likely become a critical area for improvement. for future HMDs. Where the gap between Quest 2’s angular resolution and the “retinal” resolution of the Butterscotch prototype is 3x, the gap between the brightness of Starburst and a Quest 2 is close to 200x, meaning there’s a bigger gap to cross in brightness and dynamic range before it’s able to match “pretty much any indoor environment” as Lanman said. about Starburst.
“The qualitative benefits of HDR were noticeable in the Starburst prototype demo I tried, even though the headset screen was far from retinal resolution,” Chan wrote to us. “It’s going to be a big technical challenge to get to about 20,000 nits in a consumer headset, but I could see incremental improvements in luminance through efficiency in the display’s transmission. What excites me is that producing HDR images isn’t computationally taxing – there are so many existing media with built-in HDR metadata that will benefit in HDR VR headsets, I can’t wait to replay some of my favorite VR games that have been remastered for HDR!”
UploadVR News Writer Harry Baker contributed to this report.