# VR, PPD, Aliasing and Readability Date: 2024-11-03 Tags: Tech, VR # Quick Introduction In my last post, I basically had a rant about how VR feels useless despite offering so much potential. Since then, I did some work with the main, basic goal of reliably and measurably allowing things to be "readable" in VR. As seen in this [GDC Presentation from Valve in 2015](https://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf), aliasing and generally making stuff readable is a pretty big deal. This made me think about PPD, or Pixels-per-degree (of your field of view). Pixels in a headset can be any size. But they're so close that measuring their size isn't really valuable. Instead, measuring how many you can see at once (hence PPD) allows you to get a much better idea of the sort of clarity you can get, and optimally the amount of aliasing you will be subjected to. # The world of PPD When the Index came out, PPI was still being thrown around without explanation but everyone was already converging to measuring the "clarity" of HMDs in PPD. However, to this day, calculating PPD is ambiguous at best, with massive [blogposts](http://doc-ok.org/?p=1414) being written with the sole objective of finding an accurate way to calculate PPD. In a semi-related attempt to get the Apple Vision Pro's (AVP) actual PPD (since no solid values existed online), I was led to coming up with a way to calculate PPD fairly reliably. I'm not stating this is perfect, but if I've learned anything from my engineering courses, it's that close enough is the best you can ask for when working with real life (also I don't really care that much, this is comparative anyway). So I've come up with this simple equation: $$ \text{PPD} = \frac{H_{t}}{F_{h} \times (1 + O_{p}) } $$ Where: - $\text{PPD}$ is the headset's pixel-per-degree value - $H_{t}$ is the total horizontal resolution (if 1440px horizontal per eye, then this is 2880) - $F_{h}$ is the total horizontal FOV provided by the headset - $O_{p}$ is the stereoscopic/binocular overlap fraction All of these values have specific meaning and ways to obtain them. Mainly, the total FOV and stereo overlap *should* be obtained from this amazing app called [wimFOV](https://boll.itch.io/wimfov), which just give these values with no additional work. Using this method, I can get an almost perfect estimate for Quests, Rifts, Pimaxes, Varjo Aero, Index, and managed to estimate the PPD of an AVP at around 39 PPD using data from SadlyItsBradley on Twitter. That said, I couldn't make it consistent with the Varjo XR4. Analysing this equation yields some interesting information. We already knew increasing the resolution for a narrower FOV would provide better PPD based on simple logic, but accounting for the binocular overlap yielded almost perfect values every time. Although there is no empirical backing for this, the equation basically states that any pixel shared by both eyes (as in, the pixels in the area of binocular overlap) don't "count" in the otherwise simple PPD calculation of resolution over FOV. The greater the overlap, the closer the "basic" PPD gets to being halved. Essentially, a simple monitor has an overlap of zero, so the equation defaults to Basic PPD (res divided by FOV). But if each eye sees the same image from identical displays (100% overlap), then the Base PPD gets effectively halved, like if only a single screen was visible. Humans need a bit of overlap to perceive depth. Too much and you feel cross-eyed, too little and you might not be able to see depth (kinda like putting a book 1 inch from your face and trying to read). Most headsets are around 80%+ for overlap. Notably, Varjo and Pimax have consistently low overlap, which might just be their approach to wide FOV with reasonable PPD. For instance, almost all Varjo devices are around 70%. # But why does this matter? Well, now that we can reliably get PPD, which not use that information to address readability in VR? We now know that a headset has a set number of pixels per degree. That means that whatever image we try to show in VR, even if it had infinite resolution (ie an SVG drawing), that it would eventually get "filtered" down to the headset's PPD. This is the most basic principle of aliasing from digital displays. If your filter (the screen itself) can't capture details of a certain size, they will be lost. If you try to render something 1 unit wide but your display's pixels are 2 units wide, then some error will be induced. Do that at different scales and you get aliasing. Again from that DGC presentation, aliasing is the bane of VR. If it wasn't an issue, we wouldn't be chasing higher resolution headsets. So how can we make the most of what we have? I have a Valve Index. It get approximately 14.84 PPD. That means that: - Anything with denser pixels (effectivePPD goes up) will get aliased - Anything with sparser pixels (effective PPD goes down) will waste space in the user's FOV since no additional details can be seen So technically, you'd want any text that would be readable on a normal screen to be scaled such that it gives approximately the same NUMBER of pixels to, for example, a letter in the HMD as it does on the monitor. If the letter "a" needs 10 pixels, we want it to also have 10 pixels in the HMD, otherwise the pixels become visible if the letter is rendered too big, or get aliased into mush if it's too small. Therefore, using this [calculator](https://phrogz.net/tmp/ScreenDensityCalculator.html#find:size,pxW:1280,pxH:720,sizeUnit:cm,axis:horiz,distance:130,distUnit:cm,density:14.84), for a 720p 16:9 window, we would need a "screen" at 1.3m from the user (roughly the distance at which the SteamVR dashboard spawns from the user) and have a horizontal width of about 243cm for it to be around the same PPD as the headset. Testing this in VR using [Desktop+ on Steam](https://store.steampowered.com/app/1494460/Desktop/) (since it offers reliable metric sizing of windows), the results are honestly perfect. Everything is readable and the boundary of aliasing vs pixels too large is easy to spot if you get further away or closer, respectively. As a little quirk of the calculator, it's technically possible to also refer to a headset's clarity in terms of the horizontal FOV that must be occupied for a window of a set resolution, say 1280px in width, for it to match the headset's PPD. For an Index, this is about 86° horizontal (most of the available FOV) whereas something like the AVP only [requires 33°](https://phrogz.net/tmp/ScreenDensityCalculator.html#find:size,pxW:1280,pxH:720,sizeUnit:cm,axis:horiz,distance:130,distUnit:cm,density:39). Do also note that this angle value is actually constant; the width, height and distance of the screen in VR will ALWAYS result in this horizontal "arc" of the user's vision being occupied, given a fixed PPD and fixed screen resolution. What this also means is that for windows of the same resolution, where the Index can only show one at "matching" resolution (1 window needs 86°), the AVP can show almost 3 windows (1 needs 33°, three needs about 100°). # Moving Forward Basically, I can now reliably get the PPD value of a headset using experimental results, and we can estimate the size a screen must be to match this PPD in VR. based on this, it's technically possible to determine automatically the size of any UI elements for any headset such that there is no aliasing. I will attempt to incorporate this in future projects. I hope this is useful for someone lol.