Friday 30 April 2021

VR Review Roundup 1

Since last month, I've had some time enjoying my new PC VR setup [I also switched domain hosts so if anyone has had any problems with this site or any of my other subdomains, let me know]. I can definitely feel the growing room that exists for really pushing the fidelity of this VR display (and with displays only getting higher resolution from here, that will continue), so all of this is currently being given with the caveat that we are fast-approaching the fifth birthday of my GPU - at some point I'm going to be able to get more games looking nicer or enjoying far less time viewing reprojected alternating frames, maybe even at the highest 144Hz refresh rates that this headset can do. One of the difficulties I have in VR is always being able to be as analytical as I'd like while wrapped inside the virtual space and the default tools for capturing moments are not raw grabs (into the actual deformed view fed to the headset) while adding actual DirectX frame capture into the rendering chain might mess with latencies etc (I've yet to look into it). Let's run down some notable things I've played recently and what my perceptions are of the rendering going on:

No Man's Sky

I just couldn't get this working correctly. Not sure if I'm still finding my "PC expert" legs on how to set things up correctly for VR but the flying-through-space loading screen (along with very unstable movement to photons delay) was enough to make me feel slightly unwell & the framerate once I'd landed on a planet simply wasn't where it needed to be (even after tweaking the Index scaling option well below the automatic value). Maybe I needed to poke more at the in-game settings or wipe my previous config file (from before VR was patched into the game) because aiming for 2D 4K60 and aiming for VR numbers are not remotely similar optimisation processes and the game isn't reading the SteamVR requested resolution correctly. It's important to not take this initial post as being my final decree on modern PC VR (from the perspective of someone who previously has mainly been configuring console VR experiences) - it is still early days and I'm still finding my legs for tweaking VR games.


Star Wars: Squadrons

The scale is wrong. Digital Foundry noted something similar during their stream of Doom VR for PS VR last month and it's immediately very noticeable as soon as you get into this game. When sat down, the floor in the game is about at the level of your actual floor but everything is scaled as if you were standing up. If you do stand up and reset the position (your screen initially going to black as soon as you move out of the sweet spot it expects you to be in, avoiding letting you walk and clip through too much geometry) then the virtual floor is clearly at the wrong depth, as you might decide in a game designed only for seated play.

Getting into the game design choices, the cockpits may be accurate to the fictional universe but the often limited front-only (or slit window) view into space removes the field of view advantage of VR while the instruments feel insufficient to give a good sense of where things are around you (again, this may be true to the source material but I'd much rather they offer "upgraded" in-world interfaces rather than lean on an optional floating HUD). There was a later mission around setting off floating reactor cores as large ships passed them (while also skirmishing with fighters) and I realised I basically didn't have a good idea of the 3D space while playing for the majority of that mission. That seems like a failure to really utilise what VR can do. It didn't help to find plenty of threads of others swearing out that mission design, despite being something that theoretically should be cool if it was easy to judge 3D relationships - ultimately I restarted the mission rather than keep banging my head against the third checkpoint, just so I could swap to a ship with a somewhat better canopy and by then I had almost learned it rote (fly here, shoot this, then fly there, shoot that at this timing, etc) so if that had involved a more dynamic setting then I might well have just given up.

The Frostbite temporal anti-aliasing is surprisingly good (considering FoV & pixel density requirements) here. Zero ghosting issues, even with the added difficulty of regular reprojected frames because I couldn't get a high res VR output at a stable frame time budget close to what you'd want, even with lots of settings dropped and including the new [lighting: Low] forward renderer mode that was patched in precisely to try and offer higher framerates for VR. The way fine detail starts to flicker out of reality at certain distances can become visible (even in only one eye at a time for a real headache) but is generally very rare and it's a lot better than constant "army of ants" edge aliasing (especially how that works the other side of lens inverse distortion to be even more distracting than in 2D). As we get higher and higher res panels in VR, we will need to find a better solution than brute force (very high res super-sampled internal rendering) for cleaning object edges and DLSS or TAA (without introducing significant latency) seems like something that's going to be the future (not just for 2D). I was also recently playing Battlefield V (with settings trying to hit a stable 4K60) and the TAA there caused significantly more issues with thin objects fading out of existence so something in the TAA used here (with a decent 'TAA sharpness' slider that's not 90% way way too much sharpening for anyone to want) felt like the best of what EA are doing.

I was definitely far more aware of polish issues than any aliasing flicker or eye discrepancy. Plenty of walk animations seemed to have not been sorted to actually plant feet on the ground so ended up with very obvious skating feet. By no means is this just a tick-box "IK enabled" fix but it's a lot closer to a solved problem, slightly weird to see not working correctly, than it was a decade plus ago (when I was doing some light animation work for video games). Reflections on the black gloss floor of Imperial bases constantly showed that they were not well aligned with cubes for the static positions from where the player would be observing them (it seems like they could have generated enough static cubes as you teleport between very few view locations and with no free movement or room-scale VR due to the screen fading to black if you moved around).

SuperHot VR

Now here is where I wish I knew how to really prod a game (maybe one of the Unity tweaker/console tools could provide some aid). This style would be perfect for some MSAA anti-aliasing that did more than mild super-sampling around the edges (which wastes shader perf on repeatedly sampling inside basically flat-shaded triangles while undersampling at the edges where the sharp contrast demands the best). The game as shipped doesn't even seem to be able to offer the post-AA (FXAA) that the non-VR games from this team have integrated. And you can't inject FXAA at the driver layer because the inverse warp for the lenses will remove the clean aliased lines that the morphological pass is looking for (assuming the nVidia driver doesn't detect VR titles and disable such tweaking entirely). While moving the Index scaling option clearly affected framerates, the aliasing never cleaned up significantly.

The game itself is still just as fun as it was when first released but, even with tweakable internal res on PC, it's still a long way short of where I would hope it could get to visually (and will likely not be getting any more updates that could add better anti-aliasing as the teem are almost finished with the 3rd game in the series, which is not in VR, then probably moving on to new things). Even a lot of brute-force super-sampling will possibly only go so far to fixing those incredibly sharp aliased edges accentuated by the game's style - something where you're wondering if something in the pipeline explodes if you push beyond 8K rendering so it'll never be viable to do so even with GPUs several generations out. You can definitely get immersed in the experience and have it bother you slightly less over time (especially if you push up refresh rate so you're getting more temporal data rather than letting aliased frames linger, something faster GPUs certainly help with) but quite a few games I've sampled seem to have decided that AA, even a cheap post-AA pass before distortion, isn't in their performance budget and I really think it's not paying off vs targeting a lower internal res but with an AA method enabled. Of course, on PS VR you often had the combination of a low internal res and no AA so at least on PC things are always less bad.

Tetris Effect

All the games I'm talking about this month provide a contrast of different techniques and rendering challenges. I talked about this on PS VR several years ago. It was one of the best games of the year in 2018 and the multiplayer mode is a nice addition in 2021 (but not really why I come to Lumines-style games) so it's still great today. The fidelity here is clearly better than on PS VR, although I found that super-sampling can push down the framerate below where it might be (even with only a 90Hz target rather than pushing towards 144Hz) without ever really making it feel like every sharp edge is anti-aliased (in combination with the FXAA the game uses). Much of the amazing particle work doesn't need AA (despite the High setting defaulting to 150% super-sampling the entire scene) and those semi-transparent particles probably causes major issues with trying to enable a turn-key AA solution so it's a shame someone hasn't built a more bespoke solution that merges the various different techniques each element of the scene needs while maximising performance (to hit the high framerates and native resolution needed for this generation of VR headset).

As with Rez (which I have not yet tried on PC, waiting for a sale to buy a second copy for a second system), there is something I find deeply pleasing about the audio-visual combination here and the soundtrack brings out the in-built speakers on the Index when cranked all the way up. There is certainly not the same deep bass you'd get from a subwoofer (it would be interesting, if not ideal for those living in apartment complexes, to be able to feed the LFE channel to a separate audio device in a game that bothers to enable a second rumble device for additional haptic feedback) but it's not bad. This isn't tinny (which is always the fear when doing something like off-ear small speakers) and is at least as good as a quality set of in-ear canalphones, but with the potential here for better positional audio because it doesn't ever feel like the audio is originating from inside your head.

I couldn't get the Index controls working exactly how I wanted them (anything linked to the right analogue stick is locked out despite the VR mode not using the right stick for anything so I couldn't rebind it; for some reason the individual buttons & pressable surfaces on the Index controllers did not all seem to turn up in the menus, which seemed designed assuming Vive or Oculus layouts and even recommending you not use those but rather plug in an Xbox controller) and this is a bit of a recurring theme in games that I have poked at. The Index controllers are a bit of a variation on the Vive designs, which changes the angle it thinks "forward" is from them but also shuffles the inputs around so that you're sometimes wondering exactly what the game is expecting when an icon from the Vive pops up. It's something that likely won't get ported back into older VR games and hopefully Valve will provide free engineering time to assist VR developers integrating prompts and defaults into their current or upcoming releases. This game is totally fine with a very old 360 controller (as long as you map things off the d-pad, because you can't drop and move Tetris pieces with a d-pad that poor at reading precise inputs) but I'd really like it to be pick-up-and-play with the Index controllers.

Half-Life: Alyx

And here we reach the culmination of a lot of VR work. Valve created an updated version of their engine and built the next entry in the Half-Life series around the development of a new headset and controller update from their earlier cooperation on the Vive ecosystem. That is the Index headset and controllers I'm currently using. This is exactly the game you expect it to be from the developer who have infinite money and time (but seemingly far fewer developers than studios that scaled out when AAA asset creation demanded it) to iterate on their previous design ethos: constant innovation during play. When I discussed Killzone: Shadow Fall in 2014, Half-Life 2 was the obvious title to compare it to when talking about combining a narrative progression with first-person gameplay variety. And that's exactly what we get here in VR, a slow development of new tools and ways of interacting with the world that also slowly eases between several genres from action to horror. Very early on, when you're introduced to the power of 'gravity gloves' to point at an object and pull it towards you, it becomes obvious, "surely everyone should be doing this!" That's the Valve magic: making something that feels like it's the only answer and something everyone else must adopt because it so cleanly solves a problem (you don't want to have to physically slowly move over to pick up every little thing while keeping the action pace up moving through a gaming environment).

On a technical side, this engine is doing exactly what all the early best practices notes (which came from engineers pushing VR like the team at Valve) said you should. Get back to forward rendering (use forward+ or similar clustered options if you want many real-time light sources that your deferred renderer was enabling at high framerates), go back to classic MSAA, and try to get a lot of pixels rendered while maintaining modern geometry and texture detail. Step back to less dynamic lighting if you have to, which is already something HL2 was excellent at mixing to hide just how much wasn't part of some unified real-time lighting solution. The end result: a very sharp result and something that I fully expect to really sing on future hardware (both higher res headsets than the Index and the future GPUs that can drive them at high resolution while hitting 144Hz native). The only thing that actually feels extremely outdated is the level loads between sections, something a level streaming solution could surely have completely alleviated.

As to how it looks on my older GPU driving the Index? At points it's a touch too sharp for me. The textures can crawl and alias a bit in spots and the edge anti-aliasing is good but not perfect. I'd prefer a softer output that manages to deal with shader aliasing, even if it might have more issues around transparencies and thin edges (here using super-sampling on texture transparency, the old classic that we don't see so much of in 2021 but really made the chainlink fences pop in 2005 in games like Half-Life 2). But beyond some mild criticisms, it holds together really well. That's why I think it'll work very well in the future (selling an entirely new generation of headsets on PC and presumably even console). Unlike some of the other games, I think you could pump up the internal res and maybe integrate some VRS or even DLSS to boost output resolution without linearly increasing GPU load (spending your fidelity more smartly with VRS Tier 2 or simply letting AI magic clean aliasing defects while chasing a fixed frame time with DLSS 2.1) and so remove those small criticisms without demanding a radically more powerful GPU.

No comments:

Post a Comment