The Decline of PC Gaming
The PC gaming landscape has changed, and its not pretty (literally).
Reading Time: 6 min / 1588 words
Table of contents
When the Nvidia RTX series of graphics card were introduced, they overpromised and underdelivered on many of its features. Ray tracing is one of them, which turned out to be too demanding for any of the 20 series cards to handle effectively.
Another one of the features introduced was DLSS, an AI upscaling tool which could be used by games to boost performance by rendering the game at a lower resolution and running an image upscaling model which worked in real time to reconstruct an image closer to the target resolution. Initially, it performed poorly and many had their doubts. DLSS 1 was the first version of the tech, and barely any games included support for it due to the need to run the model through the game's own graphics to produce results which could be deemed acceptable. This required a significant amount of time for developers to introduce and simply turned many away.
Then came DLSS 2, a much more effective solution which did not require the manual training for individual games. Instead, Nvidia would create a universal model which could work across all games and integrated directly into the rendering pipeline of the engine. Unlike version 1, this new version would still work on all RTX cards and was able to produce much higher performance gains along with better image quality. Initially pushed as an image enhancement tool for lower end cards or higher ones to reach great resolution targets such as 4K, it became adopted by almost all of the new AAA games coming out.
At the end of 2020, the year DLSS 2 was released, Sony released the PS5, the successor to the previous generation. Unlike Nvidia's offering, the PS5 did not include any sort of upscaling hardware.
You're probably wondering, why is the PS5 relevant in this story? Well, lets explore how the combination of AI and the PS5 ultimately caused PC gaming to suffer, immensely.
Memory
The PS5 unique from the PS4 in that it included system memory that it is shared between the CPU and GPU. This means that games would not be specifically limited by memory of a particular component, they could instead dynamically allocate between the two based on the engine's needs. This works similar to the Apple Silicon chips with its unified memory architecture. PCs do not have this luxury, having DRAM and VRAM physically separated between the motherboard and the GPU.
Integrated I/O
Yes, this is a PS5 marketing term but its important. With integrated I/O, I am more interested in the storage solution of the PS5. The unique architecture and memory handing within the console allows games to read from the SSD at incredible speeds. This allowed many games created with the console in mind to load gigantic worlds in only a few seconds compared to the HDD in the older console. Games such as Ratchet and Clank: Rift Apart is one of these games which showcases this tech really well.
Well, the PC must have no trouble with this at all right? Right? Wrong. Even our NVMe SSDs with PCIe 5 and the blazing fast read and write speeds are inherently not going to be able to actually achieve the response time an integrated storage solution will provide. This is exactly what happened when the console launched and for a few years the PS5 had the upper hand in this regard. Window's system APIs did not have a solution for reading from the SSD directly into the GPU where it was needed most. Eventually, Microsoft released DirectStorage which allowed developers to get much closer to the speed of the PS5 when it comes to reading from the storage.
While this setback was quickly solved with DirectStorage, the others were not.
AI Upscaling
The phrase "AI" has been thrown around now so many times that its significance as a marketing term is dying out. Nevertheless, it has played an important part in destroying performance of all the new AAA games coming out for PC, especially those from the PS5 and Xbox. DLSS is now at version 3 as of this post, and introduced Frame Generation. This is the bane of image optimisation for me, having the GPU create fake frames to compensate for lower performance.
I knew that when DLSS 2 and especially frame generation would be introduced, developers would get lazy. Unfortunately, this turned out to be more than true. Alongside DLSS, FSR by AMD was another version which ran on all GPUs, not just Nvidia's. Even Intel came out with XeSS. With frame generation, developers would be able to get away with even more optimisation by having the game run at both a lower resolution but also use the GPU to generate fake frames for them to boost FPS. Now, I can totally see the use case for the image upscaling part, but the fake frames is too much.
Frame generation is not even being properly adopted as well, with some games recommending it be enabled when running below 60fps, going against Nvidia's recommendation because of course they would. Not only does it increase the image latency, it can generate extremely smeary and ugly image in motion.
So, with every possible graphics card capable of running some sort of AI upscaling tech, what did the developers do? That's right, make the AI upscaling process part of their optimisation. In my opinion this pretty much instantly killed any chance for games to run clearly and efficiently.
Unreal Engine 5
This next-gen engine which is supposed to produce next-gen visuals instead produces last-gen performance. As the engine runs on DX12 for Windows, games need to account for the amount of shader compilation that needs to happen for games to run smoothly. Do games do this and provide a pre-compilation step at start-up? Rarely. Even games that include it don't catch everything at the game still results in stutters constantly.
Developers are not optimising their games as much as before and instead focusing on brute forcing performance through either AI upscaling, frame generation, or dynamic resolution.
The result
We are seeing more and more games in the current generation release with terrible optimisation. The use of these tools to improve performance has caused visuals to take a hit. Ignoring DLAA, games that run with AI upscaling look more blurry compared to a native image. Companies are trying to normalise running games at low resolutions as enough optimisation while using upscaling to hide the fact they are just coded like crap. Lets take a look at a few examples.
Final Fantasy XVI
Oh wow, this one really hurt. You know, I was waiting it out thinking that it would get better but no, it really runs like shit.
FFXVI recommends an RTX 2080 for 1080p 60fps. You might be thinking well that's not too bad, its quite reasonable. No, its not. A 2080 is a top of the line card that was made for 1440p. And that's not all. Many benchmarks have shown that the card does not even maintain 60fps properly at 1080p. That's right, a top of the line card cannot perform well in this game.
For a 1440p smooth 60fps experience, it is recommended that you have a RTX 4070 Ti Super. That is completely unacceptable for how the game looks. The visuals are really not next gen, but they are nice. That still does not justify the extremely high GPU requirements of the game and as a result sales have been poor for the PC version. Only users with a high end GPU are able to play the game.
Monster Hunter Wilds
For this game, the recommended GPU is only able to produce 1080p at 60fps with FRAME GENERATION! That means, both image upscaling and frame generation are recommended to achieve 1080p 60fps. This game also seems to suffer from really bad optimisation all around. The game is resource intensive on everything, from the CPU, GPU and VRAM. Not only that, the visuals are not impressive at all. It should not require image upscaling and frame generation to run well.
Valheim
I was really rooting for this game as I only tried it a few days ago and unfortunately, it runs extremely poorly. This is a game made by an indie studio but the visuals are kind of a hybrid between Minecraft and Terraria which looks really interesting.
Somehow, despite the quite simplistic graphics, I am unable to even achieve 120fps on an RTX 2060 running at 1440p. Are you serious? This isn't supposed to be a graphically demanding game, and the fact that it runs worse than Minecraft shaders is so disappointing. This is unacceptable, even for an early access game.
The future
Unfortunately, while AI image upscalers were a good idea, they have allowed the gaming industry to get lazy and produce shitty games for the PC platform. The future does not look good and until developers either optimise their games properly we are going to see this type of performance become standard. How disappointing...
Hopes for the future
We have started to plateau on the graphical fidelity of games. Unreal Engine 5 has really pushed graphics incredibly far and the team behind it have done amazing work in bringing this technology to more developers and smaller studios. Even so, it is crucial that developers see that even if they make the most graphically impressive game, you must either put in the effort to optimise it correctly or otherwise you risk cutting out a massive amount of your player base on PC.