Stalker 2 made me reflect on how it may have been a mistake to make Epic Games’ Unreal Engine 5 the industry standard for the next decade.
Of course, GSC Game World’s big comeback FPS/survival game is the ‘worst case scenario’ of a UE5-based game being broken at launch, but I’ve played my fair share of UE5 games in the four years since this generation. , and maybe Epic’s powerful engine isn’t as good for everyone as it was made after the early demos and The Matrix Awakens.
Manage cookie settings
Some context before my rant: I don’t know much about game engines, programming, 3D modeling, etc. I’ve tried Bethesda’s Creation Engine before and that was it. I’ve never been very interested in working with software and specialized tools beyond the ‘power user’ level. However, I am (and for obvious reasons) very curious about all the processes and tasks that go into creating video games and interactive experiences of all kinds.
Of course, I’m also someone who plays way more games than I should in a year. This is a great way to start figuring out both the good and the bad. Combine this with self-taught hardware knowledge and OS modifications, and as a consumer, you start to understand performance issues beyond just saying ‘this sucks’ and asking for a refund (which I encourage everyone to do more often).
Anyway, remember the big Fortnite update in late 2022 that ported everything to UE5.1 to take advantage of Nanite, Lumen, etc? After years of promoting the new engine and pushing developers to push the limits of PS4 and Xbox One, it felt like a big win for Epic with new technology. Making it possible for everyone and their mothers to experience for free The shiny, stunning new visuals that come with UE5 have been brought to full-featured AAA online games.
Naturally, things didn’t go as planned. The average Fortnite pro has been using the lowest possible settings for years to maximize their K/D ratio, and those of us who have packed sturdy enough hardware have noticed more stuttering and overall poorer performance than with the latest versions of UE4. It’s not worth the hassle. Two years later, the situation has not changed much. Jumping straight into a Fortnite match after updating the drivers or the game itself means you’re not going to perform well because the shaders will be completely reloaded again and all the heavy lifting will be done on the go. It’s not ideal.
For those of you who don’t know, the important thing about shaders is that each hardware configuration needs to have the shaders ready for fast loading. from now on That’s why consoles aren’t affected by these issues, and why modern games on PC can be a bit rough these days, at least until the PC gets ‘used to’ the latest AA/AAA monsters. Different engines (and developers) handle this in different ways. In the case of UE5, the ‘stutter difficulty’ is very real, especially when traversing huge levels/worlds, and the lack of proper and proper shader compilation at launch in some titles only makes things worse.
Even when UE5 is used reasonably well (see Remnant 2), with Lumen and Nanite adding lighting and a level of granular detail to scenes that would have seemed impossible just a few years ago, the resulting performance hit is virtually impossible, even on expensive PC hardware. It doesn’t. It’s not worth it for the average gamer who wants a smooth and painless experience, especially when playing stressful and demanding games.
What solution took over the industry much faster than UE5? Aggressive AI-based upscaling and frame generation. Both AMD and Nvidia have it all covered, with the latter locking down the technology (more powerful than its rivals) behind the 40 series and beyond. Now that developers can load frames out of thin air, the graphical fidelity feels like it’s moving faster than the actual hardware used to run it via ray tracing and all that jazz. What is the final result? Most of the big studios are trying to be Crytek in 2007 and are pushing for outrageous visual quality that can only be achieved on current hardware at high frame rates thanks to support for things like DLSS, FSR, etc.
And you know what? I think the technology works pretty well and is showing more promise as the years go by. I love how my 4070Ti makes the frame exist. But even with all that nice boost… some games still remain a janky, blocky mess, and I’m not happy to say that UE5 continues to be the worst offender. A good recent example is 2023’s ill-fated Immortals of Aveum. The game remains largely broken on a technical level to this day, even after several patches. Despite FSR3 and DLSS 3 support, it is an unstable environment prone to crashes. This also applies to other 2023 games, like the surprisingly decent Lords of the Fallen reboot. The game has gained a sizable following, but it freezes for the second longest regardless of settings and still remains rough on console.
These are just a few examples, but you can see the pattern here. If you don’t believe me, let the fine folks at Digital Foundry convince you with more data and deeper research than I can pretend to provide. Also, if you’re an avid gamer, check out our list of games you’ve recently played and which have complained about their performance. The biggest exception, at least in my experience, seems to be Hellblade 2 (which isn’t surprising considering how much time Ninja Theory devoted to its audiovisual presentation). The game was incredibly smooth and stutter-free, with the exception of rare crashes related to PC issues. We had a lot of trouble when it was released. But Hellblade 2 is also a very linear game, so make whatever you want of it.
Looking to the future, with UE5-based behemoths like the next-gen Witcher, Mass Effect, and Star Wars Jedi entries getting closer and closer by the day, I can’t help but worry about what every major studio has abandoned. Epic Games’ own technology, which relies on its engine, has so far felt quite underwhelming in real-world use outside of rough tech demos and projects where huge amounts of time and resources have been allocated to fixing problems.
Stalker 2 may have been the straw that broke the camel’s back, and we may have been giving GSC Game World a lot of blame for not putting in the extra time, but we can’t help but think about how smoothly Dragon Age ran. Veilguard spans a very wide range of hardware and runs on an engine (Frostbite) that not long ago was considered unsuitable for anything other than an FPS. Have we all been fooled by Tim Sweeney again? Uh-oh…