It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
ClassicGamer592: This youtube channel explains what's wrong with Unreal Engine 5 and modern graphics.

And many games from the late 2000s still look good today like Killzone 2 running at native 720p with 2x MSAA on PS3. We don't need blurry TAA, upscaling, or frame generation to get amazing graphics back then. Now we have games running below 720p on PS5, or worse, running below 480p resolution on Xbox Series S on some Unreal Engine 5 games!
This video is really amazing and complex in details.
The kind that people don't like to watch and thus, fall on generalizations that caused this mess to begin with.

Generalization solutions don't work when you need something so precise as this and his explanation is really deep.
Gotta be honest, I couldn't understand all of it.

But what I noticed was (and I don't really think this is indie/AA devs fault):

I saw a pattern of companies creating huge amounts of propaganda for people to normalize and use specific products that are, apparently, being made by said people who made the propaganda before and normalize, again, said products in the market and for customers that have no access to information, supposedly.

As you explained, about how 2000's games can look good today with enough upscaling technologies and run well, I think its all because back then most developers had to create their own engines and had to optimize them to work with current time hardware.

Nowadays I always hear people saying: "Yeah, we could optimize that, but now we don't need to because we have the technology to run it!" Yes, sure! Just that 5% of humanity will have the money to enjoy it. The others will, well, as always, complain that product was not well made.

Dropping the "rant-like commentary", its a difficult problem to solve:
While we could argue that devs must develop their own engines like in the past, this is by no means feasible for 8th gen demands. Talking about small games with simple mechanics and not so many complex assets, maybe, but what about middle-sized dev companies producing AA games?

Those engines "empowered" (I started hating this word because of how its being used nowadays, but anyway... :P) developers to create AAA-like products and its amazing.

My mind rests in the idea that a middle-ground solution will be found where, like others have said on this thread, they will learn to use those awesome tools correctly, rejecting generalization tools that unoptimized games while embracing and developing technologies, with the community, that creates better optimized games for last gen and current gen hardware.

There are other problems "below the surface" though.
But this would derail the talk to the point of being another thread, and forum, discussion topic.

Thanks for sharing his channel and this amazing video!

avatar
idbeholdME: This is the core of the issue, yes. Time pressure and or wanting to cheap out by not hiring high level developers. It's much easier to hire a person and tell them "check these 2 boxes here and then do this" instead of someone who can modify the underlying engine code as needed and cook up a customized/modified solution for the project in question. And most importantly, it's much easier and cheaper to find somebody able to do work in a widely available engine tailored for large scale projects, than initiating someone completely from scratch into your in-house engine. When a developer can start working on the project immediately rather than having to go through a month long initiation to the inner workings of your custom engine before being able to start working, it saves a lot of money. That is also a major part of why a lot of studios are dropping their engines and switching to Unreal and there is pretty much guaranteed to be a steady supply of hirable, ready to go devs. Maintaining and updating/developing a custom, studio specific engine is extremely expensive.

But it's true UE focuses a lot on ease of use and may be pushing the still basically experimental features as the go-to/definitive solution a little too fast. Nanites and Lumen don't have to be used and may not always be the best solution for a specific scenario. But it's a choice by the developer, not really the fault of the engine. The features were first introduced with the launch of UE5, which is a mere 2.5 years ago. Look at Ray Tracing for example and how much time has passed before it started slowly becoming the norm.

Lastly, here are some highlights of the latest 5.5 version of the engine.
https://www.youtube.com/watch?v=BcmUZpdChhA
About this, completely agree.
Here's an example of that from the comments on the video liked by @ClassicGamer592:

@unrealcreation07 - 3 months ago

10 years Unreal developer here. I've never really been into deep low-level rendering stuff, but I just discovered your channel and I'm glad your videos answered some long-lasting questions I had, like "why is this always ugly and/or blurry, no matter the options I select?? (or I drop to 20fps)".

With time, i have developed a kind of 6th sense about which checkbox will visually destroy my game, or which option I should uncheck to "fix" some horrible glitches... but I still don't know why most of the time. In many cases, I just resign and have to choose which scenario I prefer being ugly, as I can never get a nice result in all situations. And it's kind of frustrating to have to constantly choose between very imperfect solutions or workarounds. I really hope you'll make standards change!
We can't say this happens most of the time, but its definitely happening.
Not even the developer know what he is doing, and it kinda breaks hearts.
He's just following generalization criteria based on his own senses, or project superiors.

---

Edits:
Wording, corrections, add info..
Post edited 7 hours ago by .Keys
avatar
botan9386: Either that, or developers are not as good as they used to be at optimisation. Some of my favourites still look and run amazing and are tiny file sizes compared to modern titles.
Both. UE5's appallingly bad optimisation (and developers complete lack of competence / laziness) is disproportionately worse than before. Eg, wind back the clock a decade and UE3 games from the early 2010's - examples like Alien Isolation, Bioshock Infinite and Dishonored actually ran very well at 1080p on High settings on low-end GPU's of same era (eg, GTX 750Ti, HD 7790) that cost £100-£150, and only dipped below 60fps on Ultra. Back then turning down settings from Ultra to Low also often doubled the frame-rate as they put in the effort for play-testing "disable the settings with the largest performance hit but smallest image quality impact first".

Today you can blow £300-£400 on a "budget" GPU and still get 40-50fps at 1080p, and turn down the settings only for the game to look worse rather than run faster, only to met with "upscaling is the new norm!" gaslighting. You compare how a shiny new UE5 2024 AAA game looks on Low using 9GB VRAM at 1080p next to a 2014 Indie game on High using 1.5GB VRAM at 1440p and conclude : we are very definitely on a "post-peak developer competence" downslope at the moment...
Attachments:
Post edited 3 hours ago by AB2012
avatar
P. Zimerickus: I never managed to get past a total system load of 550W myself, and that's with a 450W gpu.
I also rarely use more than 170Watts when gaming (gpu only, according to nvidia....)

The power consumption is rising noticeably though. I can't run cities 2 under 200W with medium low settings and a performance DLSS option... 30 fps atm..... I actually gotten used to a average of 140W in the previous years but as you just read, this figure increased with almost 30W
avatar
Palestine: P. Zimerickus, are these wattage figures obtained via software, or, from the wall outlet (using something akin to 'Kill A Watt')?
The latter is far more accurate.
It is difficult to describe n accurate figure i guess...... I mean these are only measurements taken from my card.... but with my experience with nvidia i could assume that for a reasonable 60 fps on 2k stand the numbers probably won't differ to much if your running a 3090Ti a 4090 or the coming 5090...... That always manages to confuses me a lot. Like why does a card that is sold as 4 times more powerful end up with similar behavior in fps locked or power locked situations??? You tell me!
developers are coding like improv groups, you need to see these monkeys code.
i'm 42!! i am still coming up with new and faster ways to do math and draw visuals.
we have all this hardware that optimizes data flow and has advanced methods.

nearly every system has a NPU... how many games used it so far?
every nvidia gpu has physix, no one uses it. "maybe because nvidia didn't make it less annoying to use..."
games can run twice their speed... that does not sell hardware... so it's crime for crime..

the days of developers brute forcing their code to waste no cycle... is dead..
Post edited 3 hours ago by XeonicDevil
The currently most impressive engine in overall is probably Decima, the one used in Horizon Zero Dawn and other titles from this series. Apart from Kojima only Guerilla Games is using this engine.

UE5 is... simply a new "industry standard" able to provide next gen graphics and surely high flexibility... but it is a big performance hog: This is the true downside of this engine which may require some improvements else it will result into a "hardware war" and GPUs above lower-midrange (basically above Intels Arc series) are a "luxury hardware" for a good time already.

Gamers can check out the price of a PS5 Pro, which is one of the most expensive console ever made... yet this is it what it takes for having sufficient hardware for really demanding engines. The price will probably not come down... so i guess it is safe to say this will be the new price tag in the future for "capable" hardware.

Sure, the new Intel Arc is a "fresh piece of affordable GPU" but lets be honest... it is not the GPU you can run 4k and/or always nearly maxed out graphics with. The minimum would be the "category" i used for more than 2 years already which is at the spot of 3090 TI/7900 XTX and 4070 TI+.

Sure, even the Decima engine is leeching the PS5 Pro like crazy on the newest "Horizon Zero Dawn Remaster" but many of the stuff i was able to detect there on the PS5 Pro was really impressive... it looked in many cases better than for example "Stalker 2", which is a UE5 game.
Post edited 2 hours ago by Xeshra
avatar
Palestine: In my opinion, the price is not even a concern; it is the energy consumption of new GPUs. It is beyond ridiculous that it is now regarded as the norm for dedicated graphics cards to have a TDP of over 200 watts... some, as high as 450 watts (uncertain of true consumption during full-utilization). A few years prior, I had made the decision to limit myself to only purchasing relatively efficient-efficient APUs (Accelerated Processing Units), which, fortunately, are now able to compete with (or, at the very least, match the performance of) some decent dedicated graphics cards.
Since I play only on (gaming) laptops, my solution to that problem is to play AAA games many years after their release, when mobile GPUs have caught up and are able to run those games satisfactorily. Plus, of course, the games themselves might have received some performance optimizations in the meantime as well.

This has worked so far, unfortunately now they are saying Moore's law (and Moore himself) is dead so maybe we need that 450W GPU to play The Witcher 4 also in 2050 (providing it is released before that)?

Maybe we should at least leave x86 behind and start using more energy-efficient ARM-based CPUs, like Snapdragon? That's a start. Not sure if NVidia GPUs have similar old backwards-compatibility stuff dragging it down as well, or is it as optimized and power-efficient as GPUs can be?
Post edited 33 minutes ago by timppu