It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Xeshra: Hmmm? Ah yes they got a bit above 40% increased income after 15 years, but during this timeline the living expenses was most likely going up way above 50%... sometimes as high as 100%. So the reality is... the "average" has been robbed by inflation, because the higher salary is not able to "keep up".

Usually they use the "median-value", so they can get ride of the multi-millionaires or even billionaires. If we throw the wealthiest into the same pot, the average would become way higher.

Usually i do not use the word "average"; because it is pretty much a class-related issue and we will have to focus on those classes in order to get the most truthful maths. Reality is, humans are not equal, so any maths will have to consider this issue.

Sure, we already know how it works with the classes: GOG users for example are "lower middle class or second class". Steam is somewhat middle class or a bit above second class. No customer is actually first class... for this to happen we may have to "stand up" for our rights way more.
yea, but isn't that why they use these calculations, based on how atoms move etc. It is not really possible to tell the truth but this figure is the best way, according to the dutch goverment to describe the truth
Median value is more truthful than the average value, but even better is to split the humans into different classes and then we can use the average value of every class. It would reveal the biggest... and indeed... sometimes shocking truth. We will then clearly see the direction of coins and in general "who is being hit by inflation the most".

The median value is still based on the common opinion that humans are more or less equal "with just a few crazy hogs who should become thrown out of any comparison", which in economical and many more terms is far away from true. It is simply closer to the truth than just using a average value, indeed.

Oh besides, the name tag of those frame-improvements is pretty reasonable:

Previous FSR = pretend it does not exist... best choice!
-FSR 3.1 = 3 fake pixels out of 1 true pixel
-FSR 4.0 = 4 fake pixels out of no real pixel

Previous DLSS = pretend it does not exist... best choice!
-DLSS2 = 1 fake frame out of a real frame
-DLSS3 = 2 fake frames out a real frames
-DLSS4 = 3 fake frames out of a real frame

XeSS = Upscaling
XeSS2 = Upscaling + frame generation
Post edited January 11, 2025 by Xeshra
The "BRICK" grows ever larger (on the non-Founders versions):
https://www.youtube.com/watch?v=hrwrwY2_DHw

It's getting comical at this point.
Post edited January 12, 2025 by idbeholdME
After watching GamersNexus, Hardware Unboxed, Vex and Dan Owen's takes on it it's apparent nVidia are a big joke now. They've become a software company selling visual cheats because they can't sustain 60fps without some kind of crutch in Unreal 5 engine games (Unreal 5 is also turning out to be a piece of crap engine poorly optimized).

Also Digital Foundry are clearly shills for nVidia's mind games.

Long live the geForce 9800GTX and RX 6800.
My choice is definitely the liquid cooled Supreme from MSI. I would not get another design anymore on cards with such a big TDP. There is a valuable MB and other parts inside and a non liquid cooled design simply gets to hot. Of course, the other end should be mounted at the top and the CPU liquid cooler on the front. A gamer CPU luckily is pretty lean on heat, but better cooling ultimately means more performance and/or higher stability... and as always higher lifetime of valuable parts.

Anyway... yes... prices are huge and i suggest, if anyone wants to go insane... watching the market for a deal that still was not leaving the earths atmosphere. It will be rare for sure...

For example the old 4090 flagship with nearly same design (just smaller and with 1 fan lesser) was usually always around 2000 coins, only for a very short period the price dropped to around 1700... and the highest price was over 3000.

As with the other cards... already told: As long as someone does not enjoy frame generation and AI... it seems to be a pretty minor upgrade vs. many other cards inside the same range.

Some more infos in this video. The Supreme from MSI basically is a "founders edition" PCB but with a liquid cooled design. It was already the same for the old flagships (3090 TI, 4090) including Gaming X Trio (which was basically a Supreme PCB with 2 lesser power stages). Regarding other manufactures i can not say much, did not check them out properly.

https://www.youtube.com/watch?v=WlND7pFg2pk
Post edited January 12, 2025 by Xeshra
The 9070 XT performance is now pretty clear according to some leaked sources. Its RT + native performance is comparable to a 4070 TI Super, which is more or less on par with 7900 XTX and 3090 TI.

Ah yes, native 4k is not possible on any of those cards, just to low FPS (around 30 or less), but they are all good for 1080P up to 1440 P (2k), so thats why i say "it is a 1080P up to 1440P card" for native gamers. Anyone using "magics" may obviously get way better performance but at the cost of quality in many terms (which is surely not the reason we go 4k). 5090 is the only card able to go 4k native with ANY game, while keeping up 60 FPS, it seems.

https://videocardz.com/newz/alleged-amd-radeon-rx-9070-xt-performance-in-cyberpunk-2077-and-black-myth-wukong-leaked

RT cores are approximately around 80-100 TFLOPS for those GPUs with comparable rasterization performance.

From what i can say the PS5 Pro GPU which is still using RDNA 3 cores for rasterization, is around the performance of a 4070 (non TI) or 7900 GRE when it comes to overall performance with a slight edge on the RT cores because those are RDNA 4 specs and most likely in the same range comparable to the 9070/9070 XT and ultimately 4070 TI Super, 7900 XTX and 3090 TI. Obviously the 3090 TI got the lowest efficiency by far out of those, yet still impressive it still can "match" the performance of so many new midrange cards after many years. The 9070 XT got a efficiency more or less on par with Nvidias 4070 TI Super (the last generation) but may not match the 5000 series. Although, the efficiency on rasterization was barely improving... for the newest gen, not even on Nvidia, reason why the new flagship need way more power.

Most likely the 5070 is comparable in performance vs. those cards in RT, but not sure it can keep up in native rasterization... it may still need a 5070 TI in order to beat or match the cards i was already mentioning.

Sure, according to Nvida "native" is almost dead... because the biggest improvement they did was on the NPU (neural processing unit) which is known as "Tensor cores" by Nvidia, not any different by what it does but surely exceeding everything at the marketing and currently on performance. Those cores are basically closest to "how a brain" is working but in the shape of a processor.

However, whats clear... this is still a alpha technology and the gamers will have to face many years with countless issues as this technology is extremely complicated.

No clue regarding the TOPS (AI performance) of the 9070/9070 XT, i think they may have the same NPU comparable to the PS5 Pro, which is in that case around 300 TOPS. Nvidia is surely exceeding it by at least 3 times with the newest generation. The 3090 TI, the second flagship using Tensor cores, got 320 TOPS, so AMD is basically 2 generations behind in NPU cores and 1 generation behind in RT cores. Intel is even weaker than AMD, their NPU on the Battlemage is providing 48 TOPS according to what i know.

It would not make sense providing another flagship because AMD will have to say "here, our flagship got 300 TOPS, and that other flagship 3000 TOPS"... it will become demolished at the marketing. Even if AMD is providing equal peformance, the people still buy Nvidia in 9 out of 10 cases; so AMD is kinda lost.

I still want to express a matter which has been brought up by Sony trying to make some AI-upscaling (PSSR): The TOPS may not be able to become fully utilized, not even nearly... because the bandwidth on the RAM is to slow on any GPU. Although Nvidia was able to achieve nearly 2 TB/s on the 5090 with a expensive 512 bit bus, even this crazy value is in many terms insufficient on providing the access a NPU will need. No matter what... on marketing at least Nvidia already was wiping out the entire competition, comparable to Apple.

On the server-GPUs Nvidia is not using "cheap DDR7 or lower", instead they use HBM (high bandwidth memory) which can achieve many TB/s, for example over a 5120 bit interface they can go as high as 3,3 TB/s or even more in the future. What i want to say is... in gaming terms, those TOPS are usually not properly utilized and it is more of a marketing instead of "a real thing".
Post edited January 13, 2025 by Xeshra
Well... the news on the rasterization performance and the efficiency is worse than i would have guessed long time ago.

According to the most recent leaks it looks more or less like this:

5090 +35% FPS, +8% higher efficiency, +27% higher TDP. Most of its higher TPD seems to be used for AI and a bit of RT.
vs.
4090

5080
vs.
4080

5070 TI
vs.
4070 TI

5070
vs.
4070

It looks like the major purpose of the 5090 and its chip design is to be a good datacenter AI card with only as a spin off being a gamer card. The digital industrial complex (especially those close to where coins are made) already got more money than all of the "common people" added together... lets face the truth.

The 5080/5070 TI and 5070 are probably a bit more efficient because they are not wasting to much resources on AI. However... the low CUDA count is not making them a strong card at all. They can barely beat the last generation on rasterization... without "magic" they are truly nearly useless.

However... nope, the near 600 W TDP card is not very big, actually a 2 slot design for the "in house cooler". One of the secrets are surely engineers like "Malcolm Gutenburg"...

Incredible NVIDIA RTX 5090 Founders Edition: Liquid Metal & Cooler ft. Malcolm Gutenburg

JayzTwoCents aswell got one... and trying to put it into a SFF
, as hilarious as it sounds.

RTX 5090 is HERE and I already got to do a build with it!I

Intel surely was catching up because in my mind... neither Nvidia nor AMD is currently doing great, but at least Nvidia can beat everyone by far at the AI performance and the most efficient cooler design, including a innovative PCB design never seen before.

GPUs have been SO BAD that Intel Caught Up

RT is another story... in the beginning this was Nvidias big advantage, but AMD and Intel... both was almost catching up now and are only roughly one generation behind in RT, while AMD is probably already less than a generation behind with RDNA4. Pretty possible AMD may go head to head at the next GPU generation known as UDNA (which is as well used in the new PS6 at some point).
Post edited January 18, 2025 by Xeshra
avatar
Xeshra: Well... the news on the rasterization performance and the efficiency is worse than i would have guessed long time ago.

According to the most recent leaks it looks more or less like this:
The opinion on intel ultra cpu's is also changing atm, there is always so much focus on the first performance while we all know how stuff works
I do expect some improvements as soon as there are better drivers and more optimization... yet, i do not feel like the 5090 will ever go above +50% in rasterization... it is simply not designed for as there are close to no improvements on the CUDA cores.

The 5090, at least the Founders Edition, which is extremely rare and barely any vendor offering it, is using liquid metal... for the first time on a dedicated GPU, as far as i can say. However, Nvidia was not the first to use it on a bigger series... as we all know; Sony was using it over 4 years ago on their PS5. Sony got some experience now because of "long term data" and was already optimizing their liquid metal with the new PS5 Pro.
Post edited January 18, 2025 by Xeshra
huh????

did you say anything??
my current card plays all my games and Bioware can shub their valguard crap so pass
avatar
ussnorway: my current card plays all my games and Bioware can shub their valguard crap so pass
uhm, Veilguard?? Veilguard is pretty optimized, i mean, i could run it on 100W and it still looks nice. VRAM does start to play an important role though. I Can't comment on a low/mid tier GPU/CPU combination
Post edited January 19, 2025 by P. Zimerickus
avatar
idbeholdME: The "BRICK" grows ever larger (on the non-Founders versions):
https://www.youtube.com/watch?v=hrwrwY2_DHw

It's getting comical at this point.
Generative AI at work of generating 98% of your pixels with only 2% pixels input!
Some of us like to speculate on buying a previous generation flagship for cheaper when a new gen launches ;) I bought a 7900 XTX today.
Review embargo on the 5090 recently lifted. For the important stuff (gaming of course :D), seems like it's +30% price (for the base model, will be MUCH higher for any 3rd party sellers and especially outside the US) and +30% power consumption for 20-30% better frame rates over the 4090. Seems my plan of upgrading only every 3-4 gens remains unaffected. Planning on riding out the 4080 until at least the 70 series especially seeing how 4080 remains among the top dogs in watt/frame graphs + I have mine undervolted.

5080 reviews are going to drop next week.