Nvidia GeForce RTX 4090 vs. AMD Radeon RX 7900 XTX: Is the GeForce Premium Worth It?

Nvidia GeForce RTX 4090 vs. AMD Radeon RX 7900 XTX: Is the GeForce Premium Worth It? - 4090 vs 7900 XTX 1440p - 7900 XTX vs

Last updated 9 month ago

Reviews
Graphics

Nvidia GeForce RTX 4090 vs. AMD Radeon RX 7900 XTX: Is the GeForce Premium Worth It?



New 12 months, new GPU evaluation. Today we are taking an updated look at the modern-day technology flagship, the struggle among the Radeon RX 7900 XTX and GeForce RTX 4090. Of path, we recognise the GeForce GPU is quicker, however it also costs $2,000 whereas you may choose up a 7900 XTX for simply under $1,000, or approximately half the rate. That's a big distinction, so is the GeForce really really worth this kind of full-size top rate?

For all the AMD enthusiasts screaming at their monitors, we know the 7900 XTX more at once competes with the RTX 4080... Now not the exorbitantly priced RTX 4090. We've already made that assessment, evaluating the Radeon 7900 XTX and RTX 4080 in more than a few games, offering large rasterization, ray tracing, or even DLSS vs FSR upscaling benchmarks. So, you may check out that assessment if it truly is the comparison you are interested by.

Back to this contrast, it absolutely befell due to a recent reader query, where someone requested if we would purchase an RTX 4080 or a hypothetical "GTX 4090," a GPU with the rasterization performance of the RTX 4090 however priced like the RTX 4080, because it lacked tensor cores and consequently could not take advantage of RTX functions.

We were divided in opting for the RTX 4080 or preferring the GTX 4090, the latter which might be a higher GPU for a person like me who in the main plays aggressive multiplayer shooters. Surprisingly, a sizeable number of readers claimed we already have a GTX 4090, also referred to as the Radeon RX 7900 XTX.

But last we checked, the Radeon 7900 XTX was often pretty a chunk slower than the RTX 4090 in rasterization overall performance, so has something changed due to the fact that our last evaluation? It appears not likely, however several new and visually lovely video games have been released overdue closing year, so an up to date comparison appeared appropriate, so here we are.

For trying out, we're using our Ryzen 7 7800X3D test system with 32 GB of DDR5-6000 CL30 reminiscence. The display drivers used have been Game Ready 546.33 and Adrenalin Edition 23.12.1. It's time we delve into the information, so permit's do it…

Benchmarks

First up, we've got results for the most up-to-date model of The Last of Us Part I, and right here the 7900 XTX became 23% slower at 1080p, 20% slower at 1440p, and 21% slower at 4K. So, proper results ordinary for the XTX, specially for the reason that at MSRP it is almost 40% cheaper, and at 4K, we are searching at over 60 fps without having to enable upscaling.

Speaking of which, whilst using the FSR and DLSS high-quality modes, we see substantial frame fee will increase for each GPUs in this identify. Here the XTX changed into 26% slower at 1080p, 20% slower at 1440p, and 22% slower at 4K.

Next, we've got Starfield, and again this statistics is based totally on the maximum up-to-date model of the sport, at the time this video went live. At 1080p, the 7900 XTX is just three% slower than the RTX 4090, then 12% slower at 1440p, and 17% slower at 4K, making for any other set of high quality outcomes for the a great deal inexpensive Radeon GPU.

Even if we enable upscaling, the margins remain a whole lot the identical, with the XTX being 3% slower at 1080p, 11% slower at 1440p, and 16% slower at 4K.

Performance in Assassin's Creed Mirage may be very competitive; here the 7900 XTX became simply five% slower at 1080p, 11% slower at 1440p, and 19% slower at 4K. We're additionally looking at properly over one hundred fps at 4K, using the highest exceptional preset.

Now, with FSR and DLSS enabled, the 7900 XTX can beat the RTX 4090 at 1080p and 1440p, offering eleven% extra frames at 1440p. That said, the Radeon GPU does fall at the back of at 4K, even though best by using a 13% margin.

Moving on to Ratchet & Clank: Rift Apart, we see that neither GPU has any troubles with this name, although, as predicted, the RTX 4090 is faster. The XTX trailed with the aid of a sixteen% margin at 1080p, 20% at 1440p, and then just 8% at 4K.

However, what considerably influences the Radeon GPU in this name is ray tracing. Switching it on using excessive settings, overall performance falls off notably. We're now searching at seventy seven fps at 1080p, making it almost 50% slower than the 4090. Actually, at 1440p, it's 52% slower and an exquisite fifty eight% slower at 4K, or some other manner of searching at it, the RTX 4090 is sort of a hundred and forty% quicker at 4K.

Enabling upscaling does help the 7900 XTX relative to the 4090, as now it's 42% slower at 1080p, 45% slower at 1440p, and 50% slower at 4K.

Next up, we have Star Wars Jedi: Survivor, and right here we find some heavily CPU-restrained outcomes at 1080p, though frame fees are nicely above 150 fps. Still, the Radeon GPU manages to take the lead due to requiring much less CPU processing strength, nudging beforehand by means of a mere 3% margin.

But once we hit the 1440p resolution, the 7900 XTX configuration will become heavily GPU-limited, and now it's 30% slower and then 32% slower at 4K.

Now with ray tracing enabled, the CPU load will increase, and as a result, the 7900 XTX offers a alternatively large 18% overall performance increase over the RTX 4090, because the delivered CPU overhead from the GeForce GPU sees it turn out to be CPU-constrained plenty in advance. Because of this, even at 1440p, the margins are very comparable, and it's no longer until we hit the 4K resolution that the outcomes are GPU-limited, with the Radeon GPU falling at the back of by using a 35% margin.

Enabling FSR and DLSS would not do tons in this name as we are CPU-restricted for the maximum element at 1080p and 1440p, and actually, the overhead of those upscaling technologies truely reduces overall performance in some times. But at 4K, where we are GPU-limited, the 7900 XTX best saw a four% increase, while DLSS presented a miles extra vast 25% raise for the RTX 4090.

Forza Motorsport plays properly using both GPU with the ultra-best preset, however in spite of this, the RTX 4090 is faster, ensuing within the 7900 XTX being 23% slower at 1080p and then 21% slower at 1440p and 4K.

Enabling complete RT results in addition favors the GeForce GPU, and now the 7900 XTX turned into 26% slower at 1080p, 28% slower at 1440p, and 30% slower at 4K.

Again, we've got every other title in which FSR does not appear like operating successfully, as we see nearly no trade in performance for the 7900 XTX with it enabled. Apparently, DLSS were damaged inside the past for this name, however that not appears to be the case, as we are seeing roughly a 40% growth throughout the board. So, a huge overall performance gain right here for the RTX 4090.

Next up, we've Cyberpunk 2077: Phantom Liberty, and with out ray tracing enabled but the usage of the Ultra preset, the 7900 XTX plays very well, rendering 139 fps at 1080p, making it simply 18% slower than the RTX 4090.

However, at the extra GPU-restrained 1440p resolution, the margin will increase to 37% and then 33% at 4K.

But that's not all. If we allow the Ray Tracing Ultra preset, the 7900 XTX struggles notably, rendering just 41 fps at 1080p, making it sixty two% slower, or the 4090 161% faster. The margins remain comparable at 1440p and 4K, so that is a disappointing set of results for AMD.

Now, permitting upscaling allows the 7900 XTX to common 61 fps at 1080p, but this is half of the performance we are seeing from the RTX 4090, so once more, disappointing results for AMD.

Marvel's Spider-Man Remastered is a nicely-optimized title, a lot in order that we end up CPU-limited even at 1440p. Therefore, we have to show to 4K to peer a difference in overall performance among those GPUs, where the XTX falls behind by using a 24% margin.

Enabling ray tracing doesn't further want the GeForce GPU as the game will become extra CPU-constrained, although the XTX changed into nonetheless 18% slower at 4K. But, had the RTX 4090 now not been capped at round 150 fps, the margin might have been large.

With the game CPU-limited, allowing upscaling would not do a lot, although it does assist the 7900 XTX close the gap at 4K.

Hogwarts Legacy is some other new sport it really is heavily CPU-constrained, even if the usage of the pleasant gaming CPUs. As we've got visible a few instances now, while CPU-limited, the 7900 XTX can outperform the RTX 4090, and we are considering the fact that at 1080p and 1440p, whilst the 4K statistics continues to be competitive.

Enabling ray tracing reduces the 7900 XTX to similar tiers of performance because the RTX 4090, as each are CPU-limited here. Then at 1440p, it turns into 19% slower and a tremendous 39% slower at 4K.

Enabling upscaling helps both GPUs, specifically at 4K, and now the 7900 XTX is just shy of 60 fps, making it 19% slower than the 4090.

Now one of the most recent video games delivered to our listing is Alan Wake 2, and here the Radeon GPU is distinctly competitive. At 1080p, it changed into simply 12% slower, but we are also seeing simplest a sixteen% margin at 1440p after which 21% at 4K, that are reasonable margins, thinking about the 7900 XTX prices nearly forty% less at MSRP.

The trouble for the Radeon GPU will become obvious when allowing ray tracing. Here, it drastically lags in the back of, delivering unplayable performance with a trifling 21 fps at 1080p. The 4090 isn't always stellar right here either, with just sixty six fps at 1080p and 44 fps at 1440p, but it is considerably better than the XTX.

Of direction, permitting upscaling enables the 4090 gain extra respectable frame charges at 1080p and 1440p, although it does little to assist the 7900 XTX.

Next, we have Fortnite using the epic high-quality settings in DX11 mode, and right here the 7900 XTX is around 30% slower than the RTX 4090 at every examined decision, which isn't always extraordinary but not terrible either.

However, if we choose the extra competitive medium excellent settings, we find that at 1080p and 1440p, the XTX is over forty% slower, which isn't ideal for those searching for each aggressive advantage. The margin is barely decreased to 35% at 4K, however even so, that intended that the GeForce GPU was notably faster.

Oddly, if we switch to DX12 the use of the epic preset with ray tracing enabled, the Radeon GPU is shockingly aggressive or even quicker at 4K. We triple-checked those consequences and received the equal records whenever.

Like Fortnite, Warzone highlights an difficulty for Radeon GPUs whilst the use of aggressive excellent settings. For an unknown motive, the segment of the map used for checking out confined the 7900 XTX to around two hundred fps, and this was found irrespective of the resolution used, which generally suggests a sturdy CPU bottleneck.

However, the RTX 4090 completed significantly higher, with over 400 fps at 1080p and over three hundred fps at 1440p, making it 63% quicker. Strangely, they are very comparable at 4K, so the hassle for the Radeon GPU at decrease resolutions is unclear, possibly related to a driver trouble. For the ones in search of maximum fps in Warzone, a GeForce GPU seems to be the most suitable choice for now.

Also, permitting upscaling did little for either GPU using the simple first-class preset, so going for walks on the native resolution is probably higher, particularly for the ones gaming at 1080p.

Unlike Fortnite and Warzone, overall performance in Rainbow Six Siege is a great deal extra aggressive. The 7900 XTX turned into just five% slower at 1080p, 12% slower at 1440p, after which 24% slower at 4K. In terms of cost for cash, the Radeon GPU performs simply well here, allowing extraordinarily high body quotes at resolutions such as 1440p.

Finally, we have Counter-Strike 2 the usage of the medium preset, and once more the 7900 XTX plays pretty nicely here, turning in similar performance at 1080p. However, the RTX 4090 is CPU-restrained at this resolution, however each GPUs common over 500 fps. Then at 1440p, the XTX turned into 18% slower than a CPU-confined 4090, and 26% slower at 4K. Overall, those are properly outcomes.

Performance Summary

Here's a observe the overall performance at 1080p, except any ray tracing or upscaling (DLSS/FSR) effects. On average, the 7900 XTX turned into 12% slower than the RTX 4090, which isn't always significantly exceptional from our day one 7900 XTX evaluate data, where the Radeon GPU trailed by means of a 5% margin. The fundamental distinction being that we have been the use of the 5800X3D, whereas now we are the use of the 7800X3D.

Increasing the resolution to 1440p lets in the RTX 4090 to extend its lead, with the 7900 XTX now seen to be 18% slower on average. Although we observed margins as big as forty four%, it's nevertheless a solid overall performance ordinary for the Radeon GPU.

At 4K, the margin increases to 21%, that is favorable for AMD, thinking about the 7900 XTX is 40% cheaper at MSRP. This margin aligns with our day-one evaluation records using the slower 5800X3D, wherein the 7900 XTX was, on average, 20% slower.

Ray Tracing Performance

The vast problem for AMD stays its ray tracing performance, evident inside the 1080p outcomes in which the 7900 XTX is already 22% slower, with several effects showing margins greater than 40%.

The scenario worsens at 1440p, wherein the 7900 XTX is 28% slower, with numerous examples like Dying Light 2, Ratchet and Clank, Cyberpunk, and Alan Wake 2 where the Radeon GPU falls substantially behind.

Even the RTX 4090 wasn't continually sensible at 4K with ray tracing enabled, so normally, those margins are fairly inappropriate, but nonetheless, here the 7900 XTX become 34% slower.

Ray Tracing Upscaling Performance

In video games wherein we may want to test with ray tracing and FSR or DLSS upscaling, at 1080p, the 7900 XTX turned into on average 27% slower, indicating an affordable performance benefit for the RTX 4090, even at this lower decision.

The margin will increase in prefer of the GeForce GPUs at 1440p, where the 7900 XTX was on common 33% slower. Again, we're looking at significant losses for the Radeon GPU in titles like Ratchet & Clank, Forza Motorsport, Cyberpunk, and Alan Wake 2.

Finally, at 4K, the Radeon GPU became on common 42% slower, which is not a sturdy result for AMD and makes the RTX 4090 appear tremendously precise fee in comparison to what the 7900 XTX finished.

Would you purchase the 7900 XTX or the RTX 4090?

After all that testing and overall performance comparisons, which GPU could you buy? Or in other words, are you spending $1,000 or $2,000? As we noted on the start of this evaluation, no matter being flagship merchandise from competing companies, they're not precisely competing products.

If you're willing to spend $1,six hundred – or right now it might must be $2,000 – to gather the GeForce RTX 4090, you're not likely to even take into account the 7900 XTX. Likewise, in case you're thinking about the 7900 XTX, it's extremely unlikely that you'd be choosing among it and the RTX 4090. Instead, you would be much more likely considering probable stepping as much as the RTX 4080.

So in a feel, it is a chunk of a unnecessary evaluation, however we knew this going into it. Still, it is quality to have an updated examine AMD and Nvidia's fine services.

Essentially, in case you need the nice of the great, it's the RTX 4090, which appeals to those with ample budgets. The 7900 XTX, on the other hand, without a doubt has to earn your cash. While it might be 18% cheaper than the RTX 4080 for similar universal overall performance, its ray tracing overall performance is often a great deal worse, and its characteristic set may be lacking, both in assist or satisfactory.

Still, at ~20% cheaper, the 7900 XTX will become more appealing, mainly in case you're not too concerned approximately RT performance, which currently we're nonetheless now not, though it's great to have in case you don't have to pay a vast premium.

As it seems, the Radeon 7900 XTX isn't always the hypothetical GeForce GTX 4090, as its rasterization overall performance isn't always fast sufficient. However, it's higher in that its ray tracing performance is commonly suitable for the ones prioritizing visuals over frame charge.

A good sized problem for AMD is Radeon overall performance in famous competitive titles which include Fortnite and Warzone. If you growth the visual best settings, the Radeon GPUs are aggressive in the ones titles, especially Fortnite while using DX12. But DX12 is not the nice manner to play Fortnite competitively, nor are superb visuals. This offers Nvidia a bonus, that's why most Warzone and Fortnite players opt for and use GeForce GPUs.

I've played quite a few Fortnite with a Radeon GPU, and the experience can variety from great to downright horrible. New game updates often disrupt driver optimization, inflicting common stuttering that makes the sport unenjoyable, and this can persist for weeks or months till AMD addresses it. Using DX12 can assist, but overall performance is typically weaker, and you will possibly come across occasional stuttering. This is why the overall performance mode is based totally at the DX11 API, and it's what each seasoned participant uses.

Ultimately, lots will rely upon the games you play. Our in-recreation checking out for Counter-Strike 2 became very fantastic for the 7900 XTX, and we saw an awful lot the same in Rainbow Six Siege. So it could be a extraordinary GPU for esports titles. As usually, make certain you studies performance for the games you play the most.

Shopping Shortcuts:
  • AMD Radeon RX 7900 XTX on Amazon
  • Nvidia GeForce RTX 4090 on Amazon
  • Nvidia GeForce RTX 4080 on Amazon
  • AMD Radeon RX 7900 XT on Amazon
  • Nvidia GeForce RTX 4070 on Amazon
  • AMD Radeon RX 7800 XT on Amazon
  • Nvidia GeForce RTX 4060 Ti on Amazon
  • Nvidia GeForce RTX 4070 Ti on Amazon
  • AMD Radeon RX 7600 on Amazon
.SubDriveRevBot margin: 30px 0 0px; border-radius: 3px; line-peak: 1.5; font-length: zero.9em; color: ssharppfff; historical past-shade: ssharpp1d4d84; cursor: pointer; history-repeat: no-repeat; background-size: incorporate; heritage-position: proper; .SubDriveRevBot:hover historical past-shade: ssharpp245996; transition: 0.4s linear; .SubDriveRevBot a colour: ssharppfff; show: block; width: 100%; height: 100%; .SubDriveRevBot a:hover colour: ssharppfff; .SubDriveRevBot .Titlerr heritage: rgba(30, 41, 51, 0.63); padding: 10px 20px 7px; coloration: ssharppfff; letter-spacing: -zero.1px; show: block; border-radius: 3px; font-size: 0.9em; .SubDriveRevBot .Commentary font-weight: 500; shade: ssharppf29f26; font-family: Roboto; .SubDriveRevBot .Remarknew font-weight: 500; colour: ssharppfea; font-own family: Roboto; .SubDriveRevBot .Bulll margin-bottom: 5px !Crucial; padding: 15px 5px If you enjoy our content, please take into account subscribing.
  • Ad-free TechSpot revel in whilst helping our work
  • Our promise: All reader contributions will move towards funding more content material
  • That way: More tech capabilities, extra benchmarks and analysis
  • 4090 vs 7900 XTX 1440p

  • 7900 XTX vs 4090 4K

  • 4090 vs 7900 XTX reddit

  • Radeon RX 7900 XTX vs 4080

  • 7900 XTX vs 4090 power consumption

  • 4090 vs 7900 XTX 1440p reddit

  • 7900 XTX price

  • 7900 XTX vs 4090 Warzone 2

Who is overemployment clearly harming? Often, it's your colleagues

Who is overemployment clearly harming? Often, it's your colleagues

It looks like an appropriate crime. Since far off work became commonplace, increasingly people are now working multiple complete and component-time jobs without their employers understanding. Not all roles are conducive...

Last updated 9 month ago

Broadcom will not axe computer hypervisor Workstation products after VMware acquisition

Broadcom will not axe computer hypervisor Workstation products after VMware acquisition

In a nutshell: Users and customers of VMware products have one less reason to worry approximately the destiny after the Broadcom acquisition. The Workstation line of laptop hypervisors will no longer be discontinued, or...

Last updated 10 month ago

Intel would possibly have slipped that Windows 12 is indeed coming next year

Intel would possibly have slipped that Windows 12 is indeed coming next year

Rumor mill: Windows eleven is two years antique and continues to be in the back of Windows 10's marketplace share, however rumors have long advised Microsoft is already preparing Windows 12's debut. A current comment fr...

Last updated 12 month ago

Grab the Apple MacBook Air M1 for simply $750 modern-day

Grab the Apple MacBook Air M1 for simply $750 modern-day

Reviewers Liked Shocking overall performance with M1 chip Intel-based apps work properly Good battery life Retina Display looks terrific Can run iPhone and iPad apps Great keyboard and trackpad It's light and slim Abso...

Last updated 9 month ago

Sam Altman joins Microsoft to lead new synthetic intelligence crew following OpenAI firing

Sam Altman joins Microsoft to lead new synthetic intelligence crew following OpenAI firing

What simply occurred? Sam Altman is truely not returning to OpenAI following his shock firing some days in the past. Instead, he may be joining Microsoft as the pinnacle of a brand new synthetic intelligence research gr...

Last updated 10 month ago

Asus RTX 4090 BTF embraces a cable-free future, removes 16-pin strength connector

Asus RTX 4090 BTF embraces a cable-free future, removes 16-pin strength connector

 Asus is pushing its Back-to-the-Future (BTF) idea at CES with the introduction of latest pictures playing cards that cast off auxiliary power connectors. Asus' BTF initiative is designed in particular for DIY lovers se...

Last updated 9 month ago


safirsoft.com© 2023 All rights reserved

HOME | TERMS & CONDITIONS | PRIVACY POLICY | Contact