CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (2024)

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (1)

Jump to:

  • Executive Summary
  • 1080p Medium
  • 1080p Ultra
  • 1440p Ultra
  • 4K Ultra
  • Power, temperatures, and clocks
  • Balancing your PC build

Your once-great PC is feeling a bit long in the tooth and can no longer handle the games you want to play — we've all been there. It's time for an upgrade, but you're not sure if you need a completely new PC build or if you can make it another couple of years with a few judicious component upgrades. Many users will upgrade their GPU with one of the best graphics cards and leave everything else in place. But would they get more for the money if they spent it all on a CPU instead? And what about some combination of hardware upgrades?

Executive Summary (TLDR)

  • Use a balanced approach to selecting a CPU and GPU, with midrange CPUs generally working well with midrange GPUs.
  • Pairing top-end graphics cards with an older/slower CPUs can result in a significant loss of performance, particularly at lower resolutions.
  • RTX 4080 with an older/slower 8700K CPU loses up to 40% of its performance at 1080p, 33% at 1440p, and 10% at 4K.
  • RTX 3080 loses up to 25% of its performance at 1080p, but only 10% at 1440p and 4% at 4K.
  • Using an RTX 2080 with an 8700K only drops performance 10% at 1080p, and less than 5% at 1440p and 4K.

In a perfect world, you'd have a massive database of performance testing at your fingertips and you could simply plug in any combination of CPU and GPU to see how it would perform. Tools like 3DMark try to provide that sort of service, but given the myriad combinations of hardware, not to mention the fact that 3DMark isn't actually a game engine, you end up with many gaps in the knowledge you seek.

Today, we're going to fill in some of those gaps by putting four different CPUs and GPUs to the test — mixed and matched so we'll test every GPU with every CPU. This is by no means an exhaustive selection of hardware, but we'll use our full current GPU test suite of 19 games at four different settings/resolution combinations. That gives us 16 total reference points showing how different hardware combinations stack up. That should be sufficient to help you plan for your next PC upgrade.

PC Test Systems

There's an elephant over in the CPU corner, of course: You can't just upgrade your processor, at least in most cases. If you're on a socket AM4 motherboard, running a first generation Ryzen chip, you could upgrade to a Zen 3 Ryzen 5000-series CPU. AMD's AM4 is the longest-lived platform of all time, though it's been replaced by socket AM5 now. Intel platforms meanwhile typically only support two generations of processors. So, if you have an 8th Gen Core Intel CPU from 2017–2018, the best you can do without replacing at least the motherboard along with the processor, would be a 9th Gen Core chip from 2018–2019.

Depending on your upgrade path, you may also need to replace your system RAM. Most reasonably recent systems use at least DDR4 memory — Intel made the switch to DDR4 in 2015 and AMD's socket AM4 platform also requires DDR4. Intel started supporting DDR5 on its socket LGA1700 platform with Alder Lake 12th Gen Core in late 2021, though the platform still supports DDR4 as well. AMD's AM5 platform made a wholesale switch to DDR5, marking a clean break from the past.

For our look at CPU and GPU upgrade options, we have four different test platforms. We drew the line with Windows 11 (23H2), which requires TPM support and some other bits and bobs. While it's possible to work around some of the limitations, we opted to go with the officially supported Intel 8th Gen Core i7-8700K as our oldest CPU. Then we have the Core i9-11900K from early 2021 and the Core i9-13900K from late 2022. Rounding things out, AMD's Ryzen 9 7800X3D from 2023 reigns as the current king of the best CPUs for gaming.

All the specs for our test PCs can be seen in the boxout. We have 32GB (2x16GB) of memory for all four PCs, with DDR4-3600 RAM in the 8700K and 11900K, DDR5-5600 XMP for the 13900K, and DDR5-6000 EXPO for the 7800X3D. We used a Crucial 4TB T700 drive, since that's large enough to comfortably hold our gaming test suite, though the oldest platform we tested only ran the drive at PCIe 3.0 speeds, and the 11900K has a PCIe 4.0 interface.

Our test GPUs consist of Nvidia cards, to keep things consistent — all support DLSS and use the same 555.85 drivers (except for The Last of Us, Part 1 which was tested with 552.44 drivers due to a bug in the 555 drivers that prevented the game from running on 8GB cards). All four GPUs have all been among the best graphics cards at some point in the past six years, though only the RTX 4080 Super (a proxy for the 4080) currently ranks on our list.

Note that we're using the original RTX 4080, but if you're actually in the market for such a GPU, you're better off getting the 4080 Super variant simply because it's more financially viable — it costs $200 less than the vanilla RTX 4080 that launched in 2022, and it's basically the same performance (within 3%). The RTX 3080 (10GB) was Nvidia's second-fastest 30-series card when it launched, and the same goes for the RTX 2080. Both were later displaced by the 2080 Super and 3080 Ti, but we're sticking to the initial GPUs rather than the mid-cycle refreshes. We also added the RTX 3050 8GB (not the new 3050 6GB variant) to show how things change with what is basically the slowest desktop RTX card — it's slightly slower than even the RTX 2060, though some games now prefer the 8GB of VRAM of the 3050.

We're using our full 19-game GPU test suite, with eight ray tracing (DXR, or DirectX Raytracing) enabled games and eleven rasterization games — we left ray tracing disabled in these, even if a game supports it. Each setting gets tested at least three times, and we discard the first run and then take the better result of the remaining two runs. We break things down into an overall geometric mean (equal weighting to every game), separate geomeans of just the rasterization and ray tracing games, and then the individual game results.

GPU vs CPU upgrades: 1080p Medium

Image

1

of

22

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (2)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (3)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (4)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (5)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (6)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (7)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (8)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (9)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (10)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (11)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (12)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (13)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (14)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (15)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (16)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (17)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (18)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (19)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (20)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (21)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (22)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (23)

Let's start with 1080p medium; these are the results that should be the most CPU dependent. As you'd expect, the 7800X3D with the RTX 4080 lands at the top of our chart, with the 13900K coming in 5% behind. Then there's a pretty sizable drop to the 11900K with the 4080, which basically ties the 7800X3D and 13900K with the previous generation RTX 3080. And that's where things start to get interesting.

Pairing the RTX 4080 with the oldest CPU of our test hardware ends up yielding worse performance than a 3080 with any of the newer CPUs. In fact, the 4080 with the 8700K only beats the 3080 with the same CPU by 10%. Using those same GPUs with the 7800X3D, you'd see a 36% increase in gaming performance — versus a 28% improvement on the 13900K, and 17% with the 11900K.

This shows one of the most important lessons when it comes to contemplating what to upgrade. If you have an older CPU, moving to the fastest (or second fastest) graphics card may not give you a significant performance boost, particularly at lower resolutions and settings. You want to keep at least some level of balance between the two core components of a gaming PC.

On the other hand, a new CPU will only do so much if you have a slower GPU. The RTX 3050 8GB card ranks at the bottom of the charts, regardless of which CPU you use. Going from a seven years old 8700K to a newer CPU only nets at best a 7% improvement in performance. All four systems with the RTX 2080 are the next step up in the charts, indicating they're generally not hitting major CPU limitations yet. It's only with the 3080 and 4080 that the CPU begins to be a bigger limiting factor.

And yes, the "fastest CPU for gaming" actually came in second place when paired with the older and slower RTX 3050. That's not the only place where the 7800X3D failed to beat the 13900K, as the 2080 and 3080 also (slightly) favored Intel's processor. There are some variables in play besides just the CPU, and perhaps some of the lack of performance comes from the lack of ReBAR (Resizable Base Address Register) with the 2080, as well as driver optimizations likely targeting newer hardware.

That's all just focusing on the overall performance, of course, and individual games can show larger or smaller differences. The rasterization and DXR overall charts mostly echo the overall results, though the gains from a faster CPU tend to be muted on the more graphically demanding games — DXR games in particular.

Minimum framerates — or the average fps of the bottom 1% of frames that we use, which acts as a reasonably proxy for overall consistency of framerate — show much larger differences as well. Even with the RTX 3050 where performance is mostly GPU limited, the 13900K improved 1% lows by 13% overall. Stepping up to the RTX 2080, there's a 22% improvement in minimum fps, and that increases to 44% with an RTX 3080. And if you move from the 8700K to a 7800X3D with the RTX 4080, minimum fps improves by 61%.

One final interesting item to pull out from all the individual gaming results is Diablo IV. The game runs quite well in general on a wide range of hardware... until you try to turn on ray tracing. With DXR enabled, there's a pretty massive CPU bottleneck that comes into play. At RT Medium settings, with DLSS Quality mode upscaling, even the combination of 7800X3D and 4080 only managed 118 fps — and the 7800X3D with the 2080 actually did slightly better at 120 fps.

Diablo IV with DXR ends up as the most CPU-limited game in our suite, as indicated by the clumping of red/green/blue/grey in the chart. Even with an RTX 3050, you could almost double your framerate by upgrading from an 8700K to a 13900K or 7800X3D. Other games that are more CPU limited include Far Cry 6, Flight Simulator, Horizon Zero Dawn, Spider-Man: Miles Morales, and Watch Dogs Legion.

On the other side of the fence, even at 1080p, there are still several games that are almost entirely GPU limited — you'll see four distinct tiers of performance with all four CPU colors at each level. A Plage Tale: Requiem, Bright Memory Infinite, Control (mostly), Cyberpunk 2077, Minecraft, and The Last of Us (mostly). As we increase the resolution and quality settings, more of the games will become primarily GPU limited and will start to look like the Plague Tale chart.

GPU vs CPU upgrades: 1080p Ultra

If you only look at the 1080p medium results, upgrading your CPU as well as your graphics card might seem like the only sensible way to go, but even the shift to 1080p ultra starts to level the playing field among the various CPUs. Overall, the 4080 with either the 7800X3D or 13900K CPU performs best, but now the 11900K only falls off the pace by 16% — still noticeable but not quite as bad as the 20–24 percent deficit we measured at 1080p medium.

The 8700K on the other hand still ends up as a significant bottleneck. Again, any of the three newer CPUs paired with an RTX 3080 would provide a better 1080p ultra experience than the 8700K with a 4080. That's the overall view, of course — there are definitely games where the 8700K is still fast enough to keep the 4080 busy.

If you're a gamer using a 1080p display, however, you're probably not also looking at a $1000 GPU as a reasonable upgrade option. Or maybe you built your current gaming PC back when the 8700K and RTX 2080 were basically as good as it got, six years ago, and you want to upgrade.

Either way, we wouldn't feel much need to move beyond the RTX 3080 for 1080p gaming, even at ultra settings — and the 3080 offers very similar performance to the newer and less expensive RTX 4070, with the RTX 4070 Super being a decent $50 upsell, if you check our GPU benchmarks hierarchy.

For the older and slower GPUs, graphics card bottlenecks become the limiting factor. There's only a 3% spread among the RTX 3050 results, and a moderately larger 7% spread on the RTX 2080 results. But as soon as we move up to the faster GPUs, the gap between the CPUs starts to grow. The 13900K and 7800X3D are about 20% faster than the 8700K with the RTX 3080, and that grows to nearly a 50% lead with the RTX 4080.

The GPU continues to be the biggest limiting factor for the more demanding games, including most of the DXR-enabled titles. Most of those show four distinct tiers of performance based on the GPU, with only the 4080 needing more than the slowest CPU from our testing. Diablo IV remains a noteable exception, with a massive CPU bottleneck when DXR gets enabled. (The easy and sensible solution is to not play that game with ray tracing turned on, but that's a different topic.)

Far Cry 6 and Flight Simulator are also pretty CPU limited, at least with the 3080 and 4080 cards. If you're trying to find a good GPU upgrade for an older 8700K, again we wouldn't go beyond a 3080 for 1080p gaming — eight of the games show less than a 10% improvement with an RTX 4080 versus the RTX 3080 when using the 8700K, with a 17% increase overall. Compare that to the 7800X3D where only a single game (Flight Simulator) shows less than a 10% improvement from moving to the 4080, and the overall increase is 45%.

GPU vs CPU upgrades: 1440p Ultra

The importance of the CPU becomes far less of a factor once we step up to 1440p ultra. Our two slowest GPUs show less than a 5% difference between the 8700K and 13900K — and again, it's a bit curious to see the Ryzen 7 7800X3D somewhat underperforming when paired with the RTX 2080 and RTX 3050, as even the 11900K delivered slightly higher results overall.

Even the RTX 3080 doesn't need a ton of CPU performance to keep it happy, with an 11% spread between the fastest and slowest CPUs. But once we get to the RTX 4080, the 7800X3D returns to the top spot, just fractionally ahead of the 13900K, with both outperforming the 11900K by 12%. There's an even bigger gap to the 8700K, with nearly a 30% advantage by upgrading to a newer CPU.

So, if you're rocking a top-tier GPU like the RTX 4080 or above, or the RX 7900 XTX, but you're running a five or six years old CPU, you're still giving up a lot of performance at 1440p ultra and it's time for an upgrade — or at least, it will be time to upgrade once AMD's Zen 5 and Ryzen 9000 CPUs arrive, unless you want to wait a bit longer for Intel's Arrow Lake CPUs.

So the overall results don't show much of a difference, and that goes for the rasterization and DXR charts as well. There's a 2.5% spread on the rasterization results with the RTX 3050, a 3.8% spread with the 2080, and 11% spread on the 3080, and a 36% spread with the 4080. With the DXR chart, it's a 3% spread on the 3050, 8.3% on the 2080, 10% with the 3080, and 22% with the 4080. But there are still some noteworthy items in the individual charts.

Assassin's Creed Mirage shows almost no difference between the 3080 and 4080 on the 8700K, even at 1440p. The same goes for the 8700K and those GPUs in Far Cry 6, Flight Simulator, Spider-Man: Miles Morales, and Watch Dogs Legion. But the real killer on the CPU continues to be Diablo IV with DXR, where the 4080, 3080, and 2080 are still basically tied at just 55 fps. Note that the 1% lows do show a much larger spread, however, with the 2080 dipping to just 24 fps while the 4080 maintains a much steadier 43 fps.

Something else to note here is that upscaling algorithms like DLSS, FSR 2/3, and XeSS can all shift the bottleneck more toward the CPU as well — which perhaps partly explains the Diablo IV results, though Avatar doesn't hit CPU limits nearly as hard. If you turn on upscaling in Flight Simulator or some of the other games that are already showing CPU bottlenecks, though, you'd see more instances of the 8700K holding back the RTX 4080.

GPU vs CPU upgrades: 4K Ultra

Image

1

of

22

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (68)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (69)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (70)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (71)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (72)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (73)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (74)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (75)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (76)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (77)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (78)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (79)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (80)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (81)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (82)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (83)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (84)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (85)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (86)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (87)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (88)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (89)

We don't normally do any CPU performance testing in games at 4K ultra, and this is why. Yes, there's still some separation between the fastest and slowest processors, but going from the 8700K to the 13900K or 7800X3D only nets an 8–9 percent improvement in performance. Obviously, the newer CPUs are much faster than that, but at 4K the GPU becomes the major bottleneck in just about every game we've tested.

So, if you're a 4K gamer — or you're planning to become one with a new 4K monitor upgrade — you'll generally want to get the fastest, most expensive graphics card you can justify before worrying about your CPU. The RTX 3080, which again also acts as a proxy for the RTX 4070, delivers nearly the same gaming experience across all four of our test platforms, with the 8700K falling just 3% behind the other CPUs. Curiously, we see a larger 9% deficit with the RTX 2080 and a 6% deficit with the RTX 3050, but that's due to a few outliers in our test suite where 8GB of VRAM becomes a serious issue.

You can easily spot the outliers in the individual game charts. Far Cry 6 as an example almost totally fails to run on the combination of 8700K and 2080. We tried repeating the test dozens of times, rebooting the PC, cleaning drivers, etc. It just doesn't want to ever give a decent result — but that's a known problem with the game, where run to run variance at 4K ultra becomes massive on most 8GB cards.

The other games all perform as expected, with the PCIe 3.0 interface of the old platform having a slight effect in Cyberpunk 2077 and The Last of Us. Diablo IV also continues to be problematic with DXR enabled on the 8700K — which is easily fixed by turning off ray tracing and losing the very (very!) slight improvements in reflections and shadows, but we were mostly interested in pushing settings as high as possible to see how that impacted the various platforms.

Seeing almost no difference in framerates at 4K ultra doesn't mean the CPU doesn't matter at all, of course. There are lots of other tasks that will benefit from additional CPU performance, including just loading games and pre-compiling shaders. The Last of Us for example is notorious for taking a while to compile the shaders the first time you launch it on a new GPU. On the 7800X3D and 13900K processors, it takes around 10 minutes. On the 8700K, it probably took at least twice that long — we stopped paying attention and just went off to do something else for 30 minutes.

A slower CPU can also result in more stutters and framerate inconsistencies. We run each test multiple times and use the best result (after discarding the first run), which represents something of a best-case scenario. Diablo IV, to cite that example again, has a ton of stuttering as you enter new areas on the 8700K, to the point that it can at times become unplayable — you definitely don't want to try hardcore mode with a slower CPU and DXR enabled.

Even the 11900K has more stuttering, as you can see with the 63 fps result for 1% lows when using the RTX 4080, compared to 70 on the 13900K and 7800X3D. Given that's averaged across 19 games, you can tell just from the overall chart that there are a few games that will struggle with maintaining a consistent framerate. Far Cry 6, Flight Simulator, Horizon Zero Dawn, and Total War: Warhammer 3 are the main culprits, if you're wondering.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

GPU vs CPU upgrades: Power, temperatures, and clocks

Since we collected all the data, we've also provided the GPU power, temperature, and clock speed results. These are averaged across all 19 games and aren't super interesting to dissect, but you can clearly see instances where the CPU holds the GPU back.

Image

1

of

4

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (90)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (91)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (92)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (93)

As a well-trodden example, the RTX 4080 on the 8700K averages just 284W of power use at 4K ultra, compared to 305W/308W on the two faster processors. That in turn means lower temperatures, though clock speeds end up being higher since the GPU isn't straining as hard and can simply run at its maximum clocks.

Image

1

of

4

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (94)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (95)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (96)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (97)

Image

1

of

4

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (98)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (99)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (100)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (101)

We also have our full test results that show efficiency of the GPU (FPS/W) and the relative value (FPS/$). We've grouped things by GPU, and you can see that you get proportionally less "value" from a GPU upgrade if you're using an older CPU. Again, we won't try to fully dissect all the data here, but it's available for those who want to see it.

Image

1

of

16

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (102)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (103)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (104)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (105)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (106)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (107)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (108)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (109)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (110)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (111)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (112)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (113)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (114)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (115)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (116)
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (117)

GPU vs CPU upgrades: Balancing your PC build

PC enthusiasts likely won't find anything too shocking about all of these results. Putting together a gaming PC requires a balanced approach if you want to get the most out of your build, which is why most people who buy extreme performance graphics cards like the RTX 4080 Super and RTX 4090 also purchase an equally extreme CPU, motherboard, RAM, etc. to go with it. Likewise, mainstream builds tend to use modest GPUs, CPUs, and other components.

You don't need to use the fastest CPU to get a good gaming experience, particularly at 4K and even 1440p, but when top-tier hardware can cost $1,000 or more, you also don't want other bottlenecks holding you back. And when the next generation Nvidia Blackwell RTX 50-series GPUs arrive, alongside the AMD RDNA 4 GPUs, the bottleneck will shift even more toward the CPU on older systems.

Results will also vary — often significantly — from game to game. We've provided a relatively large look at 19 different games from the past four years, but there are tens of thousands of games available these days. If you're playing a lot of pixel art games or indie titles, chances are high that they're not going to push your GPU or CPU as hard as most of the games we used for testing.

There are also plenty of PC gamers still getting by with hardware that's slower than even the lowest combination (i7-8700K with RTX 3050 8GB) that we used for testing. If you're using such a PC and it does what you need, great — be happy and don't worry about upgrading until you find something that doesn't work the way you'd like.

While it's been interesting to go back and test four different hardware combinations, it's a very time consuming process and we've only scratched the surface of the major configurations people could be using. Looking at the major desktop graphics cards of the past six years, there have been (by my count) 65 different GPUs from AMD, Intel, and Nvidia. That's the easy part.

There have also been around 127 different CPUs from AMD and Intel going back to 2017 — and that's not counting the F/T/S parts or Pentium chips, AMD's XT parts, or any of the various AMD APUs. If we wanted to test each of those 65 GPUs with every CPU, that's only 8,255 permutations, requiring about one full day per configuration. See you in 23 years or so...

In other words, what I'm saying is that there's no intent to go and test a bunch of other combinations of older CPUs and GPUs beyond this current set of 16 results. Thankfully, even this relatively small sampling provides a decent startng point, and when combined with our CPU benchmarks hierarchy and our GPU benchmarks hierarchy, you can get a decent idea of what to expect.

Keep in mind that a lot of the time, a newer CPU or GPU from a lower tier will also perform about the same as a previous gen higher tier part, like the 4070 matching the 3080, and the 4070 Super matching (mostly) a 3090 Ti; the same goes for Core i7-14700K and i9-13900K. There's a good chance that the future RTX 5070 will perform similarly to the current 4080, and Arrow Lake and Zen 5 midrange parts will look like the current high-end offerings.

Whatever your hardware, if you're in the market to upgrade and only thought the GPU mattered for gaming, we've hopefully provided some additional food for thought.

CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (118)

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

More about gpus

The GPU benchmarks hierarchy 2024: All recent graphics cards rankedBest Graphics Cards for Gaming in 2024

Latest

AMD talks 1.2 million GPU AI supercomputer to compete with Nvidia — 30X more GPUs than world's fastest supercomputer
See more latest►

55 CommentsComment from the forums

  • Jagar123

    Very cool to see these charts. My 2080 will be very limited once I upgrade to a 9x00X3D. I'll see if next gen GPUs are a "good deal". If so, I'll have to upgrade my GPU at that time, or at least get the current gen for a potential discount.

    I've always upgraded my CPU and GPU at the same time, but with the price shenanigans the past 4+ years of GPUs I haven't felt the need to participate.

    Reply

  • stonecarver

    Thank you JarredWaltonGPU
    A Tom's shoot out with actual meat in the article. :ouimaitre:

    Reply

  • Amdlova

    I'am using low power cpu to play 35w cpu from intel... only helldivers 2 have some deeps... see some cpu using 150w of power in this game...

    The 35w cpu with a 4060ti make it a gaming monster with same power draw of a ps5 or xbox series X

    You can find these T series cheap cheap on ebay.

    Reply

  • Mattzun

    Good article
    Its nice to see an occasional review of the actual system bottlenecks at settings most people use, even if it just confirms what most readers believe.

    I have an AMD 5800x and a 6700XT - close enough to a 11900k and 2080 in gaming to check the impact of upgrades.
    I game at 1440p and use high/ultra settings and I'm considering moving to either 4k or ultrawide

    If I keep my 6700xt and upgrade my CPU, I'll see no real difference

    If I upgrade to a current mid-range card (4070 super/5800xt), I'll see some performance improvements but since I'm already at 50FPS min in raster, it's probably not worth it for me. I still won't need a CPU upgrade.

    If I upgrade to a 4k monitor, my framerates will be unacceptable to me.
    I'd need a GPU upgrade to a card at least as fast as a 3080 and I still wouldn't need a CPU upgrade.

    Reply

  • maestro0428

    Nice work. I am willing to bet that more readers want articles like these more than another high end gpu review.

    Reply

  • theronaldoblack

    Building a computer and micro updating is something I used to love doing. But for the last several years I find it best just to build completely new. Optimizing the cpu/gpu for the best performance at build budget.
    I have built computers since my first computer class - yes it was called computer class - way back in 1997-98, and with my first build using windows 98' 2nd edition.
    Updating a part here and there to eek out a bit more performance was so awesome.
    But now... building a computer has become so much simpler and also much more defined.... I guess. It is more cost effective to buy a great combination of hardware, to get the best performance for your budget. And then sale the entire rig -within a few years, or give it to family, friends, repurpose. etc,- and take that money and invest into a new gaming rig/ build.
    I mean buying used parts have become insanely risky and easy at the same time.
    It is so hard getting any kind of money out of selling your ram, storage, psu, or mobo. Because the used market is so toxic or specialized, and high in price. If you find a good seller, on a good platform like amazon, ebay, reselling a graphics card... they want too much, and if you find a good deal on a graphics card... there is probably something wrong with it.

    But I can resale my (under 3 year old) gaming pc for about 70%+ of what I put in it, if not more. I say, come and try it out, play on it. Here are some pics, etc. Clean, non smoking home, look inside it. I meet people at my place of business's main office, or hotel lobby office, or a library. -- NOT a parking lot. I also sale my monitor with it if possible. Because all of the parts are in working order - the hardware is worth more, then individual parts alone. Again, who is going to buy a used PSU... or stick of ram. During covid, I actually sold my 3 yr old build for more then I had in it.

    If a part dies... upgrade. But if your going to upgrade your cpu/gpu... what do you do with those old parts, ... resale, e-waste, horde them in a closet. Now with the best ram, that goes with this cpu, that goes with this mobo, that works with this gpu and psu... you are losing too much investment by just upgrading one part. I would rather sale it all, and build new.

    Reply

  • blargh4

    Thanks, this is a useful article.
    I'm surprised how much my 8700K is not bottlenecking me at high resolutions even with current games.

    Reply

  • Dr3ams

    "So, if you're rocking a top-tier GPU like the RTX 4080 or above, or the RX 7900 XTX, but you're running a five or six years old CPU, you're still giving up a lot of performance at 1440p ultra and it's time for an upgrade — or at least, it will be time to upgrade once AMD's Zen 5 and Ryzen 9000 CPUs arrive, unless you want to wait a bit longer for Intel's Arrow Lake CPUs."

    I didn't see any AMD GPUs in that test, so can I safely assume that the above statement is subjective?

    Reply

  • baboma

    I'm hoping Jarred would consider doing a similar piece to this, focusing current gen mainstream gaming hardware.

    For mainstream users building a new gaming PC, GPU is the first consideration, and $300-500 (or perhaps $600) is the most opted-for range. 1080p & 1440p would be the target. Then the question is what's the most cost-effective CPU to couple with this.

    I find the typical gaming benchmarks to be unhelpful, as it pegs CPU perf against the 4090 as a base, which are out of reach for mainstream gamers. Using the 4060/4070 (or 76/77/7800) as the base would be more relevant.

    Of course, we have rule-of-thumb to go by. Conventional consensus is that i5/Ryzen 5 suffices for midrange, with i7/Ryzen 7 for high-end (and i9/Ryzen 9 for the money-no-object crowd).

    But it would be good to see this consensus put to the test.

    Reply

  • Unolocogringo

    stonecarver said:

    Thank you JarredWaltonGPU
    A Tom's shoot out with actual meat in the article. :ouimaitre:

    Ditto from me.
    An article that gives good viable information.

    Reply

Most Popular
Apple M4 Specs, benchmarks, release date, and pricing
Steam Deck alternatives in 2024: worth buying or worth waiting?
T-Mobile Home Internet: Revisiting 5G connectivity for the home after two years
RTX 4060 Ti vs RX 7700 XT faceoff: Which midrange graphics card is superior?
12 diehard Razer fans got tattoos of the Razer Toaster — 5 years later, they're still patiently waiting for it to come out
Manor Lords is here and we benchmarked it — how much GPU horsepower do you need to play the indie hit?
Flaming bots and NERF shooters: Pi Wars brings together the best Raspberry Pi robotics teams in the world
Hands-on with Hyte's Nexus Link: Huge-screen AIO cooler takes center stage, plus simplified cables and fans that snap together with magnets, making for easier PC building
Qualcomm Snapdragon X Elite and X Plus: Specs, release date, benchmarks, and more
Fastest 3D printers benchmarked: Top printers ranked by output time
RTX 4070 vs RX 7900 GRE faceoff: Which mainstream graphics card is better?
CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most (2024)

References

Top Articles
Latest Posts
Article information

Author: Chrissy Homenick

Last Updated:

Views: 6428

Rating: 4.3 / 5 (54 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Chrissy Homenick

Birthday: 2001-10-22

Address: 611 Kuhn Oval, Feltonbury, NY 02783-3818

Phone: +96619177651654

Job: Mining Representative

Hobby: amateur radio, Sculling, Knife making, Gardening, Watching movies, Gunsmithing, Video gaming

Introduction: My name is Chrissy Homenick, I am a tender, funny, determined, tender, glorious, fancy, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.