Author: Xaltar
Subject: Asrock X99E-ITX/ac 64GB Ram
Posted: 16 Oct 2016 at 2:25pm
Subject: Asrock X99E-ITX/ac 64GB Ram
Posted: 16 Oct 2016 at 2:25pm
![]() |
In many cases this holds true. In most instances where a CPU is found to be bottlenecking a GPU it is a dual core, old quad core (FX 4k/Phenom/Core2Quad) or an ultra low power CPU (AM1, SoC, intel U). There are some cases where games require a lot of CPU grunt for things like AI, physics and predictive algorithms, almost all these cases are MMOs (typically FPS). In rare cases a CPU overclock or more powerful CPU will have an impact on gameplay, mostly reducing stuttering but also improving FPS when all cores were at or near 100% load on the slower solution. All these situations however are due to the way these games are optimized, NOT insufficient CPU grunt. In games like these the focus is target audience, allowing as many people to play as possible on the broadest possible hardware spectrum. This means many of these games do not fully utilize the capabilities of a modern GPU, if the guy on an iGPU can't use the feature it is handed off to the CPU for compatibility. I wouldn't call it poor or lazy optimization, it is quite deliberate and in every situation where the CPU is proving to be a bottleneck it can be negated by lowering certain visual settings. Basically the game should play smoothly at it's lowest settings on it's minimum requirement hardware, which tends to be rather low spec. The problem is that while it all works great at low settings the higher tier settings increase the load on the CPU beyond what another non MMO title would require and then we start seeing bottlenecks in large battles with 10 - 50 players etc.
GPU benchmarks are designed to stress the GPU and push it to it's limits. Everything is scripted so the CPU does not have a whole lot to do. A benchmark does not have the necessity for user input, interaction or random movement, these are the elements that use the CPU most in games.