Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"
First world problems, Steve. First world problems.
The Witcher 3's bombastic launch included bonus, spill-over fanfare surrounding the use of nVidia's GameWorks middle-ware in Project Cars. AMD spewed fire, telling Ars that GameWorks “completely sabotaged” AMD's performance, further stating “it's wrecked our performance, almost as if it [were] put in to achieve that goal.” This implication of an nVidia-branded torpedo to AMD's performance garnered attention on reddit and other social networks, following a week of similar postings related to Project Cars. We decided to do some of our own research.
During the GTA V craze, we posted a texture resolution comparison that showcased the drastic change in game visuals from texture settings. The GTA content also revealed VRAM consumption and the effectively non-existent impact on framerates by the texture setting. The Witcher 3 has a similar “texture quality” setting in its game graphics options, something we briefly mentioned in our Witcher 3 GPU benchmark.
This Witcher 3 ($60) texture quality comparison shows screenshots with settings at Ultra, High, Normal, and Low using a 4K resolution. We also measured the maximum VRAM consumption for each setting in the game, hoping to determine whether VRAM-limited devices could benefit from dropping texture quality. Finally, in-game FPS was measured as a means to determine the “cost” of higher quality textures.
Not one recent triple-A PC title has launched without its share of crashing, flickering, mouse acceleration / smoothing, or other issues. In our time benchmarking the Witcher 3's PC performance, we encountered a couple of resolvable issues pertaining to the game's stability.
Benchmarking the Witcher 3 proved to be more cumbersome than any game we've ever benchmarked. CD Projekt Red's game doesn't front the tremendously overwhelming assortment of options that GTA V does – all of which we tested, by the way – but it was still a time-consuming piece of software to analyze. This is largely due to optimization issues across the board, but we'll dive into that momentarily.
In this Witcher 3 – Wild Hunt PC benchmark, we compare the FPS of graphics cards at varied settings (1080p, 1440p, 4K) to uncover achievable framerates. Among others, we tested SLI GTX 980s, a Titan X, GTX 960s, last-gen cards, and AMD's R9 290X, 285, and 270X. Game settings were tweaked in methodology for the most fair comparison (below), but primarily checked for FPS at 1080p (ultra, medium, low), 1440p (ultra, medium), and 4K (ultra, medium).
That's a big matrix.
Let's get started.
We normally upload a “benchmark course” video to the site's YouTube channel, dedicated to showcasing just a small part of our extensive testing methodology for each game benchmarked. The previous title we tested was GTA V, and with the launch of the Witcher 3 – Wild Hunt, focus shifts to CD Projekt Red's new game.
This video walks through our benchmark course used during video card benchmarking for the Witcher, an exhaustive process that seeks to uncover the best graphics cards at various settings. We've only got four video cards left in our test tonight, which will be followed-up immediately by an article with FPS charts and other data. More performance articles will follow shortly thereafter.
In the meantime, take a look at the game's hi-fidelity graphics at 4K resolution: