Game Benchmarks stub

The Witcher 3 Video Card Benchmark - Poor Software Optimization by CDPR

Posted on May 19, 2015

Benchmarking the Witcher 3 proved to be more cumbersome than any game we've ever benchmarked. CD Projekt Red's game doesn't front the tremendously overwhelming assortment of options that GTA V does – all of which we tested, by the way – but it was still a time-consuming piece of software to analyze. This is largely due to optimization issues across the board, but we'll dive into that momentarily.

In this Witcher 3 – Wild Hunt PC benchmark, we compare the FPS of graphics cards at varied settings (1080p, 1440p, 4K) to uncover achievable framerates. Among others, we tested SLI GTX 980s, a Titan X, GTX 960s, last-gen cards, and AMD's R9 290X, 285, and 270X. Game settings were tweaked in methodology for the most fair comparison (below), but primarily checked for FPS at 1080p (ultra, medium, low), 1440p (ultra, medium), and 4K (ultra, medium).

That's a big matrix.

Let's get started.

Briefly Explaining The Witcher 3's Settings

Note: Our Witcher 3 Graphics Optimization guide is now out!

First, note that this article makes no intent to comment on gameplay value or worthiness of the game beyond its raw performance output.

witcher-gfx-s1 witcher-gfx-s2 witcher-gfx-s3

witcher-gfx2-s1 witcher-gfx2-s2

The Witcher 3's settings feel relatively tame in the face of other recent releases, but still require demystification. The options are split into two tabs: Graphics and post-processing. Graphics contains the core, familiar game settings; post-processing contains filtration FX that are applied to the output or additionally rendered.

We'll skip the obvious ones and keep this at a top-level for now, but check back shortly for more information on the game's settings.

NOTE: We just got our Witcher 3 texture quality comparison article online, complete with VRAM & FPS measurements against quality settings. Read more here.

NVidia HairWorks: Part of nVidia's GameWorks SDK that's provided to game developers. HairWorks is entirely built for nVidia devices and should be disabled for all AMD devices. We disabled this completely for all testing to eliminate test variance, regardless of which manufacturers was presently on the bench. HairWorks draws additional strands of hair and adds a fluidity of motion that assists in realism. We discuss this more in an older article about the technology, found here. HairWorks enacts one of the largest hits to performance out of all game settings, even on nVidia hardware.

Number of Background Characters: Pretty simple. This slider defines a limit on non-essential actors rendered to the screen. We'll validate this in our next test run, but our current hypothesis is that this setting impacts non-character actors as well, like some unessential wildlife.

Terrain Quality: There's no direct mention of tessellation or deformation of terrain elements, but terrain quality does impact the detail of hills and other terrain elements. More on this as we continue to dissect the game.

Water Quality: In our test section, this didn't seem to impact performance too heavily – other factors had greater hits to FPS that were more noticeable. When the rendered output is more heavily dominated by water, framerate will show greater dips from higher water qualities as defined here.

Grass Density: We learned with GTA V that grass density actually can have profound impact on game performance. In the case of the Witcher 3, it's not quite as bad, but still noteworthy. When immersed in prairies and grassy countrysides, framerate hits can emerge resultant of higher settings here.

Texture Quality: The resolution of the game's textures. Texture quality contributes to VRAM consumption notably.

Foliage Visibility Range: This setting had somewhat large hits to performance as it increased. The setting governs view distance of grass and unessential shrubs / plantlife that juts up from the terrain. Lowering this means that foliage will "pop" into view as the visibility threshold is met.

Detail Level: The LOD of objects and game elements. Objects will lose sharpness, texture (as in grit, not as in actual textures), and apparent depth as this is decreased.

Bloom: Bloom's been around forever in games. Bloom is a shader effect that's prevalent when light bleeds into environments; an example would be light cast through a window or the "shimmering" seen above an open flame.

Ambient Occlusion: We set this to "SSAO" for all testing. HBAO+ is an nVidia-enabled AO technology that produces a slight performance hit to hardware, but is optimized for on nVidia GPUs. Ambient Occlusion, at a top-level, governs how the light interacts with reflective surfaces in the game.

Depth of Field: DOF, as in the photography term. Depth of Field layering produces a “bokeh” (blurred background) effect surrounding the in-focus object or character.

Chromatic Aberration: Also found in photography. Chromatic Aberration doesn't necessarily make a lot of real-world sense in the Witcher, as it is primarily used to describe a camera lens' inability to bring all EM wavelengths (colors) into focus. This creates a shimmering glow around objects.

Vignetting: The final photography term in The Witcher 3's settings. A photographic vignette applies a soft shadow or darkness around the edges of an image.

Light Shafts: Also known as "God Rays," which are cast down from the sun through gaps in tree coverage or other obstructing items.

The Witcher 3 at Max Graphics Settings (4K) – Benchmark Course

The above video shows our benchmarking course used for testing video cards. This is an in-game, site-defined benchmark that encounters water, particle FX, lighting and bloom (conducted at 4AM game time – and real time, for that matter), and common occurrence of NPCs and complex objects.

Witcher 3 Minimum Requirements

  • CPU: Intel i5-2500K or AMD Phenom II X4 940
  • GPU: GTX 660 or HD 7870
  • RAM: 6GB
  • OS: Windows 7 or Windows 8.1 64-bit
  • DirectX 11
  • Storage: 40GB

Witcher 3 Recommended System Specs

  • CPU: i7-3770 or AMD FX-8350
  • GPU: GTX 770 or R9 290
  • RAM: 8GB
  • OS: Windows 7 or Windows 8.1 64-bit
  • DirectX 11
  • Storage: 40GB

Test Methodology

We tested using our updated 2015 GPU test bench, detailed in the table below. Our thanks to supporting hardware vendors for supplying some of the test components.

The latest 352.86 GeForce driver was used during testing. AMD's 15.4 Catalyst Beta was attempted and worked for some cards, but was eventually abandoned for 14.2 stable drivers (250X, 285 struggled on 15.4). Game settings were manually controlled for the DUT. Overclocking was neither applied nor tested, though stock overclocks (“superclocks”) were left untouched.

VRAM utilization was measured using in-game tools and then validated with MSI's Afterburner, a custom version of the Riva Tuner software. Parity checking was performed with GPU-Z. FPS measurements were taken using FRAPS and then analyzed with a spreadsheet.

Each pass underwent a 30-second loop in an identical scenario on all cards, then repeated two more times for parity.

GN Test Bench 2015NameCourtesy OfCost
Video CardGTX Titan X 12GB
GTX 980 Reference
SLI GTX 980s Gaming 4G
GTX 780
GTX 770
GTX 750 Ti
EVGA GTX 960 SuperSC 4GB
ASUS GTX 960 Strix 2GB

AMD R9 290X
AMD R9 285
AMD R9 270X
AMD R7 250X
NVIDIA
PNY
EVGA
ASUS
ZOTAC
AMD
MSI
Ranges
CPUIntel i7-4790K CPUCyberPower
$340
Memory32GB 2133MHz HyperX Savage RAMKingston Tech.$300
MotherboardGigabyte Z97X Gaming G1GamersNexus$285
Power SupplyNZXT 1200W HALE90 V2NZXT$300
SSDHyperX Predator PCI-e SSDKingston Tech.TBD
CaseTop Deck Tech StationGamersNexus$250
CPU CoolerBe Quiet! Dark Rock 3Be Quiet!~$60

Average FPS, 1% low, and 0.1% low times are measured. We do not measure maximum or minimum FPS results as we consider these numbers to be pure outliers. Instead, we take an average of the lowest 1% of results (1% low) to show real-world, noticeable dips; we then take an average of the lowest 0.1% of results for severe spikes.

Anti-Aliasing was left completely disabled for this testing. We saw moderate FPS impact with AA -- not as much as one would expect -- but elected to disable the tech to stick with our past methodologies. The on/off charts will be shown in our next post.

Proprietary technologies were avoided when they did not improve performance of the host manufacturer. SSAO was used as the AO technology (HBAO+ is nVidia-specific and also creates a slight performance hit); HairWorks was entirely disabled (big FPS hit, too).

The game was restarted after every graphics setting change, and it must be -- performance suffers a slight hit if it is not manually restarted.

Foreword: AMD's Drivers Are A Mess. We Did Our Best to Make It Work.

AMD's drivers set us back several hours tonight, but we fought to get everything functional to balance the testing as evenly as possible, given available hardware. First note that, as of this test, AMD has not yet provided game-ready drivers for The Witcher 3. This means our results will likely see drastic improvement as soon as AMD pushes a driver update. We will revisit the AMD device benchmark as soon as that's available to show the cards in the most true-to-world light (as we're doing here, until an update launches). For now, we used the outstanding Catalyst Beta drivers (15.4) and fell-back to the stable 14.2 drivers when those failed for some cards.

Despite using a completely clean image, the driver installs often had trouble detecting devices, exhibited other oddities – like turning the display into purely monochromatic output – and other anomalies. We've had success with the newest Catalyst Omega beta in the past, but it's still not perfectly stable. Investing a few hours of extra effort, we built a completely driver-clean image (has never had any nVidia or AMD drivers) strictly for testing The Witcher 3. We eventually found relative stability and were able to test the game, but ran into other issues on behalf of CD Projekt Red. For instance...

The Witcher is Also Buggy: 1440p and Other Issues

1440p wasn't present for some devices. Our reference GTX 980 and our AMD R9 290X could not use 2560x1440 resolutions without hacks, and after digging through the game's configuration files, I think 1440p resolution support could be hacked through game files or external software. For the time being, we found a work-around with the GTX 980: By installing an SLI configuration and then disabling SLI (using a single GTX 980), we were able to conjure a 2560x1440 resolution option in the game's graphics settings. Odd, but it worked. A single GTX 980 did not present this option.

We were unable to find a solution for the R9 290X, and so it went untested at 1440p.

Witcher 3 4K Benchmark – SLI GTX 980s vs. Titan X, 290X, 285, More

witcher-bench-4k-u

At 4K, we were only able to test the GTX 980, Titan X, and R9 290X. This test was conducted using a native 4K display and did not involve DSR or VSR filtration.

The GTX 980s in SLI produced the best performance, predictably outperforming the single Titan X. It's still not exactly “playable” at ultra settings with 4K resolution, but we're getting there. Let's look at medium.

witcher-bench-4k-m

Using medium settings, a slight FPS gain emerges and pushes SLI GTX 980s just barely into a playable spectrum – but dips in combat or certain game areas may force still lower settings.

Witcher 3 1440p Benchmark with Reservations

witcher-bench-1440-u

If you skimmed past the part where we talk about 1440p above, it's worth a quick read. We did what we could here, but 1440p refused to appear as an option for all tested devices. This is a game software issue.

At 1440p, the SLI GTX 980s are more than capable of running the game in excess of 60FPS at ultra settings. The Titan X is just barely within range. A single GTX 980 or R9 285 would not be sufficient for 1440p / ultra settings.

1440p medium shows substantially improved performance. Remember that we hacked the single GTX 980 to run by disabling SLI in an SLI config, but were unable to find a means to run 1440p on a 290X. The chart is below:

witcher-bench-1440-m

Witcher 3 1080p Ultra PC Benchmark

witcher-bench-1080-u

AMD runs the middle of the pack with its yet-un-driver-ed Ri 290X and R9 285, though neither is particularly playable at ultra / 1080. In fact, almost none of these devices are. We saw a playable average FPS with the GTX 980, but its 1% low and 0.1% low FPS were abysmal. The dips are jarring and noticeable enough that, were I to play the game, I'd be forced to drop the settings.

Even the Titan X and SLI 980s don't show particularly compelling high-percentile metrics, falling far below what we normally see in other games. The R9 290X is almost within “playable” range for average FPS on ultra, but isn't quite there; both AMD and nVidia are equal opportunity sufferers when it comes to 1% and 0.1% lows on this configuration.

The Witcher 3 1080p Medium & Low Graphics Card Benchmarks – 270X, 960, 770, etc.

witcher-bench-1080-m

1080 at medium sees little reassurance. The R9 290X is well within playable range, though has poor 0.1% low performance (something we'd imagine would be fixed with driver tuning, but can't be sure). The single GTX 980 hums along without issue, and even last gen's GTX 780 does fairly well and has greater floor values than the previous suite.

The GTX 960s do well, too, with no clear victor between the 4GB and 2GB models (though this isn't always the case). The R9 270X isn't quite there for 1080p / medium performance on the current drivers, but let's see how it does on low:

witcher-bench-1080-l

Not much better. The disparity between “low” and “medium” isn't quite as huge as “medium” and “ultra” (we skipped “high”), and that shows. The 270X is just barely playable with additional tuning.

Conclusion: A Poorly Optimized Launch

Some of the game's settings are tanking performance of the 99th and 99.9 percentile FPS ranges. The lowest 1% and 0.1% dips in our above FPS charts are bad enough that the game visibly stutters at times, producing a hiccup-like motion as we move across the map. It's not experience-ruining, for the most part, but is noticeable on occasion to the experienced PC gamer. The settings are something we're actively investigating for the next article, due for publication shortly. Bugs with 1440p availability are somewhat confusing and make it impossible to bench cards that could otherwise prove worthy at the mid-range resolution.

It takes serious hardware to play the Witcher 3 right now. The game doesn't scale remotely as well as GTA V – a title that is played shockingly well on some IGPs – and it remains to be seen whether further driver updates or game patches will resolve this. To play the game at Medium / 1080 with 60+ FPS average, you'd need either an R9 290X from the AMD camp or GTX 970 from the nVidia camp; 1080p / ultra seems hard even on a GTX 980. And, yes, we do not presently have a GTX 970 available for benchmarking -- that's something we're looking to remedy. We're also missing an R9 280X despite requests, but hope to add one of those (or a predecessor) in the near future.

VRAM consumption never exceeded 2.7GB during our testing, which is noteworthy -- many of the games we've benchmarked lately (ACU, FC4, GTA V) have exceeded 4GB when the memory is available to the system. System RAM consumption hovered around 2.1GB.

In the meantime, stay tuned to our YouTube, twitter, or facebook accounts for updates as we investigate optimization pathways.

Be sure to check out our texture quality comparison and its impact on VRAM / FPS once you're done here. Check out our Graphics Optimization Guide for optimal settings in the Witcher 3.

- Steve “Lelldorianx” Burke.