Game Benchmarks

Activision's latest in its seemingly undying shooter franchise launched with fairly simplistic graphics settings, but still has a few items that may raise questions – like Order Independent Transparency and Subsurface Scattering. We talk about some of these at a top-level in our Black Ops 3 GPU benchmark, found here, but dive deep in this latest guide. Ignoring difficulties encountered with VRAM and memory, the heavy LOD scaling and graphics controls allow for scalability across the $130 to $1000 GPU range.

Our Call of Duty: Black Ops 3 optimization guide shows the best graphics settings for improving FPS, including screenshot comparisons of the settings. We independently benchmarked all of the game's settings. The screenshots below show texture quality (resolution) comparisons, preset & texture VRAM consumption, FPS performance for each setting, and more. We also define Order Independent Transparency, Volumetric Lighting, Subsurface Shadows, Mesh Quality, Shadow Mapping, and more of Call of Duty's options.

All of these tests were conducted using the patch released on November 7, which contained some bug fixes not addressed at launch. The latest nVidia (358.87) and AMD (15.11) drivers were used for testing. More below in the methodology sections.

Each setting will be listed by severity of its impact on FPS. Higher severity FPS impacters will be listed first.

We're in the final throes of our Call of Duty: Black Ops III content before moving on to the next game – you know the one. While processing data for our forthcoming graphics optimization guide, we realized that Black Ops III is among the most VRAM-hungry games we've ever tested, consuming upwards of 10GB GDDR5 on the Titan X.

Our GPU benchmarks included some initial memory benchmarking, stating that the 980 Ti saw full saturation of its 6GB framebuffer at 4K/max settings. We also showed that the game commits 15.2GB of memory under max settings (pageable address space), with an active physical consumption of about 6.7GB (working set) in Multiplayer. Our testing presents that the singleplayer campaign is far more intensive than multiplayer, to the tune of 38.6% lower FPS on the GPU side.

During tests of all Call of Duty: Black Ops 3's graphics settings, we uncovered a VRAM consumption approaching 10GB in campaign mode when using 4K & “Extra” settings.

Call of Duty: Black Ops 3 arrived on PC at midnight, bringing with it high-fidelity graphics that stress PC components – all the way down to the memory. We set forth on benchmarks for Call of Duty: Black Ops III immediately, with our first FPS tests focusing on GPU performance, alongside some RAM & VRAM insights. More tests are forthcoming, so be sure to follow us for those.

Before jumping into the BLOPS 3 benchmarks, let's explain the game's graphics settings. Until our graphics optimization guide for Call of Duty arrives, this detailed list of game settings should assist in determining which options can be disabled or tuned for a higher framerate.

Update: Our Black Ops III graphics optimization guide is now live for per-setting analysis.

UPDATE: Our launch-day Battlefront GPU benchmarks are now live. Refer here for updated charts.

The PC version of Star Wars: Battlefront was made available through beta channels yesterday and, somewhat surprisingly, the graphics settings and assets appear to be fairly feature-complete. It's possible (even likely) that some final optimizations are in the pipe leading up to launch but, for now, the game's high-resolution, high LOD assets are a testament to its preparedness for benchmarking.

Star Wars: Battlefront fronts some of the most advanced, realistic graphics we've yet seen, rivaling GTA V and The Witcher 3 in intensity and technology. Battlefront makes heavy use of terrain deformation and tessellation to add the appearance of greater depth, smooth terrain elements, and create a landscape that scales impressively well at various view distances.

We deployed a suite of video cards to benchmark Star Wars: Battlefront in an exhaustive test, including SLI GTX 980 Tis, the Sea Hawk 980 Ti, GTX 980, GTX 970, 960s, 950, the 390X, 290X, 270X, and more. This Star Wars: Battlefront benchmark compares FPS of graphics cards at maximum (ultra) settings, high, and medium settings in 1080p, 1440p, and 4K resolutions.

Disclaimer: This article makes no intentions to comment on gameplay value. We're strictly looking at visuals and framerate performance in Battlefront.

Regardless of how its mechanics pan-out, Star Citizen is slated to claim the throne as one of the most graphically intense PC games in recent history. This is something we discussed with CIG's Chris Roberts back when the Kickstarter was still running, diving into the graphics technology and the team's intent to fully utilize all tools available to them.

We've been trying to perform frequent benchmarks of Star Citizen as the game progresses. This progress monitor comes with a massive disclaimer, though, and is something we'll revisit shortly: The game isn't finished.

The recent launch of the GTX 980 Ti, R9 Fury X, and AMD 300 series cards almost demands a revisit to Star Citizen's video card performance. This graphics benchmark looks at GPU performance in Star Citizen's 1.1.3 build, testing framerates at various settings and resolutions.

The launch of the Witcher 3 introduced a couple of game graphics options that aren't very commonly available in settings menus. Photographers may be familiar with the likes of chromatic aberration and vignetting, but not many games have offered these items for tweaking in the past.

We recently benchmarked The Witcher 3 for GPU performance and remarked that the game was horridly optimized, taking the opportunity expand on the graphics settings in a limited fashion. Since this posting, CD Projekt Red has released a new game patch (1.03) that drastically improves PC performance on various video cards; AMD is expected to release a Catalyst 15.5 beta driver update that focuses on the Witcher in the near future.

This Witcher 3 optimization guide defines the best graphics settings for improving FPS in the game, seeking to explain each option in greater depth. We independently benchmarked various game settings on a Titan X (to eliminate bottlenecking on the hardware) and took a graphics settings comparison video, found below. Although screenshots can get some of the job done, a comparison video is critical for a game like The Witcher; CD Projekt Red's newest RPG makes heavy use of temporal filters, which means that the filters make the most impact over time (seen through movement, which isn't conveyed in a screenshot). We'd encourage checking out the video for just a few comparisons of the many options.

During the GTA V craze, we posted a texture resolution comparison that showcased the drastic change in game visuals from texture settings. The GTA content also revealed VRAM consumption and the effectively non-existent impact on framerates by the texture setting. The Witcher 3 has a similar “texture quality” setting in its game graphics options, something we briefly mentioned in our Witcher 3 GPU benchmark.

This Witcher 3 ($60) texture quality comparison shows screenshots with settings at Ultra, High, Normal, and Low using a 4K resolution. We also measured the maximum VRAM consumption for each setting in the game, hoping to determine whether VRAM-limited devices could benefit from dropping texture quality. Finally, in-game FPS was measured as a means to determine the “cost” of higher quality textures.

Benchmarking the Witcher 3 proved to be more cumbersome than any game we've ever benchmarked. CD Projekt Red's game doesn't front the tremendously overwhelming assortment of options that GTA V does – all of which we tested, by the way – but it was still a time-consuming piece of software to analyze. This is largely due to optimization issues across the board, but we'll dive into that momentarily.

In this Witcher 3 – Wild Hunt PC benchmark, we compare the FPS of graphics cards at varied settings (1080p, 1440p, 4K) to uncover achievable framerates. Among others, we tested SLI GTX 980s, a Titan X, GTX 960s, last-gen cards, and AMD's R9 290X, 285, and 270X. Game settings were tweaked in methodology for the most fair comparison (below), but primarily checked for FPS at 1080p (ultra, medium, low), 1440p (ultra, medium), and 4K (ultra, medium).

That's a big matrix.

Let's get started.

A week of benchmarking behind us, we've now tested most major aspects of Rockstar's new GTA V PC release. We've elected to adopt the game into our test methodology for future component reviews, given its wide performance demands and load balancing between the CPU and GPU. This final GTA V benchmark looks at CPU bottlenecking at various resolutions and settings; we pit the 3570K, 4790K, FX-8320E, FX-8370E, FX-9690, G3258, and Athlon 760K against one another.

The selection casts a wide net for core count and price points, hopefully illustrating where CPU bottlenecks may appear in playing GTA V.

GTA V shipped alongside an onslaught of graphics settings – none of which offer tool-tips – that can vastly control the fluidity of gameplay. In our recent and comprehensive GTA V benchmark, we tested multiple video cards for FPS at simple “max” and “high” settings, fluctuating resolution between 1080, 1440, and 4K along the way. That content now behind us, we took the opportunity to objectively benchmark various graphics settings for performance differences, then took a few screenshots for comparison of those settings.

This GTA V optimization guide assists in choosing the best graphics settings for frame-limited video cards, explaining the options along the way.

Page 3 of 4

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.


  VigLink badge