Game Benchmarks

Blizzard announced in January that Overwatch had surpassed the 25 million player milestone, but despite being nearly a year old, there’s still no standardized way to benchmark the game. We’ve developed our own method instead, which we’re debuting with this GPU optimization guide.

Overwatch is an unusual title for us to benchmark. As a first person shooter, the priority for many players is on sustained high framerates rather than on overall graphical quality. Although Overwatch isn’t incredibly demanding (original recommended specs were a GTX 660 or a Radeon HD 7950), users with mid-range hardware might have a hard time staying above 60FPS at the highest presets. This Overwatch GPU optimization guide is for those users, with some graphics settings explanations straight from Blizzard to GN.

Benchmarking Mass Effect: Andromeda immediately revealed a few considerations for our finalized testing. Frametimes, for instance, were markedly lower on the first test pass. The game also prides itself in casting players into a variety of environs, including ship interiors, planet surfaces of varying geometric complexity (generally simpler), and space stations with high poly density. Given all these gameplay options, we prefaced our final benchmarking with an extensive study period to research the game’s performance in various areas, then determine which area best represented the whole experience.

Our Mass Effect: Andromeda benchmark starts with definitions of settings (like framebuffer format), then goes through research, then the final benchmarks at 4K, 1440p, and 1080p.

Watch Dogs 2 CPU Benchmark - Threads Matter

By Published February 20, 2017 at 1:00 pm

With Ryzen around the corner, we wanted to publish a full CPU benchmark of Watch Dogs 2 in our test course, as we’ve recently found the game to be heavily thread-intensive and responsive to CPU changes. The game even posts sizable gains for some overclocks, like on the i5-2500K, and establishes a real-world platform of when CPU choice matters. It’s easy to bottleneck GPUs with Watch Dogs 2, which is something of a unique characteristic for modern games.

Watch Dogs 2 is a familiar title by now at the GN test bench, and while we’ve published a GPU benchmark and a more recent CPU optimization guide, we never published a comprehensive CPU benchmark. We’ve gathered together all our results here, from the 2500K revisit all the way to Kaby Lake reviews (see: 7600K review & 7350K review), and analyzed what exactly makes a CPU work well with Watch Dogs 2 and why.

In this Watch Dogs 2 CPU benchmark, we’ll recap some graphics optimization tips for CPUs and test whether an i7 is worth it, alongside tests of the 7600K, 7700K, 6600K, 7350K, FX-8370, and more.

One interesting aspect of the Watch_Dogs 2 benchmarking we did for our 2500K revisit was the difference in performance between i5s and i7s. At stock speeds, the i7-2600K was easily outpacing the i5-2500K by roughly 15 FPS—and even more interestingly, the i7-6700K managed to hit our GTX 1080’s ceiling of 110-115 FPS, while the i5-6600K only managed 78.7 with the same settings. Watch_Dogs 2 is clearly a game where the additional threads are beneficial, making it an exciting test opportunity as that’s not a common occurrence. We decided to look into settings optimization for CPUs with Watch Dogs 2, and have tested a few of the most obvious graphics settings to see which ones can really help.

This Watch Dogs 2 graphics optimization guide focuses on CPU performance to try and figure out which settings can be increased (with GPU overhead) and decreased (with CPU limits).

Before even getting started here, let’s put out the obvious disclaimer. This GPU benchmark is for the beta version of For Honor, which means a few things: (1) the game’s not final yet and, despite being just two weeks away, there are still some graphics settings missing from the menu; (2) nVidia’s current drivers are optimized for the beta, but the company plans another update some point soon for further optimizations; (3) AMD has not yet released drivers for the game, though we did ask for early access and were told that the company won’t be ready until launch day. There are day-0 drivers planned from AMD.

Regardless, we tested anyway to see how the beta performs and get a baseline understanding of what we should expect overall from the new multiplayer brawler title. For Honor thus far has proven impressively detailed in geometry and texturing (especially texturing), and deserves high marks for the art department. Granted, that generally means more abuse on the video card or CPU (for the complex geometric draw calls), so we’ve got some For Honor graphics settings scaling tests as well.

This graphics card benchmark tests For Honor’s performance at 4K, 1440p, and 1080p with Extreme settings. We tested using a real, in-game benchmark rather than the built-in benchmark, which generally makes performance look a lot worse than it is in reality (we have a chart demonstrating this). Settings scaling was tested from low to extreme, as was multiplayer and ‘singleplayer’ (bot match). We primarily ran For Honor benchmarks with the AMD RX 480 8GB & 4GB, RX 470 4GB, RX 460 2GB, & 390X cards vs. the GTX 1080, 1070, 1060 6GB & 3GB, 1050 & Ti, and 970 AIB partner cards.

Ubisoft's newest dystopian efforts start strong with allusions to modern-day challenges pertaining to privacy and "cyber warfare," working to build-up our character as a counter-culture hacker. And, as with Ubisoft's other AAA titles, the game builds this world with high-resolution textures, geometrically complex and dense objects, taxing shadow/lighting systems, and an emphasis on graphics quality.

Watch Dogs 2 is a demanding title to run on modern hardware. We spent the first 1-2 hours of our time in Watch Dogs 2 simply studying the impact of various settings on performance, further studying locales and their performance hits. Areas with grass and foliage, we found, most heavily hit framerate. Nightfall or dark rain play a role in FPS hits, too, particularly when running high reflection qualities and headlight shadows.

We look at performance of 11 GPUs in this Watch Dogs 2 video card benchmark, including the RX 480 vs. GTX 1060, GTX 1070, GTX 1080, RX 470, R9 Fury X, and more.

We've been through Battlefield 1 a few times now. First were the GPU benchmarks, then the HBAO vs. SSAO benchmark, then the CPU benchmark. This time it's RAM, and the methodology remains mostly the same. Note that these results are not comparable to previous results because (1) the game has received updates, (2) memory spec has changed for this test, and (3) we have updated our graphics drivers. The test platforms and memory used are dynamic for this test, the rest remaining similar to what we've done in the past. That'll be defined in the methodology below.

Our CPU benchmark had us changing frequencies between test platforms as we tried to determine our test patterns and methodology / bench specs for the endeavor. During that exploratory process, we noticed that memory speeds of 3200MHz were measurably faster in heuristic testing than speeds of, say, 2400MHz. That was just done by eye, though; it wasn't an official benchmark, and we wanted to dedicate a separate piece to that.

This content benchmarks memory performance in Battlefield 1, focusing on RAM speed (e.g. 1600MHz, 1866, 2133, 2400, so forth) and capacity. We hope to answer whether 8GB is "enough" and find a sweet spot for price-performance in memory selection.

This benchmark took a while to complete. We first started benchmarking CPUs with Battlefield 1 just after our GPU content was published, but ran into questions that took some back-and-forth with AMD to sort out. Some of that conversation will be recapped here.

Our Battlefield 1 CPU benchmark is finally complete. We tested most of the active Skylake suite (i7-6700K down to i3-6300), the FX-8370, -8320E, and some Athlon CPUs. Unfortunately, we ran out of activations before getting to Haswell or last-gen CPUs, but those may be visited at some point in the future. Our next goal is to look into the impact of memory speed on BF1 performance, or determine if there is any at all.

Back on track, though: Today's feature piece is to determine at what point a CPU will begin bottlenecking performance elsewhere in the system when playing Battlefield 1. Our previous two content pieces related to Battlefield 1 are linked below:

The goal of this content is to show that HBAO and SSAO have negligible performance impact on Battlefield 1 performance when choosing between the two. This benchmark arose following our Battlefield 1 GPU performance analysis, which demonstrated consistent frametimes and frame delivery on both AMD and nVidia devices when using DirectX 11. Two of our YouTube commenters asked if HBAO would create a performance swing that would favor nVidia over AMD and, although we've discussed this topic with several games in the past, we decided to revisit for Battlefield 1. This time, we'll also spend a bit of time defining what ambient occlusion actually is, how screen-space occlusion relies on information strictly within the z-buffer, and then look at performance cost of HBAO in BF1.

We'd also recommend our previous graphics technology deep-dive, for folks who want a more technical explanation of what's going on for various AO technologies. Portions of this new article exist in the deep-dive.

Battlefield 1 marks the arrival of another title with DirectX 12 support – sort of. The game still supports DirectX 11, and thus Windows 7 and 8, but makes efforts to shift Dice and EA toward the new world of low-level APIs. This move comes at a bit of a cost, though; our testing of Battlefield 1 has uncovered some frametime variance issues on both nVidia and AMD devices, resolvable by reverting to DirectX 11. We'll explore that in this content.

In today's Battlefield 1 benchmark, we're strictly looking at GPU performance using DirectX 12 and DirectX 11, including the recent RX 400 series, GTX 10 series, GTX 9 series, and RX 300 series GPUs. Video cards tested include the RX 480, RX 470, RX 460, 390X, and Fury X from AMD and the GTX 1080, 1070, 1060, 970, and 960 from nVidia. We've got a couple others in there, too. We may separately look at CPU performance, but not today.

This BF1 benchmark bears with it extensive testing methodology, as always, and that's been fully detailed within the methodology section below. Please be sure that you check this section for any questions as to drivers, test tools, measurement methodology, or GPU choices. Note also that, as with all Origin titles, we were limited to five device changes per game code per day (24 hours). We've got three codes, so that allowed us up to 15 total device tests within our test period.

Page 1 of 3

  VigLink badge