Game Benchmarks

Our Destiny 2 GPU benchmark highlighted massive performance uplift vs. beta on some devices, upwards of 50% on Vega, but was conducted in largely GPU-constrained scenarios. For this content piece, we’ll be exploring the opposite: CPU-constrained scenarios to benchmark Destiny 2 performance on AMD Ryzen and Intel Kaby/Coffee Lake parts, including the R7 1700, R5 1600X, R3 1200, and i7-7700K, i5-7600K, i3-8350K, and G4560.

Most of our test notes have already been recapped in the GPU benchmark, and won’t be fully repeated. Again, we ran a wide spread of tests during the beta, which will be informing our analysis for the Destiny 2 launch benchmarks. Find the previous content below:

As stated in the video intro, this benchmark contains some cool data that was exciting to work with. We don’t normally accumulate enough data to run historical trend plots across various driver or game revisions, but our initial Destiny 2 pre-launch benchmarks enabled us to compare that data against the game’s official launch. Bridging our pre-launch beta benchmarks with similar testing methods for the Destiny 2 PC launch, including driver changes, makes it easier to analyze the deviation between CPU, driver, and game code optimizations.

Recapping the previous tests, we already ran a wide suite of Destiny 2 benchmarks that included performance scaling tests in PvP multiplayer, campaign/co-op multiplayer, and various levels/worlds in the game. Find some of that content below:

NOTE: Our Destiny 2 CPU benchmark is now live.

Some of our original graphics optimization work also carried forward, allowing us to better pinpoint Depth of Field on Highest as one of the major culprits to AMD’s performance. This has changed somewhat with launch, as you’ll find below.

We’re sticking with FXAA for testing. Bungie ended up removing MSAA entirely, as the technique has been buggy since the beta, and left only SMAA and FXAA in its place.

Destiny 2 Texture Quality Comparison

By Published August 31, 2017 at 12:01 am

As we’ve done in the past for GTA V and Watch_Dogs 2, we’re now taking a look at Destiny 2’s texture resolution settings. Our other recent Destiny 2 content includes our GPU benchmark and CPU benchmark.

All settings other than texture resolution were loaded from the highest preset and left untouched for these screenshots. There are five degrees of quality, but only highest, medium, and lowest are shown here to make differences more obvious. The blanks between can easily be filled in.

UPDATE: We have run new CPU benchmarks for the launch of this game. Please view the Destiny 2 launch CPU benchmarks here.

Our Destiny 2 GPU benchmark was conducted alongside our CPU benchmark, using many of the same learnings from our research for the GPU bench. For GPU testing, we found Destiny 2 to be remarkably consistent between multiplayer and campaign performance, scaling all the way down to a 1050 Ti. This remained true across the campaign, which performed largely identically across all levels, aside from a single level with high geometric complexity and heavy combat. We’ll recap some of that below.

For CPU benchmarking, GN’s Patrick Lathan used this research (starting one hour after the GPU bench began) to begin CPU tests. We ultimately found more test variance between CPUs – particularly at the low-end – when switching between campaign and multiplayer, and so much of this content piece will be dedicated to the research portion behind our Destiny 2 CPU testing. We cannot yet publish this as a definitive “X vs. Y CPU” benchmark, as we don’t have full confidence in the comparative data given Destiny 2’s sometimes nebulous behaviors.

For one instance, Destiny 2 doesn’t utilize SMT with Ryzen, producing utilization charts like this:

UPDATE: We have run benchmarks of the launch version of Destiny 2. Please view the launch Destiny 2 GPU benchmarks here.

The Destiny 2 beta’s arrival on PC provides a new benchmarking opportunity for GPUs and CPUs, and will allow us to plot performance uplift once the final game ships. Aside from being a popular beta, we also want to know if Bungie, AMD, and nVidia work to further improve performance in the final stretch of time prior to the official October 24 launch date. For now, we’re conducting an exploratory benchmark of multiplayer versus campaign test patterns for Destiny 2, quality settings, and multiple resolutions.

A few notes before beginning: This is beta, first off, and everything is subject to change. We’re ultimately testing this as it pertains to the beta, but using that experience to learn more about how Destiny 2 behaves so that we’re not surprised on its release. Some of this testing is to learn about settings impact to performance (including some unique behavior between “High” and “Highest”), multiplayer vs. campaign performance, and level performance. Note also that drivers will iterate and, although nVidia and AMD both recommended their respective drivers for this test (385.41, 17.8.2), likely change for final release. AMD in particular is in need of a more Destiny-specific driver, based on our testing, so keep in mind that performance metrics are in flux for the final launch.

Note also: Our Destiny 2 CPU benchmark will be up not long after this content piece. Keep an eye out for that one.

Blizzard announced in January that Overwatch had surpassed the 25 million player milestone, but despite being nearly a year old, there’s still no standardized way to benchmark the game. We’ve developed our own method instead, which we’re debuting with this GPU optimization guide.

Overwatch is an unusual title for us to benchmark. As a first person shooter, the priority for many players is on sustained high framerates rather than on overall graphical quality. Although Overwatch isn’t incredibly demanding (original recommended specs were a GTX 660 or a Radeon HD 7950), users with mid-range hardware might have a hard time staying above 60FPS at the highest presets. This Overwatch GPU optimization guide is for those users, with some graphics settings explanations straight from Blizzard to GN.

Benchmarking Mass Effect: Andromeda immediately revealed a few considerations for our finalized testing. Frametimes, for instance, were markedly lower on the first test pass. The game also prides itself in casting players into a variety of environs, including ship interiors, planet surfaces of varying geometric complexity (generally simpler), and space stations with high poly density. Given all these gameplay options, we prefaced our final benchmarking with an extensive study period to research the game’s performance in various areas, then determine which area best represented the whole experience.

Our Mass Effect: Andromeda benchmark starts with definitions of settings (like framebuffer format), then goes through research, then the final benchmarks at 4K, 1440p, and 1080p.

Watch Dogs 2 CPU Benchmark - Threads Matter

By Published February 20, 2017 at 1:00 pm

With Ryzen around the corner, we wanted to publish a full CPU benchmark of Watch Dogs 2 in our test course, as we’ve recently found the game to be heavily thread-intensive and responsive to CPU changes. The game even posts sizable gains for some overclocks, like on the i5-2500K, and establishes a real-world platform of when CPU choice matters. It’s easy to bottleneck GPUs with Watch Dogs 2, which is something of a unique characteristic for modern games.

Watch Dogs 2 is a familiar title by now at the GN test bench, and while we’ve published a GPU benchmark and a more recent CPU optimization guide, we never published a comprehensive CPU benchmark. We’ve gathered together all our results here, from the 2500K revisit all the way to Kaby Lake reviews (see: 7600K review & 7350K review), and analyzed what exactly makes a CPU work well with Watch Dogs 2 and why.

In this Watch Dogs 2 CPU benchmark, we’ll recap some graphics optimization tips for CPUs and test whether an i7 is worth it, alongside tests of the 7600K, 7700K, 6600K, 7350K, FX-8370, and more.

One interesting aspect of the Watch_Dogs 2 benchmarking we did for our 2500K revisit was the difference in performance between i5s and i7s. At stock speeds, the i7-2600K was easily outpacing the i5-2500K by roughly 15 FPS—and even more interestingly, the i7-6700K managed to hit our GTX 1080’s ceiling of 110-115 FPS, while the i5-6600K only managed 78.7 with the same settings. Watch_Dogs 2 is clearly a game where the additional threads are beneficial, making it an exciting test opportunity as that’s not a common occurrence. We decided to look into settings optimization for CPUs with Watch Dogs 2, and have tested a few of the most obvious graphics settings to see which ones can really help.

This Watch Dogs 2 graphics optimization guide focuses on CPU performance to try and figure out which settings can be increased (with GPU overhead) and decreased (with CPU limits).

Before even getting started here, let’s put out the obvious disclaimer. This GPU benchmark is for the beta version of For Honor, which means a few things: (1) the game’s not final yet and, despite being just two weeks away, there are still some graphics settings missing from the menu; (2) nVidia’s current drivers are optimized for the beta, but the company plans another update some point soon for further optimizations; (3) AMD has not yet released drivers for the game, though we did ask for early access and were told that the company won’t be ready until launch day. There are day-0 drivers planned from AMD.

Regardless, we tested anyway to see how the beta performs and get a baseline understanding of what we should expect overall from the new multiplayer brawler title. For Honor thus far has proven impressively detailed in geometry and texturing (especially texturing), and deserves high marks for the art department. Granted, that generally means more abuse on the video card or CPU (for the complex geometric draw calls), so we’ve got some For Honor graphics settings scaling tests as well.

This graphics card benchmark tests For Honor’s performance at 4K, 1440p, and 1080p with Extreme settings. We tested using a real, in-game benchmark rather than the built-in benchmark, which generally makes performance look a lot worse than it is in reality (we have a chart demonstrating this). Settings scaling was tested from low to extreme, as was multiplayer and ‘singleplayer’ (bot match). We primarily ran For Honor benchmarks with the AMD RX 480 8GB & 4GB, RX 470 4GB, RX 460 2GB, & 390X cards vs. the GTX 1080, 1070, 1060 6GB & 3GB, 1050 & Ti, and 970 AIB partner cards.

Page 1 of 3

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge