Destiny 2 Texture Quality Comparison

By Published August 31, 2017 at 12:01 am

As we’ve done in the past for GTA V and Watch_Dogs 2, we’re now taking a look at Destiny 2’s texture resolution settings. Our other recent Destiny 2 content includes our GPU benchmark and CPU benchmark.

All settings other than texture resolution were loaded from the highest preset and left untouched for these screenshots. There are five degrees of quality, but only highest, medium, and lowest are shown here to make differences more obvious. The blanks between can easily be filled in.

Our Destiny 2 GPU benchmark was conducted alongside our CPU benchmark, using many of the same learnings from our research for the GPU bench. For GPU testing, we found Destiny 2 to be remarkably consistent between multiplayer and campaign performance, scaling all the way down to a 1050 Ti. This remained true across the campaign, which performed largely identically across all levels, aside from a single level with high geometric complexity and heavy combat. We’ll recap some of that below.

For CPU benchmarking, GN’s Patrick Lathan used this research (starting one hour after the GPU bench began) to begin CPU tests. We ultimately found more test variance between CPUs – particularly at the low-end – when switching between campaign and multiplayer, and so much of this content piece will be dedicated to the research portion behind our Destiny 2 CPU testing. We cannot yet publish this as a definitive “X vs. Y CPU” benchmark, as we don’t have full confidence in the comparative data given Destiny 2’s sometimes nebulous behaviors.

For one instance, Destiny 2 doesn’t utilize SMT with Ryzen, producing utilization charts like this:

The Destiny 2 beta’s arrival on PC provides a new benchmarking opportunity for GPUs and CPUs, and will allow us to plot performance uplift once the final game ships. Aside from being a popular beta, we also want to know if Bungie, AMD, and nVidia work to further improve performance in the final stretch of time prior to the official October 24 launch date. For now, we’re conducting an exploratory benchmark of multiplayer versus campaign test patterns for Destiny 2, quality settings, and multiple resolutions.

A few notes before beginning: This is beta, first off, and everything is subject to change. We’re ultimately testing this as it pertains to the beta, but using that experience to learn more about how Destiny 2 behaves so that we’re not surprised on its release. Some of this testing is to learn about settings impact to performance (including some unique behavior between “High” and “Highest”), multiplayer vs. campaign performance, and level performance. Note also that drivers will iterate and, although nVidia and AMD both recommended their respective drivers for this test (385.41, 17.8.2), likely change for final release. AMD in particular is in need of a more Destiny-specific driver, based on our testing, so keep in mind that performance metrics are in flux for the final launch.

Note also: Our Destiny 2 CPU benchmark will be up not long after this content piece. Keep an eye out for that one.

Blizzard announced in January that Overwatch had surpassed the 25 million player milestone, but despite being nearly a year old, there’s still no standardized way to benchmark the game. We’ve developed our own method instead, which we’re debuting with this GPU optimization guide.

Overwatch is an unusual title for us to benchmark. As a first person shooter, the priority for many players is on sustained high framerates rather than on overall graphical quality. Although Overwatch isn’t incredibly demanding (original recommended specs were a GTX 660 or a Radeon HD 7950), users with mid-range hardware might have a hard time staying above 60FPS at the highest presets. This Overwatch GPU optimization guide is for those users, with some graphics settings explanations straight from Blizzard to GN.

Benchmarking Mass Effect: Andromeda immediately revealed a few considerations for our finalized testing. Frametimes, for instance, were markedly lower on the first test pass. The game also prides itself in casting players into a variety of environs, including ship interiors, planet surfaces of varying geometric complexity (generally simpler), and space stations with high poly density. Given all these gameplay options, we prefaced our final benchmarking with an extensive study period to research the game’s performance in various areas, then determine which area best represented the whole experience.

Our Mass Effect: Andromeda benchmark starts with definitions of settings (like framebuffer format), then goes through research, then the final benchmarks at 4K, 1440p, and 1080p.

Anyone who sticks to one medium for gaming -- PC, Xbox, Playstation, destroyed Switch -- inevitably misses out on some games. For us at GamersNexus, Monster Hunter has long been one of those franchises. Luckily, Phoenix Labs felt the same way, and so created a more platform-favorable co-op, behemoth-slaying RPG called “Dauntless.” Dauntless aims to bring a refreshed, new take on the hunting experience, adding a healthy dash of Dark Souls-inspired combat for the PC platform.

The very existence of humanity is being threatened by aether-fueled behemoths, we’re told, and so you shouldn’t feel bad about eradicating entire families of beasts, Design Director Chris Cleroux informed us. Just murder all of them. They’re all bad.

Thunder Lotus has hit their stride in game making. Their first game, Jotun, made waves largely due to the beautiful hand-drawn visuals. They’ve now embarked on their second title, Sundered, and they’re hoping to do the same again. Shifting to a more “Metroidvania” style game has only benefited from the appeal of Thunder Lotus’ hand drawn aesthetic, which carries over into the new action/platformer title.

In Sundered, you control Eshe, a survivor in a post-apocalyptic world. Separated from your group by an eldritch sandstorm, Eshe is forced to investigate the powerful arcane forces that shattered the world. Throughout the game, enemies come from two fractions: the Valkyries and the Eschatons. The Valkyries were once humanity’s best soldiers, formed to try and stop the cataclysm that shattered the world. The Eschatons, on the other hand, were humans that fell under the sway of those same eldritch powers. Eshe will have to overcome both throughout the game.

Watch Dogs 2 CPU Benchmark - Threads Matter

By Published February 20, 2017 at 1:00 pm

With Ryzen around the corner, we wanted to publish a full CPU benchmark of Watch Dogs 2 in our test course, as we’ve recently found the game to be heavily thread-intensive and responsive to CPU changes. The game even posts sizable gains for some overclocks, like on the i5-2500K, and establishes a real-world platform of when CPU choice matters. It’s easy to bottleneck GPUs with Watch Dogs 2, which is something of a unique characteristic for modern games.

Watch Dogs 2 is a familiar title by now at the GN test bench, and while we’ve published a GPU benchmark and a more recent CPU optimization guide, we never published a comprehensive CPU benchmark. We’ve gathered together all our results here, from the 2500K revisit all the way to Kaby Lake reviews (see: 7600K review & 7350K review), and analyzed what exactly makes a CPU work well with Watch Dogs 2 and why.

In this Watch Dogs 2 CPU benchmark, we’ll recap some graphics optimization tips for CPUs and test whether an i7 is worth it, alongside tests of the 7600K, 7700K, 6600K, 7350K, FX-8370, and more.

One interesting aspect of the Watch_Dogs 2 benchmarking we did for our 2500K revisit was the difference in performance between i5s and i7s. At stock speeds, the i7-2600K was easily outpacing the i5-2500K by roughly 15 FPS—and even more interestingly, the i7-6700K managed to hit our GTX 1080’s ceiling of 110-115 FPS, while the i5-6600K only managed 78.7 with the same settings. Watch_Dogs 2 is clearly a game where the additional threads are beneficial, making it an exciting test opportunity as that’s not a common occurrence. We decided to look into settings optimization for CPUs with Watch Dogs 2, and have tested a few of the most obvious graphics settings to see which ones can really help.

This Watch Dogs 2 graphics optimization guide focuses on CPU performance to try and figure out which settings can be increased (with GPU overhead) and decreased (with CPU limits).

Page 1 of 29

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge