Steve Burke

Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

Everyone talks game about how they don’t care about power consumption. We took that comment to the extreme, using a registry hack to give Vega 56 enough extra power to kill the card, if we wanted, and a Floe 360mm CLC to keep temperatures low enough that GPU diode reporting inaccuracies emerge. “I don’t care about power consumption, I just want performance” is now met with that – 100% more power and an overclock to 1742MHz core. We've got room to do 200% power, but things would start popping at that point. The Vega 56 Hybrid mod is our most modded version of the Hybrid series to date, and leverages powerplay table registry changes to provide that additional power headroom. This is an alternative to BIOS flashing, which is limited to signed drivers (like V64 on V56, though we had issues flashing V64L onto V56). Last we attempted it, a modified BIOS did not work. Powerplay tables do, though, and mean that we can modify power target to surpass V56’s artificial power limitation.

The limitation on power provisioned to the V56 core is, we believe, fully to prevent V56 from too easily outmatching V64 in performance. The card’s BIOS won’t allow greater than 300-308W down the PCIe cables natively, even though official BIOS versions for V64 cards can support 350~360W. The VRM itself easily sustains 360W, and we’ve tested it as handling 406W without a FET popping. 400W is probably pushing what’s reasonable, but to limit V56 to ~300W, when an additional 60W is fully within the capabilities of the VRM & GPU, is a means to cap V56 performance to a point of not competing with V64.

We fixed that.

AMD’s CU scaling has never been that impacting to performance – clock speed closes most gaps with AMD hardware. Even without the extra shaders of V64, we can outperform V64’s stock performance, and we’ll soon find out how we do versus V64’s overclocked performance. That’ll have to wait until after PAX, but it’s something we’re hoping to further study.

Our Destiny 2 GPU benchmark was conducted alongside our CPU benchmark, using many of the same learnings from our research for the GPU bench. For GPU testing, we found Destiny 2 to be remarkably consistent between multiplayer and campaign performance, scaling all the way down to a 1050 Ti. This remained true across the campaign, which performed largely identically across all levels, aside from a single level with high geometric complexity and heavy combat. We’ll recap some of that below.

For CPU benchmarking, GN’s Patrick Lathan used this research (starting one hour after the GPU bench began) to begin CPU tests. We ultimately found more test variance between CPUs – particularly at the low-end – when switching between campaign and multiplayer, and so much of this content piece will be dedicated to the research portion behind our Destiny 2 CPU testing. We cannot yet publish this as a definitive “X vs. Y CPU” benchmark, as we don’t have full confidence in the comparative data given Destiny 2’s sometimes nebulous behaviors.

For one instance, Destiny 2 doesn’t utilize SMT with Ryzen, producing utilization charts like this:

The Destiny 2 beta’s arrival on PC provides a new benchmarking opportunity for GPUs and CPUs, and will allow us to plot performance uplift once the final game ships. Aside from being a popular beta, we also want to know if Bungie, AMD, and nVidia work to further improve performance in the final stretch of time prior to the official October 24 launch date. For now, we’re conducting an exploratory benchmark of multiplayer versus campaign test patterns for Destiny 2, quality settings, and multiple resolutions.

A few notes before beginning: This is beta, first off, and everything is subject to change. We’re ultimately testing this as it pertains to the beta, but using that experience to learn more about how Destiny 2 behaves so that we’re not surprised on its release. Some of this testing is to learn about settings impact to performance (including some unique behavior between “High” and “Highest”), multiplayer vs. campaign performance, and level performance. Note also that drivers will iterate and, although nVidia and AMD both recommended their respective drivers for this test (385.41, 17.8.2), likely change for final release. AMD in particular is in need of a more Destiny-specific driver, based on our testing, so keep in mind that performance metrics are in flux for the final launch.

Note also: Our Destiny 2 CPU benchmark will be up not long after this content piece. Keep an eye out for that one.

We’re revisiting an old topic. A few years ago, we posted an article entitled “How Many Watts Does a Gaming PC Really Need,” which focused on testing multiple configurations for power consumption. We started working on this revisit last week, using a soon-to-be-released Bronze 450W PSU as a baseline, seeing as we’ve recently advocated for more 400-450W PSUs in PC builds. We'll be able to share more about this PSU (and its creator and name) soon. This content piece shows how far we can get on lower wattage PSUs with modern hardware.

Although we’ll be showing an overclocked 7700K + GTX 1080 FTW as the high-end configuration, we’d recommend going higher than 450W for that particular setup. It is possible to run on 450W, but we begin pushing the continuous load on the PSU to a point of driving up noise levels (from the PSU fan) and abusing the power supply. There’s also insufficient headroom for 100% GPU / 100% CPU workloads, but that should be uncommon for most of our audience. Most the forum builds we see host PSUs ranging from 700-800W+, which is often overkill for most modern gaming PCs. You’d want the higher capacity for something like Threadripper, for instance, or X299, but those are HEDT platforms. For gaming platforms, power requirements largely stop around 600W, sans serious overclocking, and most systems can get by lower than that.

Ask GN serves as a consistent format in our video series, now 57 episodes strong. We are still filming at a pace of roughly one Ask GN episode every 1-2 weeks, so if you’ve got questions, be sure to submit them in the YT comments section.

This week’s episode gives a brief break from the deeper overclocking, undervolting, and benchmarking topics of late. We briefly visit Pascal temperature response observations, a user’s chipped GPU (and our own tech battle scars), and talk monitor overclocking. The article referenced during the monitor OC section can be found here.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge