Vega 64 vs. Vega 56 Clock-for-Clock Shader Comparison (Gaming)

By Published September 13, 2017 at 5:24 pm
  •  

AMD’s architecture hasn’t generally shown a large gain from increasing CU count between top-tier and second-to-top cards. The Fury and Fury X, for instance, could be made to match with an overclock on the lower-tiered card. Additional gains on the higher-tiered card often amount from the increased power limit and clock, not from a straight shader increase. We’re putting that knowledge to the test on Vega architecture, equalizing the Vega 56 & Vega 64 clocks (and 945MHz HBM2 clocks) to determine how much of a difference emerges from the 4096 shaders on V64 to 3584 shaders on V56. Purely counting shaders, that’s a 14% increase to V64, but like most performance metrics, that won’t result in a linear performance increase.

We were able to crush Vega 64’s performance with our heavily modded Vega 56 card, using powerplay tables and liquid to jump to 1742MHz clock speeds. That's with modding, though, and isn't out-of-box performance -- it also doesn't give us any indication as to shader differences. Going less crazy about overclocking and limiting clocks to matched speeds, we can reveal the shader count difference.

Challenges with Shader Testing

The biggest issue with this test is clock matching the GPUs. AMD has implemented its own version of nVidia’s GPU Boost 3.0, except now for Vega: The clocks will bounce around based on power headroom, voltage headroom, and temperature headroom. As a user who doesn’t touch overclocking, this means that you hypothetically get the most out of your card with the least effort, and so is a hypothetical net positive; unfortunately, it also makes overclocking more of a pain, because now there’s a lot more work involved in determining if the clocks have actually moved. If the GPU will occasionally downclock to accommodate some parameter that becomes limited, we now need to take more careful means to measure that change. This means logging during benchmarking, then checking both frequency over the time axis and average frequency. Significant swings in either direction could invalidate a shader comparison. As a result, our margin for error is wider with this test than normally: We’re looking at roughly a +/- 2.5% tolerance when combining test variance and clock variance. We cannot account for shader increase benefits that fall within this window of tolerance.

We ended up clocking to around 1590MHz, but that frequency bounced around based on the game. The only critical factor was keeping both cards matched. With the Vega 56 and Vega 64 cards, we applied power offsets of 80% with powerplay tables, just to ensure there was no power limit, then used 1.2 Vcore. HBM2 speeds were set to 945MHz on each card, meaning that we’ve brought the V56 up to V64 memory speeds. We did not encounter thermal or power limits during these tests.

We ended up retesting this three times, thanks to the new clock behavior in Vega (each game had a minimum of 4-6 passes, most running 6 – then do this three times). The final two tests, we logged frequency for each test and each game, then checked averaged frequencies and frequency over time for each test. This allowed us to better determine if the clocks were actually roughly matched, or if one card had more boosting headroom. The first two sets were invalidated pursuant to unexpected clock behavior differences between the two cards, leaving us with the final and third set for our data. The Vega 64 card often needed to be set to around 1632MHz to get 1590MHz in some games, for example, which can make things confusing. Other games, like Ashes, would boost higher than we asked. The only solution was logging.

We ultimately ended up with a range of about 2-10MHz for most tests. Ashes ranged 2-15MHz at times, but smoothed-out toward the end of the test pass. Keep in mind that some of our frequency vs. time charts are not fully aligned, and idle periods will plot much higher frequencies (1600+) because we’ve disabled lower DPM states. The only relevant portion of the vs. time charts is when the test is active; we do not average for idle or load screen frequencies. We are also only showing the first test pass (generally around 200s worth of testing), not the entire test, as we’d have to align a lot more data for visualization, and the result is ultimately the same.

We will only be focusing on gaming today. Shaders may behave or engage differently in compute or production applications, where we’d anticipate potential for more of a swing than what we ultimately saw in gaming.

Test Platform

GN Test Bench 2017 Name Courtesy Of Cost
Video Card This is what we're testing - -
CPU Intel i7-7700K 4.2GHz locked GamersNexus $330
Memory Corsair Vengeance LPX 3200MHz Corsair -
Motherboard Gigabyte Aorus Gaming 7 Z270X Gigabyte $240
Power Supply NZXT 1200W HALE90 V2 NZXT $300
SSD Plextor M7V
Crucial 1TB
GamersNexus -
Case Top Deck Tech Station GamersNexus $250
CPU Cooler Asetek 570LC Asetek -

BIOS settings include C-states completely disabled with the CPU locked to 4.2GHz, so these results are not directly comparable to our tests at 4.5GHz. Memory is at XMP1. Tested using 17.8.2.

Vega 56 vs. 64 at Same Clocks: 3DMark FireStrike

v56 v64 firestrike frequency

(Note: The above misalignment is not factored toward the score or average frequency -- it's just a matter of 3DMark taking different lengths of time to load the next scene)

Here’s the clock comparison chart for 3DMark Firestrike at 1080p. As seen here, we’re roughly equal in clocks: Averaging the clock during actual tests, not between tests, we end up at 1585.6MHz for Vega 64 and 1583.5MHz for Vega 56, or a 0.1% offset. That’s close enough to be effectively the same. The ‘spiky’ behavior is during load screens between tests, and is not counted toward the score or average frequency.

v56 v64 3dm score

Averaged, our graphics score for Vega 64 lands at 22761, with Vega 56 at 22724. That is a difference of 0.16%, which coincides with our clock speed deficit on Vega 56 and is well within variance of 3DMark test execution. The scores are effectively the same between Vega 56 and Vega 64 when matching clock speeds, showing no advantage for higher shader count on Vega 64.

To put those scores into perspective, here are the FPS scores:

v56 v64 3dm fps

111.3FPS for Vega 64 and 111.27FPS for Vega 56, looking at GT1. GT2 puts us at 89.09 and 88.85 for 64 and 56, respectively. This difference is within our error and variance tolerances.

Vega 56 vs. 64 at Same Clocks: For Honor (4K & 1080p)

v56 v64 forhonor frequency

Just to make sure there’s not some major advantage in higher resolutions, we tested a few games at both 1080p and 4K. Here’s the clock chart for For Honor, where we see effectively equal clocks throughout the first clock validation pass. The average clock for Vega 64 was 1578.5MHz, with Vega 56 at 1580.4MHz, sticking to our 0.1% difference.

v56 v64 forhonor 4k

At 4K, we fall within our error margins and test-to-test variance. The Vega 56 card technically plots higher, but this is a statistically insignificant difference; that lead could just be because Vega 56 happened to hit a higher clock in a more abusive scene, or maybe something slightly different happened on-screen during that test pass, or maybe something to do with power delivery and chip quality. There is no statistically significant difference in For Honor at 4K.

v56 v64 forhonor 1080p

At 1080p, we see similarly insignificant differences: We’re at roughly 137FPS AVG for each GPU. The shaders provide no clear advantage in any of our three measured metrics.

Vega 56 vs. 64 at Same Clocks: Hellblade (4K & 1080p)

v56 v64 hellblade 4k

Hellblade at 4K and Very High had both the Vega 56 and 64 cards at around 34FPS AVG, with lows similarly matched. In this particular test, there is not an advantage from increased shader count.

v56 v64 hellblade 1080p

At 1080p, we see the same: We’re at about 81FPS AVG for each GPU.

Vega 56 vs. 64 at Same Clocks: Ashes of the Singularity

v56 v64 aots frequency

Ashes of the Singularity with Dx12 and GPU-focused testing lands an average V64 frequency of 1600.6MHz, with V56 averaging 1598.9MHz. Close enough.

v56 v64 aots 4k

Measured performance has Vega 64 at 58FPS AVG, with Vega 56 at 56.7FPS AVG. This is close enough to be within our error tolerances for this particular test. Lows are also largely the same.

Vega 56 vs. 64 at Same Clocks: Ghost Recon

v56 v64 grw 4k

Ghost Recon: Wildlands at 4K had Vega 56 and Vega 64 both operating at 42FPS AVG, with lows between 37 and 38FPS 1% low and roughly between 36 and 37FPS 0.1% lows.

v56 v64 grw 1080p

1080p exhibited similar performance behavior.

Conclusion

There might be applications where the shader difference is more noticeable, but it’s not any of these games. These games serve as an intended analog for other games, but we obviously can’t account for every scenario – there are likely instances where the shader difference emerges. We’d expect shader differences to become more visible in compute applications and production applications, but the focus for today was on gaming.

Difficulties in this type of testing included controlling for frequency on both the core and HBM, for thermals with liquid cooling, and for voltages by taking DMM measurements on the back of the card (to ensure the software was accurate). Performance in gaming is effectively the same once we equalize for shaders on V56 and V64; at least, with the games that we tested. It is possible that we become memory bound, but the trouble is determining whether a limit is encountered on the memory clock prior to differences emerging from the shaders. That will require additional testing. Our main V56 sample is presently hitting 980-990MHz on memory, but we will soon be pushing for 1100MHz. If we can hit 1100MHz on each, we'll consider a revisit. Until then, the shader differences aren't emerging with these tested clocks.

If Vega 56 is able to stick near the 1070’s price, it’s AMD’s strongest argument from the Vega line. The biggest downside is the boosted power consumption, but if that’s not a concern to you, Vega 56 is a good buy if assuming a similar price between them. Prices are so volatile right now that we’ll refrain from hard numbers, and just suggest checking that the cards are relatively close. We’d strongly encourage solving for thermals with an aftermarket cooler or a board partner card, then overclocking. Vega 56 can outmatch or equal Vega 64 with the right mods, including powerplay tables and BIOS mods. For these gaming workloads, the only reason Vega 56 would underperform versus Vega 64 is AMD’s power limit, which is higher on V64. You can fix that with a BIOS flash or registry mod.

As for the shaders, it looks like there’s not a big difference for the games we tested. There’s probably an application out there that likes the extra shaders, but for gamers, we’d say hard pass on Vega 64 and strongly consider Vega 56 as a highly modifiable counter.

Editorial: Steve Burke
Video: Andrew Coleman

Last modified on September 13, 2017 at 5:24 pm
Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge