Everyone talks game about how they don’t care about power consumption. We took that comment to the extreme, using a registry hack to give Vega 56 enough extra power to kill the card, if we wanted, and a Floe 360mm CLC to keep temperatures low enough that GPU diode reporting inaccuracies emerge. “I don’t care about power consumption, I just want performance” is now met with that – 100% more power and an overclock to 1742MHz core. We've got room to do 200% power, but things would start popping at that point. The Vega 56 Hybrid mod is our most modded version of the Hybrid series to date, and leverages powerplay table registry changes to provide that additional power headroom. This is an alternative to BIOS flashing, which is limited to signed drivers (like V64 on V56, though we had issues flashing V64L onto V56). Last we attempted it, a modified BIOS did not work. Powerplay tables do, though, and mean that we can modify power target to surpass V56’s artificial power limitation.
The limitation on power provisioned to the V56 core is, we believe, fully to prevent V56 from too easily outmatching V64 in performance. The card’s BIOS won’t allow greater than 300-308W down the PCIe cables natively, even though official BIOS versions for V64 cards can support 350~360W. The VRM itself easily sustains 360W, and we’ve tested it as handling 406W without a FET popping. 400W is probably pushing what’s reasonable, but to limit V56 to ~300W, when an additional 60W is fully within the capabilities of the VRM & GPU, is a means to cap V56 performance to a point of not competing with V64.
We fixed that.
AMD’s CU scaling has never been that impacting to performance – clock speed closes most gaps with AMD hardware. Even without the extra shaders of V64, we can outperform V64’s stock performance, and we’ll soon find out how we do versus V64’s overclocked performance. That’ll have to wait until after PAX, but it’s something we’re hoping to further study.
|GN Test Bench 2017||Name||Courtesy Of||Cost|
|Video Card||This is what we're testing||-||-|
|CPU||Intel i7-7700K 4.5GHz locked||GamersNexus||$330|
|Memory||Corsair Vengeance LPX 3200MHz||Corsair||-|
|Motherboard||Gigabyte Aorus Gaming 7 Z270X||Gigabyte||$240|
|Power Supply||NZXT 1200W HALE90 V2||NZXT||$300|
|Case||Top Deck Tech Station||GamersNexus||$250|
|CPU Cooler||Asetek 570LC||Asetek||-|
BIOS settings include C-states completely disabled with the CPU locked to 4.5GHz at 1.32 vCore. Memory is at XMP1.
Max Clock Stability vs. Power Limit
Given all the uncertainty surrounding the initial driver launch and limited voltage and broken clock reporting functionality, we first needed to check whether the powerplay tables were actually doing anything. Fortunately, the clock reporting was fixed in drivers 17.8.1 onward, something we showed in our latest livestream with the Hybrid 56, so we now need to isolate the power target. During the last livestream, we also checked voltage at the back of the card (rather than just trusting software) to better determine undervolting functionality with the latest drivers. Check that stream for more of the undervolting aspect.
Here’s a look at clock stability versus the power target, represented by PCIe power draw. Power consumption starts off at around 300W when we have a 50% offset (stock supported), a bit below what we saw on the reference Vega 56 card. We’ve reduced power leakage a bit here, so the 50% offset now comes down to 296W rather than 308W, a reduction of 4%. That doesn’t really do enough to get us much extra; regardless, that’s the starting point. The clock bounces around sporadically at this point, jumping between 1474MHz and 1702MHz (our target). Because the card’s power consumption is limited to about 300W, we can’t achieve stability at higher clocks without pushing power. Dithering clocks are a direct result of the card not getting enough power, and in this case, that’s because AMD set a hard cap on what V56 can pull.
We first go to 60% power, which gets us another 16W to work with in this workload. The clocks still aren’t stable, so we move to 70% power around the 240-second mark. That boosts us 33W over the original 50% offset, providing a clear boost to somewhat stabilize the frequency. At 80% power, we start pushing 350W through the PCIe cables, which largely helps stabilize the 1702MHz target. The next attempt is 1722MHz, which proves unstable, and requires a further jump to 90% power. This puts us at 360W-365W. 1732MHz is roughly stable at this same power draw. For 1742MHz, we boost to 95% power to fully stabilize and draw 370-380W.
The result is 70-80W more than reference, but a clock that stabilizes at 1742MHz rather than one that bounces between 1474 and 1702. This is better than we saw in the livestream, where we just blasted power at 406W down the PCIe cables to achieve stability the quickest. Our max clock in the stream was also 20MHz lower, so we’ve done better here. Instability began to emerge regularly at 1762MHz (even with 120% power), so we called it at 1742/980MHz for synthetic tests, eventually dropping down to 1732MHz for gaming tests.
V56 Hybrid Temperature Response
Here’s the temperature response to all of that. Core temperature pings 28C quickly, maxing out at 31C. This is not a delta T over ambient reading – ambient is about 24C here, but we know that our Vega 56 underreports its temperature by at least a few degrees. We can’t be sure just how much, unfortunately. Either way, we’re about 40-45C below the reported temperature on air. The MOSFET temperatures max out at 46C for the middle-right FET and 32C for the top FET, both of which were cooled by direct airflow from a high-RPM fan. For perspective, the reference card had FET temperatures of about 63C for the right FET and 73C for the top FET. We’ve managed to reduce MOSFET temperatures by blasting them with air, despite the 70-80W of extra power to the core.
V56 Comparative Thermals – Reference vs. Hybrid
Getting into comparative thermals, we see the RX Vega 56 reference card operates close to the 75C target, with our Hybrid mod operating a 25C (reported) GPU diode temperature. Overclocking to 1742MHz core and 980MHz HBM2, using a 105% power target, we push 30C GPU diode temperature. Note here that ambient temperature is about 23-24C during most of these tests and is logged actively, which means that Vega’s temperature reporting – on our card, at least – is not fully accurate. It is common for thermistors and thermocouples to have some level of inaccuracy, but this output does appear more impacted than even K-types, which have a +/-2.2C variance, in most cases. Under idle workloads, we’re sitting between 2-4C below ambient, which obviously is physically impossible with this kind of cooling setup. You’d need something chilled to actually achieve that, so the temperature reporting is incorrect to some degree.
Looking at power consumption at the wall, our RX Vega 56 Hybrid system pushes 465W when overclocked to 1742MHz core and 980MHz HBM2, an increase of 165W over the reference RX Vega stock card.
V56 Hybrid Power Consumption at the Rails
This chart will be quick: This is what we used when doing our initial undervolting – that sporadic orange line is from fighting with the software – and shows our range of PCIe consumption being roughly 180W to 355W. We were able to wrangle the Hybrid down to 355W in this particular test, done by dropping to a 95% power target. The drop proved stable for FireStrike, though it was ultimately increased a bit more for gaming. Keep in mind that power consumption varies based on scene and benchmark used, so you’ll see different results based on which test we’re actively showing.
Power Consumption at the Wall
Big note, here: We’ve got a ton more fans hooked up, which means more power consumption at the wall. That’s why the at-rails measurements above are important – they ignore other system variable changes, like the extra fans and pump for the V56 Hybrid.
For some quick FireStrike numbers at the wall – again, we’re switching back to wall draw – we see total system power at 447W for the Hybrid OC V56, nearly tying it with CrossFire RX 580 and 480 cards. The stock Vega 64 system draws 372W – so we’ve got a 75W increase – and stock V56 system draws 303W. This aligns with our reported ~60W advantage to V64 out-of-box, despite using the same VRM. A GTX 1080 Ti system draws 347W, or 100W less than the V56 Hybrid OC. Efficiency is not in our favor with this mod – but it’s also not the point. We’re spending a lot of power to keep this thing cool.
A gaming workload (Ghost Recon) places our V56 mod’s system power consumption ahead of the nVidia Titan XP by about 50W, maintaining the same ~70W offset from the stock V56.
Here’s For Honor, just to show another game.
Similar scaling to GRW.
And idle at desktop, which shows increased power consumption both from GPU modifications and from adding more fans to the system: