We’ve already reviewed the reference RX 5700 XT, the Sapphire Pulse model that we received as a frontrunner for board partners, and the MSI Evoke OC, which was overall poor value when compared to cheaper solutions on the market. Now, we’re looking at the Gigabyte Gaming OC card, the first triple-axial fan contender in our RX 5700 series benchmarks. It all comes down to thermals and noise with these, as gaming performance is functionally the same between each card when stock, and so we’ll focus-down on if Gigabyte can achieve competitive thermals at equivalent noise to the Sapphire and MSI cards. At $420 MSRP, the Gigabyte Gaming OC is priced around the Pulse and just below the Evoke.

For this testing, we’re focusing most heavily on thermals and noise. As a reminder, and we’ll say this in the conclusion as well, board partner cards ultimately come down to thermal and acoustic differences. There are a few that have extreme impact on XOC capabilities, like a KINGPIN or HOF card, but none of the 5700 XTs that we’ve yet looked at focus on that market. Gaming benchmarks become unimportant, as performance remains unchanged in nearly all partner designs. The one exception is the Evoke OC, which posted roughly a +2% performance advantage over baseline reference/Pulse performance, but otherwise, expect no meaningful or even measurable performance impact in gaming from these cards. The biggest gain is in massive quality of life improvements, such as reduced noise levels (in a noticeable way), reduced thermals, features like extra VBIOS on-board (the Pulse has this), or fitment advantages.

Silicon quality and the so-called silicon lottery are often discussed in the industry, but it’s rare for anyone to have enough sample size to actually demonstrate what those phrases mean in practice. We asked Gigabyte to loan us as many of a single model of video card as they could so that we could demonstrate the frequency variance card-to-card at stock, the variations in overclocking headroom, and actual gaming performance differences from one card to the next. This helps to more definitively strike at the question of how much silicon quality can impact a GPU’s performance, particularly when stock, and also looks at memory overclocking and range of FPS in gaming benchmarks with a highly controlled bench and a ton of test passes per device. Finally, we can see the theory of how much one reviewer’s GPU might vary from another’s when running initial review testing.

As we board another plane, just five days since landing home from Taipei, we're recapping news leading into next week's E3 event, positioned exhaustingly close to Computex. This recap talks AMD and Samsung partnerships on GPUs, Apple's $1000 monitor stand and accompanying cheese grater, and the Radeon Vega II dual-GPUs located therein. We also talk tariff impact on pricing in PC hardware and, as an exclusive story for the video version, we talk about the fake "X499" motherboard at Computex 2019.

Show notes below the video embed.

Today’s benchmark is a case study by the truest definition of the phrase: We are benchmarking a single sample, overweight video card to test the performance impact of its severe sag. The Gigabyte GTX 1080 Ti Xtreme was poorly received by our outlet when we reviewed it in 2017, primarily for its needlessly large size that amounted to worse thermal and acoustic performance than smaller, cheaper competitors. The card is heavy and constructed using through-bolts and complicated assortments of hardware, whereas competition achieved smaller, more effective designs that didn’t sag.

As is tradition, we put the GTX 1080 Ti Xtreme in one of our production machines alongside all of the other worst hardware we worked with, and so the 1080 Ti Xtreme was in use in a “real” system for about a year. That amount of time has allowed nature – mostly gravity – to take its course, and so the passage of time has slowly pulled the 1080 Ti Xtreme apart. Now, after a year of forced labor in our oldest rendering rig, we get to see the real side-effects of a needlessly heavy card that’s poorly reinforced internally. We’ll be testing the impact of GPU sag in today’s content.

We previously deep-dived on MCE (Multi-Core Enhancement) practices with the 8700K, revealing the performance variance that can occur when motherboard makers “cheat” results by boosting CPUs out of spec. MCE has become less of a problem with Z390 – namely because it is now disabled by default on all boards we’ve tested – but boosted BCLKs are the new issue.

If you think Cinebench is a reliable benchmark, we’ve got a histogram of all of our test results for the Intel i9-9900K at presumably stock settings:

1 z390 motherboard differences cinebench histogram

(Yes, the scale starts at non-0 -- given a range of results of 1976 to 2300, we had to zoom-in on the axis for a better histogram view)

The scale is shrunken and non-0 as the results are so tightly clustered, but you can still see that we’re ranging from 1970 cb marks to 2300 cb marks, which is a massive range. That’s the difference between a heavily overclocked R7 2700 and an overclocked 7900X, except this is all on a single CPU. The only difference is that we used 5 different motherboards for these tests, along with a mix of auto, XMP, and MCE settings. The discussion today focuses on when it is considered “cheating” to modify CPU settings via BIOS without the user’s awareness of those changes. The most common change is to the base clock, where BIOS might report a value of 100.00, but actually produce a value of 100.8 or 100.9 on the CPU. This functionally pre-overclocks it, but does so in a way that is hard for most users to ever notice.

We’re at PAX West 2018 for just one day this year (primarily for a discussion panel), but stopped by the Gigabyte booth for a hands-on with the new RTX cards. As with most other manufacturers, these cards aren’t 100% finalized yet, although they do have some near-final cooling designs. The models shown today appear to use the reference PCB design with hand-made elements to the coolers, as partners had limited time to prepare. Gigabyte expects to have custom PCB solutions at a later date.

There’s a new trend in the industry: Heatsinks. Hopefully, anyway.

Gigabyte has listened to our never-ending complaints about VRM heatsinks and VRM thermals, and outfitted their X470 Gaming 7 motherboard with a full, proper fin stack and heatpipe. We’re happy to see it, and we hope that this trend continues, but it’s also not entirely necessary on this board. That doesn’t make us less excited to see an actual heatsink on a motherboard; however, we believe it does potentially point toward a future in higher core-count Ryzen CPUs. This is something that Buildzoid speculated in our recent Gaming 7 X470 VRM & PCB analysis. The amount of “overkill” power delivery capabilities on high-end X470 boards would suggest plans to support higher power consumption components from AMD.

Take the Gigabyte Gaming 7: It’s a 10+2-phase VRM, with the VCore VRM using IR3553s for 40A power stages. That alone is enough to run passive, but a heatsink drags temperature so far below requirements of operating spec that there’s room to spare. Cooler is always better in this instance (insofar as ambient cooling, anyway), so we can’t complain, but we can speculate about why it’s been done this way. ASUS’ Crosshair VII Hero has the same VRM, but with 60A power stages. That board, like Gigabyte’s, could run with no heatsink and be fine.

We tested with thermocouples placed on one top-side MOSFET, located adjacent to the SOC VRM MOSFETs (1.2V SOC), and one left-side MOSFET that’s centrally positioned. Our testing included stock and overclocked testing (4.2GHz/1.41VCore at Extreme LLC), then further tested with the heatsink removed entirely. By design, this test had no active airflow over the VRM components. Ambient was controlled during the test and was logged every second.

GamersNexus secured an early exclusive with the new Gigabyte Gaming 7 motherboard at CES 2018, equipped with what one could confidently assume is an AMD X470 chipset. Given information from AMD on launch timelines, it would also be reasonable to assume that the new motherboards can be expected for roughly April of this year, alongside AMD’s Ryzen CPU refresh. This is all information learned from AMD’s public data. As for the Gigabyte Gaming 7 motherboard, the first thing we noticed is that it has real heatsinks on the VRMs, and that it’s actually running what appears to be a higher-end configuration for what we would assume is the new Ryzen launch.

Starting with the heatsink, Gigabyte has taken pride in listening to media and community concerns about VRM heatsinks, and has now added an actual finstack atop its 10-phase Vcore VRM. To give an idea, we saw significant performance improvement on the EVGA X299 DARK motherboard with just the finned heatsinks, not even using the built-in fans. It’s upwards of 20 degrees Celsius improvement over the fat blocks, in some cases, since the blocks don’t provide any surface area.

Having gone over the best CPUs, cases, some motherboards, and soon coolers, we’re now looking at the best GTX 1080 Tis of the year. Contrary to popular belief, the model of cooler does actually matter for video cards. We’ll be going through thermal and noise data for a few of the 1080 Tis we’ve tested this year, including MOSFET, VRAM, and GPU temperatures, noise-normalized performance at 40dBA, and the PCB and VRM quality. As always with these guides, you can find links to all products discussed in the description below.

Rounding-up the GTX 1080 Tis means that we’re primarily going to be focused on cooler and PCB build quality: Noise, noise-normalized thermals, thermals, and VRM design are the forefront of competition among same-GPU parts. Ultimately, as far as gaming and overclocking performance, much of that is going to be dictated by silicon-level quality variance, and that’s nearly random. For that reason, we must differentiate board partner GPUs with thermals, noise, and potential for low-thermal overclocking (quality VRMs).

Today, we’re rounding-up the best GTX 1080 Ti graphics cards that we’ve reviewed this year, including categories of Best Overall, Best for Modding, Best Value, Best Technology, and Best PCB. Gaming performance is functionally the same on all of them, as silicon variance is the larger dictator of performance, with thermals being the next governor of performance; after all, a Pascal GPU under 60C is a higher-clocked, happier Pascal GPU, and that’ll lead framerate more than advertised clocks will.

Gigabyte is releasing security updates for Intel motherboards making use of Intel ME (Management Engine) and TXE (Trusted Execution Engine). The first batch of updates will be for Z370 and 200-series boards, with older generations following. Gigabyte will be supplying patched BIOS versions as well as the latest Intel ME and TXE drivers on their website.

Gigabyte’s announcement follows a notice from the Intel Security Center about “security vulnerabilities that could potentially place impacted platforms at risk.” These vulnerabilities have to do with MINIX, a lightweight OS designed by educator Andrew Tanenbaum (as discussed in this week’s HW News), and its use in Intel’s ME. As reported by Tom’s Hardware earlier this month, a Google team led by software engineer Ron Minnich is responsible for uncovering MINIX’s role in the ME and expressing their concerns in a presentation bluntly titled “Replace your exploit-ridden firmware with a Linux kernel.”

Page 1 of 5

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge