Steve Burke

Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

NVidia’s Titan Xp 2017 model video card was announced without any pre-briefing for us, marking it the second recent Titan X model card that took us by surprise on launch day. The Titan Xp, as it turns out, isn’t necessarily targeted at gaming – though it does still bear the GeForce GTX mark. NVidia’s Titan Xp followed the previous Titan X (that we called “Titan XP” to reduce confusion from the Titan X – Maxwell before that), and knocks the Titan X 2016 out of its $1200 price bracket.

The Titan Xp 2017 now firmly socketed into the $1200 category, we’ve got a gap between the GTX 1080 Ti at $700 MSRP ($750 common price) of $450-$500 to the TiXp. Even with that big of a gap, though, diminishing returns in gaming or consumer workloads are to be expected. Today, we’re benchmarking and reviewing the nVidia Titan Xp for gaming specifically, with additional thermal, power, and noise tests included. This card may be better deployed for neural net and deep learning applications, but that won’t stop enthusiasts from buying it simply to have “the best.” For them, we’d like to have some benchmarks online.

EVGA’s GTX 1080 Ti SC2 ($720) card uses the same ICX cooler that we reviewed back in February, where we intensely detailed how the new solution works (including information on the negative type thermistors and accuracy validation of those sensors). To get caught-up on ICX, we’d strongly recommend reading the first page of that review, and then maybe checking the thermal analysis for A/B testing versus ACX in an identical environment. As a fun add, we’re also A/B testing the faceplate – it’s got all those holes in it, so we thought we’d close them off and see if they actually help with cooling.

The fast version is basically this: EVGA, responding to concerns about ACX last year, decided to fully reinvent its flagship cooler to better monitor and cool power components in addition to the GPU component. The company did this by introducing NTC thermistors to its PCB, used for measuring GPU backside temperature (rather useless in a vacuum, but more of a validation thing when considering last year’s backplate testing), memory temperature, and power component temperature. There are thermistors placed adjacent to 5 MOSFETs, 3 memory modules, and the GPU backside. The thermistors are not embedded in the package, but placed close enough to get an accurate reading for thermals in each potential hotspot. We previously validated these thermistors versus our own thermocouples, finding that EVGA’s readings were accurate to reality.

Although this is absolutely a unique, innovative approach to GPU cooling – no one else does it, after all – we found its usefulness to primarily be relegated to noise output. After all, a dual-fan ACX cooler was already enough to keep the GPU cool (and FETs, with the help of some thermal pads), and ICX is still a dual-fan cooler. The ICX sensors primarily add a toy for enthusiasts to play with, as it won’t improve gaming performance in any meaningful way, though those enthusiasts could benefit from fine-tuning the fan curve to reduce VRM fan speeds. This would benefit in noise levels, as the VRM fan doesn’t need to spin all that fast (FETs can take ~125C heat before they start losing efficiency in any meaningful way), and so the GPU + VRM fans can spin asynchronously to help with the noise profile. Out of box, EVGA’s fan curve is a bit aggressive, we think – but we’ll talk about that later.

Thanks to GamersNexus reader ‘Grant,’ we were able to obtain a loaner nVidia Titan Xp (2017) card for review and thermal analysis. Grant purchased the card for machine learning and wanted to liquid cool the GPU, which happens to be something with which we’re well-versed. In the process, we’ll be reviewing the Titan Xp from a gaming standpoint, tearing it down, analyzing the PCB & VRM, and building it back into a liquid-cooled card. All the benchmarking is already done, but we’re opening our Titan Xp content string with a tear-down of the card.

Disassembling Founders Edition nVidia graphics cards tends to be a little more tool-intensive than most other GPU tear-downs. NVidia uses 2.0mm & 2.5mm Allen keys to secure the shroud to the baseplate, and then the baseplate to the PCB; additionally, a batch of ~16x 4mm hex heads socket through the PCB and into the baseplate, each of which hosts a small Phillips screw for the backplate.

The disassembly tutorial continues after this video version:

AMD’s taken a page out of nVidia’s book, apparently, and nVidia probably took that page from Apple – or any number of other companies that elect to re-use product names. The new Radeon Pro Duo uses the same name as last year’s launch, but has updated the internals.

The RX 580, as we learned in the review process, isn’t all that different from its origins in the RX 480. The primary difference is in voltage and frequency afforded to the GPU proper, with other changes manifesting in maturation of the process over the past year of manufacturing. This means most optimizations are relegated to power (when idle – not under load) and frequency headroom. Gains on the new cards are not from anything fancy – just driving more power through under load.

Still, we were curious as to whether AMD’s drivers would permit cross-RX series multi-GPU. We decided to throw an MSI RX 580 Gaming X and MSI RX 480 Gaming X into a configuration to get things close, then see what’d happen.

The short of it is that this works. There is no explicit inhibitor built in to forbid users from running CrossFire with RX 400 and RX 500 series cards, as long as you’re doing 470/570 or 480/580. The GPU is the same, and frequency will just be matched to the slowest card, for the most part.

We think this will be a common use case, too. It makes sense: If you’re a current owner of an RX 480 and have been considering CrossFire (though we didn’t necessarily recommend it in previous content), the RX 580 will make the most sense for a secondary GPU. Well, primary, really – but you get the idea. The RX 400 series cards will see EOL and cease production in short order, if not already, which means that prices will stagnate and then skyrocket. That’s just what retailers do. Buying a 580, then, makes far more sense if dying for a CrossFire configuration, and you could even move the 580 to the top slot for best performance in single-GPU scenarios.

  VigLink badge