Laptop reviewing and benchmarking comes with a unique challenge: We don’t typically get to hang onto review samples once the cycle is complete, unlike other review products, which limits regression testing for content like today’s. This means that we need to rely on some of our older testing and methodology, but we can still judge scaling based on old games – that should be mostly linear, with some exceptions (which we’ve accounted for in our summary of tests).

Fortunately, the upshot of revisiting older titles for comparative analysis is that those titles do not change. They don’t get updates to game code and they don’t get driver updates, so results should largely exist in a hermetically sealed state.

Regardless, today’s goal is to benchmark the GTX 1050 Ti notebook GPU. We still have a lot of work to do on notebooks as we work to rebuild our bench, but this will start us off. The GTX 1070 is next. We’re starting with an MSI GE72 7RE Apache Pro with GTX 1050 Ti and i7-7700HQ CPU. This isn’t a review of the GE72 – that’s upcoming – but just a GPU benchmark to help determine scaling and placement of the 1050 Ti against other notebook GPUs.

Our review of the notebook is forthcoming, as are a few feature benchmark pieces. It’ll be interesting stuff, as we’ve got some key things to point out with this one. Be sure to follow or subscribe to catch that. For today, let’s get into the 1050 Ti notebook benchmarks.

NVidia’s Volta GV100 GPU and Tesla V100 Accelerator were revealed yesterday, delivering on a 2015 promise of Volta arrival by 2018. The initial DGX servers will ship by 3Q17, containing multiple V100 Accelerator cards at a cost of $150,000, with individual units priced at $18,000. These devices are obviously for enterprise, machine learning, and compute applications, but will inevitably work their way into gaming through subsequent V102 (or equivalent) chips. This is similar to the GP100 launch, where we get the Accelerator server-class card prior to consumer availability, which ultimately helps consumers by recuperating some of the initial R&D cost through major B2B sales.

The EVGA GTX 1080 Ti FTW3 is the company’s attempt at a 3-fan cooler, entering EVGA into the three-fan ranks alongside ASUS, Gigabyte, and MSI. The difference with EVGA’s card, though, is that it’s a two-slot design; board partners have gone with a “bigger is better” mentality for the 1080 Ti, and it’s not necessarily advantageous. Sure, there are benefits – taller cards mean taller fans, like on the Gaming X, which results in slower rotation of fans without sacrificing volume of air moved. It follows then that taller fans on taller cards could be profiled to run quieter, without necessarily sacrificing thermal performance of the GPU, VRM, and VRAM components.

But we’re testing today to see how all that plays out in reality. In our EVGA GTX 1080 Ti FTW3 review, we benchmark the card vs. EVGA’s own SC2, MSI’s 1080 Ti Gaming X, Gigabyte’s Xtreme Aorus, and the Founders Edition card. Each of these also has an individual review posted, if you’re looking for break-outs on any one device. See the following links for those (listed in order of publication):

It’s Not About Gaming Performance

Having reviewed this many cards in the past few weeks, it should be apparent to everyone that same-GPU cards aren’t really differentiated by gaming performance. Gaming performance is going to be within a few percentage points of all devices, no matter what, because they’re ultimately governed by the GPU. A manufacturer can throw the world’s best PCB, VRM, and cooler together, and it’s still going to hit a Pascal wall of voltage and power budget. Further, chip quality dictates performance in greater ways than PCB or VRM will. We have duplicates of most of our cards, and they can perform 1-3% apart from one another, depending on which boosts higher out-of-box.

Our Titan Xp Hybrid mod is done, soon to be shipped back to its owner in its new condition. Liquid cooling mods in the past have served as a means to better understand where a GPU could perform given a good cooler, and are often conducted on cards with reference coolers. The Titan Xp won’t have AIB partner cooler models, and so building a Hybrid card gives us a glimpse into what could have been.

It’s also not a hard mod to do – an hour tops, maybe a bit more for those who are more hesitant – and costs $100 for the Hybrid kit. Against the $1200 purchase for the card, that’s not a tall order.

In today’s benchmarks and conclusion of the Titan Xp Hybrid mod, we’ll cover thermals and noise levels extensively, overclocking, and throw in some gaming benchmarks.

 

We just posted our second part of the Titan Xp Hybrid mod, detailing the build-up process for adding CLCs to the Titan Xp. The process is identical to the one we detailed for the GTX 1080 Ti FE card, since the PCB is effectively equal between the two devices.

For this build, we added thermocouples to the VRAM and VRM components to try and determine if Hybrid mods help or hurt VRAM temperatures (and, with that part of testing done, we have some interesting results). Final testing and benchmarking is being run now, with plans to publish by Monday.

In the meantime, check out part 2 below:

NVidia’s Titan Xp 2017 model video card was announced without any pre-briefing for us, marking it the second recent Titan X model card that took us by surprise on launch day. The Titan Xp, as it turns out, isn’t necessarily targeted at gaming – though it does still bear the GeForce GTX mark. NVidia’s Titan Xp followed the previous Titan X (that we called “Titan XP” to reduce confusion from the Titan X – Maxwell before that), and knocks the Titan X 2016 out of its $1200 price bracket.

The Titan Xp 2017 now firmly socketed into the $1200 category, we’ve got a gap between the GTX 1080 Ti at $700 MSRP ($750 common price) of $450-$500 to the TiXp. Even with that big of a gap, though, diminishing returns in gaming or consumer workloads are to be expected. Today, we’re benchmarking and reviewing the nVidia Titan Xp for gaming specifically, with additional thermal, power, and noise tests included. This card may be better deployed for neural net and deep learning applications, but that won’t stop enthusiasts from buying it simply to have “the best.” For them, we’d like to have some benchmarks online.

Thanks to GamersNexus reader ‘Grant,’ we were able to obtain a loaner nVidia Titan Xp (2017) card for review and thermal analysis. Grant purchased the card for machine learning and wanted to liquid cool the GPU, which happens to be something with which we’re well-versed. In the process, we’ll be reviewing the Titan Xp from a gaming standpoint, tearing it down, analyzing the PCB & VRM, and building it back into a liquid-cooled card. All the benchmarking is already done, but we’re opening our Titan Xp content string with a tear-down of the card.

Disassembling Founders Edition nVidia graphics cards tends to be a little more tool-intensive than most other GPU tear-downs. NVidia uses 2.0mm & 2.5mm Allen keys to secure the shroud to the baseplate, and then the baseplate to the PCB; additionally, a batch of ~16x 4mm hex heads socket through the PCB and into the baseplate, each of which hosts a small Phillips screw for the backplate.

The disassembly tutorial continues after this video version:

AMD’s got a new strategy: Don’t give anyone time to blink between product launches. The company’s been firing off round after round of products for the past month, starting with Ryzen 7, then Ryzen 5, and now Polaris Refresh. The product cannon will eventually be reloaded with Vega, but that’s not for today.

The RX 500 series officially arrives to market today, primarily carried in on the backs of the RX 580 and RX 570 Polaris 10 GPUs. From an architectural perspective, there’s nothing new – if you know Polaris and the RX 400 series, you know the RX 500 series. This is not an exciting, bombastic launch that requires delving into some unexplored arch; in fact, our original RX 480 review heavily detailed Polaris architecture, and that’s all relevant information to today’s RX 580 launch. If you’re not up to speed on Polaris, our review from last year is a good place to start (though the numbers are now out of date, the information is still accurate).

Both the RX 580 and RX 570 will be available as of this article’s publication. The RX 580 we’re reviewing should be listed here once retailer embargo lifts, with our RX 570 model posting here. Our RX 570 review goes live tomorrow. We’re spacing them out to allow for better per-card depth, having just come off of a series of 1080 Ti reviews (Xtreme, Gaming X).

Our Gigabyte GTX 1080 Ti Aorus Xtreme ($750) review brings us to look at one of the largest video cards in the 1080 Ti family, matching it well versus the MSI 1080 Ti Gaming X. Our tests today will look at the Aorus Xtreme GPU in thermals (most heavily), noise levels, gaming performance, and overclocking, with particular interest in the efficacy of Gigabyte’s copper insert in the backplate. The Gigabyte Aorus Xtreme is a heavyweight in all departments – size being one of them – and is priced at $750, matching the MSI Gaming X directly. A major point of differentiation is the bigger focus on RGB LEDs with Gigabyte’s model, though the three-fan design is also interesting from a thermal and noise perspective. We’ll look at that more on page 3.

We’ve already posted a tear-down of this card (and friend of the site ‘Buildzoid’ has posted his PCB analysis), but we’ll recap some of the PCB and cooler basics on this first page. The card uses a 3-fan cooler (with smaller fans than the Gaming X-type cards, but more of them) and large aluminum heatsink, ultimately taking up nearly 3 PCI-e slots. It’s the same GPU and memory underneath as all other GTX 1080 Ti cards, with differences primarily in the cooling and power management departments. Clock, of course, does have some pre-OC applied to help boost over the reference model. Gigabyte is shipping the Xtreme variant of the 1080 Ti at 1632/1746MHz (OC mode) or 1607/1721 (gaming mode), toggleable through software if not manually overclocking.

On the heels of the media world referring to the Titan X (Pascal) as Titan XP – mostly to reduce confusion versus the previous Titan X – nVidia today announced its actual Titan Xp (lowercase ‘p,’ very important) successor to the Titan XP. Lest Titan X, Titan X, and Titan X be too confusing, we’ll be referring to these as Titan XM [Maxwell], Titan X (Pascal), and Titan Xp. We really should apologize to Nintendo for making fun of their naming scheme, as nVidia seems to now be in competition; next, we’ll have the New Titan Xp (early 2017).

Someone at nVidia is giddy over taking the world’s Titan XP name and changing it, we’re sure.

Page 1 of 21

  VigLink badge