Hardware Guides

MSI GTX 1080 Sea Hawk EK Tear-Down

By Published September 13, 2016 at 6:27 pm

Upon return home from PAX, we quickly noticed that the pile of boxes included an MSI GTX 1080 Sea Hawk EK graphics card, which use a pre-applied GPU water block for open loop cooling. This approach is more traditional and in-depth than what we've shown with the AIO / CLC solutions for GPUs, like what the EVGA GTX 1080 FTW Hybrid uses (review here).

The Sea Hawk EK ($783) partners with, obviously, EK WB for the liquid cooling solution, and uses a full coverage block atop a custom MSI PCB for cooling. The biggest difference in such a setup is coverage of the VRAM, MOSFETs, capacitor bank, and PWM. The acrylic is channeled out for the inductors, so their heat is not directly conducted to the water block; this would increase liquid temperature unnecessarily, anyway.

We won't be fully reviewing this card. It's just not within our time budget right now, and we'd have to build up a wet bench for testing open loop components; that said, we'll soon be testing other EK parts – the Predator, mostly – so keep an eye out for that. The Sea Hawk EK was sent by MSI before confirming our review schedule, so we decided to tear it apart while we've got it and see what's underneath.

As soon as the electrical contacts of a switch are joined from a switch depression, an electrical signal is dispatched within the mouse for processing by its internal components. That initial queue of processing helps rule-out potential spurious behavior, electromagnetic interference (or cross-talk), and performs any necessary calculations for the input command. If deemed an intentional user action, that input is sent down the USB cable (or transmitted wirelessly) to the system.

We discussed this process in our Logitech G900 Chaos Spectrum review. There's a misconception with users that wireless input devices are inherently slower than their wired counterparts, when the reality is that the opposite can be true – and is, with the G900 and G403 wireless. The recent PAX West 2016 event gave us an opportunity to get hands-on with the company's USB analyzer setup to demystify some of the wireless vs. wired mouse arguments.

EVGA GTX 1080 FTW PCB & Overclocking Analysis

By Published September 02, 2016 at 10:59 am

Buildzoid of “Actually Hardcore Overclocking” joined us to discuss the new EVGA GTX 1080 FTW PCB, as found on the Hybrid that we reviewed days ago. The PCB analysis goes into the power staging, and spends a few minutes explaining the 10-phase VRM, which is really a doubled 5-phase VRM. Amperage supported by the VRM and demanded by the GPU are also discussed, for folks curious about the power delivery capabilities of the FTW PCB, and so is the memory power staging.

If you're curious about the thermal solution of the EVGA FTW Hybrid, check out the review (page 1 & 3) for that. EVGA is somewhat uniquely cooling the VRAM by sinking it to a copper plate, then attaching that to the CLC coldplate. We say “somewhat” because Gigabyte also does this, and we hope to look at their unit soon.

EVGA GTX 1080 FTW Hybrid Tear-Down

By Published August 30, 2016 at 12:25 pm

The review is forthcoming – within a few hours – but we decided to tear-down EVGA's GTX 1080 FTW Hybrid ahead of the final review. The card is more advanced in its PCB and cooling solution than what we saw in the Corsair Hydro GFX / MSI Sea Hawk X tear-down, primarily because EVGA is deploying a Gigabyte-like coldplate that conducts thermals from the VRAM and to the CLC coldplate. It's an interesting fusion of cooling solutions, and one which makes GPU temperatures look higher than seems reasonable on the surface – prompting the tear-down – but is actually cooling multiple devices.

Anyway, here's a video of the tear-down process – photos to follow.

When we received the new 10-series laptops for review, we immediately noticed sluggishness in the OS just preparing the environment for testing. Even with an SSD, opening Windows Explorer took at least one full second – eternity, by today's standards. It was anything but instant, as a new computer should be, and would prompt outrage from any real-world consumer.

Looking further into the issue, we realized that the system tray accommodated 13 icons of pre-installed software that opened on launch. This included an incessant warranty registration pop-up/reminder, Norton Anti-Virus (the biggest offender on spurious CPU utilization), about three different control panels – because we need multiple paths to one location – and a few other programs.

This, traditionally, is what's known as “bloatware;” it's software pre-installed by the manufacturer that the user didn't necessarily request, and bloats the system's processes to a crawl. Today, we're showing just how profoundly a new system's framerate is dragged down by bloat. Using an MSI GE62VR Apache Pro laptop (~$1600) with a GTX 1060 and an i7-6700HQ CPU (boosts to 3.5GHz), 16GB DDR4, and an M.2 SSD, we're clearly not running Windows on slow hardware. And that's the thing, too – even Windows is slow at the desktop level. Just using the desktop, we'd occasionally spike to ~30% load for no good reason, and frequently hit 100% load during file transfers (thanks, Norton).

For validation purposes, we also ran the same tests on an MSI GE62 Apache Pro with a GTX 970M and i7 CPU. That's one last-gen model and one current model, both clean Windows installs with all the factory-preset software included.

We've got a new thermal paste applicator tool that'll help ensure consistent, equal spread of TIM across cooler surfaces for future tests. As we continue to iterate on "Hybrid" DIY builds, or even just re-use coolers for testing, we're also working to control for all reasonable variables in the test process. Our active ambient monitoring with thermocouple readers was the first step of that, and ensures that even minute (resolution 0.1C) fluctuations in ambient are accounted for in the results. Today, we're adding a new tool to the arsenal. This is a production tool used in Asetek's factory, and is deployed to apply that perfect circle of TIM that comes pre-applied to all the liquid cooler coldplates. By using the same application method on our end (rather than a tube of compound), we eliminate the chance of users changing application methods and eliminate the chance of applying too much or too little compound. These tools ensure exactly the same TIM spread each time, and mean that we can further eliminate variables in testing. That's especially important for regression testing.

This isn't something you use for home use, it is for production and test use. When cooling manufacturers often fight over half a degree of temperature advantage, it would be unfair to the products to not account for TIM application, which could easily create a 0.5C temperature swing. For consumers, that's irrelevant -- but we're showing a stack of products in direct head-to-head comparisons, and that needs to be an accurate stack.

The Titan X (Pascal) DIY “Hybrid” project has come to a close, and with that, we've reached our results phase. This project has yielded the most visible swings in clock performance that we've yet seen from a liquid cooling mod, and has revealed significant thermal throttling in the reference nVidia Titan XP design. What's more, this card will not feature the market saturation created by AIB partners with lower end cards, and so more advanced coolers do not seem to be available without going open loop or DIY.

Our liquid-cooled Titan X Pascal Hybrid has increased the card's non-overclocked frequency by an average of nearly 200MHz – again, pre-overclock – because we've removed the thermal throttle point. The card has also improved its clock-rate stability versus temperature and time, provable during our two-hour endurance run.


We've just finished testing the result of this build, and the results are equal parts exciting and intriguing – but that will be published following this content. We're still crunching data and making charts for part 3.

In the meantime, the tear-down of our reader's loaner Titan X (Pascal) GPU has resulted in relatively easy assembly with an EVGA Hybrid kit liquid cooler. The mounting points on the Titan XP are identical to a GTX 1080, and components can be used between the two cards almost completely interchangeably. The hole distance on the Titan XP is the same as the GTX 1080, which is the same as the 980 Ti, 1070, and very similar to the GTX 1060 (which has a different base plate).

Here's the new video of the Titan X build, if you missed it:

With thanks to GamersNexus viewer Sam, we were able to procure a loaner Titan X (Pascal) graphics card whilst visiting London. We were there for nVidia's GTX 10 Series laptop unveil anyway, and without being sampled the Titan X, this proved the best chance at getting hands-on.

The Titan X (Pascal) GP102-400 GPU runs warmer than the GTX 1080's GP104-400 chip, as we'll show in benchmarks in Part 3 of this series, but still shows promise as a fairly capable overclocker. We've already managed +175MHz offsets from core with the stock cooler, but want to improve clock-rate stability over time and versus thermals. The easiest way to do that – as we've found with the 1080 Hybrid, 1060 Hybrid, and 480 Hybrid – is to put the card under water cooling (or propylene glycol, anyway).

In this first part of our DIY Titan XP “Hybrid” build log, we'll tear-down the card to its bones and look at the PCB, cooling solution, and potential problem points for the liquid cooling build.

Here's the video, though separate notes and photos are below:

MSI has begun filling-in its X99A line of Broadwell-E motherboards with workstation-targeted options, built for compliance with ECC Registered DIMMs and with boosted maximum data throughput via M.2. The motherboard fits LGA2011-3 socketed CPUs, including Haswell-E and Broadwell-E, and supports SLI with nVidia Quadro GPUs for production workloads. Additional focus is placed on storage controllers and HSIO allocation, fitting for a board that will be deployed in workstation environments (e.g. render machines, CAD/ProE machines).

The X99A Workstation motherboard uses what appears to be an 8-phase power design for its core VRM, with additional phases for the memory. The VRM is comprised of titanium inductors with a max temperature of 220C, supporting higher current for extreme overclocks. Dark capacitors (solid caps) populate the board and VRM's capacitor bank, rated for a 10-year lifespan.

  VigLink badge