2xSLI PNY GTX 980s vs. Titan X 4K Gaming Benchmark

By Published April 16, 2015 at 10:43 am

NVidia's latest addition to the Titan family diverges from its predecessors' market objectives. Previous Titan cards were fully double-precision enabled, ensuring marketability as affordable production and simulation cards that, by nature, also served reasonably as gaming cards. Because double-precision is detrimental to gaming performance, the original Titan and current Titan Z can be set to “single-precision mode” to better game, but aren't targeted as the “best gaming video card” out there. The Titan X is; in fact, that's exactly what nVidia calls it – the best single-GPU on the market. The selection of these words is intentional, ruling-out dual-GPU single cards (like the 295X2 or 690) and multi-card configurations (like what we're testing today).

Because the Titan X is heavily marketed as a gaming solution, something reinforced by offering just 1/32 of SP in DP performance, we decided to perform a value comparison between 2xGTX 980s in SLI. The SLI configuration offers indisputably powerful raw computational output, but has a smaller memory capacity than the Titan X's 12GB single-GPU pool.

The specificity of this benchmark effectively assumes that the reader is already nearing a point of contention between the Titan X and its alternatives. In an act of curiosity, we benchmarked the new GTX Titan X 12GB single-card against the performance and power consumption of 2xPNY GTX 980s in SLI. The test analyzes the value provided by each choice, then assists in determining which configuration is the best fit for users demanding an nVidia solution at the high-end. We do not analyze competing cards in this article due to the nature of the comparison – we're assuming you already want either 2x980s or a single Titan X. This test may be revisited in a competitive analysis as AMD continues to iterate on its graphics lineup and introduce multi-GPU, single-card solutions.

Note that this is a single GPU vs. dual-GPU comparison, so there should be a large performance difference expected.

Titan X & GTX 980 Specs

  GTX Titan X GTX 980
PNY GTX 980 XLR8 Pro
GPU GM200 GM204 GM204
Fab Process 28nm 28nm 28nm
Texture Filter Rate
192GT/s  144.1GT/s 144.1GT/s
TjMax 91C  95C 95C
Transistor Count 8B  5.2B 5.2B
ROPs  96 64 64
TMUs  192 128 128
CUDA Cores 3072  2048 2048
Base Clock (GPU) 1000MHz  1126MHz 1228MHz
Boost Clock (GPU) 1075MHz  1216MHz 1329MHz
GDDR5 Memory /
Memory Interface
12GB / 384-bit 4GB / 256-bit  4GB / 256-bit
Memory Bandwidth (GPU) 336.5GB/s  224GB/s 224GB/s
Mem Speed  7Gbps 7Gbps  7Gbps 
Power  1x8-pin
2x6-pin  2x6-pin
TDP  250W 165W  165-180W
Output 3xDisplayPort
1xHDMI 2.0
1xDual-Link DVI
HDMI 2.0
3xDisplayPort 1.2
MSRP $1000 $550 $590

Maxwell made its official debut with the 750 Ti, but didn't see its full feature set unveiled until the GTX 980 and its GM204 chip. Titan X uses identical architecture to the GTX 980's GM204 on its GM200, but increases a few core specifications (as above) in critical measures. The transistor count of the GM200 GPU sees a 42% increase over GM204, climbing from 5.2B to 8B transistors. Raster output units (ROPs) also see a meaningful jump, moving from 64 to 96 units on the single GM200 GPU. Perhaps most critically, the GM200 hosts 3072 CUDA Cores and supports 12GB of on-card GDDR5 memory, using a 384-bit interface. All calculated, memory bandwidth sits at a massive 336.5GB/s.

TDP is also worth paying careful attention: The Titan X boasts a TDP of just 250W while the 980 suggests ~165-180W TDP (single-card).

Priced at roughly $1000, we pitted the Titan X against two of PNY's $590 GTX 980 XLR8 Pro OC video cards. The PNY 980s have a combined cost of roughly $1180, but a current $550 sale puts them at $1100.

NVIDIA & AMD Both Refocusing on Multiple Graphics Cards

We've historically shown preferential selection toward single-card configurations for most gaming environments. Until recently, few games adequately supported SLI and CrossFire configurations, the drivers were less optimized than for a single video card, and gamers ultimately had to disable SLI or CrossFire in instances where the technology was not supported. Aside from these issues, single-device configurations are simpler and often consumer less power, but also make for more versatility in component selection.

There's often an argument that “I'll buy one now and another one in a few years,” but we've found that multi-card configurations really don't work this way; wait too long, in many cases, and the device's EOL will ensure it becomes difficult or offensively expensive to pick-up another.

Still, nVidia and AMD have both heavily pushed for SLI and CrossFire support in recent driver updates. The two companies have dedicated no small amount of time to talking multi-card configurations in press conferences. As game developers expand support (even if as a result of nudges from manufacturers) and drivers increment SLI or CrossFire performance, the technology finally appears viable.

Test Methodology

We tested using our updated 2015 GPU test bench, detailed in the table below. Our thanks to supporting hardware vendors for supplying some of the test components.

The latest 350.12 GeForce driver was used during testing. Game settings were manually controlled for the DUT. Overclocking was neither applied nor tested, though stock overclocks (“superclocks”) were left untouched.

VRAM utilization was measured using in-game tools (when present) and then validated with MSI's Afterburner, a custom version of the Riva Tuner software. Parity checking was performed with GPU-Z. FPS measurements were taken using FRAPS and then analyzed with FRAFS.

FPS logs were analyzed using FRAFS, then added to an internal spreadsheet for further analysis.

Each game was tested for 30 seconds in an identical scenario on the two cards, then repeated for parity.

GN Test Bench 2015 Name Courtesy Of Cost
Video Card

GTX Titan X 12GB
2x PNY GTX 980 SLI

CPU Intel i7-4790K CPU CyberPower
Memory 32GB 2133MHz HyperX Savage RAM Kingston Tech. $300
Motherboard Gigabyte Z97X Gaming G1 GamersNexus $285
Power Supply NZXT 1200W HALE90 V2 NZXT $300
SSD HyperX Predator PCI-e SSD Kingston Tech. TBD
Case Top Deck Tech Station GamersNexus $250
CPU Cooler Be Quiet! Dark Rock 3 Be Quiet! ~$60

Average FPS, 1% low, and 0.1% low times are measured. We do not measure maximum or minimum FPS results as we consider these numbers to be pure outliers. Instead, we take an average of the lowest 1% of results (1% low) to show real-world, noticeable dips; we then take an average of the lowest 0.1% of results for severe spikes.

3DMark FireStrike Scoring – Ultra (4K) & Extreme (1440p)

The first chart we're showing uses Futuremark's 3DMark FireStrike benchmark. This utility is a synthetic benchmarking platform that tests video cards in an easily-replicated environment. Each card is placed under stress using various real-world implemented video game graphics and physics effects. All the framerates (in FPS) are combined and computed into a 3DMark score, shown below:


Continue on for the more relatable FPS numbers.

3DMark FireStrike FPS – Ultra (4K) & Extreme (1440p)

At 1440p, FireStrike Extreme showcases effectively identical physics performance but large gains in graphics FPS. The biggest disparity is shown between the Titan X and SLI 980s during “Graphics Test 1,” where the 2x980s trounce the Titan X ~34FPS to ~22FPS.


FireStrike Ultra evokes greater memory saturation with its 4K rendering, but still favors the 2x980 configuration. Combined and physics performance are largely uninteresting, though graphics performance sees gaps greater than 20FPS.


GTA V 4K FPS Benchmark – SLI GTX 980s vs. Titan X

Here's a chart we pulled from our GTA V benchmark that was just conducted:


Although the Titan X offers an indisputably powerful FPS for a single video card, two GTX 980s outperform it handily in GTA V.

Far Cry 4 4K FPS Benchmark – SLI GTX 980s vs. Titan X

At 4K resolution, the Titan X falls behind enough that we'd have to slightly lower settings if we wanted to assure fluid playability at a higher framerate. The 2x980 configuration offers powerful output at a more-than-acceptable FPS, but occasionally exhibits texture and world environment flickering. The flickering is jarring enough to warrant disabling SLI in some in-game locales.


Assassin's Creed Unity 4K Benchmark – SLI GTX 980s vs. Titan X

This is exactly what we discussed above. In SLI, ACU refuses to work in any measurable regard. We tweaked various settings through the files and were unable to establish system stability.


Regardless of performance gains, the biggest ding to SLI's performance gains is the requirement to disable one of the two cards.

GRID: Autosport 4K Benchmark – SLI GTX 980s vs. Titan X

Performance in GRID: Autosport – which caps close to 2GB in VRAM consumption – is effectively identical. The Titan X has a slight lead in AVG FPS, but this is within margin of error.


Battlefield: Hardline 4K Benchmark – 2x980s vs. Titan X FPS

1% low and 0.1% low FPS are not extrapolated from our Battlefield: Hardline data due to software-side restrictions on our usual test suite. We use MSI Afterburner for Hardline FPS logging and have not yet implemented a means to analyze data in an equivalent fashion to our other tests. For this reason, we're only showing average FPS.


The Titan X trails behind at below 50FPS when running BFHL at 4K resolution, with the 2x980 configuration pushing 63FPS – more than acceptable.

Metro: Last Light 4K Benchmark – 2x980s vs. Titan X


Metro: Last Light is among the most prolific video card benchmarking platforms. This real-world, actual game provides useful insight to heavily driver-optimized performance using different hardware configurations.

At 4K, the difference is noticeable enough to make an impact on playability (~60 FPS vs. ~50 FPS), with the 2x980s leading the way again.

Power Consumption – Maximum Combined Peak Load

The below chart represents peak system load using each device. This is measured using monitoring hardware that rests between the outlet and the system, which logs power consumption as the GPU is placed under 100% load from synthetic FireStrike testing. Note well that the number displayed is not the TDP or power consumption of the video card itself, but rather an identical system with swapped cards.


The Titan X draws significantly less power, sitting at almost 100W lower demand than the 2x980 SLI configuration. This is of little consequence to most gamers, but could be important to power-conscious users with specific thermal or power demands.

Given that a very high-end bench consumes just 454W with 2x980s, it's tough to make a power argument against either card other than in very particular use case scenarios.

Conclusion: Titan X or 2xGTX 980s?

SLI has its downsides, but a lot of that falls upon the game developers and publishers. The 2x980 configuration excels in almost all tests above, sans software-bottlenecked performance in GRID, but still isn't perfect; both titles we complain about, ACU and FC4, are published by the same studio and exhibit consistent SLI compatibility issues. On our platform, it is necessary to disable SLI in order to play ACU. FC4 was playable, but took some time to resolve itself of its issues.

These two hiccups hurt the argument for multi-card solutions, but as long as the majority of games you're interested in will support the tech, it's still worth considering. Overall performance is exceptional on the system using the two PNY GTX 980s, often superseding the Titan X by measurable leaps and bounds. Power draw is also noticeably higher with the two 980s and the memory pool is smaller, so scalability becomes questionable as games continue to demand greater VRAM capacities at higher resolutions. For the immediate future, though, the 980s appear the victor.

As for the increased VRAM, playing GTA V with MSAA enabled and a 4K resolution will exceed the 4GB memory pool. Although there are little noticeable dips in performance with this title, it won't remain that way forever. At UHD resolutions and with filter settings enabled, the 4GB pool is rapidly consumed by the increased pixel count and filtration.

A single Titan X would also be worthwhile if determined to allow room for additional expansion cards, limited on PCI-e lanes, or if concerned about SLI compatibility (which is still an issue, as highlighted).

- Steve “Lelldorianx” Burke.

Last modified on April 16, 2015 at 10:43 am
Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.


  VigLink badge