Game Benchmarks

AMD was first-to-market with Doom-ready drivers, but exhibited exceptionally poor performance with a few of its cards. The R9 390X was one of those, being outperformed massively (~40%) by the GTX 970, and nearly matched by the GTX 960 at 1080p. If it's not apparent by the price difference between the two, that's unacceptable; the hardware of the R9 390X should effortlessly outperform the GTX 960, a budget-class card, and it just wasn't happening. Shortly after the game launched and AMD posted its initial driver set (16.5.2), a hotfix (16.5.2.1) was released to resolve performance issues on the R9 390 series cards.

We had a moment to re-benchmark DOOM using the latest drivers between our GTX 1080 Hybrid experiment and current travel to Asia. The good news: AMD's R9 390X has improved performance substantially – about 26% in some tests – and seem to be doing better. Other cards were unaffected by this hot fix (though we did test), so don't expect a performance gain out of your 380X, Fury X, or similar non-390-series device.

Note: These charts now include the GTX 1080 and its overclocked performance.

Following our GTX 1080 coverage of DOOM – and preempting the eventual review – we spent the time to execute GPU benchmarks of id Software's DOOM. The new FPS boasts high-fidelity visuals and fast-paced, Quake-era gameplay mechanics. Histrionic explosions dot Doom's hellscape, overblown only by its omnipresent red tint and magma flows. The game is heavy on particle effects and post-processing, performing much of its crunching toward the back of the GPU pipeline (after geometry and rasterization).

Geometry isn't particularly complex, with the game's indoor settings comprised almost entirely of labyrinthine corridors and rooms. Framerate fluctuates heavily; the more lighting effects and particle simulation in the camera frustum, the greater the swings in FPS as players emerge into or depart from lava-filled chambers and other areas of post-FX interest.

In this Doom graphics card benchmark, we test the framerate (FPS) of various GPUs in the new Doom “4” game, including the GTX 980 Ti, 980, 970, Fury X, 390X, 380X, and more. We'll briefly define game graphics settings first; game graphics definitions include brief discussion on TSSAA, directional occlusion quality, shadows, and more.

Note: Doom will soon add support for Vulkan. It's not here yet, but we've been told to expect Vulkan support within a few weeks of launch. All current tests were executed with OpenGL. We will revisit for Vulkan once the API is enabled.

Ashes of Singularity has become the poster-child for early DirectX 12 benchmarking, if only because it was the first-to-market with ground-up DirectX 12 and DirectX 11 support. Just minutes ago, the game officially updated its early build to include its DirectX 12 Benchmark Version 2, making critical changes that include cross-brand multi-GPU support. The benchmark also made updates to improve reliability and reproduction of results, primarily by giving all units 'god mode,' so inconsistent deaths don't impact workload.

For this benchmark, we tested explicit multi-GPU functionality by using AMD and nVidia cards at the same time, something we're calling “SLIFire” for ease. The benchmark specifically uses MSI R9 390X Gaming 8G and MSI GTX 970 Gaming 4G cards vs. 2x GTX 970s, 1x GTX 970, and 1x R9 390X for baseline comparisons.

Cloud Imperium Games' Star Citizen achieved a major milestone with the distribution of its Alpha 2.0 package, allowing multiplayer exploration in addition to existing dog-fighting and free flight. This release gives players the first glimpse of the game's open world intentions, presenting environments forged in Sci-Fi influence.

There's not much in the way of gameplay just yet, but Alpha 2.0 has been made available to all backers for initial bug- and stress-testing. We decided to conduct a test of our own, specifically looking at GPU performance and preset scaling across multiple “game modes.” Right now, because the pre-release game is comprised of several disjointed modules, there's no one “Play Star Citizen” button – it's split into parts. Racing, free flight, and dog-fighting are in one module (Arena Commander), the Hangar stands alone, and online testing with ArcCorp and Crusader were just released.

For our Star Citizen video card benchmark, we look at GPU vs. GPU performance in the race, delta performance scaling on ArcCorp and in the hangar or free flight, and talk methodology. The game isn't done and has yet to undergo performance optimizations and official driver support, so we won't be recommending the usual “best graphics cards for [game]” this time, as we usually do in our game benchmarks.

Rico's back in town. This time, the vigilante who saves the people by blowing up The People's Water Tower comes in high-fidelity graphics with a focus on lighting FX and water tech. Just Cause 3 revisits a partnership with nVidia's GameWorks development kit, making use of the WaveWorks tech that was previously found in Just Cause 2 (a 2010 release). The game's graphics settings are fairly simple for anyone following our game benchmarks, but we'll recap potential points of confusion further down.

Our Just Cause 3 GPU benchmark puts nVidia & AMD graphics cards to the test at 1080, 1440, and 4K resolutions, using “Very High” and “High” settings for FPS testing. Among others, the video card benchmark includes the 980 Ti (+ SLI), 980, 970, 960, et al., and AMD's 390X, 380X (+ CrossFire), 290X, 270X, et al.

We've noticed some curiosities with Just Cause 3's implementation of water detail scaling and will cover that further down.

Forthcoming team shooter Overwatch is Blizzard's first new IP in years, fusing familiar FPS and team-based elements with MOBA-like playable characters. That, at its core, is what we'd call a “team shooter,” a genre that's been popularized most recently by Team Fortress 2.

The game is still going through closed beta testing, with select Battle.net accounts receiving invites to play-test the game over a few weekends. This weekend's test was, according to Overwatch PR Manager Steven Khoo, an attempt at learning “how Overwatch runs on your system” and a reach-out for “technical feedback.” We figured we'd throw ten video cards at the game and see how it does.

Overwatch isn't particularly GPU intensive, but it does make use of some advanced shadow and reflection techniques that can impact FPS. We performed some initial settings analysis – shown further down – to determine top-level performance impact on a per-setting basis. This is the basis of our eventual graphics optimization guide (see: Black Ops equivalent), something we'll finalize at the game's launch. For now, the goal was to provide a foundation upon which to base our GPU test methodology with Overwatch. This graphics card benchmark looks at the best GPUs for Overwatch (beta), testing 1080p, 1440p, and 4K resolutions across “Epic” and “Ultra” settings.

Battlefront is one of the best-optimized games right now, strictly looking at the graphics-versus-framerate output across multiple GPUs. The game fronts brilliant shading, lighting, and post-FX, leveraging what appears to be some form of PBR (though we're not positive) to create a more realistic aesthetic without hammering draw calls and polys.

That was all tested on an X99 platform, though, so we figured it'd be worth a look at Battlefront's fluidity across our (still limited) CPU suite. We benchmarked Battlefront with the Intel lineup (G3258 to i7) and some of AMD's FX CPUs, including one APU + dGPU combination. Anything not present here means one of two things: We either don't have it or it is presently being used for another benchmark, which accounts for quite a few CPUs, given game launch season.

Assassin's Creed: Unity was the last AC title we benched, and it led us to an exciting discovery: Some games, like AC Unity did, will present a sizable performance disparity between 2GB and 4GB models of the same video card. The Assassin's Creed series has long been heralded as a standard bearer for lighting, shading, and FX technologies, emboldening its geometrically complex environments with supporting filtration and processing. Lighting and shadows propel games like Assassin's Creed to a point in visual fidelity where, without bloating poly-count beyond reason, the models and meshes look smooth and sculpted beyond the geometry's own means, were it unassisted by lighting.

Ubisoft's Assassin's Creed Syndicate was made available to us last night, at which point it was immediately used for benchmarking AMD's brand new R9 380X GPU. This graphics card benchmark of Assassin's Creed Syndicate tests the game at Ultra and Medium settings across 1080p, 1440p, and 4K resolutions. All the latest cards are present in our Syndicate GPU benchmark – the GTX 980 Ti, 980, 970, 960 4GB vs. 2GB, 950, and 750 Ti from nVidia; the R9 390X ($240), R9 380X, R9 290X, R9 285, and R9 270X from AMD.

Of most note, AC Syndicate carries Unity's legacy of truly accentuating the 2GB vs. 4GB VRAM gap in the GTX 960 cards, something that should, theoretically, propagate to other multi-size models (like the R9 380, if we had them).

October 9 saw the conclusion of a beta week full of GPU benchmarking, posting initial performance numbers that were agreeable with nearly all modern, $120+ GPUs. Our testing put everything from the 750 Ti to the 390X – and above – to the test, and we ultimately concluded that the ideal GPU selection included the R9 380 & GTX 960 for 1080p and GTX 970 & R9 390X for 1440p. But that was the beta, something we indicated amply in the first benchmark, and those numbers had potential to shift as the game approached launch.

Star Wars Battlefront is now officially out for PC. Our refreshed Star Wars Battlefront GPU benchmark tests FPS output at 1080p, 1440p, and 4K, using Ultra, High, and Medium settings. The tests, as always, bench the GTX 980 Ti vs. the 980, 970, 960, and downward, alongside AMD's R9 390X vs. the R9 290X, 285, 270X, and 250X.

Below is a look at the game's graphics settings maxed-out at 4K, followed by a quick recap of the Battlefront graphics settings and what they do.

Fallout 4 – now an entire day old – is nearing the end of its content lifecycle on our test bench. We'll soon move on to the next major hardware and games, but for now, we've got a few tests left in Bethesda's latest title. This next benchmark looks at the game's CPU performance, a greater methodological challenge than our preceding GPU benchmark, volumetric lighting benchmark, and texture quality comparison.

Our Fallout 4 CPU benchmark compares FPS across the Intel & AMD lineups, including an i3-4130, i5-4690K, i5-6600K, some i7- CPUs, and AMD's FX-8370E, 8320E, & 9590. Other CPUs were tested as well, like the popular G3258 dual-core Pentium and A10-7870K APU, and are all included on the benchmark charts. The top-level goal was to find the best CPU for Fallout 4, but we also wanted to identify performance disparities and anomalies across models.

Page 1 of 2

  VigLink badge