AMD issued a statement moments ago pertaining to the Radeon Software (see also: Radeon Settings, Crimson, former Catalyst) fan speed configuration causing GPU overheating issues. Some users have reported GPU death resultant of excessive thermals, which correlate with inadequately low fan speeds for the current heat generation.

This article specifically looks at single-GPU solutions to gaming at various price-points. We scale our GPU search from $100 to $600, covering PC builders across budget, mid-range, and high-end configurations. We've had extensive hands-on testing with the cards below, a fact accentuated by the burst of game launches in the past few weeks. Most of these cards have been tested in Battlefront, Fallout 4, AC Syndicate, Black Ops III, and the year's earlier titles, like The Witcher 3 and GTA V.

Black Friday starting to hit full swing, we found some of the best graphics cards of the year on sale for – in some cases – significant discount. The GTX 970 at $290, R9 380 at $143, and GTX 980 at $400 are just a few of the finds below.

Software doesn't normally warrant a standalone review on this site; we'll review the hardware and, as an accompaniment, talk about the software's ability to adequately enable that hardware. AMD's newest “Radeon Settings – Crimson Edition” (introduced here) supersedes its long-standing Catalyst Control Center, which has been retired from service. Radeon Settings, which we'll interchangeably refer to as “Crimson,” is a complete overhaul of the AMD control interface. This, we think, warrants more of an in-depth tear-down than a simple news post.

There shouldn't be major performance updates included in the preview package we were provided; at least, not any more than what we've found in 15.11.1 benchmarking. This is largely an interface improvement, moving to a minimalistic UI – the trend of late – and attempting to improve ease-of-use for anyone with AMD Radeon hardware.

Forthcoming team shooter Overwatch is Blizzard's first new IP in years, fusing familiar FPS and team-based elements with MOBA-like playable characters. That, at its core, is what we'd call a “team shooter,” a genre that's been popularized most recently by Team Fortress 2.

The game is still going through closed beta testing, with select accounts receiving invites to play-test the game over a few weekends. This weekend's test was, according to Overwatch PR Manager Steven Khoo, an attempt at learning “how Overwatch runs on your system” and a reach-out for “technical feedback.” We figured we'd throw ten video cards at the game and see how it does.

Overwatch isn't particularly GPU intensive, but it does make use of some advanced shadow and reflection techniques that can impact FPS. We performed some initial settings analysis – shown further down – to determine top-level performance impact on a per-setting basis. This is the basis of our eventual graphics optimization guide (see: Black Ops equivalent), something we'll finalize at the game's launch. For now, the goal was to provide a foundation upon which to base our GPU test methodology with Overwatch. This graphics card benchmark looks at the best GPUs for Overwatch (beta), testing 1080p, 1440p, and 4K resolutions across “Epic” and “Ultra” settings.

Battlefront is one of the best-optimized games right now, strictly looking at the graphics-versus-framerate output across multiple GPUs. The game fronts brilliant shading, lighting, and post-FX, leveraging what appears to be some form of PBR (though we're not positive) to create a more realistic aesthetic without hammering draw calls and polys.

That was all tested on an X99 platform, though, so we figured it'd be worth a look at Battlefront's fluidity across our (still limited) CPU suite. We benchmarked Battlefront with the Intel lineup (G3258 to i7) and some of AMD's FX CPUs, including one APU + dGPU combination. Anything not present here means one of two things: We either don't have it or it is presently being used for another benchmark, which accounts for quite a few CPUs, given game launch season.

“Team Red” appears to have been invigorated lately, inspired by unknown forces to “take software very seriously” and improve timely driver roll-outs. The company, which went about half a year without a WHQL driver from 2H14-1H15, has recently boosted game-ready drivers near launch dates, refocused on software, and is marketing its GPU strengths.

The newest video card from AMD bears the R300 series mark, from which we previously reviewed the R9 380 & R9 390 GPUs. AMD's R9 380X 4GB GPU costs $230 MSRP, but retails closer to $240 through board partners, and hosts 13% more cores than the championed R9 380 graphics card (~$200 after MIRs). That places the R9 380X in direct competition with nVidia's GTX 960 4GB, priced at roughly $230, and 2GB alternative at $210.

Today, we're reviewing the Sapphire Nitro version of AMD's R9 380X graphics card, including benchmarks from Battlefront, Black Ops III, Fallout 4, Assassin's Creed Syndicate, and more. The head-to-head would pit the R9 380X 4GB vs. the GTX 960 4GB, something we've done in-depth below. We'll go into thermals, power consumption, and overclocking on the last page.

October 9 saw the conclusion of a beta week full of GPU benchmarking, posting initial performance numbers that were agreeable with nearly all modern, $120+ GPUs. Our testing put everything from the 750 Ti to the 390X – and above – to the test, and we ultimately concluded that the ideal GPU selection included the R9 380 & GTX 960 for 1080p and GTX 970 & R9 390X for 1440p. But that was the beta, something we indicated amply in the first benchmark, and those numbers had potential to shift as the game approached launch.

Star Wars Battlefront is now officially out for PC. Our refreshed Star Wars Battlefront GPU benchmark tests FPS output at 1080p, 1440p, and 4K, using Ultra, High, and Medium settings. The tests, as always, bench the GTX 980 Ti vs. the 980, 970, 960, and downward, alongside AMD's R9 390X vs. the R9 290X, 285, 270X, and 250X.

Below is a look at the game's graphics settings maxed-out at 4K, followed by a quick recap of the Battlefront graphics settings and what they do.

We're in the final throes of our Call of Duty: Black Ops III content before moving on to the next game – you know the one. While processing data for our forthcoming graphics optimization guide, we realized that Black Ops III is among the most VRAM-hungry games we've ever tested, consuming upwards of 10GB GDDR5 on the Titan X.

Our GPU benchmarks included some initial memory benchmarking, stating that the 980 Ti saw full saturation of its 6GB framebuffer at 4K/max settings. We also showed that the game commits 15.2GB of memory under max settings (pageable address space), with an active physical consumption of about 6.7GB (working set) in Multiplayer. Our testing presents that the singleplayer campaign is far more intensive than multiplayer, to the tune of 38.6% lower FPS on the GPU side.

During tests of all Call of Duty: Black Ops 3's graphics settings, we uncovered a VRAM consumption approaching 10GB in campaign mode when using 4K & “Extra” settings.

Readers following our story on the Asetek vs. Cooler Master lawsuits may remember a call to attention regarding the Fury X's utilization of a CM Seidon equivalent CLC. Gigabyte's newest GTX 980 WaterForce card uses a 120mm CLC supplied by Cooler Master, with the pump mounted atop the coldplate (GPU block). This falls within Asetek's claims regarding its patent protection – and the company holds patents valid for GPU- and CPU-mounted pumps – and Gigabyte could reasonably be impacted by the resolutions set forth by recent lawsuits.

A new GeForce Experience update sees the addition of 4K60 game streaming support, allowing nVidia users to remotely render game content at high native resolutions. Changes to the distribution approach to “Game-Ready” drivers – issued on launch day of popular betas and AAA titles – are also being made with the new GFE update, discussed below.

GeForce Experience is nVidia's software “ecosystem” that handles driver downloads, updates, game optimization, retroactive game capture (ShadowPlay), game streaming and sharing, and more. The service now claims a 65-million user install base, with nVidia noting that “90% or more” of its driver downloads are now distributed directly through the utility. NVidia's strategy is to unify its consumer base under a single application, with the current trajectory aiming to introduce news updates, contests, and giveaways through email newsletters. Long-standing GFE features, namely game optimization and driver notifications, remain a mainstay of the application, greatly aiding usability for gamers who don't necessarily check for drivers with each game release.

Page 1 of 16

  VigLink badge