Johan Andersson, a Frostbite developer under EA, today posted a photograph of AMD's new liquid-cooled video card. It is already known that the R9 300-series video cards are due for release in the summer – likely June – and that the flagship devices will be liquid cooled, but little has been officially announced. The pictured Pirate Islands card is assumed to be a 390X.

Benchmarking the Witcher 3 proved to be more cumbersome than any game we've ever benchmarked. CD Projekt Red's game doesn't front the tremendously overwhelming assortment of options that GTA V does – all of which we tested, by the way – but it was still a time-consuming piece of software to analyze. This is largely due to optimization issues across the board, but we'll dive into that momentarily.

In this Witcher 3 – Wild Hunt PC benchmark, we compare the FPS of graphics cards at varied settings (1080p, 1440p, 4K) to uncover achievable framerates. Among others, we tested SLI GTX 980s, a Titan X, GTX 960s, last-gen cards, and AMD's R9 290X, 285, and 270X. Game settings were tweaked in methodology for the most fair comparison (below), but primarily checked for FPS at 1080p (ultra, medium, low), 1440p (ultra, medium), and 4K (ultra, medium).

That's a big matrix.

Let's get started.

It's been a while since our last card-specific GTX 980 review – and that last one wasn't exactly glowing. Despite the depressing reality that 6 months is “old” in the world of computer hardware, the GTX 980 and its GM204 GPU have both remained near the top of single-GPU benchmarks. The only single-GPU – meaning one GPU on the card – AIC that's managed to outpace the GTX 980 is the Titan X, and that's $1000.

This review looks at PNY's GTX 980 XLR8 Pro ($570) video card, an ironclad-like AIC with pre-overclocked specs. Alongside the XLR8 Pro graphics card, we threw-in the reference GTX 980 (from nVidia) and MSI's Gaming 4G GTX 980 (from CyberPower) when benchmarking.

NVidia and AMD both define the ~$200 price-range as a zone of serious contention among graphics cards. The launch of the 960 held the card to high standards for 1080 gaming, a point nVidia drove home with data showing the prevalence of 1920x1080 as the standard desktop resolution for most gamers.

Our GTX 960 review employed ASUS' Strix 960, a 2GB card with a heavy focus on silence and cooling efficiency, but we've since received several other GTX 960 devices. In this round-up, we'll review the ASUS Strix, EVGA SuperSC 4GB, MSI Gaming 4GB, and PNY XLR8 Elite GTX 960 video cards. The benchmark tests each device for heatsink efficacy, framerate output (FPS) in games, and memory capacity advantages.

NVidia's latest addition to the Titan family diverges from its predecessors' market objectives. Previous Titan cards were fully double-precision enabled, ensuring marketability as affordable production and simulation cards that, by nature, also served reasonably as gaming cards. Because double-precision is detrimental to gaming performance, the original Titan and current Titan Z can be set to “single-precision mode” to better game, but aren't targeted as the “best gaming video card” out there. The Titan X is; in fact, that's exactly what nVidia calls it – the best single-GPU on the market. The selection of these words is intentional, ruling-out dual-GPU single cards (like the 295X2 or 690) and multi-card configurations (like what we're testing today).

Because the Titan X is heavily marketed as a gaming solution, something reinforced by offering just 1/32 of SP in DP performance, we decided to perform a value comparison between 2xGTX 980s in SLI. The SLI configuration offers indisputably powerful raw computational output, but has a smaller memory capacity than the Titan X's 12GB single-GPU pool.

Following our GTA V benchmark from yesterday, we decided to embark on a mission to determine the impact of texture qualities on system performance and visual acuity. We took screenshots of identical objects at Very High, High, and Normal texture resolutions at 4K, then compared the textures in combined screenshots. During this process, maximum theoretical VRAM consumption and texture quality impact on FPS and tearing were also analyzed, resulting in a specific settings benchmark for GTA V.

It's finally here.

Grand Theft Auto V took its time to migrate to PC, and from our preliminary overview and testing, it seems like the wait was worthwhile. GTA V's PC port exhibits unique PC features, like a VRAM consumption slider indicative of the maximum VRAM requirement of the current settings. The port also added first-person mode, complete with new 3D models and animations for the characters' arms, phone, guns, and what-have-you. As you'll find out in our benchmark results below, the game is also incredibly well-optimized across most graphics card configurations, something we can't say has been true for most games in recent history.

These things take time, and RockStar certainly took as much of that as it needed.

Using a suite of video cards spanning the Titan X, SLI GTX 980s, R9 290X and 270Xs, GTX 960s, 750 Ti cards, and more, we benchmarked GTA V in an intensive test. This GTA V PC benchmark compares FPS of various graphics cards at maximum settings in 1080p, 1440p, and 4K resolutions.

This article makes no intentions to comment on gameplay value.

Graphics manufacturer EVGA yesterday announced its second iteration of “Pro SLI Bridges,” effectively prettied-up bridges for multi-card solutions. The Pro SLI Bridge V2 is housed in an aluminum and offers an illuminated EVGA logo front-and-center, for those with compatible EVGA devices.

Following the launch of 2GB cards, major board partners – MSI and EVGA included – have begun shipment of 4GB models of the GTX 960. Most 4GB cards are restocking availability in early April at around $240 MSRP, approximately $30 more expensive than their 2GB counterparts. We've already got a round-up pending publication with more in-depth reviews of each major GTX 960, but today, we're addressing a much more basic concern: Is 4GB of VRAM worth it for a GTX 960?

This article benchmarks an EVGA GTX 960 SuperSC 4GB card vs. our existing ASUS Strix GTX 960 2GB unit, testing each in 1080, 1440p, and 4K gaming scenarios.

With AMD's Mantle in dire straits and losing ongoing support, the question of timing for its inevitable death has been fresh in our minds. Microsoft's DirectX 12 promises to accomplish many of the same objectives that made Mantle appealing – namely, putting developers “closer to the metal” – while being distributed alongside the prolific Windows OS; this, we think, has already stifled Mantle's viability to developers.

Page 1 of 12