October 9 saw the conclusion of a beta week full of GPU benchmarking, posting initial performance numbers that were agreeable with nearly all modern, $120+ GPUs. Our testing put everything from the 750 Ti to the 390X – and above – to the test, and we ultimately concluded that the ideal GPU selection included the R9 380 & GTX 960 for 1080p and GTX 970 & R9 390X for 1440p. But that was the beta, something we indicated amply in the first benchmark, and those numbers had potential to shift as the game approached launch.

Star Wars Battlefront is now officially out for PC. Our refreshed Star Wars Battlefront GPU benchmark tests FPS output at 1080p, 1440p, and 4K, using Ultra, High, and Medium settings. The tests, as always, bench the GTX 980 Ti vs. the 980, 970, 960, and downward, alongside AMD's R9 390X vs. the R9 290X, 285, 270X, and 250X.

Below is a look at the game's graphics settings maxed-out at 4K, followed by a quick recap of the Battlefront graphics settings and what they do.

Fallout 4 – now an entire day old – is nearing the end of its content lifecycle on our test bench. We'll soon move on to the next major hardware and games, but for now, we've got a few tests left in Bethesda's latest title. This next benchmark looks at the game's CPU performance, a greater methodological challenge than our preceding GPU benchmark, volumetric lighting benchmark, and texture quality comparison.

Our Fallout 4 CPU benchmark compares FPS across the Intel & AMD lineups, including an i3-4130, i5-4690K, i5-6600K, some i7- CPUs, and AMD's FX-8370E, 8320E, & 9590. Other CPUs were tested as well, like the popular G3258 dual-core Pentium and A10-7870K APU, and are all included on the benchmark charts. The top-level goal was to find the best CPU for Fallout 4, but we also wanted to identify performance disparities and anomalies across models.

In suit of our Fallout 4 GPU benchmark and the follow-up Volumetric Lighting Benchmark, we're now looking toward Bethesda's odd texture quality presentation. Fallout 4 includes with it a Texture Quality option, which should dictate the resolution of textures as applied to game elements. The setting scales from medium to ultra, with one step (“high”) in between. Usually – especially in Black Ops III, as we found – texture resolution can have profound impact on performance and VRAM consumption, leading us to run some texture tests in Fallout.

Here's an example of what we're used to when it comes to texture quality comparisons:

Activision's latest in its seemingly undying shooter franchise launched with fairly simplistic graphics settings, but still has a few items that may raise questions – like Order Independent Transparency and Subsurface Scattering. We talk about some of these at a top-level in our Black Ops 3 GPU benchmark, found here, but dive deep in this latest guide. Ignoring difficulties encountered with VRAM and memory, the heavy LOD scaling and graphics controls allow for scalability across the $130 to $1000 GPU range.

Our Call of Duty: Black Ops 3 optimization guide shows the best graphics settings for improving FPS, including screenshot comparisons of the settings. We independently benchmarked all of the game's settings. The screenshots below show texture quality (resolution) comparisons, preset & texture VRAM consumption, FPS performance for each setting, and more. We also define Order Independent Transparency, Volumetric Lighting, Subsurface Shadows, Mesh Quality, Shadow Mapping, and more of Call of Duty's options.

All of these tests were conducted using the patch released on November 7, which contained some bug fixes not addressed at launch. The latest nVidia (358.87) and AMD (15.11) drivers were used for testing. More below in the methodology sections.

Each setting will be listed by severity of its impact on FPS. Higher severity FPS impacters will be listed first.

We're in the final throes of our Call of Duty: Black Ops III content before moving on to the next game – you know the one. While processing data for our forthcoming graphics optimization guide, we realized that Black Ops III is among the most VRAM-hungry games we've ever tested, consuming upwards of 10GB GDDR5 on the Titan X.

Our GPU benchmarks included some initial memory benchmarking, stating that the 980 Ti saw full saturation of its 6GB framebuffer at 4K/max settings. We also showed that the game commits 15.2GB of memory under max settings (pageable address space), with an active physical consumption of about 6.7GB (working set) in Multiplayer. Our testing presents that the singleplayer campaign is far more intensive than multiplayer, to the tune of 38.6% lower FPS on the GPU side.

During tests of all Call of Duty: Black Ops 3's graphics settings, we uncovered a VRAM consumption approaching 10GB in campaign mode when using 4K & “Extra” settings.

Call of Duty: Black Ops 3 arrived on PC at midnight, bringing with it high-fidelity graphics that stress PC components – all the way down to the memory. We set forth on benchmarks for Call of Duty: Black Ops III immediately, with our first FPS tests focusing on GPU performance, alongside some RAM & VRAM insights. More tests are forthcoming, so be sure to follow us for those.

Before jumping into the BLOPS 3 benchmarks, let's explain the game's graphics settings. Until our graphics optimization guide for Call of Duty arrives, this detailed list of game settings should assist in determining which options can be disabled or tuned for a higher framerate.

Update: Our Black Ops III graphics optimization guide is now live for per-setting analysis.

UPDATE: Our launch-day Battlefront GPU benchmarks are now live. Refer here for updated charts.

The PC version of Star Wars: Battlefront was made available through beta channels yesterday and, somewhat surprisingly, the graphics settings and assets appear to be fairly feature-complete. It's possible (even likely) that some final optimizations are in the pipe leading up to launch but, for now, the game's high-resolution, high LOD assets are a testament to its preparedness for benchmarking.

Star Wars: Battlefront fronts some of the most advanced, realistic graphics we've yet seen, rivaling GTA V and The Witcher 3 in intensity and technology. Battlefront makes heavy use of terrain deformation and tessellation to add the appearance of greater depth, smooth terrain elements, and create a landscape that scales impressively well at various view distances.

We deployed a suite of video cards to benchmark Star Wars: Battlefront in an exhaustive test, including SLI GTX 980 Tis, the Sea Hawk 980 Ti, GTX 980, GTX 970, 960s, 950, the 390X, 290X, 270X, and more. This Star Wars: Battlefront benchmark compares FPS of graphics cards at maximum (ultra) settings, high, and medium settings in 1080p, 1440p, and 4K resolutions.

Disclaimer: This article makes no intentions to comment on gameplay value. We're strictly looking at visuals and framerate performance in Battlefront.

So, you've decided to play Skyrim again. Or perhaps this is the first time. Either way, you've installed the game, played a few minutes, and realized something: wow, this is pretty ugly.

Skyrim isn't exactly a game that has visually aged well. It's more than three years old, was already a bit dated when it came out (Bethesda's four-year development cycle shows), and with gorgeous games like The Witcher 3 having been released this year, Skyrim doesn't really have much to offer on the visual front.

It is, however, a gun that runs on Creation Engine, and it has a development kit with an active community. We have the technology. We can rebuild it.

The Fury X has been a challenging video card to review. This is AMD's best attempt at competition and, as it so happens, the card includes two items of critical importance: A new GPU architecture and the world's first implementation of high-bandwidth memory.

Some system builders may recall AMD's HD 4870, a video card that was once a quickly-recommended solution for mid-to-high range builds. The 4870 was the world's first graphics card to incorporate the high-speed GDDR5 memory solution, reinforcing AMD's position of technological jaunts in the memory field. Prior to the AMD acquisition, graphics manufacturer ATI designed the GDDR3 memory that ended up being used all the way through to GDDR5 (GDDR4 had a lifecycle of less than a year, more or less, but was also first instituted on ATI devices).

Our recent Fury X driver comparison took rumors of a disparate relationship between press and launch drivers to task, ultimately finding that no real difference existed. This testing procedure exposed us to the currently discussed “coil whine” and “pump whine” of the new R9 Fury X. Today's test seeks to determine with objectivity and confidence whether the whine is detrimental in a real-world use case.

AMD's R9 Fury X video card emits a high frequency whine when under load. We have located this noise on both of our retail units – sold under Sapphire's banner, but effectively identical to all Fury X cards – and reviewers with press samples have cited the same noise. The existence of a sound does not inherently point toward an unusably loud product, though, and must be tested in a sterile environment to determine impact to the user experience. The noise resembles coil whine, for those familiar with the irritating hum, but is actually an emission from the high-speed pump on the Fury X. This relegates the noise to what is ultimately a mechanical flaw in the engineering rather than something electrical, as coil whine would suggest.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge