With Ryzen around the corner, we wanted to publish a full CPU benchmark of Watch Dogs 2 in our test course, as we’ve recently found the game to be heavily thread-intensive and responsive to CPU changes. The game even posts sizable gains for some overclocks, like on the i5-2500K, and establishes a real-world platform of when CPU choice matters. It’s easy to bottleneck GPUs with Watch Dogs 2, which is something of a unique characteristic for modern games.
Watch Dogs 2 is a familiar title by now at the GN test bench, and while we’ve published a GPU benchmark and a more recent CPU optimization guide, we never published a comprehensive CPU benchmark. We’ve gathered together all our results here, from the 2500K revisit all the way to Kaby Lake reviews (see: 7600K review & 7350K review), and analyzed what exactly makes a CPU work well with Watch Dogs 2 and why.
In this Watch Dogs 2 CPU benchmark, we’ll recap some graphics optimization tips for CPUs and test whether an i7 is worth it, alongside tests of the 7600K, 7700K, 6600K, 7350K, FX-8370, and more.
One interesting aspect of the Watch_Dogs 2 benchmarking we did for our 2500K revisit was the difference in performance between i5s and i7s. At stock speeds, the i7-2600K was easily outpacing the i5-2500K by roughly 15 FPS—and even more interestingly, the i7-6700K managed to hit our GTX 1080’s ceiling of 110-115 FPS, while the i5-6600K only managed 78.7 with the same settings. Watch_Dogs 2 is clearly a game where the additional threads are beneficial, making it an exciting test opportunity as that’s not a common occurrence. We decided to look into settings optimization for CPUs with Watch Dogs 2, and have tested a few of the most obvious graphics settings to see which ones can really help.
This Watch Dogs 2 graphics optimization guide focuses on CPU performance to try and figure out which settings can be increased (with GPU overhead) and decreased (with CPU limits).
Before even getting started here, let’s put out the obvious disclaimer. This GPU benchmark is for the beta version of For Honor, which means a few things: (1) the game’s not final yet and, despite being just two weeks away, there are still some graphics settings missing from the menu; (2) nVidia’s current drivers are optimized for the beta, but the company plans another update some point soon for further optimizations; (3) AMD has not yet released drivers for the game, though we did ask for early access and were told that the company won’t be ready until launch day. There are day-0 drivers planned from AMD.
Regardless, we tested anyway to see how the beta performs and get a baseline understanding of what we should expect overall from the new multiplayer brawler title. For Honor thus far has proven impressively detailed in geometry and texturing (especially texturing), and deserves high marks for the art department. Granted, that generally means more abuse on the video card or CPU (for the complex geometric draw calls), so we’ve got some For Honor graphics settings scaling tests as well.
This graphics card benchmark tests For Honor’s performance at 4K, 1440p, and 1080p with Extreme settings. We tested using a real, in-game benchmark rather than the built-in benchmark, which generally makes performance look a lot worse than it is in reality (we have a chart demonstrating this). Settings scaling was tested from low to extreme, as was multiplayer and ‘singleplayer’ (bot match). We primarily ran For Honor benchmarks with the AMD RX 480 8GB & 4GB, RX 470 4GB, RX 460 2GB, & 390X cards vs. the GTX 1080, 1070, 1060 6GB & 3GB, 1050 & Ti, and 970 AIB partner cards.
The 4th installment in the Mass Effect series, Mass Effect Andromeda, will launch March 21 of 2017 in North America on PS4, Xbox One, and PC. Europe will see a release date of March 23, 2017. Andromeda’s story will take place hundreds of years after the events of the original Mass Effect Trilogy, completely separate from the original storyline. Andromeda will be set in a true open world environment instead of the “linear open world” in the first 3 Mass Effect games.
In this article, we’ll explore everything we know so far (most of the important bits, anyway) with Mass Effect: Andromeda. We’re leaving out major speculative pieces in this and instead focusing on information officially released or deduced through careful observation of trailers and screenshots. We’ve also had some interview time with the Andromeda team, and will be including that coverage in this content.
Medieval action/strategy RPG Mount & Blade II: Bannerlord moved one step closer to release when a Steam page was unveiled in October, but still has no official release date.
Confusingly, Mount & Blade II: Bannerlord is a prequel to Mount & Blade: Warband, itself a 2010 standalone expansion to the 2007 game Mount & Blade—it helps to just think of Warband as a complete overhaul of the original. Warband has maintained a loyal fanbase since release—thanks to the winter sale, steamcharts.com reported roughly 12,000 players in-game as of this writing (a bit more than Elder Scrolls Online). In 2012, developer TaleWorlds announced they would follow-up on Warband’s success with Bannerlord, and have been slowly releasing tidbits of information since. Here’s what we know so far:
As the Black Friday and Cyber Monday deals start to wind down, we will begin to take a look at some games that are among the most popular on Steam. Some of these games were greatly reduced on the Autumn Sale, and while that sale has ended, Steam usually hosts a Winter Sale -- that’ll be coming up shortly.
For today, we’ve got a list of some of our top PC games for 2016 (including previous releases and sales). All the below games are available on Steam. This year's game releases felt a little lighter in the AAA category versus last year; at least, when considering last year offered GTA V and The Witcher 3 in rapid succession, but active sales have revived last year's titles alongside a couple major indie hits for 2016.
Here is the shortlist:
Ubisoft's newest dystopian efforts start strong with allusions to modern-day challenges pertaining to privacy and "cyber warfare," working to build-up our character as a counter-culture hacker. And, as with Ubisoft's other AAA titles, the game builds this world with high-resolution textures, geometrically complex and dense objects, taxing shadow/lighting systems, and an emphasis on graphics quality.
Watch Dogs 2 is a demanding title to run on modern hardware. We spent the first 1-2 hours of our time in Watch Dogs 2 simply studying the impact of various settings on performance, further studying locales and their performance hits. Areas with grass and foliage, we found, most heavily hit framerate. Nightfall or dark rain play a role in FPS hits, too, particularly when running high reflection qualities and headlight shadows.
We look at performance of 11 GPUs in this Watch Dogs 2 video card benchmark, including the RX 480 vs. GTX 1060, GTX 1070, GTX 1080, RX 470, R9 Fury X, and more.
We've been through Battlefield 1 a few times now. First were the GPU benchmarks, then the HBAO vs. SSAO benchmark, then the CPU benchmark. This time it's RAM, and the methodology remains mostly the same. Note that these results are not comparable to previous results because (1) the game has received updates, (2) memory spec has changed for this test, and (3) we have updated our graphics drivers. The test platforms and memory used are dynamic for this test, the rest remaining similar to what we've done in the past. That'll be defined in the methodology below.
Our CPU benchmark had us changing frequencies between test platforms as we tried to determine our test patterns and methodology / bench specs for the endeavor. During that exploratory process, we noticed that memory speeds of 3200MHz were measurably faster in heuristic testing than speeds of, say, 2400MHz. That was just done by eye, though; it wasn't an official benchmark, and we wanted to dedicate a separate piece to that.
This content benchmarks memory performance in Battlefield 1, focusing on RAM speed (e.g. 1600MHz, 1866, 2133, 2400, so forth) and capacity. We hope to answer whether 8GB is "enough" and find a sweet spot for price-performance in memory selection.
This benchmark took a while to complete. We first started benchmarking CPUs with Battlefield 1 just after our GPU content was published, but ran into questions that took some back-and-forth with AMD to sort out. Some of that conversation will be recapped here.
Our Battlefield 1 CPU benchmark is finally complete. We tested most of the active Skylake suite (i7-6700K down to i3-6300), the FX-8370, -8320E, and some Athlon CPUs. Unfortunately, we ran out of activations before getting to Haswell or last-gen CPUs, but those may be visited at some point in the future. Our next goal is to look into the impact of memory speed on BF1 performance, or determine if there is any at all.
Back on track, though: Today's feature piece is to determine at what point a CPU will begin bottlenecking performance elsewhere in the system when playing Battlefield 1. Our previous two content pieces related to Battlefield 1 are linked below: