Battlefield 1 CPU Benchmark (Dx11 & Dx12) - i7 vs. i5, i3, FX-8370, X4 845

By Published November 07, 2016 at 9:57 am
  •  

This benchmark took a while to complete. We first started benchmarking CPUs with Battlefield 1 just after our GPU content was published, but ran into questions that took some back-and-forth with AMD to sort out. Some of that conversation will be recapped here.

Our Battlefield 1 CPU benchmark is finally complete. We tested most of the active Skylake suite (i7-6700K down to i3-6300), the FX-8370, -8320E, and some Athlon CPUs. Unfortunately, we ran out of activations before getting to Haswell or last-gen CPUs, but those may be visited at some point in the future. Our next goal is to look into the impact of memory speed on BF1 performance, or determine if there is any at all.

Back on track, though: Today's feature piece is to determine at what point a CPU will begin bottlenecking performance elsewhere in the system when playing Battlefield 1. Our previous two content pieces related to Battlefield 1 are linked below:

The video form of this article is embedded below. You may continue reading if preferred or for additional depth not provided in the video.

Test Methodology

We first detailed our Battlefield 1 testing methodology in the GPU benchmark article. Much of that has been redeployed here, with the exception being a focus on CPUs. Our Battlefield 1 CPU benchmarking leveraged three different types of tests for final validation, one of which partially mirrors AMD's internal BF1/CPU test methodology as a means to further verify our results against theirs.

First, some re-printed text from our BF1 GPU test methodology, then we'll define the additions:

The above video shows some of our gameplay while learning about Battlefield 1's settings. This includes our benchmark course (a simple walk through Avanti Savoia - Mission 1), our testing in Argonne Forest to get a quick look at FPS during 64-player multiplayer matches, and a brief tab through the graphics settings.

We tested using our GPU test bench, detailed in the table below. Our thanks to supporting hardware vendors for supplying some of the test components.

Game settings were manually controlled for the DUT. All games were run at presets defined in their respective charts. Our test courses are all manually conducted. In the case of our bulk data below, the same, easily repeatable test was conducted a minimum of eight times per device, per setting, per API. This ensures data integrity and helps to eliminate potential outliers. In the event of a performance anomaly, we conduct additional test passes until we understand what's going on. In NVIDIA's control panel, we disable G-Sync for testing (and disable FreeSync for AMD, where relevant). Note that these results were all done with the newest drivers, including the newest game patches, and may not be comparable to previous test results. Also note that tests between reviewers should not necessarily be directly compared as testing methodology may be different. Just the measurement tool alone can have major impact.

We execute our tests with PresentMon via command line, a tool built by Microsoft and Intel to hook into the operating system and accurately fetch framerate and frametime data. Our team has built a custom, in-house Python script to extract average FPS, 1% low FPS, and 0.1% low FPS data (effectively 99/99.9 percentile metrics). The test pass is executed for 30 seconds per repetition, with a minimum of 3 repetitions. This ensures an easily replicated test course for accurate results between cards, vendors, and settings. You may learn about our 1% low and 0.1% low testing methodology here:

Or in text form here.

Windows 10-64 Anniversary Edition was used for the OS.

Partner cards were used where available and tested for out-of-box performance. Frequencies listed are advertised clock-rates. We tested both DirectX 11 and DirectX 12.

Please note that we use onPresent to measure framerate and frametimes. Reviewers must make a decision whether to use onPresent or onDisplay when testing with PresentMon. Neither is necessarily correct or incorrect, it just comes down to the type of data the reviewer wants to work with and analyze. For us, we look at frames on the Present. Some folks may use onDisplay, which would produce different results (particularly at the low-end). Make sure you understand what you're comparing results to if doing so, and also ensure that the same tools are used for analysis. A frame does not necessarily equal a frame between software packages. We trust PresentMon as the immediate future of benchmarking, particularly with its open source infrastructure built and maintained by Intel and Microsoft.

Also note that we are limited on our activations per game code. We can test 5 hardware components per code within a 24-hour period. We've got three codes, so we can test a total of 15 configurations per 24 hours.

Battlefield 1 has a few critical settings that require tuning for adequate benchmarking. Except where otherwise noted, we disabled GPU memory restrictions for testing; this setting triggers dynamic quality scaling, creating unequal tests. We also set resolution render scale to 100% to match render resolution to display resolution. Field of View was changed to 80-degrees vertical to more appropriately fit what a player would use, since the default 55-degree vertical FOV is a little bit silly for competitive FPS players. This impacts FPS and should also be accounted for if attempting to cross-compare results. V-Sync and adaptive sync are disabled. Presets are used for quality, as defined by chart titles. Game performance swings based on test location, map, and in-game events. We tested in the Italian Avanti Savoia campaign level for singleplayer, and we tested on Argonne Forest for multiplayer. You can view our test course in the above, separate video.

The campaign was used as primary test platform, but we tested multiplayer to determine the scaling between singleplayer and multiplayer. Multiplayer is not a reliable test platform when considering our lack of control (pre-launch) over servers, tick rate, and network interference with testing. Thankfully, the two are actually pretty comparable in performance. FPS depends heavily on the map, as always, but even on 64 player-count servers, assuming the usual map arrangement where you never see everyone at once, are not too abusive on the GPU.

Note that we used the console command gametime.maxvariablefps 0 to disable the framerate cap, in applicable test cases. This removes the Battlefield 1 limitation / FPS cap of 200FPS.

In addition to the above (previous testing methodology), note that we have updated to use the latest driver sets at the time of test execution. This includes nVidia 375.70 and AMD 16.10.3. Both have now been superseded by hotfixes, but the optimizations for Battlefield 1 are the same as our test suite.

We conduct 8 test passes per CPU. The first two are discarded, as Battlefield 1 is still enduring sporadic pop-in of assets and presents chaotic performance as a result (see video of this benchmark for an example).

Tests were conducted using ultra settings at 1080p with two GPUs: One was a GTX 1080 FTW Hybrid, used to place emphasis on the CPU performance. Interestingly, the GTX 1080 is powerful enough that just running a lower resolution is enough to demonstrate CPU scaling, and we can leave the graphics options more realistically high. This is new with this generation of GPUs. The second tested card was an RX 480 8GB Gaming X, which gives more of a middle-of-the-road look at things.

The point of running a lower resolution is to show scaling performance. As you increase resolution, load will be placed more heavily on the pixel pipeline and the GPU's ability to draw and sample all of those pixels -- that shifts the load and obfuscates CPU performance. That said, there is something to be said for the real-world aspect of this: If running more demanding quality settings (with reduced settings for CPU-intensive options), it would be possible to run lower-end CPUs with reasonably high settings.

DirectX 12 performance was measured using the onPresent variable from PresentMon. We extract 1% low and 0.1% low metrics using a python script that GamersNexus created. Game graphics are configured to Ultra, 96* horizontal FOV, GPU memory restriction off, and VSync off.

Our other testing methodology, which leverages a 3-minute long test in the Through Mud & Blood map, is detailed in our second set of test results further down the page.

CPUs Tested

Intel CPUs used:

We tested the 6700K with hyperthreading enabled, then again with it disabled for a better understanding of HT impact in BF1. Memory was Corsair Vengeance LPX.

Intel motherboards used:

AMD CPUs used:

Note that we are using an A10-7870K as a sort-of 880K "surrogate," as the performance is more or less identical, just the 880K disables its IGP for a lower price. The 880K does run about 100MHz faster, but the performance should be effectively equal to the 7870K.

Memory was HyperX Savage. AMD motherboards used:

Why did you use these parts?

These are the most recent and relevant CPUs and motherboards that we presently have access to in our lab. If a component is missing, it's either because we don't have it or we didn't have time to test it in addition to our tests of the latest components.

Continue to Page 2 for the results.


Last modified on November 07, 2016 at 9:57 am
Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

Advertisement:

  VigLink badge