This benchmark took a while to complete. We first started benchmarking CPUs with Battlefield 1 just after our GPU content was published, but ran into questions that took some back-and-forth with AMD to sort out. Some of that conversation will be recapped here.
Our Battlefield 1 CPU benchmark is finally complete. We tested most of the active Skylake suite (i7-6700K down to i3-6300), the FX-8370, -8320E, and some Athlon CPUs. Unfortunately, we ran out of activations before getting to Haswell or last-gen CPUs, but those may be visited at some point in the future. Our next goal is to look into the impact of memory speed on BF1 performance, or determine if there is any at all.
Back on track, though: Today's feature piece is to determine at what point a CPU will begin bottlenecking performance elsewhere in the system when playing Battlefield 1. Our previous two content pieces related to Battlefield 1 are linked below:
The goal of this content is to show that HBAO and SSAO have negligible performance impact on Battlefield 1 performance when choosing between the two. This benchmark arose following our Battlefield 1 GPU performance analysis, which demonstrated consistent frametimes and frame delivery on both AMD and nVidia devices when using DirectX 11. Two of our YouTube commenters asked if HBAO would create a performance swing that would favor nVidia over AMD and, although we've discussed this topic with several games in the past, we decided to revisit for Battlefield 1. This time, we'll also spend a bit of time defining what ambient occlusion actually is, how screen-space occlusion relies on information strictly within the z-buffer, and then look at performance cost of HBAO in BF1.
We'd also recommend our previous graphics technology deep-dive, for folks who want a more technical explanation of what's going on for various AO technologies. Portions of this new article exist in the deep-dive.
Battlefield 1 marks the arrival of another title with DirectX 12 support – sort of. The game still supports DirectX 11, and thus Windows 7 and 8, but makes efforts to shift Dice and EA toward the new world of low-level APIs. This move comes at a bit of a cost, though; our testing of Battlefield 1 has uncovered some frametime variance issues on both nVidia and AMD devices, resolvable by reverting to DirectX 11. We'll explore that in this content.
In today's Battlefield 1 benchmark, we're strictly looking at GPU performance using DirectX 12 and DirectX 11, including the recent RX 400 series, GTX 10 series, GTX 9 series, and RX 300 series GPUs. Video cards tested include the RX 480, RX 470, RX 460, 390X, and Fury X from AMD and the GTX 1080, 1070, 1060, 970, and 960 from nVidia. We've got a couple others in there, too. We may separately look at CPU performance, but not today.
This BF1 benchmark bears with it extensive testing methodology, as always, and that's been fully detailed within the methodology section below. Please be sure that you check this section for any questions as to drivers, test tools, measurement methodology, or GPU choices. Note also that, as with all Origin titles, we were limited to five device changes per game code per day (24 hours). We've got three codes, so that allowed us up to 15 total device tests within our test period.
This is the last of our CitizenCon coverage from Sunday. Following this interview with Erin Roberts, we flew up to San Jose to tour a few of the hardware manufacturers located in the area. We'll likely have some coverage of those visits online within the next 2-3 days, for folks looking for a return to hardware industry discussion and architecture dives. A few reviews are also pending publication, likely going live next week.
We try to focus on technology at GN, as always, and so spent our previous interview talking about parallax occlusion mapping and 64-bit engine technology. We think that this is interesting and useful information to learn to better understand GPU interactions with game engines, hopefully better painting a picture of what's going on behind the GPU shroud. With Erin Roberts, Studio Head for the Foundry 42 UK branch, we discussed procedural generation edge blending for planets v2 (also defined here), development timelines to build the demo shown, and workload distribution between studios.
Immediately following our already-published interview with Star Citizen's Chris Roberts, we encountered Technical Director Sean Tracy, recently responsible for educating us on the game's 64-bit engine. The Technical Director took a few moments after CitizenCon to share details about the lower level technology driving the night's demonstration, like real-time physics, poly per pixel budgets, occlusion mapping, and more.
Tracy's role for the CitizenCon presentation primarily had him demonstrating the production tools utilized by CIG for planet development. This includes Planet Ed, already somewhat detailed here and here, which is the team's creation kit for planet customization and design. The toolkit takes the approach of getting artists “90% of the way there,” allowing them to fine-tune the final 10% for faster production pipelines and a hands-on polish. The tech demo showed a walk-through of CIG's team using large brushes to paint the surface with biomes, hand-placing bodies of water and buildings, and blending everything together.
CitizenCon 2016 included the biggest technological demonstrations that the game has publicly shown to-date, including fully functional procedural generation 2.0 for planets, real-time spring physics, and authoring tools. The technology suite was detailed in two of our recent interviews with Sean Tracy and Chris Roberts, but was overshadowed in some ways by statements given regarding a potential Squadron 42 demonstration for CitizenCon 2016.
Viewers of the stream (or our content) will know that Squadron 42 didn't make it into the live demonstration, already packed with hours of discussion on Spectrum (comms systems), procedural generation, authoring tools, and roadmaps. The demonstration was technologically and graphically impressive, but Squadron 42 was left out for polish and refinement reasons. Roberts indirectly referenced our interview on stage, stating that, “I gave an interview about 3 weeks ago and probably spoke too soon,” but continued that he hoped the planet and tools demonstrations would make up for this.
In our post-event interview, we spoke with CIG CEO Chris Roberts on his thoughts regarding the event, the roadmap for Star Citizen, and Squadron 42's absence. Learn more in the video, or find a transcript below.
Leading into Star Citizen's annual “CitizenCon” event, held today, we received preliminary details from CIG CEO Chris Roberts and Technical Director Sean Tracy, both of whom heavily focused on an unveil of new procedural planets technology. The first interview (with Roberts) covered the top-level overview of procedural planet generation technology, with the third interview (with Tracy) focusing on the driving tools behind the planets.
But CitizenCon 2016 marks the first time we've seen those tools in action, unveiled on stage in front of an audience of more than 600 people live, with more tuned-in to the stream. Our earlier interviews suggested that the first major, complete Squadron 42 mission would be unveiled at CitizenCon alongside this “Planets V2” tech and character technology updates, but the plans changed in the weeks since that discussion. Planets V2 took the spotlight with its Homestead demo, Squadron 42 has been delayed to allow for quality improvement on the single-player demo.
Note: This was written live during the event for immediate publication. We are continuing to live update.
(7:50PM PT 10/9 - We are done live updating. We've added 4K screenshots to this article, all new from tonight. We've got interviews with Chris Roberts, Sean Tracy, and Erin Roberts going live shortly.)
Before getting to further discussion, a recap of the last two weeks:
The Gears of War franchise may be 10 years old, but this fourth title is only the second to make its way to PC. Third-person shooters in general have never had as big of a following on PCs as they have on console. Gears of War, it seems, has become something of a gold standard of third-person shooters for play with dual-shock controllers. We’re here to give it a pass on the keyboard and mouse.
There are good reasons for the success of the franchise. With a controller, the dual thumbsticks of an Xbox controller suit maneuvering through cover and “locking” to walls. Then there's the trademark thickness of the characters; even normal humans stand out against other games. Against a backdrop of superfluous gore and chainsaw-guns, the Lancer, Gears of War 4 has established itself as one of the meatiest, most visceral shooters on the market.
One of the perennial remarks about the Gears of War franchise has always been the game's character design. Every male character seems have a jaw like a brick, or Robert Z’Dar, and the armor looks built of heavy metals. Now, the Gears of War franchise is five majors titles in, with novel, comic, and board game adaptations. The odd, not-entirely-human look of the characters has become a dedicated part of the franchise, and GoW4 remains true to that aesthetic. The Coalition may have taken over development from Epic, but they’ve also come a long way in design from the first Gears of War. One of the characters in this game -- Del -- even appears to have a neck. As with all the previous titles, females characters still look pretty human. 25 years older, Marcus Fenix still looks more like a bicep drawn by Rob Liefeld than a human.
We had a clerical error in our original Gears of War 4 GPU benchmark, but that's been fully rectified with this content. The error was a mix of several variables, primarily having three different folks working on the benchmarks, and working with a game that has about 40 graphics settings. We also had our custom Python script (which works perfectly) for interpreting PresentMon, a new tool to FPS capture, and that threw enough production changes into the mix that we had to unpublish the content and correct it.
All of our tests, though, were good. That's the good news. The error was in chart generation, where nVidia and AMD cards were put on the same charts using different settings, creating an unintentional misrepresentation of our data. And as a reminder, that data was valid and accurate – it just wasn't put in the right place. My apologies for that. Thankfully, we caught that early and have fixed everything.
I've been in communication with AMD and nVidia all morning, so everyone is clear on what's going on. Our 4K charts were completely accurate, but the others needed a rework. We've corrected the charts and have added several new, accurately presented tests to add some value to our original benchmark. Some of that includes, for instance, new tests that look at Ultra performance on nVidia vs AMD properly, tests that look at the 3GB vs 6GB GTX 1060, and more.e titles distributed to both PC and Xbox, generally leveraging UWP as a link.
Gears of War 4 is a DirectX 12 title. To this end, the game requires Windows 10 to play – Anniversary Edition, to be specific about what Microsoft forces users to install – and grants lower level access to the GPU via the engine. Asynchronous compute is now supported in Gears of War 4, useful for both nVidia and AMD, and dozens of graphics options make for a brilliantly complex assortment of options for PC enthusiasts. In this regard, The Coalition has done well to deliver a PC title of high flexibility, going the next step further to meticulously detail the options with CPU, GPU, and memory intensive indicators. Configure the game in an ambitious way, and it'll warn the user of a specific setting which may cause issues on the detected hardware.
That's incredible, honestly. This takes what GTA V did by adding a VRAM slider, then furthers it several steps. We cannot commend The Coalition enough for not only supporting PC players, but for doing so in a way which is so explicitly built for fine-tuning and maximizing hardware on the market.
In this benchmark of Gears of War 4, we'll test the FPS of various GPUs at Ultra and High settings (4K, 1440p, 1080p), furthering our tests by splashing in an FPS scaling chart across Low, Medium, High, and Ultra graphics. The benchmarks include the GTX 1080, 1070, 1060, RX 480, 470, and 460, and then further include last gen's GTX 980 Ti, 970, 960, and 950 with AMD's R9 Fury X, R9 390X, and R9 380X.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.