During GPU and CPU benchmarking for Battlefront, we encountered a 200FPS framerate cap / lock when using high-end configurations. For accurate benchmarking, this FPS limit had to be removed to allow the “natural” or absolute performance of the hardware on-bench. Thankfully, being that Battlefront runs on a familiar Frostbyte engine to previous Battlefield titles, the console commands are all nearly identical to Battlefield 4.
Here's the command you want to unlock FPS and disable the FPS limit in Star Wars Battlefront:
Fallout 4 – now an entire day old – is nearing the end of its content lifecycle on our test bench. We'll soon move on to the next major hardware and games, but for now, we've got a few tests left in Bethesda's latest title. This next benchmark looks at the game's CPU performance, a greater methodological challenge than our preceding GPU benchmark, volumetric lighting benchmark, and texture quality comparison.
Our Fallout 4 CPU benchmark compares FPS across the Intel & AMD lineups, including an i3-4130, i5-4690K, i5-6600K, some i7- CPUs, and AMD's FX-8370E, 8320E, & 9590. Other CPUs were tested as well, like the popular G3258 dual-core Pentium and A10-7870K APU, and are all included on the benchmark charts. The top-level goal was to find the best CPU for Fallout 4, but we also wanted to identify performance disparities and anomalies across models.
In suit of our Fallout 4 GPU benchmark and the follow-up Volumetric Lighting Benchmark, we're now looking toward Bethesda's odd texture quality presentation. Fallout 4 includes with it a Texture Quality option, which should dictate the resolution of textures as applied to game elements. The setting scales from medium to ultra, with one step (“high”) in between. Usually – especially in Black Ops III, as we found – texture resolution can have profound impact on performance and VRAM consumption, leading us to run some texture tests in Fallout.
Here's an example of what we're used to when it comes to texture quality comparisons:
NVidia's implementation of volumetric lighting utilizes tessellation for light shafts radiation and illumination of air. This approach allows better lighting when light sources are occluded by objects or when part of a light source is obfuscated, but requires that the GPU perform tessellation crunching to draw the light effects to the screen. NVidia is good at tessellation thanks to their architecture and specific optimizations made, but AMD isn't as good at it – Team Red regularly struggles with nVidia-implemented technologies that drive tessellation for visual fidelity, as seen in the Witcher's hair.
When benchmarking Fallout 4 on our lineup of GPUs, we noticed that the R9 390X was outclassed by the GTX 970 at 1080p with ultra settings. This set off a few red flags that we should investigate further; we did this, tuning each setting individually and ultimately finding that the 970 always led the 390X in our tests – no matter the configuration. Some settings, like shadow distance, can produce massive performance deltas (about 16-17% here), but still conclude with the 970 in the lead. It isn't until resolution is increased to 1440p that the 390X takes charge, somewhat expected given AMD's ability to handle raw pixel count at the higher-end.
Further research was required.
This is sort of a two-in-one fix – at least, it was for us.
Fallout 4, shipping tomorrow, is built on the same engine as Skyrim and previous Fallout games. Anyone familiar with Skyrim's expandability through mods and .ini tweaking may recall “iPresentInterval” – well, it's back.
iPresentInterval isn't just a V-Sync equivalent, which would lock the framerate to the refresh rate; instead, iPresentInterval caps the framerate at a hard 60 max (even with a 120Hz display). In Skyrim, changing this setting could impact physics events and was often recommended left on, despite the framerate limitation. To be fair, neither Skyrim nor Fallout are games that benefit from the notoriously high framerates demanded by CSGO players, for instance, but users of high refresh rate monitors still want their FPS.
A dozen hours of Black Ops 3 testing completed and we're moving to the next sequentially-incremented video game: Fallout 4. It's got a bigger number at the end.
Bethesda has one of the longest development life cycles in the industry, but the company's games are also among – arguably – the longest lasting, thanks to the undying efforts of modders. It helps that the modding community is able to fill gaps in Bethesda's code or build entirely new games from the strong foundation set forth by the veteran RPG team.
Our Fallout 4 game review & gameplay analysis is live on the homepage already, if that's what you're looking for. This post looks exclusively and in depth at Fallout 4's graphics settings and performance on PC. The below Fallout 4 PC benchmark tests FPS performance across nVidia and AMD GPUs, including the GTX 750 Ti, 960, 970, 270X, 285, 390X, and many more. VRAM and memory consumption is also looked at loosely below, hopefully establishing a baseline for the best video cards for Fallout 4 on PC.
Because mod tools don't yet exist – and certainly no mods did during our pre-release testing – we are not accounting for the inevitable performance hit created by future graphics mods.
Update: Volumetric lighting benchmark now live.
Living up to the rolling hype-ball generated by Fallout 4 seems nearly impossible. After reminiscing about Fallout 3 for years, living through false rumors, and the non-stop recent postings pertaining to Bethesda’s latest game, expectations are at an all-time high.
Fallout 4 is the much-anticipated continuation of the Fallout series. The first-person shooter, role-playing game is based in a futuristic world similar to our own, but diverging on a timeline wrought with nuclear war.
The newest game in the series is set in Commonwealth of Massachusetts -- Boston, specifically. Fallout 4 has been in development since the release of Fallout 3, a late 2008 launch, and uses the same engine as Skyrim. The basic storyline (note: this only covers the very beginning and initial storyline setup, nothing more) behind Fallout 4 is that a family is cryogenically frozen in Vault 111, after getting in just as the nuclear bombs go off. We’re then awoken, and see our child being taken by two unknown NPCs who’ve just killed our significant other. After waking up again, we discover that we’re the only one left alive in Vault 111. So begins the game, setting forth on a journey to find a child.
Now that the background is established, let’s cover some of the features Bethesda did well on: settlement building, FPS mechanics, weapon and armor modding, and game atmosphere. We’ll later visit subpar performance for average graphics and irritating bugs.
Activision's latest in its seemingly undying shooter franchise launched with fairly simplistic graphics settings, but still has a few items that may raise questions – like Order Independent Transparency and Subsurface Scattering. We talk about some of these at a top-level in our Black Ops 3 GPU benchmark, found here, but dive deep in this latest guide. Ignoring difficulties encountered with VRAM and memory, the heavy LOD scaling and graphics controls allow for scalability across the $130 to $1000 GPU range.
Our Call of Duty: Black Ops 3 optimization guide shows the best graphics settings for improving FPS, including screenshot comparisons of the settings. We independently benchmarked all of the game's settings. The screenshots below show texture quality (resolution) comparisons, preset & texture VRAM consumption, FPS performance for each setting, and more. We also define Order Independent Transparency, Volumetric Lighting, Subsurface Shadows, Mesh Quality, Shadow Mapping, and more of Call of Duty's options.
All of these tests were conducted using the patch released on November 7, which contained some bug fixes not addressed at launch. The latest nVidia (358.87) and AMD (15.11) drivers were used for testing. More below in the methodology sections.
Each setting will be listed by severity of its impact on FPS. Higher severity FPS impacters will be listed first.
We're in the final throes of our Call of Duty: Black Ops III content before moving on to the next game – you know the one. While processing data for our forthcoming graphics optimization guide, we realized that Black Ops III is among the most VRAM-hungry games we've ever tested, consuming upwards of 10GB GDDR5 on the Titan X.
Our GPU benchmarks included some initial memory benchmarking, stating that the 980 Ti saw full saturation of its 6GB framebuffer at 4K/max settings. We also showed that the game commits 15.2GB of memory under max settings (pageable address space), with an active physical consumption of about 6.7GB (working set) in Multiplayer. Our testing presents that the singleplayer campaign is far more intensive than multiplayer, to the tune of 38.6% lower FPS on the GPU side.
During tests of all Call of Duty: Black Ops 3's graphics settings, we uncovered a VRAM consumption approaching 10GB in campaign mode when using 4K & “Extra” settings.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.