While testing Titanfall, I kept tally of memory consumption to get a ballpark understanding of the game's bulk.
Titanfall's Graphics Settings Explained
I'll get into our test methodology and applied settings in a moment, but first, a look at what Titanfall offers:
- Impact Marks: How many visible landing marks are simultaneously tracked and displayed (from Titans and other falling objects). The CPU handles this task almost exclusively.
- Ragdoll physics: The accuracy with which dead body physics are presented is contingent upon the CPU's processing abilities.
- Lighting Quality: The diffusion effects, light count, and types of lights displayed on the screen. This is primarily a GPU-intensive task.
- Model Detail: The Level of Detail (LOD) for character models in-game.
Lower LODsettings will significantly improve framerate.
- Anti-Aliasing: Higher anti-aliasing settings (and certain types of AA) will push GPUs harder. Lowering AA settings and selecting the correct type for your GPU will improve framerate.
- Texture Filtering: The perceived depth, grit, and 'pop' in surface details by the observer. Higher anisotropic filtration settings will give the illusion of added surface depth (actual texture, feeling) to the painted objects on screen.
- Texture Resolution: As we've discussed countless times before on this website, high-resolution textures are the number one consumer of on-card video memory. Amping-up the texture resolution will eat into video RAM if using a dedicated card and system RAM if using an APU/IGP. System RAM will still be consumed for textures even with a discrete card, but to a lesser degree.
Effects details and Shadow details are a fair mix between CPU- and GPU-intensive processing.
Some items to note:
- Trilinear filtering is presently required to make use of 120Hz refresh rate displays.
- 'Insane' texture resolution is allegedly only available when more than 3GB of video RAM is present, though we were able to access it on AMD devices and with SLI.
- Titanfall appears to lock the FPS at the monitor's refresh rate (60Hz = 60FPS locked framerate; 120Hz = 120FPS locked framerate, etc.). Ensure your display device is set to its optimal response frequency and ensure V-Sync is configured properly. You can learn about V-Sync in our previous post, found here; you'll want it to either be disabled or, if you're using a 120Hz display, triple-buffered.
While on the topic of V-Sync and our previous analysis of it, Titanfall has severe tear and stutter issues right now (pick your poison based upon your settings). It makes play a bit jarring, but that's the nature of a beta. It appeared that no GPU handled this better than the other, so I believe it's simply an issue with the pre-launch rev of the game and not a user-resolvable hardware issue.
All of these items now outlined, it's worth giving some optimization tips for weaker or lopsided hardware configurations. If you find that your GPU is underpowered in comparison to the accompanying CPU, we found the best graphics settings to diminish for Titanfall would be model detail, texture detail (if you're caching-out on the RAM), and texture filtering. The opposite -- a strong GPU and weak CPU -- would do well to minimize ragdoll physics and impact marks, then effect details, then shadow details.
Our initial tests were thrown out due to limitations discovered in some configurations. Titanfall appears to lock the framerate to the display, so if you've got a 60Hz display, your GPU will never report more than 60 frames per second. This would obviously have a massive negative impact on relative performance results, so we began investigating both software and hardware solutions. Because Titanfall runs on the Source engine, I spent a bit of time playing around in configuration files in an attempt to force com_maxfps to be a greater value, but to no avail. This is not aided by the fact that Respawn has restricted access to the command console within Titanfall. Finally, we stopped being lazy and substituted in a 27" 120Hz display to counteract this issue. Done. Solved.
It was on the other side of the room. Can't blame me for spending an hour with configuration files before moving it.
The test bench used was our standard hardware review platform, detailed in all HW reviews.
|GN Test Bench 2013||Name||Courtesy Of||Cost|
(This is what we're testing).
|CPU||Intel i5-3570k CPU||GamersNexus||~$220|
|Memory||16GB Kingston HyperX Genesis 10th Anniv. @ 2400MHz||Kingston Tech.||~$117|
|Motherboard||MSI Z77A-GD65 OC Board||GamersNexus||~$160|
|Power Supply||NZXT HALE90 V2||NZXT||Pending|
|SSD||Kingston 240GB HyperX 3K SSD||Kingston Tech.||~$205|
|Optical Drive||ASUS Optical Drive||GamersNexus||~$20|
|CPU Cooler||Thermaltake Frio Advanced||Thermaltake||~$65|
The system was kept in a constant thermal environment (21C - 22C at all times) while under test. We keep the CPU overclocked at 4.4GHz (44x multipler) running a 1.265V vCore. 4x4GB memory modules were kept overclocked at 2133MHz. All case fans were set to 100% speed and automated fan control settings were disabled for purposes of test consistency.
A 120Hz display was connected for purposes of testing Titanfall without a frame-lock limiter. The native resolution of the display is 1920x1080, which is what we used throughout the test for best real-world spread. Trilinear filtering was enabled to allow a 120Hz framerate.
We ran the following settings for all discrete devices under test (DUT):
When testing the A10-5800K Trinity APU with Titanfall, we had to lower settings due to compatibility and performance concerns. For these reasons, the A10 (7660D)'s results on the benchmark are not tested on the same maxed settings as all other devices. See the 7660D's settings below:
The video cards tested included, in ascending order of gaming power:
- Intel HD4000 3rd-Gen IGP (last-gen).
- AMD A10-5800K APU 7660D IGP (superseded by Kaveri).
- AMD Radeon 7850 1GB GPU.
- CrossFire 2xAMD Radeon 7850 1GB GPUs.
- NVIDIA GeForce GTX 650 Ti Boost 2GB.
- SLI 2xNVIDIA GeForce GTX 650 Ti Boost 2GB.
- NVIDIA GeForce GTX 580 1.5GB (Fermi flagship, superseded by Kepler).
- NVIDIA GeForce GTX 760 2GB.
We ran each test 5 times for 60 seconds using FRAPS for benchmarking frametimes and framerate. All tests were conducted on the same map in relatively the same area. All tests were conducted on full servers. We did our best to keep a similar approach to play throughout the tests. All five tests for each card were averaged for maximum FPS, minimum FPS, average FPS, and 1% Low FPS (a better indicator than 'minimum' as it eliminates outliers).
Test Objective: Eliminate the CPU as a limiter and test GPU-limited gameplay for framerate performance (exception: APU test, wherein only the APU was used with no discrete component).
Titanfall PC Benchmarks: 7850 vs. 650 Ti Boost vs. 650 Ti Boost SLI vs. 760
First off, I stated earlier that RAM was loosely tracked throughout these tests. I'll note immediately that I often saw system memory consumption sitting in the range of ~3~3.5GB consumed by Titanfall's process. During simpler tests and when under less load, we tended to see 1.7GB of active memory consumption.
Now, the results:
(Note: We wrote a guide that contains common Titanfall crash/stutter fixes; if you're having issues with your beta, check the guide for assistance).
What you're looking at is:
- Average FPS: This is the number you care most about. This is the most realistic representation of what you'll experience with this video card in Titanfall.
- Minimum FPS: The lowest FPS ever reported (<0.01% of the total frame count). This is an outlier by nature and is less realistic of a measurement than the next item.
- 1% Time Low FPS: The (low) FPS displayed 1% of the time; a representation of how low your 'lag spikes' will go when limited by video hardware. This will have serious impact on streaming and video capture.
You'll notice straight away that the CrossFire 7850s and HD4000 are marked as '0' in their scoring channel. Although the 7850s in CrossFire theoretically should have good performance, the game exhibited fierce artifacting and resulted in impossible play. It is likely that this is an issue with the beta that will be resolved prior to launch. The HD4000 exhibited the same artifacting; though for the few seconds I managed to play, I was met with ~9~18 FPS. I did not scientifically measure the HD4000 framerate due to the impossibility of playing the game under the artifacting conditions.
So of these tested GPUs, what's the best video card for Titanfall?
There are a lot more video cards out there, but our suite does a pretty good job of covering low-end, really low-end/HTPC, mid-range, and borderline-high-end PCs. The only thing we truly lacked in this benchmark was a representation of the flagship class -- like a 780, 780 Ti, or AMD R9 290 of some sort. That said, judging off the stellar performance of the GTX 760, it's fair to assume that a flagship card's primary advantage over the 760 (which already exceeds 'playable' standards) would be live video capture and multi-monitor / 3D / VR headroom.
Regardless, of these devices, the GTX 760 very clearly 'wins' over the rest. I had very high hopes for SLI GTX 650 Ti Boosts, but Titanfall seems to have some serious multi-GPU issues right now, so I can't say I trust that the SLI test was representative of the finalized product. We'll see. It's too close to the single 650 TiB and SLI actually underperformed in other aspects, so we'll have to revisit this toward a finalized release.
On the low-end, it's safe to say that playing Titanfall on a flagship APU of each generation (Trinity onward) is possible. With Trinity, you'll be pretty close to the lowest possible settings (but still at 1080), though Kaveri's 7850K APU should seriously outperform its predecessor in this regard. It is unreasonable to expect playability on an Intel IGP at this time.
AMD's Radeon 7850 -- although we no longer recommend it for purchase (poor value due to GPU market price inflation and other options) -- performs surprisingly well on the whole, but has some issues. There were very noticeable framerate drops during gameplay; intense battles would drop us from 60FPS to the 30-45FPS range, which felt like a totally different game in that it was much more difficult to compete. As a player, I'd forfeit settings to a moderate medium-high mix to compensate.
NVidia's GTX 650 Ti Boost was one of our favorite budget GPUs up until recently (will be superseded by the impending 750 Ti launch), and actually outperformed the (admittedly abused) former Fermi flagship - the GTX 580. This is primarily due to the memory availability. Titanfall's texture size and resolution fills the framebuffer quickly; the industry has generally pushed us to strictly recommend 2GB and larger framebuffers these days, but this test reinforces our suggestion to pursue more video memory.
Finally, the GTX 760 serves unsurprisingly well in Titanfall. The video card offered the most reliable framerate with the fewest dips below 30, making it an ideal mid-range purchase for most gamers with the budget for it; if you're doing any sort of lightweight FRAPS, streaming, or ShadowPlay recording for YouTube, we'd recommend a GTX 760 or higher. It is likely that you will need to drop one or two settings while recording with a 760, but the mid-range cost makes it worth it.
If you need the best visuals on a 120Hz+ display and have the money, the GTX 760 seems an easy choice. I'd say wait out for a GTX 750 Ti if you're considering a more budget-friendly card (it will replace the 650 Ti Boost, from what we understand).
For those on a tighter budget, the usual recommendation would be either a 2GB 7850 or R9 270, but AMD's prices are still rubber-banding from the cryptocurrency boom. Keep an eye on prices of those units and the 750 Ti.
Please don't hesitate to leave us a comment below with any questions! If you need support building a PC for Titanfall, drop us a forum post!
- Steve "Lelldorianx" Burke.