The goal of this content is to show that HBAO and SSAO have negligible performance impact on Battlefield 1 performance when choosing between the two. This benchmark arose following our Battlefield 1 GPU performance analysis, which demonstrated consistent frametimes and frame delivery on both AMD and nVidia devices when using DirectX 11. Two of our YouTube commenters asked if HBAO would create a performance swing that would favor nVidia over AMD and, although we've discussed this topic with several games in the past, we decided to revisit for Battlefield 1. This time, we'll also spend a bit of time defining what ambient occlusion actually is, how screen-space occlusion relies on information strictly within the z-buffer, and then look at performance cost of HBAO in BF1.

We'd also recommend our previous graphics technology deep-dive, for folks who want a more technical explanation of what's going on for various AO technologies. Portions of this new article exist in the deep-dive.

A frame's arrival on the display is predicated on an unseen pipeline of command processing within the GPU. The game's engine calls the shots and dictates what's happening instant-to-instant, and the GPU is tasked with drawing the triangles and geometry, textures, rendering lighting, post-processing effects, and dispatching the packaged frame to the display.

The process repeats dozens of times per second – ideally 60 or higher, as in 60 FPS – and is only feasible by joint efforts by GPU vendors (IHVs) and engine, tools, and game developers (ISVs). The canonical view of game graphics rendering can be thought of as starting with the geometry pipeline, where the 3-dimensional model is created. Eventually, lighting gets applied to the scene, textures and post-processing is applied, and the scene is compiled and “shipped” for the gamer's viewing. We'll walk through the GPU rendering and game graphics pipeline in this “how it works” article, with detailed information provided by nVidia Director of Technical Marketing Tom Petersen.

We spoke exclusively with the Creative Assembly team about its game engine optimization for the upcoming Total War: Warhammer. Major moves to optimize and refactor the game engine include DirectX 12 integration, better CPU thread management (decoupling the logic and render threads), and GPU-assigned processing to lighten the CPU load.

The interview with Al Bickham, Studio Communications Manager at Creative Assembly, can be found in its entirety below. We hope to soon visit the topic of DirectX 12 support within the Total War: Warhammer engine.

Our East Coast Game Conference coverage kicks-off with Epic Games' rendering technology, specifically as it pertains to implementation within upcoming MOBA “Paragon.” Epic Games artist Zak Parrish covered topics relating to hair, skin, eyes, and cloth, providing a top-level look at game graphics rendering techniques and pipelines.

The subject was Sparrow, a Braid-like playable archer hero with intensely detailed hair and lighting. Parrish used Sparrow to demonstrate each of his rendering points – but we'll start with sub-surface scattering, which may be a bit of a throwback for readers of our past screen-space subsurface scattering article (more recently in Black Ops III graphics guide).

In our latest graphics technology interview – one of many from GDC 2016 – we spoke with Crytek's Frank Vitz about CryEngine's underlying graphics tech. Included in the discussion is a brief-but-technical overview of DirectX 12 integration (and why it's more than just a wrapper), particle FX, audio tech, and more.

This is following Crytek's announcement on CryEngine V (published here), where we highlighted the company's move to fully integrate DirectX 12 into the new CryEngine. As an accompaniment to this interview, we'd strongly encourage a read-through of our 2000-word document on new graphics technologies shown at GDC 2016.

GDC 2016 marks further advancement in game graphics technology, including a somewhat uniform platform update across the big three major game engines. That'd be CryEngine (now updated to version V), Unreal Engine, and Unity, of course, all synchronously pushing improved game fidelity. We were able to speak with nVidia to get in-depth and hands-on with some of the industry's newest gains in video game graphics, particularly involving voxel-accelerated ambient occlusion, frustum tracing, and volumetric lighting. Anyone who's gained from our graphics optimization guides for Black Ops III, the Witcher, and GTA V should hopefully enjoy new game graphics knowledge from this post.

The major updates come down the pipe through nVidia's GameWorks SDK version 3.1 update, which is being pushed to developers and engines in the immediate future. NVidia's GameWorks team is announcing five new technologies at GDC:

  • Volumetric Lighting algorithm update

  • Voxel-Accelerated Ambient Occlusion (VXAO)

  • High-Fidelity Frustum-Traced Shadows (HFTS)

  • Flow (combustible fluid, fire, smoke, dynamic grid simulator, and rendering in Dx11/12)

  • GPU Rigid Body tech

This article introduces the new technologies and explains how, at a low-level, VXAO (voxel-accelerated ambient occlusion), HFTS (high-fidelity frustum-traced shadows), volumetric lighting, Flow (CFD), and rigid bodies work.

Readers interested in this technology may also find AMD's HDR display demo a worthy look.

Before digging in, our thanks to nVidia's Rev Lebaredian for his patient, engineering-level explanation of these technologies.

NVidia's implementation of volumetric lighting utilizes tessellation for light shafts radiation and illumination of air. This approach allows better lighting when light sources are occluded by objects or when part of a light source is obfuscated, but requires that the GPU perform tessellation crunching to draw the light effects to the screen. NVidia is good at tessellation thanks to their architecture and specific optimizations made, but AMD isn't as good at it – Team Red regularly struggles with nVidia-implemented technologies that drive tessellation for visual fidelity, as seen in the Witcher's hair.

When benchmarking Fallout 4 on our lineup of GPUs, we noticed that the R9 390X was outclassed by the GTX 970 at 1080p with ultra settings. This set off a few red flags that we should investigate further; we did this, tuning each setting individually and ultimately finding that the 970 always led the 390X in our tests – no matter the configuration. Some settings, like shadow distance, can produce massive performance deltas (about 16-17% here), but still conclude with the 970 in the lead. It isn't until resolution is increased to 1440p that the 390X takes charge, somewhat expected given AMD's ability to handle raw pixel count at the higher-end.

Further research was required.

During the GTA V craze, we posted a texture resolution comparison that showcased the drastic change in game visuals from texture settings. The GTA content also revealed VRAM consumption and the effectively non-existent impact on framerates by the texture setting. The Witcher 3 has a similar “texture quality” setting in its game graphics options, something we briefly mentioned in our Witcher 3 GPU benchmark.

This Witcher 3 ($60) texture quality comparison shows screenshots with settings at Ultra, High, Normal, and Low using a 4K resolution. We also measured the maximum VRAM consumption for each setting in the game, hoping to determine whether VRAM-limited devices could benefit from dropping texture quality. Finally, in-game FPS was measured as a means to determine the “cost” of higher quality textures.

“Tessellation” isn't an entirely new technology – Epic and Crytek have been talking about it for years, alongside nVidia's own pushes – but it's been getting more visibility in modern games. GTA V, for instance, has a special tessellation toggle that can be tweaked for performance. Like most settings found in a graphics menu, the general understanding of tessellation is nebulous at best; it's one of those settings that, perhaps like ambient occlusion or anti-aliasing, has a loose tool-tip of a definition, but doesn't get broken-down with much depth.

As part of our efforts to expand our game graphics settings glossary, we sat down with Epic Games Senior Technical Artist Alan Willard, a 17-year veteran of the company. Willard provided a basic overview of tessellation, how it is used in game graphics, GPU load and performance, and implementation techniques.

Physically-based rendering promises photorealistic lighting in 3D environments by offering a mathematical, less production-intensive approach to the rendering of light. The three letter acronym – “PBR” – has circulated lately as industry frontrunners like Chris Roberts (Star Citizen) have touted its presence in triple-A titles. A few game engines come to mind when we think of advanced, hyper-realistic graphics capabilities; one of those engines is the CryEngine, developed and maintained by Crytek and best-known for advancing PC graphics with Crysis.

While at GDC, we had the opportunity to ask a pair of CryEngine developers to explain what PBR is, how it affects gameplay, and if it has any performance or production impact to games. Joining us in the below video is 3D Environment Artist Sina Els and Engine Programmer Scott Peter, providing a top-level definition of physically-based rendering and its uses.

Page 1 of 2

  VigLink badge