To everyone’s confusion, a review copy of Dragon Ball FighterZ for Xbox One showed up in our mailbox a few days ago. We’ve worked with Bandai Namco in the past, but never on console games. They must have cast a wide net with review samples--and judging by the SteamCharts stats, it worked.

It’d take some digging through the site archives to confirm, but we might never have covered a real fighting game before. None of us play them, we’ve tapered off doing non-benchmark game reviews, and they generally aren’t demanding enough to be hardware testing candidates (recommended specs for FighterZ include a 2GB GTX 660). For the latter reason, it’s a good thing they sent us the Xbox version. It’s “Xbox One X Enhanced,” but not officially listed as 4K, although that’s hard to tell at a glance: the resolution it outputs on a 4K display is well above 1080p, and the clear, bold lines of the cel-shaded art style make it practically indistinguishable from native 4K even during gameplay. Digital Foundry claims it’s 3264 x 1836 pixels, or 85% of 4K in height/width.

Today, we’re using Dragon Ball FighterZ to test our new console benchmarking tools, and further iterate upon them for -- frankly -- bigger future launches. This will enable us to run console vs. PC testing in greater depth going forward.

This week's hardware news recap teases some of our upcoming content pieces, including a potential test on Dragonball FighterZ, along with pending-publication interviews of key Spectre & Meltdown researchers. In addition to that, as usual, we discuss major hardware news for the past few days. The headline item is the most notable, and pertains to Samsung's GDDR6 memory entering mass production, nearing readiness for deployment in future products. This will almost certainly include GPU products, alongside the expected mobile device deployments. We also talk AMD's new-hires and RTG restructure, its retiring of the implicit primitive discard accelerator for Vega, and SilverStone's new low-profile air cooler.

Show notes are below the embedded video.

This content piece was highly requested by the audience, although there is presently limited point to its findings. Following the confluence of the Meltdown and Spectre exploits last week, Microsoft pushed a Windows security software update that sought to fill some of the security gaps, something which has been speculated as causing a performance dip between 5% and 30%. As of now, today, Intel has not yet released its microcode update, which means that it is largely folly to undertake the benchmarks we’re undertaking in this content piece – that said, there is merit to it, but the task must be looked at from the right perspective.

From the perspective of advancing knowledge and building a baseline for the next round of tests – those which will, unlike today’s, factor-in microcode patches – we must eventually run the tests being run today. This will give us a baseline for performance, and will grant us two critical opportunities: (1) We may benchmark baseline, per-Windows-patch performance, and (2) we can benchmark post-patch performance, pre-microcode. Both will allow us to see the isolated impact from Intel’s firmware update versus Microsoft’s software update. This is important, and alone makes the endeavor worthwhile – particularly because our CPU suite is automated, anyway, so no big time loss, despite CES looming.

Speaking of, we only had time to run one CPU through the suite, and only with a few games, as, again, CES is looming. This is enough for now, though, and should sate some demand and interest.

Microsoft has, rather surprisingly, made it easy to get into and maintain the Xbox One X. The refreshed console uses just two screws to secure the chassis – two opposing, plastic jackets for the inner frame – and then uses serial numbering to identify the order of parts removal. For a console, we think the Xbox One X’s modularity of design is brilliant and, even if it’s just for Microsoft’s internal RMA purposes, it makes things easier for the enthusiast audience to maintain. We pulled apart the new Xbox One X in our disassembly process, walking through the VRM, APU, cooling solution, and overall construction of the unit.

Before diving in, a note on the specs: The Xbox One X uses an AMD Jaguar APU, to which is affixed an AMD Polaris GPU with 40 CUs. This CU count is greater than the RX 580’s 36 CUs (and so yields 2560 SPs vs. 2304 SPs), but runs at a lower clock speed. Enter our errata from the video: The clock speed of the integrated Polaris GPU in the Xbox One X is purportedly 1172MHz (some early claims indicated 1720MHz, but that proved to be the memory speed); at 1172MHz, the integrated Polaris GPU is about 100MHz slower than the original reference Boost of the RX 480, or about 168MHz slower than some of the RX 580 partner models. Consider this a correction of those numbers – we ended up citing the 1700MHz figure in the video, but that is actually incorrect; the correct figure is 1172MHz core, 1700MHz memory (6800MHz effective). The memory operates a 326GB/s bandwidth on its 384-bit bus. As for the rest, 40 CUs means 160 TMUs, giving a texture fill-rate of 188GT/s.

Our hardware news round-up for the past week is live, detailing some behind-the-scenes / early information on our thermal and power testing for the i9-7900X, the Xbox One X hardware specs, Threadripper's release date, and plenty of other news. Additional coverage includes final word on Acer's 21X Predator, Samsung's 64-layer NAND finalization, Global Foundries' 7nm FinFET for 2018, and some extras.

We anticipate a slower news week for non-Intel/non-AMD entities this week, as Intel launched X299/SKY-X and AMD is making waves with Epyc. Given the command both these companies have over consumer news, it's likely that other vendors will hold further press releases until next week.

Find the show notes below, written by Eric Hamilton, along with the embedded video.

The right-to-repair bills (otherwise known as “Fair Repair”) that are making their way across a few different states are facing staunch opposition from The Entertainment Software Association, a trade organization including Sony, Microsoft, Nintendo as well as many video game developers and publishers. The proposed legislation would not only make it easier for consumers to fix consoles, but electronics in general, including cell phones. Bills have been introduced in Nebraska, Minnesota, New York, Massachusetts, and Kansas. Currently, the bill is the furthest along in Nebraska where the ESA have concentrated lobbying efforts.

Console makers have been a notable enemy of aftermarket repair, but they are far from alone; both Apple and John Deere have vehemently opposed this kind of legislation. In a letter to the Copyright Office, John Deere asserted—among other spectacular delusions, like owners only have an implied license to operate the tractor—that allowing owners to repair, tinker with, or modify their tractors would “make it possible for pirates, third-party developers, and less innovative competitors to free-ride off the creativity, unique expression and ingenuity of vehicle software.”

This issue has been driving us crazy for weeks. All of our test machines connect to shared drives on central terminal (which has Windows 10 installed). As tests are completed, we launch a Windows Explorer tab (file explorer) and navigate to \\COMPUTER-NAME\data to drop our results into the system. This setup is used for rapid file sharing across gigabit internal lines, rather than going through cumbersome USB keys or bloating our NAS with small test files.

Unfortunately, updating our primary test benches to Windows 10 Anniversary Edition broke this functionality. We’d normally enter \\COMPUTER-NAME\data to access the shared drive over the network, but that started returning an “incorrect username or password” error (despite using the correct username and password) after said Win10 update. The issue was worked around for a few weeks, but it finally became annoying enough to require some quick research.

Our full OCAT content piece is still pending publication, as we ran into some blocking issues when working with AMD’s OCAT benchmarking utility. In speaking with the AMD team, those are being worked-out behind the scenes for this pre-release software, and are still being actively documented. For now, we decided to push a quick overview of OCAT, what it does, and how the tool will theoretically make it easier for all users to perform Dx12 & Vulkan benchmarks going forward. We’ll revisit with a performance and overhead analysis once the tool works out some of its bugs.

The basics, then: AMD has only built the interface and overlay here, and uses the existing, open source Intel+Microsoft amalgam of PresentMon to perform the hooking and performance interception. We’ve already been detailing PresentMon in our benchmarking methods for a few months now, using PresentMon monitoring low-level API performance and using Python and Perl scripts built by GN for data analysis. That’s the thing, though – PresentMon isn’t necessarily easy to understand, and our model of usage revolves entirely around command line. We’re using the preset commands established by the tool’s developers, then crunching data with spreadsheets and scripts. That’s not user-friendly for a casual audience.

Just to deploy the tool, Visual Studio package requirements and a rudimentary understanding of CMD – while not hard to figure out – mean that it’s not exactly fit to offer easy benchmarking for users. And even for technical media, an out-of-box PresentMon isn’t exactly the fastest tool to work with.

The Coalition's Gears of War 4 demonstrated the capabilities of nVidia's new GTX 1070-enabled notebooks, operating at 4K with fully maxed-out graphics options. View our Pascal notebook article for more information on the specifics of the hardware. While at the event in England, we took notes of the game's complete graphics settings and some notes on graphics setting impact on the GPU and CPU. The Coalition may roll-out additional settings by the game's October launch.

We tested Gears of War 4 on the new MSI GT73 notebook with 120Hz display and a GTX 1070 (non-M) GPU. The notebook was capable of pushing maxed settings at 1080p and, a few pre-release bugs aside (pre-production hardware and an unfinished game), gameplay ran in excess of 60FPS.

We've got an early look at Gears of War 4's known graphics settings, elevated framerate, async compute, and dynamic resolution support. Note that the Gears team has promised “more than 30 graphics settings,” so we'll likely see a few more in the finished product. Here are our photos of the graphics options menu:

Windows 10 games distribution platform UWP has previously forced V-Sync onto users, but has become toggleable for Gears of War 4, Mark Reyner told Eurogamer. Among other technical changes, Gears of War 4 appears to be shaping up to be a proper benchmark title for our future GPU reviews. The game will host a benchmark mode – always a plus – while unlocking the framerate and adding super-resolution support. That means, like Shadow of Mordor and similar games, players will be able to run the game at whatever resolution they want. It’s similar to DSR/VSR in that the game renders at the higher resolution, then down scales to fit the display. This results in greater pixel density and increases clarity.

Page 1 of 3

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge