Hardware Guides

AMD R3 2200G Delid & Liquid Metal Thermals

By Published February 14, 2018 at 1:03 pm

Delidding the AMD R3 2200G wasn’t as clean as using pre-built tools for Intel CPUs, but we have a separate video that’ll show the delid process to expose the APU die. The new APUs use thermal paste, rather than AMD’s usual solder, which is likely a cost-saving measure for the low-end parts. We ran stock thermal tests on our 2200G using the included cooler and a 280mm X62 liquid cooler, then delidded it, applied Thermal Grizzly Conductonaut liquid metal, and ran the tests again. Today, we’re looking at that thermal test data to determine what sort of headroom we gain from the process.

Delidding the AMD R3 2200G is the same process as for the 2400G, and liquid metal application follows our same guidelines as for Intel CPUs. This isn’t something we recommend for the average user. As far as we’re aware, one of Der8auer’s delid kits does work for Raven Ridge, but we went the vise & razor route. This approach, as you might expect, is a bit riskier to the health of the APU. It wouldn’t be difficult to slide the knife too far and destroy a row of SMDs (surface-mount devices), so we’d advise not following our example unless willing to risk the investment.

APU reviews have historically proven binary: Either it’s better to buy a dGPU and dirt-cheap CPU, or it’s actually a good deal. There is zero room for middle-ground in a market that’s targeting $150-$180 purchases. There’s no room to be wishy-washy, and no room for if/but/then arguments: It’s either better value than a dGPU + CPU, or it’s not worthwhile.

Preceding our impending Raven Ridge 2400G benchmarks, we decided to test the G4560 and R3 1200 with the best GPU money can buy – because it’s literally the only GPU you can buy right now. That’d be the GT 1030. Coupled with the G4560 (~$72), we land at ~$160 for both parts, depending on the momentary fluctuations of retailers. With the R3 1200, we land at about $180 for both. The 2400G is priced at $170, or thereabouts, and lands between the two.

(Note: The 2400G & 2200G appear to already be listed on retailers, despite the fact that, at time of writing, embargo is still on)

We’re revisiting one of the best ~200mm-ish fans that existed: The SilverStone Air Penetrator 180, or AP181, that was found in the chart-topping Raven02 case that we once held in high regard. We dug these fans out of our old Raven, still hanging around post-testing from years ago, and threw them into a test bench versus the Noctua 200mm and Cooler Master 200mm RGB fans (the latter coming from the H500P case).

These three fans, two of which are advertised as 200mm, all have different mounting holes. This is part of the reason that 200mm fans faded from prominence (the other being replacing mesh side panels with a sheet of glass), as companies were all fighting over a non-standardized fan size. Generally speaking, buying a case with 200mm fans did not – and still does not – guarantee that other 200mm fans will work in that case. The screw hole spacing is different, the fan size could be different, and there were about 4 types of 200mm-ish fans from the time: 180mm, 200mm, 220mm, and 230mm.

That’s a large part of the vanishing act of the 200mm fans, although a recent revival by Cooler Master has resurrected some interest in them. It’s almost like a fashion trend: All the manufacturers saw at Computex that 200mm fans were “in” again, and immediately, we started seeing CES 2018 cases making a 200mm push.

The short answer to the headline is “sometimes,” but it’s more complicated than just FPS over time. To really address this question, we have to first explain the oddity of FPS as a metric: Frames per second is inherently an average – if we tell you something is operating at a variable framerate, but is presently 60FPS, what does that really mean? If we look at the framerate at any given millisecond, given that framerate is inherently an average of a period of time, we must acknowledge that deriving spot-measurements in frames per second is inherently flawed. All this stated, the industry has accepted frames per second as a rating measure of performance for games, and it is one of the most user-friendly means to convey what the actual, underlying metric is: Frametime, or the frame-to-frame interval, measured in milliseconds.

Today, we’re releasing public some internal data that we’ve collected for benchmark validation. This data looks specifically at benchmark duration or optimization tests to min-max for maximum accuracy and card count against the minimum time required to retain said accuracy.

Before we publish any data for a benchmark – whether that’s gaming, thermals, or power – we run internal-only testing to validate our methods and thought process. This is often where we discover flaws in methods, which allow us to then refine them prior to publishing any review data. There are a few things we traditionally research for each game: Benchmark duration requirements, load level of a particular area of the game, the best- and worst-case performance scenarios in the game, and then the average expected performance for the user. We also regularly find shortcomings in test design – that’s the nature of working on a test suite for a year at a time. As with most things in life, the goal is to develop something good, then iterate on it as we learn from the process.

To everyone’s confusion, a review copy of Dragon Ball FighterZ for Xbox One showed up in our mailbox a few days ago. We’ve worked with Bandai Namco in the past, but never on console games. They must have cast a wide net with review samples--and judging by the SteamCharts stats, it worked.

It’d take some digging through the site archives to confirm, but we might never have covered a real fighting game before. None of us play them, we’ve tapered off doing non-benchmark game reviews, and they generally aren’t demanding enough to be hardware testing candidates (recommended specs for FighterZ include a 2GB GTX 660). For the latter reason, it’s a good thing they sent us the Xbox version. It’s “Xbox One X Enhanced,” but not officially listed as 4K, although that’s hard to tell at a glance: the resolution it outputs on a 4K display is well above 1080p, and the clear, bold lines of the cel-shaded art style make it practically indistinguishable from native 4K even during gameplay. Digital Foundry claims it’s 3264 x 1836 pixels, or 85% of 4K in height/width.

Today, we’re using Dragon Ball FighterZ to test our new console benchmarking tools, and further iterate upon them for -- frankly -- bigger future launches. This will enable us to run console vs. PC testing in greater depth going forward.

It’s been nearly a month since news broke on Meltdown and Spectre, but the tech industry is still swarming like an upturned anthill as patches have been tumultuous, hurting performance, causing reboots, and then getting halted and replaced, while major manufacturers try to downplay the problem. Yes, that sentence was almost entirely about Intel, but they aren’t the only ones affected. We now return to the scene of the crime, looking at the Meltdown and Spectre exploits with the assistance of several research teams behind the discovery of these attacks.

To summarize the summary of our previous article: Meltdown is generally agreed to be more severe, but limited to Intel, while Spectre has to do with a fundamental aspect of CPUs made in the past 20 years. They involve an important technique used by modern CPUs to increase efficiency, called “speculative execution,” which is allows a CPU to preemptively queue-up tasks it speculates will next occur. Sometimes, these cycles are wasted, as the actions never occur as predicted; however, most of the time, speculating on incoming jobs will greatly improve efficiency of the processor by preemptively computing the inbound instructions. That’s not the focus of this article, but this Medium article provides a good intermediate-level explanation of the mechanics, as do the Spectre and Meltdown whitepapers themselves. For now, it’s important to know that although “speculative execution” is a buzzword being tossed around a lot, it isn’t in itself an exploit--the exploits just take advantage of it.

The most comprehensive hub of information on Meltdown and Spectre is the website hosted by Graz University of Technology in Austria, home of one of the research teams that discovered and reported them to Intel. That’s “one of” because there are no fewer than three other teams acknowledged by Graz that independently discovered and reported these vulnerabilities over the past few months. We’ve assembled a rough timeline of events, with the aid of WIRED’s research:

A GTX 1080 Ti today costs the same as an entire PC build in 2017 – and one containing said 1080 Ti, at that. RAM today costs 2-4x its price in 2016 and 2017. SSDs, at best, have stagnated; at worst, some have increased in price marginally.

Today, we’re benchmarking a 2017 “MSRP” PC build versus a 2018 current-price PC build, using a $1500 budget. Our objective was to see how far we could push performance at around $1500, using only new components, when comparing the best prices of yesteryear versus the prices of today. If there must be a point to this content, the primary takeaway is to avoid purchasing new GPUs at prices so far beyond MSRP that they enter old flagship categories.

As for components, we’re using Intel as a baseline, as platform scalability makes more sense when tested between the same architectures (going to Ryzen, for instance, would make for better performance in Blender, but worse performance in games, thus killing the point of a like-for-like dollar-stretching benchmark). Intel has also had the most severe price swings in the past year, whereas AMD has remained largely steady at launch Ryzen prices, and has often dipped well under them. Intel has remained north of MSRP or at MSRP.

Our builds are as follows:

PC versus console is an ancient debate, long discussed by the wisest and most scholarly of YouTube commenters. PCs are described as expensive, bulky, and difficult to assemble or work with, while consoles are called underpowered, underperforming systems that hold game development back for the duration of each generation. The pro-console responses to our first Xbox One X tests usually boiled down to: “it’s still better than a $500 PC.”

It’s a reasonable argument, and it’s the basis on which consoles are sold these days. By popular demand*, then, we’ve built a $500 PC to compare to the Xbox One X (list price: $500) in performance. We tested whether the 4K-capable Xbox One X is “better” than an equivalently priced PC, judging by framerates in two of the Xbox’s first batch of 4K-enabled games, Destiny 2 and Assassin’s Creed: Origins.

Given the recent insanoland surge in RAM and GPU prices, the argument is more poignant than ever. DIY PCs stand to lose marketshare if people can’t afford to build a cheap machine, and so we thought we’d use our new in-house software to benchmark a low-end PC and an Xbox One X.

We spent the whole of 2017 complaining about airflow in cases, and it seems that some manufacturers have heard our complaints for 2018. Cooler Master has revamped its H500P to now feature a mesh face, and has also resolved significant build quality concerns with the front and top panels. Enermax rebuilt its Saberay to use mesh front and top panel inserts (optionally), a major step forward. Lian Li put out the best case at this year’s show, focusing on both looks and airflow (with two different models).

This is our review of the best cases of 2018, CES edition, following on a now six-year (!) tradition of “Best Case” coverage from CES. We started CES case round-ups in 2012, and have advanced them significantly since. Our demands have also changed significantly, as we look more toward function-focused designs that can artfully integrate ease-of-installation features.

This content piece was highly requested by the audience, although there is presently limited point to its findings. Following the confluence of the Meltdown and Spectre exploits last week, Microsoft pushed a Windows security software update that sought to fill some of the security gaps, something which has been speculated as causing a performance dip between 5% and 30%. As of now, today, Intel has not yet released its microcode update, which means that it is largely folly to undertake the benchmarks we’re undertaking in this content piece – that said, there is merit to it, but the task must be looked at from the right perspective.

From the perspective of advancing knowledge and building a baseline for the next round of tests – those which will, unlike today’s, factor-in microcode patches – we must eventually run the tests being run today. This will give us a baseline for performance, and will grant us two critical opportunities: (1) We may benchmark baseline, per-Windows-patch performance, and (2) we can benchmark post-patch performance, pre-microcode. Both will allow us to see the isolated impact from Intel’s firmware update versus Microsoft’s software update. This is important, and alone makes the endeavor worthwhile – particularly because our CPU suite is automated, anyway, so no big time loss, despite CES looming.

Speaking of, we only had time to run one CPU through the suite, and only with a few games, as, again, CES is looming. This is enough for now, though, and should sate some demand and interest.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge