Steve Burke

Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

Our viewers have long requested that we add standardized case fan placement testing in our PC case reviews. We’ve previously talked about why this is difficult – largely logistically, as it’s neither free in cost nor free in time – but we are finally in a good position to add the testing. The tests, we think, clearly must offer some value, because it is one of our most-requested test items over the past two years. We ultimately want to act on community interests and explore what the audience is curious about, and so we’ve added tests for standardized case fan benchmarking and for noise normalized thermal testing.

Normalizing for noise and running thermal tests has been our main, go-to benchmark for PC cooler testing for about 2-3 years now, and we’ve grown to really appreciate the approach to benchmarking. Coolers are simpler than cases, as there’s not really much in the way of “fan placement,” and normalizing for a 40dBA level has allowed us to determine which coolers have the most efficient means of cooling when under identical noise conditions. As we’ve shown in our cooler reviews, this bypasses the issue where a cooler with significantly higher RPM always chart-tops. It’s not exactly fair if a cooler at 60dBA “wins” the thermal charts versus a bunch of coolers at, say, 35-40dBA, and so normalizing the noise level allows us to see if any proper differences emerge when the user is subjected to the same “volume” from their PC cooling products. We have also long used these for GPU cooler reviews. It’s time to introduce it to case reviews, we think, and we’ll be doing that by sticking with the stock case fan configuration and reducing case fan RPMs equally to meet the target noise level (CPU and GPU cooler fans remain unchanged, as these most heavily dictate CPU and GPU coolers; they are fixed speeds constantly).

This content piece started with Buildzoid’s suggestion for us to install a custom VBIOS on our RX 570 for timing tuning tests. Our card proved temperamental with the custom VBIOS, so we ended up instead – for now – testing AMD’s built-in timing level options in the drivers. AMD’s GPU drivers have a drop-down option featuring “automatic,” “timing level 1,” and “timing level 2” settings for Radeon cards, all of which lack any formal definition within the drivers. We ran an RX 570 and a Vega 56 card through most of our tests with these timings options, using dozens of test passes across the 3DMark suite (for each line item) to minimize the error margins and help narrow-in the range of statistically significant results. We also ran “real” gaming workloads in addition to these 3DMark passes.

Were we to step it up, the next goal would be to use third-party tools to manually tune the memory timings, whether GDDR5 or HBM2, or custom VBIOSes on cards that are more stable. For now, we’ll focus on AMD’s built-in options.

AMD didn’t claim that its R7 2700X Gold Edition would be special in any frequency or binning sense of the word, but exposure to the Intel i7-8086K has obviously led us to project our hopes onto AMD that it would be binned. This is, of course, a fault of our own and not of AMD’s, as it’s not like the company claimed binning, but we still wanted to try and see if we could get a golden Gold Edition sample. In this content, we’ll be establishing that the special 50th anniversary edition 2700X doesn’t come with higher clocks than stock (but it’s not like AMD claimed otherwise), then attempting to find more overclocking headroom than our 2700X and 2700 original samples.

For the most part, this CPU was released as a commemorative item. It has a laser engraving of CEO Lisa Su’s signature (not an actual signature), which clearly illustrates its purpose as more of one for display than some special bin. Despite the 50th Anniversary gift being gold, it would seem the 2700X Gold Edition is named more for its bundling with The Division 2 Gold Edition and a 1-year season pass, alongside World War Z. If you were buying these anyway, it’s not a bad deal. If not, you’d still be better off buying a 2700 and overclocking it – purely from a financial standpoint – than spending the extra money on the Gold Edition. That said, you wouldn’t get the box or laser-etched name, so once again, this is very obviously priced higher for AMD purists and fans.

One of our most popular videos of yore talks about the GTX 960 4GB vs. GTX 960 2GB cards and the value of choosing one over the other. The discussion continues today, but is more focused on 3GB vs. 6GB comparisons, or 4GB vs. 8GB comparisons. Now, looking back at 2015’s GTX 960, we’re revisiting with locked frequencies to compare memory capacities. The goal is to look at both framerate and image quality to determine how well the 2GB card has aged versus how well the 4GB card has aged.

A lot of things have changed for us since our 2015 GTX 960 comparison, so these results will obviously not be directly comparable to the time. We’re using different graphics settings, different test methods, a different OS, and much different test hardware. We’ve also improved our testing accuracy significantly, and so it’s time to take all of this new hardware and knowledge and re-apply it to the GTX 960 2GB vs. 4GB debate, looking into whether there was really a “longevity” argument to be made.

NVIDIA’s GTX 1650 was sworn to secrecy, with drivers held for “unification” reasons up until actual launch date. The GTX 1650 comes in variants ranging from 75W to 90W and above, meaning that some options will run without a power connector while others will focus on boosted clocks, power target, and require a 6-pin connector. GTX 1650s start at $150, with this model costing $170 and running a higher power target, more overclocking headroom, and potentially better challenging some of NVIDIA’s past-generation products. We’ll see how far we can push the 1650 in today’s benchmarks, including overclock testing to look at maximum potential versus a GTX 1660. We’re using the official, unmodified GTX 1650 430.39 public driver from NVIDIA for this review.

We got our card two hours before product launch and got the drivers at launch, but noticed that NVIDIA tried to push drivers heavily through GeForce Experience. We pulled them standalone instead.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge