Hardware Guides

GPU manufacturer Visiontek is old enough to have accumulated a warehouse of unsold, refurbished cards. Once in a while, they’ll clear stock by selling them off in cheap mystery boxes. It’s been a long time since we last reported on these boxes, and GPU development has moved forward quite a bit, so we wanted to see what we could get for our money. PCIe cards were $10 for higher-end and $5 for lower, and AGP and PCI cards were both $5. On the off chance that Visiontek would recognize Steve’s name and send him better-than-average cards, we placed two identical orders, one in Steve’s name and one in mine (Patrick). Each order was for one better PCIe card, one worse, one PCI, and one AGP.

The AMD R9 290X, a 2013 release, was the once-flagship of the 200 series, later superseded by the 390X refresh, (sort of) the Fury X, and eventually the RX-series cards. The R9 290X typically ran with 4GB of memory, although the 390X made 8GB somewhat commonplace, and was a strong performer for early 1440p gaming and high-quality 1080p gaming. The goal posts have moved, of course, as time has mandated that games get more difficult to render, but the 290X is still a strong enough card to warrant a revisit in 2019.

The R9 290X still has some impressive traits today, and those influence results to a point of being clearly visible at certain resolutions. One of the most noteworthy features is its 64 count of ROPs, where the output is converted into a bitmapped image, and its 176 TMUs. The ROPs assist in improving performance scaling as resolution increases, something that also correlates with higher anti-aliasing values (same idea – sampling more times per pixel or drawing more pixels). For this reason, we’ll want to pay careful attention to performance scaling at 1080p, 1440p, and 4K versus some other device, like the RX 580. The RX 580 is a powerful card for its price-point, often managing comparable performance to the 290X while running half the ROPs and 144 TMUs, but the 290X can close the gap (mildly) at higher resolutions. This isn’t particularly useful to know, but is interesting, and illustrates how specific parts of the GPU can change the performance stack under different rendering conditions.

Today, we’re testing with a reference R9 290X that’s been run through both stock and overclocked, giving us a look at the bottom-end performance and average partner model or OC performance. This should cover most the spectrum of R9 290X cards.

Today’s benchmark is a case study by the truest definition of the phrase: We are benchmarking a single sample, overweight video card to test the performance impact of its severe sag. The Gigabyte GTX 1080 Ti Xtreme was poorly received by our outlet when we reviewed it in 2017, primarily for its needlessly large size that amounted to worse thermal and acoustic performance than smaller, cheaper competitors. The card is heavy and constructed using through-bolts and complicated assortments of hardware, whereas competition achieved smaller, more effective designs that didn’t sag.

As is tradition, we put the GTX 1080 Ti Xtreme in one of our production machines alongside all of the other worst hardware we worked with, and so the 1080 Ti Xtreme was in use in a “real” system for about a year. That amount of time has allowed nature – mostly gravity – to take its course, and so the passage of time has slowly pulled the 1080 Ti Xtreme apart. Now, after a year of forced labor in our oldest rendering rig, we get to see the real side-effects of a needlessly heavy card that’s poorly reinforced internally. We’ll be testing the impact of GPU sag in today’s content.

We’re revisiting the Intel i7-7700K today, following its not-so-distant launch of January of 2017 for about $340 USD. The 7700K was shortly followed by the i7-8700K, still selling well, which later in the same year but with an additional two cores and four threads. That was a big gain, and one which stacked atop the 7700K’s already relatively high overclocking potential and regular 4.9 to 5GHz OCs. This revisit looks at how the 7700K compares to modern Coffee Lake 8000 and 9000 CPUs (like the 9700K), alongside modern Ryzen CPUs from the Zen+ generation.

For a quick reminder of 7700K specs versus “modern” CPUs – or, at least, as much more “modern” as a 1-year-later launch is – remember that the 7700K was the last of the 4C/8T parts in the i7 line, still using hyper-threading to hit 8T. The 8700K was the next launch in the family, releasing at 6C/12T and changing the lineup substantially at a similar price-point, albeit slightly higher. The 9900K was the next remarkable launch but exited the price category and became more of a low-end HEDT CPU. The 9700K is the truer follow-up to the 7700K, but oddly regresses to an 8T configuration from the 8700K’s 12T configuration, except it instead uses 8 physical cores for all 8 threads, rather than 6 physical cores. Separately, the 7700K critically operated with 8MB of total cache, as opposed to 12MB on the 9700K. The price also changed, with the 7700K closer to $340 and the 9700K at $400 to $430, depending. Even taking the $400 mark, that’s more than adjustment for inflation.

We’re revisiting the 7700K today, looking at whether buyers truly got the short straw with the subsequent and uncharacteristically rapid release of the 8700K. Note also, however, that the 8700K didn’t really properly release at end of 2017. That was more of a paper launch, with few products actually available at launch. Regardless, the feeling is the same for the 7700K buyer.

We already reviewed an individual NVIDIA Titan RTX over here, used first for gaming, overclocking, thermal, power, and acoustic testing. We may look at production workloads later, but that’ll wait. We’re primarily waiting for our go-to applications to add RT and Tensor Core support for 3D art. After replacing our bugged Titan RTX (the one that was clock-locked), we were able to proceed with SLI (NVLink) testing for the dual Titan RTX cards. Keep in mind that NVLink is no different from SLI when using these gaming bridges, aside from increased bandwidth, and so we still rely upon AFR and independent resources.

As a reminder, these cards really aren’t built for the way we’re testing them. You’d want a Titan RTX card as a cheaper alternative to Quadros, but with the memory capacity to handle heavy ML/DL or rendering workloads. For games, that extra (expensive) memory goes unused, thus demeaning the value of the Titan RTX cards in the face of a single 2080 Ti.

This is really just for fun, in all honesty. We’ll look at a theoretical “best” gaming GPU setup today, then talk about what you should buy instead.

Finding the “best" workstation GPU isn't as straight-forward as finding the best case, best gaming CPU, or best gaming GPU. While games typically scale reliably from one to the next, applications can deliver wildly varying performance. Those gains and losses could be chalked up to architecture, drivers, and also whether or not we're dealing with a true workstation GPU versus a gaming GPU trying to fill-in for workstation purposes.

In this content, we're going to be taking a look at current workstation GPU performance across a range of tests to figure out if there is such thing as a champion among them all. Or, in the very least, we'll figure out how AMD differs from NVIDIA, and how the gaming cards differ from the workstation counterparts. Part of this will look at Quadro vs. RTX or GTX cards, for instance, and WX vs. RX cards for workstation applications. We have GPU benchmarks for video editing (Adobe Premiere), 3D modeling and rendering (Blender, V-Ray, 3ds Max, Maya), AutoCAD, SolidWorks, Redshift, Octane Bench, and more.

Though NVIDIA's Quadro RTX lineup has been available for a few months, review samples have been slow to escape the grasp of NVIDIA, and if we had to guess why, it's likely due to the fact that few software solutions are available that can take advantage of the features right now. That excludes deep-learning tests which can benefit from the Tensor cores, but for optimizations derived from the RT core, we're still waiting. It seems likely that Chaos Group's V-Ray is going to be one of the first plugins to hit the market that will support NVIDIA's RTX, though Redshift, Octane, Arnold, Renderman, and many others have planned support.

The great thing for those planning to go with a gaming GPU for workstation use is that where rendering is concerned, the performance between gaming and workstation cards is going to be largely equivalent. Where performance can improve on workstation cards is with viewport performance optimizations; ultimately, the smoother the viewport, the less tedious it is to manipulate a scene.

Across all of the results ahead, you'll see that there are many angles to view workstation GPUs from, and that there isn't really such thing as a one-size-fits all - not like there is on the gaming side. There is such thing as an ultimate choice though, so if you're not afraid of spending substantially above the gaming equivalents for the best performance, there are models vying for your attention.

As we get into the holiday spirit here at GN, it’s time for our year-end round-ups and best of series—probably some of our favorite content. These guides provide a snapshot of what the year had to offer in certain spaces, like SSDs, for instance. You can check our most recent guides for the Best Cases of 2018 and Best CPUs of 2018.

These guides will also help users navigate the overwhelming amount of Black Friday and Cyber Monday marketing ahead of us all. SSD prices have been especially good lately, and the holidays should certainly net opportunities for even better deals.

That said, buying something just because it’s cheap isn’t ever a good idea, really; better to know what’s best first, then buy cheap—or cheaper than usual, anyway. This guide will take the legwork out of distinguishing what the year’s best SSDs are based on use case and price. Today, we're looking at the best SSDs for gaming PCs, workstations, budget PC builds, and for cheap, high-capacity storage. 1TB SSDs are more affordable than ever now, and we'll explore some of those listings.

As we continue our awards shows for end of year (see also: Best Cases of 2018), we’re now recapping some of the best and worst CPU launches of the year. The categories include best overall value, most well-rounded, best hobbyist production, best budget gaming, most fun to overclock, and biggest disappointment. We’ll be walking through a year of testing data as we recap the most memorable products leading into Black Friday and holiday sales. As always, links to the products are provided below, alongside our article for a written recap. The video is embedded for the more visual audience.

We’ll be mailing out GN Award Crystals to the companies for their most important products for the year. The award crystal is a 3D laser-engraved GN tear-down logo with extreme attention to detail and, although the products have to earn the award, you can buy one for yourself at store.gamersnexus.net.

As a reminder here, data isn’t the focus today. We’re recapping coverage, so we’re pulling charts sparingly and as needed from a year’s worth of CPU reviews. For those older pieces, keep in mind that some of the tests are using older data. For full detail on any CPU in this video, you’ll want to check our original reviews. Keep in mind that the most recent review – that’ll be the 9600K or 9980XE review – will contain the most up-to-date test data with the most up-to-date Windows and game versions.

Awards Show: Best & Worst PC Cases of 2018

By Published November 20, 2018 at 3:57 pm

It’s time for the annual GN Awards series, starting off with the best – and worst – cases of 2018. Using our database of over 160 test results for cases, we crawled through our reviews for the year to pull cases that had the best out of the box thermals, the best noise levels, best quality at a budget, best design, best all-around, the most overhyped case, and the most disappointing cases. We hit every price category in this round-up and cover cases that are both subjective and objectively good. Links will be provided for anyone shopping this season.

Leading into Black Friday and Cyber Monday, let's walk through the best and worst PC cases of 2018.

Every manufacturer featured in this content will receive one of our Large GN Awards for the Best Of categories – no award for the worst categories, sadly. The GN Award Crystal is only given out for prestige, featuring a detailed 3D laser-engraved GN tear-down logo with fine detail, like VRM components, fans, and electrical circuitry in the design. Although manufacturers have to earn their award, you can buy one for yourself on store.gamersnexus.net in large and medium sizes.

The RTX 2080 Ti failures aren’t as widespread as they might have seemed from initial reddit threads, but they are absolutely real. When discussing internally whether we thought the issue of artifacting and dying RTX cards had been blown out of proportion by the internet, we had two frames of mind: On one side, the level of attention did seem disproportionate to the size of the issue, particularly as RMA rates are within the norm. Partners are still often under 1% and retailers are under 3.5%, which is standard. The other frame of mind is that, actually, nothing was blown out of proportion for people who spent $1250 and received a brick in return. For those affected buyers, the artifacting is absolutely a real issue, and it deserves real attention.

This content marks the closing of a storyline for us. We published previous videos detailing a few of the failures on our viewers’ cards (borrowed by GN on loan), including an unrelated issue of a 1350MHz lock and BSOD issue. We also tested cards in our livestream to show what the artifacting looks like, seen here. Today, we’re mostly looking at thermals, firmware, the OS, downclocking impact, and finding a conclusion of what the problem isn’t (rather than what it 100% is).

With over a dozen cards mailed in to us, we had a lot to sort through over the past week. This issue certainly exists in a very real way for those who spent $1200+ on an unusable video card, but it isn’t affecting everyone. It’s far from “widespread,” fortunately, and our present understanding is that RMA rates remain within reason for most of the industry. That said, NVIDIA’s response times to some RMA requests have been slow, from what our viewers have expressed, and replacements can take upwards of a month given supply constraints in some regions. That’s a problem.

Page 1 of 21

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge