I'm not sure why the hotel phone rang – a loud, cursed beige thing – at 11PM. I was asleep; it's 11AM on the East Coast, about bed time, and the woman speaking to me was doing so in Chinese. There's something especially vexing about trying to come-to from a Snorlax-like slumber while also being at the receiving end of an unfamiliar language. I sat there in silence for a moment while trying to piece together what she just said, then realized it was no use – “English?”

She laughed. I said “It's OK,” phonetically stammered out “may qwan qi” – something I learned a few hours prior, and without learning the spelling – and then we hung up. This curious episode was matched moments later, when one of the hotel staff knocked (loudly) on the door. I still wasn't sure of the time, and figured it was room service: “Later?”

Sticking to single words seemed the best bet.

She knocked again. I cracked it open and was handed a lighter, and she was whisked away by the darkness of the hall. After looking at the thing for a moment, I put it on the bathroom counter and returned to bed.

It's been a few months since our “Ask GN” series had its last installment. We got eleven episodes deep, then proceeded to plunge into the non-stop game testing and benchmarking of the fourth quarter. Alas, following fan requests and interest, we've proudly resurrected the series – not the only thing resurrected this week, either.

So, amidst Games for Windows Live and RollerCoaster Tycoon's re-re-announcement of mod support, we figured we'd brighten the week with something more promising: DirectX & Vulkan cherry-picked topics, classic GPU battles, and power supply testing questions. There's a bonus question at the end, too.

Thermal testing for cases, coolers, CPUs, and GPUs requires very careful attention to methodology and test execution. Without proper controls for ambient or other variables within a lab/room environment, it's exceedingly easy for tests to vary to a degree that effectively invalidates the results. Cases and coolers are often fighting over one degree (Celsius) or less of separation, so having strict tolerances for ambient and active measurements of diodes and air at intake/exhaust helps ensure accurate data.

We recently put our methodology to the test by borrowing time on a local thermal chamber – a controlled environment – and checking our delta measurements against it. GN's thermal testing is conducted in a lab on an open-loop HVAC system; we measure ambient constantly (second-to-second) with thermocouples, then subtract those readings from diode readings to create a delta value. For the thermal chamber, we performed identical methodology within a more tightly controlled environment. The goal was to determine if the delta value (within the chamber) paralleled the delta value achieved in our own (open air) labs, within reasonable margin of error; if so, we'd know our testing is fully accurate and properly accounts for ambient and other variables.

The chamber used has climate control functions that include temperature settings. We set the chamber to match our normal lab temps (20C), then checked carefully for where the intake and exhaust are setup within the chamber. This particular unit has slow, steady intake from the top that helps circulate air by pushing it down to an exhaust vent at the bottom. It'd just turn into an oven, otherwise, as the system's rising temps would increase ambient. This still happens to some degree, but a control module on the thermal chamber helps adjust and regulate the 20C target as the internal temperature demands. It's the control module which is the most expensive, too; our chaperone told us that the units cost upwards of $10,000 – and that's for a 'budget-friendly' approach.

 

I'll be honest: This post started, as most site updates do, under the pretense of running behind on other, deeper content. We've got some charts-heavy benchmarks lined-up through the week, not the least of which includes tomorrow's exhaustive wattage consumption analysis. This stuff takes time to do.

But we do at least one “state of the site” style update each year. We're due for another. These have, for years, served as a means to thank our readers and staff, to highlight accomplishments and things we've learned, and to publicize some of our loose plans for the coming year.

Following months of nonstop, long-form content – and subsequent imperialistic takeover of additional rooms as lab or studio space – we have seen a ~143% growth in annual pageviews. From January 2015 through now, the end of November, we're sitting at just under 7,000,000 pageviews. Last year's same-period pageview count was around 3,000,000 at this time (still a tremendous feat to a small outlet like ours). Our YouTube subscriber base has grown from around 8,000 subscribers to ~22,000 subscribers (same period, year-to-year). By the way, you should subscribe if you haven't. Views-wise, we're looking at a growth (total views for the period) of nearly 100% year-over-year.

PAX now behind us, we've returned to our new-found efforts of addressing direct reader questions via YouTube and twitter comments. This new series has been dubbed “Ask GN” and, to our great satisfaction, has thus far yielded excellent discussion points on current topics. A couple of article ideas have emerged from the questions, too, so keep them coming!

The list for episode 3 saw inclusion of open vs. closed liquid cooling loop discussion, cable brief management tips, device controllers on a motherboard, and whether or not a motherboard impacts the gaming experience.

I've never before had GN's lab so fully equipped. It's an exciting period of growth for us. For the first time in the site's seven-year history, it feels appropriate to slap the “lab” label on our multi-room testing setup. Following weeks of intensive cleaning and organization efforts, we've now got shelving units installed (all connected to ground and ESD compliant) that house motherboards, video cards, CPUs, and more – much bought out of pocket – and a complex network of systems.

The brain of the network is our rendering rig, which sits opposite my main production system. The rendering rig is used exclusively to render and edit videos for GamersNexus, with the primary user being GN's Keegan Gallick. The system hosts RAID HDDs that are utilized for all of our test data, video media, and photo media.

We put together a quick video showcasing the rig:

Our publication schedule is constantly littered with research-intensive articles. The workload is split – Michael Kerns heads-up feature posts like keyboard round-ups, Keegan Gallick assists with sustaining news posts and videos, and Patrick Lathan handles social media and some posts. This is done intentionally, as I'm normally pulling my (abundant) hair out trying to devise new testing methodology for upcoming products. It's fun, though, and we do our best to split resources well enough that the site is sustained during times of intense research.

This week's slow-down on major content posts has allowed for some behind-the-scenes work on a new benchmarking platform – an X99 rig using an i7-5930K and DDR4 memory for SLI and CrossFire benchmarking. System integrator iBUYPOWER had some spare X-rev lab samples left over, so with their support and our own investment in other hardware, the site will soon be able to produce SLI tests without concern of throttling due to lane and processing limitations of the 4790K CPU.

We've been hard at work with some behind-the-scenes efforts lately. As the site has continued to grow, we've found more room in the budget to make upgrades in necessary areas to allow continued growth; we've also allocated some funding to review hardware, camera / convention equipment, and staffing power.

If we were “on the map” before our GTA V and Witcher coverage, we've become a much more noticeable – but still medium-sized – dot on that map. GTA V's launch saw our day-1 publication of a GPU benchmark, followed rapidly by texture comparisons, a heavily-trafficked graphics optimization guide, and a CPU benchmark. All totaled, these items helped contribute to the site's first time exceeding 1,000,000 pageviews in a one-month period. Huge news for us, as we've traditionally rested in the 300-500k pageviews per month range. Much of that persisted through The Witcher 3, offering the same types of content (GPU benchmark, graphics optimization guide, more) with a slightly smaller inbound traffic metric.

As we ramp into GDC and PAX East, we're using the gap in review time to overhaul our testing methodology and test platforms. Yesterday's post revealed our open air GPU testing station, a direction that'll drastically improve our efficiency when testing multiple graphics configurations. Today, we're looking at the new case review test bench. The site has grown substantially in the past two years; we'll no longer be using the same bench for testing all components, and will now use individual systems for testing each component. This will eliminate chance of test error, improve efficiency, and allow each of our writers to specialize in an area.

GN's Staff Writer & Social Media Manager, Patrick Lathan, will be handling most ATX and micro-ATX case reviews going forward. As such, I dropped off a load of parts for Patrick's new test bench, which will be put to immediate use with NZXT's S340. Following his review of the S340, we'll look at Be Quiet's Silent Base 800.

Page 3 of 5

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge