We asked Intel why Kaby Lake-X exists at its recent press day, challenging that the refreshed 7700 & 7600 CPUs can’t be used on LGA1151 sockets, that they aren’t significantly different from the predecessors, and that LGA2066 boards are way more expensive. The socket and chipset alone have a higher BOM cost for manufacturers than 200-series boards, and that cost is passed on to consumers. That’s not free. The consumer also pays for the components that won’t go unused, like the trace routing for half of the DIMMs (and the physical slots).

But Intel gave us an answer to that query.

The Steam Summer Sale is upon on us, and we’ve put together a list of some of the best deals. This year’s Summer Sale runs from June 22nd to July 5th, so there is time to pick up these or any other games that might be of interest.

Keeping marketing checked by reality is part of the reason that technical media should exist: Part of the job is to filter out the adjectives and subjective language for consumers and get to the objective truth. Intel’s initial marketing deck contained a slide that suggested their new X-series CPUs could run 3-way or 4-way GPUs for 12K Gaming. Those are their exact words: "12K Gaming," supported by orange demarcation for the X-series, whereas it is implicitly not supported (in the slide) on the K-SKU desktop CPUs. Not to speak of how uncommon that resolution is, this also isn’t a real resolution. Regardless, we’re using this discussion of Intel’s "12K" claims as an opportunity to benchmark two x8 GPUs on a 7700K with two x16 GPUs on a 7900X, with some tests disabling cores and boosting clock. We have also received a statement from Intel to GamersNexus regarding the marketing language.

First of all, we need to define a few things: Intel’s version of 12K is not what you’d normally expect – in fact, it’s actually fewer pixels than 8K, so the naming is strongly misleading. Let’s break this down.

Our hardware news round-up for the past week is live, detailing some behind-the-scenes / early information on our thermal and power testing for the i9-7900X, the Xbox One X hardware specs, Threadripper's release date, and plenty of other news. Additional coverage includes final word on Acer's 21X Predator, Samsung's 64-layer NAND finalization, Global Foundries' 7nm FinFET for 2018, and some extras.

We anticipate a slower news week for non-Intel/non-AMD entities this week, as Intel launched X299/SKY-X and AMD is making waves with Epyc. Given the command both these companies have over consumer news, it's likely that other vendors will hold further press releases until next week.

Find the show notes below, written by Eric Hamilton, along with the embedded video.

Intel’s past few weeks have seen the company enduring the ire of a large portion of the tech community, perhaps undeservedly in some instances -- certainly deservedly in others. We criticized the company for its initial marketing of the 7900X – but then, we criticize nearly everyone for marketing claims that borderline on silly. “Extreme Mega-Tasking,” for instance, was Intel’s new invention.

But it’d be folly to assume that Skylake-X won’t perform. It’s just a matter of how Intel positions itself with pricing, particularly considering the imminent arrival of Threadripper. Skylake-X is built on known and documented architecture and is accompanied by the usual platform roll-out, with some anomalies in the form of Kaby Lake X's accompaniment on that same platform.

Today, we're reviewing the Intel Core i9-7900X Skylake X CPU, benchmarking it in game streaming (Twitch, YouTube) vs. Ryzen, in Blender & Premiere rendering, VR gaming, and standard gaming.

With AMD’s upcoming Threadripper line of CPUs, prices on the existing Ryzen R7 and R5 chips have seen significant price cuts. The R5 1600X has been dropped down below its launch price, which, coupled with our Editor’s Choice Award in our initial review, makes for an attractive foundation for a mid-range PC. We also found a slight discount on an X370 motherboard from MSI, a kit of DDR4 RAM, and a 500W PSU from EVGA.

EVGA GTX 1080 Ti Kingpin PCB & VRM Analysis

By Published June 17, 2017 at 6:16 pm

When interviewing EVGA Extreme OC Engineer “Kingpin,” the term “dailies” came up – as in daily users, or “just gamers,” or generally people who don’t use LN2 to overclock their GPU. The GTX 1080 Ti Kingpin card is not a device built for “dailies,” but rather for extreme overclockers – people who are trying to break world records.

Cards like this – the Lightning would be included – do have a reason to exist. Criticism online sometimes calls such devices “pointless” for delivering the same overall out-of-box experience as nearly any other 1080 Ti, but those criticizing aren’t looking at it from the right perspective. A Kingpin, Lightning, or other XOC card is purchased to eliminate the need to perform hard mods to get a card up to speed. It’s usable out of the box as an XOC tool.

GN’s Camera Upgrade: 200Mbps, 4K60 Video

By Published June 16, 2017 at 8:04 pm

It’s a far cry from our last “major” camera purchase, which consisted of about $3000 to buy a then-new Canon XA20 and shotgun mic. That was around 2013. Since that time, we’ve invested thousands in audio equipment, sliders, tripods, and lighting – but as video team’s skills and arsenal have grown, we’ve had one straggler: The camera. The XA20 was a fantastic camera to buy for our first major video equipment, replacing our previous Canon Vixia HF S20; the XA20 permitted 1080p60 uploads, put us on the map for video, and continues to be an absolute workhorse for road production. We’re planning to keep it around for multi-cam interview shooting in the future, alongside giving us an option for multiple video staff on-site at an event. Logistically, it makes good sense to keep the XA20 around – again, the thing is truly a workhorse, and I’d be lying to not acknowledge a sentimental attachment.

Although it may feel like one GTX 1080 Ti isn’t too different from the next, that’s only “true” when comparing the least meaningful metric: Framerate. Once we’ve established a baseline framerate for the actual GPU – that is, GP102 – there’s not going to be a whole lot of difference between most partner cards. The difference is in thermals and noise, and most people don’t go too in-depth on either subject. For our testing, we look at thermal performance on various board components (not just the GPU), we look at noise, and we look at noise-normalized thermal performance (every card at 40dBA) for cooling efficiency testing.

EVGA’s SC2 Hybrid is an SC2 in every aspect except for cooling. The PCB is the same, the clocks are the same, and so the gaming performance is the same. For this reason alone, there’s no point to testing FPS. If framerates are all you care about, check our SC2 review.

This episode of Ask GN is our first since returning from Taiwan, where we spent nearly two weeks covering Computex and other on-sight events. In the time since, the internet erupted in questions pertaining to PCIe lanes, specifically as they relate to Intel CPUs and chipsets. We address some of those questions in this Ask GN, explain differences in CPU and PCH HSIO lanes, and then get into some other questions.

Those other questions, for the interested, are all timestamped below. One asked about whether AIB partner cards are "worth it" for their power design vs. an FE card, about repasting CPU TIM, and about coolers for new large-size CPU sockets.

Video below:

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge