Z97 motherboards have been floating around for a little while now -- here's our round-up of them -- but we haven't had a chance to actually look at the Z97 chipset as a product. Z97's immediate accompanying CPU is the Devil's Canyon chip that was announced at GDC, but will later host the 5th Gen Broadwell CPUs. Devil's Canyon is due out shortly, though another Haswell Refresh (i5-4690, others) was recently posted that has seen minimal interest thus far; Broadwell is due out in 4Q14 or later and features a die-shrink to 14nm fab process.

sata-expressThis is a SATA Express connector -- effectively dual SATA ports with additional power/ground.

Thus far, we know of Intel's Z97 and H97 chipsets and have heard no news of an "H91" or "B95" equivalent from last generation. For this Intel Broadwell 9-series chipset comparison, we'll look strictly at Z97 vs. H97 for gaming and overclocking purposes; the goal of this guide is to help PC builders determine which chipset will perform best for their objectives while remaining price-scaled.

I wrote a similar chipset comparison for AMD FM2/FM2+ chipsets last week.

AMD announced Monday their upcoming plans for "SkyBridge," due to come out in 2015, and customized AMD/ARM cores ("K12") due in 2016. The 2014 road map AMD laid-out is promising across mobility and x86 applications (more on the latter in a future post). AMD is the first company that is minimizing costs by having a single motherboard for both ARM and x86 architecture, giving more potential reach for consumers and sticking with AMD's effort to let buyers keep their board upon upgrading CPUs. AMD has always owned a bit of a niche market (except the days when they dominated with the Athlon 64), but they are expanding their strategy to one of differentiation from the norms of semiconductors.


We've seen a lot of discussion spurred about Kingston's silent decision to switch their mainstream consumer-targeted SSDNow V300 drive from synchronous to asynchronous NAND. In fact, on one of our PC builds that recommended the drive, a reader encouraged us to run updated performance benchmarks to validate the impact of the NAND switch. A recent article published down the road by Anandtech went at the V300 fiercely, referencing user AS-SSD benchmark data from forums to highlight the theoretical performance hit.


Upon publication of Kristian's post on Anandtech, I called our Kingston contact to press on a few points and also give a chance to defend their position. Unsurprisingly, Kingston supported the product readily; switching the NAND supply was in favor of price, and is the reason we've seen the V300 as low as $60-$70 on some retailers, they said. The 19nm Toshiba Toggle-Mode 2.0 NAND in the original V300 either became more scarce or was too expensive, and so the company switched to Micron's 20nm asynchronous NAND for cost reasons. 

I wanted to give everyone a quick update as to why the hardware features have been quiet since the unveiling of our dual- vs. single-channel RAM test a couple weeks back. There's been a lot of fuss lately about asynchronous NAND finding its way into a specific Kingston SSD that was previously known to use synchronous NAND. Most of the discussion was relegated to forums, where users ran somewhat haphazard benchmarks to determine that the new model of the V300 (using asynchronous NAND) was severely underperforming versus the earlier model that used a higher-quality NAND supply. I'll get into what this means briefly in a moment.


In short, it looks like the NAND type in the older V300 was Toshiba's 19nm Toggle-Mode 2.0 supply, which I've confirmed by opening our drive; the new NAND used in the newer iteration of the V300 -- which has no listing in the specs to make the difference clear -- is Micron's 20nm asynchronous NAND. The switch to asynchronous NAND allows the drive to be produced for much cheaper, but has an inherent performance detriment. If you're unsure of what I mean by "asynchronous" and "synchronous," it boils down to this:

Memory has a tendency to get largely overlooked when building a new system. Capacity and frequency steal the show, but beyond that, it's largely treated as a check-the-reviews component. Still, a few guidelines exist like not mixing-and-matching kits and purchasing strictly in pairs of two where dual-channel is applicable. These rules make sense, especially to those of us who've been building systems for a decade or more: Mixing kits was a surefire way to encounter stability or compatibility issues in the past (and is still questionable - I don't recommend it), and as for dual-channel, no one wanted to cut their speeds in half.


When we visited MSI in California during our 2013 visit (when we also showed how RAM is made), they showed us several high-end laptops that all featured a single stick of memory. I questioned this choice since, surely, it made more sense to use 2x4GB rather than 1x8GB from a performance standpoint. The MSI reps noted that "in [their] testing, there was almost no difference between dual-channel performance and normal performance." I tested this upon return home (results published in that MSI link) and found that, from a very quick test, they looked to be right. I never got to definitively prove where / if dual-channel would be sorely missed, though I did hypothesize that it'd be in video encoding and rendering.

In this benchmark, we'll look at dual-channel vs. single-channel platform performance for Adobe Premiere, gaming, video encoding, transcoding, number crunching, and daily use. The aim is to debunk or confirm a few myths about computer memory. I've seen a lot of forums touting (without supporting data) that dual-channel is somehow completely useless, and to the same tune, we've seen similar counter-arguments that buying anything less than 2 sticks of RAM is foolish. Both have merits.

Page 6 of 6

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.


  VigLink badge