We’ve been writing about the latest memory and Flash price increases for a bit now – and this does seem to happen every few years – but relief remains distant. The memory supply is limited for a few reasons right now, including new R&D processes by the big suppliers (Samsung, Toshiba, SK Hynix, Micron) as some of the suppliers attempt to move toward new process technology. More immediately and critical, the phone industry’s launch cycle is on the horizon, and that means drastically increased memory sales to phone vendors. Supply is finite – it has to come out of inventory somewhere, and that tends to be components. As enthusiasts, that’s where we see the increased prices come into play.
Professional overclocker Toppc recently set another world record for DDR4 SDRAM frequency. Using a set of G.SKILL DDR4 sticks (an unidentified kit from the Trident Z RGB line) bestriding an MSI X299 Gaming Pro Carbon AC motherboard, Toppc was able to achieve a 5.5 GHz DDR4 frequency—approximately a 500 MHz improvement over his record from last year.
Toppc’s new record is verified by HWBot, accompanied by a screenshot of CPU-Z and Toppc’s extreme cooling setup, which involved LN2. Although an exact temperature was not provided, and details on the aforementioned G.SKILL kit are scant, we do know that the modules used Samsung 8GB ICs. Based on the limited information, we can infer or postulate that this is probably a new product from G.SKILL, as they announced new memory kits at Computex.
We recently covered Intel’s DC P4800X data center drive, with takes on the technology from two editors in video and article form. Those content pieces served as a technology overview for 3D Xpoint and Intel Optane (and should be referenced as primer material), but both indicated a distinct lack of any consumer-focused launch for the new half-memory, half-storage amalgam.
Today, we’re back to discuss Intel’s Optane Memory modules, which will ship April 24 in the form of M.2 sticks.
As Intel’s platform for 3D Xpoint (Micron also has one: QuantX), Optane will be deployed on standardized interfaces like PCI-e AICs, M.2, and eventually DIMM form factors. This means no special “Optane port,” so to speak, and should make adoption at least somewhat more likely. There’s still a challenging road ahead for Intel, of course, as Optane has big goals to somewhat unify memory and storage by creating a device with storage-like capacities and memory-like latencies. For more of a technology overview, check out Patrick Stone’s article on the DC P4800X.
The finer distinctions between DDR and GDDR can easily be masked by the impressive on-paper specs of the newer GDDR5 standards, often inviting an obvious question with a not-so-obvious answer: Why can’t GDDR5 serve as system memory?
In a simple response, it’s analogous to why a GPU cannot suffice as a CPU. Being more incisive, CPUs are comprised of complex cores using complex instruction sets in addition to on-die cache and integrated graphics. This makes the CPU suitable for the multitude of latency sensitive tasks often beset upon it; however, that aptness comes at a cost—a cost paid in silicon. Conversely, GPUs can apportion more chip space by using simpler, reduced-instruction-set based cores. As such, GPUs can feature hundreds, if not thousands of cores designed to process huge amounts of data in parallel. Whereas CPUs are optimized to process tasks in a serial/sequential manner with as little latency as possible, GPUs have a parallel architecture and are optimized for raw throughput.
While the above doesn’t exactly explicate any differences between DDR and GDDR, the analogy is fitting. CPUs and GPUs both have access to temporary pools of memory, and just like both processors are highly specialized in how they handle data and workloads, so too is their associated memory.
At the tail-end of a one-day trip across the country, this episode of Ask GN tides us over until our weekend burst of further content production. We’re currently working on turning around a few case reviews, some game benchmarks, and implementing new thermal calibrators and high-end equipment.
In the meantime, this episode addresses questions involving “doubled” DRAM prices, delidding plans for the i7-7700K, contact between a heatsink and the back of a video card, and a few other topics. Check back posthaste as we’ll ramp into publication of our i5-7600K review within the next day.
Video below, timestamps below that:
As predicted, DRAM-dependent components continue to grow more expensive as demand outpaces supply. Nanya Technology president Pei-Ing Lee confirmed that their DRAM’s average price will increase in the first and second quarter of 2017.
When we published our “Why Are RAM Prices So High” article in 2014, DRAM was transitioning to 25nm wafers—and now it’s transitioning again, this time to 20nm. Prices in the second half of 2017 are expected to stabilize, but depend largely on how quickly manufacturers gear up for the move to smaller dies—Nanya Technology will be simultaneously increasing 20nm production while cutting down on 30nm going into 2018.
We recently reported on G.SKILL’s announcement of the new Trident Z RGB series of memory. G.SKILL has now announced their high-end Trident Z DDR4 DIMMs designed for Kaby Lake CPUs and Z270 chipset motherboards.
For the launch of Kaby Lake and the Z270 chipset, G.SKILL will offer various dual-channel kits in 16GB, 32GB, and 64GB options. The modules themselves will only come in 8 or 16GB densities; a rejoinder of sorts on behalf of G.SKILL, pertaining to their claims of mainstream popularity for 16GB and 32GB memory kits. As seen below, kits clocked at 4000MHz and beyond will only be offered in configurations of 8GB modules.
As the pre-CES hardware news keeps pouring in, HyperX has announced new products today that will further their peripherals and components aimed at the gaming market. HyperX has introduced two new Alloy keyboards, a Pulsefire Gaming Mouse, a new Cloud Revolver S Headset, and HyperX Predator DDR4 RGB LED Memory.
We will be visiting HyperX and Kingston this week, and hope to have more in-depth, on-site coverage from the show floor. For now, we’ve got the basic specs and introductory information for each new peripheral and memory kit.
Starting with the audio gear, HyperX has announced the new headset that will be showcased at CES 2017 -- the Cloud Revolver S. The new gaming headset will feature plug-and-play Dolby 7.1 virtual surround sound via a connected USB dongle. HyperX claims no additional software or audiobox will be needed to get the Dolby 7.1 surround sound functional. This isn’t the first time we’ve seen an implementation of 7.1 surround in this fashion -- Turtle Beach and Plantronics have done this for ages -- but it’s the first major noise HyperX is making about Dolby Surround.
More unique to the unit, the HyperX Cloud Revolver S will have a condenser microphone with a bi-directional pattern; the condenser mic, although we’ll have to test it, could be promising for streamers and video casters who’d rather not use standalone input hardware. The HyperX Cloud Revolver S will be available February of 2017 for $150 USD.
We've been through Battlefield 1 a few times now. First were the GPU benchmarks, then the HBAO vs. SSAO benchmark, then the CPU benchmark. This time it's RAM, and the methodology remains mostly the same. Note that these results are not comparable to previous results because (1) the game has received updates, (2) memory spec has changed for this test, and (3) we have updated our graphics drivers. The test platforms and memory used are dynamic for this test, the rest remaining similar to what we've done in the past. That'll be defined in the methodology below.
Our CPU benchmark had us changing frequencies between test platforms as we tried to determine our test patterns and methodology / bench specs for the endeavor. During that exploratory process, we noticed that memory speeds of 3200MHz were measurably faster in heuristic testing than speeds of, say, 2400MHz. That was just done by eye, though; it wasn't an official benchmark, and we wanted to dedicate a separate piece to that.
This content benchmarks memory performance in Battlefield 1, focusing on RAM speed (e.g. 1600MHz, 1866, 2133, 2400, so forth) and capacity. We hope to answer whether 8GB is "enough" and find a sweet spot for price-performance in memory selection.
This episode of Ask GN (#28) addresses the concept of HBM in non-GPU applications, primarily concerning its imminent deployment on CPUs. We also explore GPU Boost 3.0 and its variance within testing when working on the new GTX 1080 cards. The question of Boost's functionality arose as a response to our EVGA GTX 1080 FTW Hybrid vs. MSI Sea Hawk 1080 coverage, and asked why one 1080 was clock-dropping differently from another. We talk about that in this episode.
Discussion begins with proof that the Cullinan finally exists and has been sent to us – because it was impossible to find, after Computex – and carries into Knights Landing (Intel) coverage for MCDRAM, or “CPU HBM.” Testing methods are slotted in between, for an explanation on why some hardware choices are made when building a test environment.