Popular memory manufacturer G.SKILL has announced its answer to the RGB LED movement: the Trident Z RGB series. At this point, it may be hard to pinpoint the derivation of the RGB trend, yet its perpetuation across components and peripherals is one we predicted here at GN, along with some other fads.
The Trident Z RGB series will be—you guessed it—adorned with RGB LEDs in the form of a translucent lightbar affixed to the aluminum heat spreaders. The aforesaid lightbar will run the length of the DIMM operating by default in a “wave-style” effect, offering a range of hues. Such effects are capable of being modified with a future software launch, scheduled for February 2017. The Trident Z RGB lineup is somewhat inimitable in its implementation, chiefly that it does not require any additional power connections from the motherboard for user control; all necessary power is drawn from the DIMM slot. This offers divergence from the Geil EVO X RGB memory, which must be tethered to the motherboard for proper function of the LEDs, and from other LED memory options (Vengeance, Avexir) that are mono-color.
Ramping up the video production for 2016 led to some obvious problems – namely, burning through tons of storage. We’ve fully consumed 4TB of video storage this year with what we’re producing, and although that might be a small amount to large video enterprises, it is not insignificant for our operation. We needed a way to handle that data without potentially losing anything that could be important later, and ultimately decided to write a custom Powershell script for automated Handbrake CLI compression routines that execute monthly.
Well, will execute monthly. For now, it’s still catching up and is crunching away on 4000+ video files for 2016.
Thermal cameras have proliferated to a point that people are buying them for use as tech toys, made possible thanks to new prices nearer $200 than the multi-thousand thermal imaging cameras that have long been the norm. Using a thermal camera that connects to a mobile phone eliminates a lot of the cost for such a device, relying on the mobile device’s hardware for post-processing and image cleanup that make the cameras semi-useful. They’re not the most accurate and should never be trusted over a dedicated, proper thermal imaging device, but they’re accurate enough for spot-checking and rapid concepting of testing procedures.
Unfortunately, we’ve seen them used lately as hard data for thermal performance of PC hardware. For all kinds of reasons, this needs to be done with caution. We urged in our EVGA VRM coverage that thermal imaging was not perfect for the task, and later stuck thermal probes directly to the card for more accurate measurements. Even ignoring the factors of emission, transmission, and reflection (today’s topics), using thermal imaging to take temperature measurements of core component temperatures is methodologically flawed. Measuring the case temperature of a laptop or chassis tells us nothing more than that – the temperature of the surface materials, assuming an ideal black body with an emissivity close to 1.0. We’ll talk about that contingency momentarily.
But even so: Pointing a thermal imager at a perfectly black surface and measuring its temperature is telling us the temperature of the surface. Sure, that’s useful for a few things; in laptops, that could be determining if case temperature exceeds the skin temp specification of a particular manufacturer. This is good for validating whether a device might be safe to touch, or for proving that a device is too hot for actual on-lap use. We could also use this information as troubleshooting to help us determine where hotspots are under the hood, potentially useful in very specific cases.
That doesn’t, however, tell us the efficacy of the cooling solution within the computer. For that, we need software to measure the CPU core temperatures, the GPU diode, and potentially other components (PCH and HDD/SSD are less popular, but occasionally important). Further analysis would require direct thermocouple probes mounted to the SMDs of interest, like VRM components or VRAM. Neither of these two examples are equipped with internal sensors that software, and even the host GPU, is capable of reading.
The holiday season is upon us. In due time, the Steam Holiday/Winter sale will be flowing like a river, and many users will be preparing their wallets for the impending profligacy. As Newegg, Amazon, and other retailers usually offer sales of their own, other users may be eyeing core component upgrades or new systems entirely. That said, we’ve attempted to take some of the legwork out of putting together a mid-level gaming machine that is comprised mostly of hardware currently on sale, or discounted through current rebates. Admittedly, that narrows options; however, we’ve still come up with very capable and modern build without becoming lusus naturae.
This rig will be a sub-$700 system focused on gaming at the respectable, and still most popular, 1080p. If by chance you are needing more horsepower for, say, the 1440p domain, check out another recent build guide of ours. As an aside, we’ve selected mATX hardware housed in an mATX chassis; something that will please space mindful users wanting a build with a minimal footprint. Before getting into it, I’ll preface with this: more ardent enthusiasts might balk at the presence of a core i3, specifically the i3-6100, but keep in mind that this is a value-oriented build, and the i3-6100 fills the space well. We’ll discuss this a bit more below.
Per the usual format, we will list an OS in the below DIY build list as an optional purchase in addition to an optional, but advised, SSD. Also below, find our tutorial on building a gaming PC or check out our more in depth article.
This gaming PC build is priced below $700 (though may be below $600, if the sales are still active), and is targeted at high graphics settings with a 1080p monitor.
We’re still cranking through content as we roll into the weekend’s holiday, and that’s not going to stop. We’ll be posting on the 25th, as we do each year, and I’ll likely be working on server upgrades. It’s somewhat of a Christmas tradition for GN, it seems.
But until that time, we’ve still got a solid week of production to cut through. This week starts off with the usual Ask GN episode, now #38, and will briefly highlight advantages of the Xeon or i7 CPUs in different use cases, frequency and core count discussion, and threats to Intel’s dominance in the computing world.
We’ll have a new PC build article going live within the next day, too. That’ll feature a few sales for Christmas, so be sure to check that guide. There’ll be a few fairly big ones.
Episode below. Timestamps below that.
Newegg’s Techmas and pre-holiday sales continue throughout the month of December. Throughout this weekend, you can pick up an ADATA 256GB SSD, Seasonic Modular 650W Power Supply, and an iFixit toolkit if you need a good set to tackle a late 2016 PC build. We also have identified a couple different kits of G.SKILL Ripjaws V Series 16GB (2x8) DRR4 RAM, presently down-priced for EOY Christmas sales.
The second card in our “revisit” series – sort of semi-re-reviews – is the GTX 780 Ti from November of 2013, which originally shipped for $700. This was the flagship of the Kepler architecture, followed later by Maxwell architecture on GTX 900 series GPUs, and then the modern Pascal. The 780 Ti was in competition with AMD’s R9 200 series and (a bit later) R9 300 series cards, and was accompanied by the expected 780, 770, and 760 video cards.
Our last revisit looked at the GTX 770 2GB card, and our next one plans to look at an AMD R9 200-series card. For today, we’re revisiting the GTX 780 Ti 3GB card for an analysis of its performance in 2016, as pitted against the modern GTX 1080, 1070, 1060, 1050 Ti, and RX 480, 470, and others.
LG has made a preliminary announcement heralding the arrival of a new flagship display: the LG 32UD99. Poised to entice creative professionals, gamers, and prosumers, the LG 32UD99 suggests targeting a more encompassing demographic; a contrast to the fairly recent announcement of LG UltraFine 4K and 5K panels that seemingly left Windows users in the cold. LG plans to demonstrate the 32UD99 at CES next month alongside some other panels. Naturally, many specifications were left undisclosed. Here is what we know so far:
The LG 32UD99 touts a 32” IPS panel at a native resolution of 3840 x 2160, making this a UHD 4K display. The IPS panel is of 10-bit color depth and can reproduce 1.07 billion colors. That’s vs 8-bit with 16.77 million colors. The panel of the LG 32UD99 allegedly saturates 95% of the DCI P3 color space, and LG has reported nothing of other color spaces such as sRGB and Adobe RGB. The LG 32UD99 also supports 3D LUTs (look-up tables), but again, there are no details on the LUTs. As look-up tables are primarily for color enhancement and correction, this is a feature more prepared for users working in digital media.
Hardware news has, somewhat surprisingly, maintained its pace through the late months of the year. We normally expect a slowdown in December, but with AMD’s onslaught of announcements (Instinct, Ryzen, Vega), and with announcements leading into CES, we’ve yet to catch a break.
This week’s hardware news focuses on the RX 460 unlocking discovered by Der8auer, new SSDs from Corsair (MP500) and Zadak, and TSMC’s fab expansion.
Albeit in the midst of troubling SSD news, Corsair fans may rejoice. After a seeming lack of focus in the SSD market, Corsair has announced the immediate availability of the new Force Series MP500 M.2 solid-state drives. Although laggardly, Corsair now joins other companies like Samsung, Plextor, Toshiba, and Intel in leaving behind the limited SATA III 6Gb/s connection in favor of NVMe via PCIe x4.
Corsair avers the new MP500 Force Series to be the fastest drive they have yet produced, with sequential read/write speeds rated at 3000/2400 MB/s and random 4K read/write speeds at 250,000/210,000 IOPS, nominally. Theoretically speaking, system boot times, large file transfers, and game load times will see improvement over using a single SATA 6Gb/s connection. This also puts the drive in somewhat parallel performance with the Samsung 960 EVO.
The MP500 series will utilize a Phison PS5007-E7 NVMe memory controller in conjunction with the high bandwidth afforded by PCIe Gen 3.0 x4 lanes. The MP500 conforms to the M.2 2280 form factor and sports a black PCB with a black cover hiding the NAND (and so we haven’t yet identified the modules used). Although not particularly relevant, it does coincide with the recent motherboard color trend and should please users aimed at achieving a uniform aesthetic, in comparison to the overdone green PCBs. The Phison PS5007-E7 controller supports SLC/MLC/TLC and 3D NAND (V-NAND), although we are currently unable to ascertain the specific NAND type used in the Corsair Force MP500. The Force Series MP500 range will offer 120GB, 240GB, and 480GB capacities priced at $110, $170, and $325, respectively.