Antlion ModMic 4.0 Review & Recording SamplesBy Michael Kerns Published November 16, 2016 at 3:34 pm
Teamwork is vitally important in current leading games: Dota 2, Overwatch, Rocket League, League of Legends, Battlefield 1’s new squad system -- the industry has been trending toward team-heavy play for a few years now. Voice chat is the only real solution to communication in anything faster paced, and so we normally look toward headsets for an easy plug-and-play solution. Unfortunately, bad microphones plague even the most expensive headsets.
Headsets will often bundle together a mediocre quality mic and headphones and price it above what each would be worth individually. On top of that, for folks already in possession of higher quality standalone headphones, replacing them with a headset with worse sound quality isn’t that appealing. Clip-on and desk (see: Yeti, Snowball) mics are convenient for PC gamers who already have nice headphones (or for some other reason don’t want a headset), and can provide higher quality input. Not always -- but it’s not hard to beat the average headset.
Another unique option other than a clip-on or desktop mic is the Antlion ModMic. Antlion’s ModMic has a magnet on an adjustable mic which sticks to another base (which also has a magnet), and all of this is placed onto the side of the headphones using adhesive. This allows for users to attach a headset style/boom mic to the side of their already-good headphones. Currently, both the ModMic 4.0 Uni-directional and Omni-directional versions (with mute) both are $50 on Antlion’s site and $55 on Amazon. Overall, it achieves its goal well by allowing users to use their own headphones while also having a decent quality adjustable mic. In this review of Antlion’s ModMic 4.0, we’ll look at mic quality, usability, and build/sound quality.
Logitech G403 & G Pro Review - A Pair of $70 Gaming MiceBy Steve Burke Published November 14, 2016 at 11:30 am
Logitech's latest obsession seems to be weight reduction. The G502 didn't make as much noise about weight as its top-tier predecessors, sequentially the G303 ($42), G900 ($135), and now G403 ($70) (and G Pro). The company has fine-tuned its tooling to allow for manufacturing of ultra-thin plastic shells, making for a lighter weight mouse chassis that can seemingly still withstand the usual wear-and-tear imposed on a mouse. In previous efforts, Logitech has boasted sensor optimization through firmware or other collaborative efforts with its sensor suppliers, has boasted lights, and has moved to implement keyplates with more consistent "clickiness" as governed by spring-tensioned switches.
But again, the latest trends have been to reduce weight and improve wireless functionality -- two efforts that seemingly go hand-in-hand. We talked about these moves heavily in our G900 review, and will leave most of the technical discussion (radiation patterns, wireless strength and reliability, etc.) to that content.
NZXT Kraken X52, X62, & X42 Review & Benchmarks vs. H100iV2, Predator 280By Steve Burke Published October 29, 2016 at 12:14 pm
A new series of Kraken liquid coolers from NZXT marks the first time that Asetek has afforded a customer the responsibility of designing custom electronics, which NZXT deploys for RGB LED control and future firmware revisions. The coolers use Gen5 Asetek pumps with custom-built pump blocks, "infinity mirror" pump plates, and NZXT fans that differentiate the X42, X52, and X62 line-up from Corsair's nearby competition. Corsair most heavily competes in the 240mm market -- that'd go up against the X52 -- where the H100iV2 is priced at ~$105 right now, though the H90 also competes with the X42.
Our disassembly of the Kraken X42 liquid cooler showed the device's internals, explained that the high-quality of design and component selection made for a promising set of tests, but didn't dive into the details. This review looks at the temperature performance and noise performance, along with a noise-temperature curve, of the new NZXT Kraken X62, X52, and X42 liquid coolers, particularly matched against the H100iV2. We've got the EK WB Predator XLC 280 as a high-end alternative, alongside the Be Quiet! Dark Rock 3 as a $50 air cooler, just to provide a baseline.
Logitech G213 Prodigy Membrane Keyboard ReviewBy Michael Kerns Published October 27, 2016 at 4:46 pm
Whenever we get a new keyboard to review, we make a point to put away the regularly used keyboards. It’s easy to gravitate toward what we’re familiar with, and so those things must be put aside for the review. Oftentimes, putting away the usual keyboards is easy since we have worked with a number of good releases lately, but sometimes it’s not so trivial.
Frankly, we expected the latter situation when unboxing the Logitech G213 Prodigy ($70). It’s a rubber dome keyboard, and those don’t get quite the fanfare that mechanical boards do. Setting the keyboard up revealed inclusion of RGB lighting, fully functional media keys, and a tuned force profile on the switches. The G213 also positions itself at a $70 “budget” price-point for an RGB board, but we’ll talk more about that later.
NVIDIA GTX Titan X (Pascal) Review vs. GTX 1080, SLI 1070sBy Steve Burke Published October 26, 2016 at 2:24 pm
The Titan X Hybrid mod we hand-crafted for a viewer allowed the card to stretch its boost an additional ~200MHz beyond the spec. This was done for Sam, the owner who loaned us the Titan XP, and was completed back in August. We also ran benchmarks before tearing the card down, albeit on drivers from mid-August, and never did publish a review of the card.
This content revisits the Titan XP for a review from a gaming standpoint. We'd generally recommend such a device for production workloads or CUDA-accelerated render/3D work, but that doesn't stop that the card is marketed as a top-of-the-line gaming device with GeForce branding. From that perspective, we're reviewing the GTX Titan X (Pascal) for its gaming performance versus the GTX 1080, hopefully providing a better understanding of value at each price-point. The Titan X (Pascal) card is priced at $1200 from nVidia directly.
Review content will focus on thermal, FPS, and overclocking performance of the GTX Titan X (Pascal) GP102 GPU. If you're curious to learn more about the card, our previous Titan XP Hybrid coverage can be found here:
MSI GTX 1050 Ti & 1050 Review – Benchmarks vs. RX 460, 470, MoreBy Steve Burke Published October 25, 2016 at 9:00 am
AMD issued a preemptive response to nVidia's new GTX 1050 and GTX 1050 Ti, and they did it by dropping the RX 460 MSRP to $100 and RX 470 MSRP to $170. The price reduction's issuance is to battle the GTX 1050, a $110 MSRP card, and GTX 1050 Ti, a $140-$170 card. These new Pascal-family devices are targeted most appropriately at the 1080p crowd, where the GTX 1060 and up were all capable performers for most 1440p gaming scenarios. AMD has held the sub-$200 market since the launch of its RX 480 4GB, RX 470, and RX 460 through the summer months, and is just now seeing its competition's gaze shift from the high-end.
Today, we've got thermal, power, and overclocking benchmarks for the GTX 1050 and GTX 1050 Ti cards. Our FPS benchmarks look at the GTX 1050 OC and GTX 1050 Ti Gaming X cards versus the RX 460, RX 470, GTX 950, 750 Ti, and 1060 devices. Some of our charts include higher-end devices as well, though you'd be better off looking at our GTX 1060 or RX 480 content for more on that. Here's a list of recent and relevant articles:
MSI GE62VR Apache Pro Laptop Review: GTX 1060 Benchmark vs. 1070, 1080By Steve Burke Published October 03, 2016 at 3:06 pm
The GTX 980's placement in notebooks heralded the now-present era of desktop GPUs in laptops, but was still sort of a trial of the tech. NVidia and AMD have both introduced their Pascal and Polaris architectures in full, uncut versions to notebooks this generation, with performance generally within about 10% of an equivalent desktop build. Despite the desktop-level power, battery life should also be improved resultant of an overall reduction in power consumption by the GPU and the CPU alike. And almost every other component, for that matter – like DDR4, which requires lower voltage and draws less power than DDR3.
Today, we're looking at the MSI GE62VR 6RF Apache Pro laptop with GTX 1060 & i7-6700HQ, priced at $1600. The benchmarks follow our previous notebook 1070 vs. 1080 tests, but with proper depth and hands-on. Note also that we already wrote about the GE62VR's bloatware problem.
In this review of the MSI GE62VR 6RF Apache Pro ($1600), we'll be testing FPS on the GTX 1060, temperatures, noise levels, and build quality.
Gigabyte GTX 1080 Xtreme Water Force Review vs. EVGA Hybrid, Sea HawkBy Steve Burke Published September 22, 2016 at 11:40 am
Implementation of liquid coolers on GPUs makes far more sense than on the standard CPU. We've shown in testing that actual performance can improve as a result of a better cooling solution on a GPU, particularly when replacing weak blower fan or reference cooler configurations. With nVidia cards, Boost 3.0 dictates clock-rate based upon a few parameters, one of which is remedied with more efficient GPU cooling solutions. On the AMD side of things, our RX 480 Hybrid mod garnered some additional overclocking headroom (~50MHz), but primarily reduced noise output.
Clock-rate also stabilizes with better cooling solutions (and that includes well-designed air cooling), which helps sustain more consistent frametimes and tighten frame latency. We call these 1% and 0.1% lows, though that presentation of the data is still looking at frametimes at the 99th and 99.9th percentile.
The EVGA GTX 1080 Hybrid has thus far had the most interesting cooling solution we've torn down on an AIO cooled GPU this generation, but Gigabyte's Xtreme Waterforce card threatens to take that title. In this review, we'll benchmark the Gigabyte GTX 1080 Xtreme Water Force card vs. the EVGA 1080 FTW Hybrid and MSI/Corsair 1080 Sea Hawk. Testing is focused on thermals and noise primarily, with FPS and overclocking thrown into the mix.
A quick thanks to viewer and reader Sean for loaning us this card, since Gigabyte doesn't respond to our sample requests.
Rosewill Cullinan Case Review (& the Anidees AI Crystal)By Steve Burke Published September 19, 2016 at 6:09 pm
Rosewill's Cullinan PC case is the company's most modern endeavor since the R5, and manages to get to the front of the case industry's current trends. It's a mid-tower with a PSU shroud and a full tempered glass side window, which is checking almost all the boxes created by NZXT's H440, In-Win's more expensive cases, and Corsair's 760T. The only 2016 trend missing from the Cullinan is a set of RGB LED fans, but they've still got blue LEDs.
We first saw the Rosewill Cullinan mid-tower at Computex 2016, but the case was impacted by shipping delays (and other internal delays) that pushed back its launch until now-ish. In theory, the ~$150 Cullinan will begin availability just before October, and should begin shipping to customers by the first week of October. That long lead-in to production has allowed competitors to enter the growing market of cases with tempered glass side panels, including Corsair with its brand new 460X, In Win with its 303, and Anidees with its identical AI Crystal ($150).
The Anidees AI Crystal and Rosewill Cullinan enclosures both boast 5mm thick tempered glass side windows and a 4mm thick tempered glass front panel. The enclosures target the front edge of a trend in the industry to adopt tempered glass on affordable cases (read: ~$100 to ~$200), replacing the cheaper acrylic that's found in almost all windowed panels. Rosewill and Anidees both use Chinese OEM designer Jonsbo, whom we believe to be a customer of case factory God Speed Casing. If that name's familiar, it's because God Speed Casing is the manufacturer used (and effectively grown) by NZXT; we've even toured their factories in China.
GTX 1060 3GB vs. 6GB Benchmark: Some Major Performance SwingsBy Steve Burke Published September 15, 2016 at 8:30 am
The GTX 1060 3GB ($200) card's existence is curious. The card was initially rumored to exist prior to the 1060 6GB's official announcement, and was quickly debunked as mythological. Exactly one month later, nVidia did announce a 3GB GTX 1060 variant – but with one fewer SM, reducing the core count by 10%. That drops the GTX 1060 from 1280 CUDA cores to 1152 CUDA cores (128 cores per SM), alongside 8 fewer TMUs. Of course, there's also the memory reduction from 6GB to 3GB.
The rest of the specs, however, remain the same. The clock-rate has the same baseline 1708MHz boost target, the memory speed remains 8Gbps effective, and the GPU itself is still a declared GP106-400 chip (rev A1, for our sample). That makes this most the way toward a GTX 1060 as initially announced, aside from the disabled SM and halved VRAM. Still, nVidia's marketing language declared a 5% performance loss from the 6GB card (despite a 10% reduction in cores), and so we decided to put those claims to the test.
In this benchmark, we'll be reviewing the EVGA GTX 1060 3GB vs. GTX 1060 6GB performance in a clock-for-clock test, with 100% of the focus on FPS. The goal here is not to look at the potential for marginally changed thermals (which hinges more on AIB cooler than anything) or potentially decreased power, but to instead look strictly at the impact on FPS from the GTX 1060 3GB card's changes. In this regard, we're very much answering the “is a 1060 6GB worth it?” question, just in a less SEF fashion. The GTX 1060s will be clocked the same, within normal GPU Boost 3.0 variance, and will only be differentiated in the SM & VRAM count.
For those curious, we previously took this magnifying glass to the RX 480 8GB & 4GB cards, where we pitted the two against one another in a versus. In that scenario, AMD also reduced the memory clock of the 4GB models, but the rest remained the same.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.