Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"
First world problems, Steve. First world problems.
Raven Ridge APUs are interesting as products. In a world where MSRP acted as an infallible decree handed down by galactic overlords, the GT 1030 would cost $70, the RX 560 would cost $100, and the G4560 would always have been $60. In this world, however, the GT 1030 has now usurped both the GTX 1050 and RX 560 in price, landing at $110 to $120, and the G4560 has… actually fallen in price, down to $60 from an overpriced $80 previously.
Then the R3 2200G and R5 2400G entered the market, priced at $100 and $170, respectively. These APU launches are different from previous APU launches: Previously, AMD has pushed variants of the Bulldozer architecture with older generation GPU components; today, Ryzen and Vega significantly outperform AMD’s previous parts, and are both found in the APUs.
We’re benchmarking the Raven Ridge parts entirely for gaming right now. In our eyes, the Raven Ridge APUs – the R3 2200G and R5 2400G – are gaming parts, and so we’ll leave the production workloads to the higher-end Ryzen desktop parts. We are also focusing our performance testing on the R3 2200G, R5 2400G, and competing, similarly priced dGPU + discrete CPU options. This includes the G4560 + GT 1030 and R3 1200 + GT 1030. For determining performance scalability, we have a few charts from our GPU bench (run with an unconstrained GPU on an i7-7700K). These are obviously not meant to compare the APU performance to high-end desktop components, but rather to offer perspective of scale – it’s a look at how much performance an APU provides at its price.
Note also that we’ve not bothered to test the Intel IGP performance, as we already know its performance is, comparatively speaking, garbage. There’s no need to do in-depth testing on that; no one should reasonably be using an Intel IGP for gaming at any meaningful quality level. Because our performance floor cuts the IGPs, we are left with the APUs and immediately competing discrete components.
We're back with Ask GN! It's been a long week of testing: Patrick has been working on FFXV, an Elgato 4K60 review, and other pieces; I've been working on managing the upcoming travel schedule, primarily for Computex and other tradeshows, and also have a whole slew of in-depth content coming up. One of our biggest endeavors for the week will be our upcoming livestream, where we intended to battle the LinusTechTips team for a top 10 spot in 3DMark benchmark rankings. It's a bit of a friendly rivalry, and we think you all will enjoy tuning in. We'll talk about that more soon.
We also have a news video going up tomorrow, as usual. The video will include several major news items for the past week, including some discussion of the nVidia GPP story that's been going around. Stay tuned for all of that.
In the meantime, Ask GN is below, and the timestamps are below that. Our Patreon bonus episode is here.
01:37 - David Watson: “Hey Steve , do you think we will end up seeing Nvidia competing with the aftermarket cards more directly by releasing their own non founders edition cards with new custom cooling solutions and heatsinks with double or triple fan designs ? i think its a fascinating prospect and one i feel Nvidia has had a lot of thought bent on and is surely bound to use much sooner rather than later because if they can get more marketshare then they surely will and i honestly feel its coming from Nvidia and could be massive for them , because i know one thing , if they released a badass new aftermarket card design that not only performed really well but looked really cool and actually ran cool then i for one would certainly be very tempted by it as many surely would Steve , do you feel this is on the horizon ? thanks man”
06:47 - Stank Buddha: “Quick question, regarding the 200fps limit in (newer?) games. Is this applied per monitor if you were using multi monitor or is it the actual game that is locked to to pushing 200fps total??? Like if the game is locked itself then a theoretical 3 60hz monitors would be maxing it out(60*3=180). Or can you do a 3 monitor 240hz and max em out each at 200fps. just wondering.”
08:57 - Michael Morgan: “Can you demonstrate the end user benefit of HBM memory over GDDR5 or GDDR5X on GPU's please?”
13:12 - vishal bobde: “#askgn-questions Why do CPU don't have different manufacturers like GPU. If there were more manufacturers we might get more enthusiasts features from factory like LM tim and better IHS.”
17:23 - Satoshi_Nakamoto: “@GN Staff Hey guys could you reach out to Thermaltake and ask them if they have any idea for the arrival of the Level 20 Case?”
18:14 – defenestrationize: “Steve, a massive limit limit for APUs is their need to use system memory. Do you think, APUs will remain on the low end or end up high end (is there not actually a limit to DDR for APUs) , More memory channels on APUs (possibly separate for CPU/GPU so 2 sticks DDR 2 sticks GDDR) or on chip memory (hbm) will appear in the near future? Given how board partners operate and push for chip consolidation , do you think we might see a MB, RAM, GPU, CPU as a single pcb ? Feel free to cut this question as needed to perhaps a simpler version.”
22:36 - Dayne_ofStarfall: “@Steve Burke Hello Steve, I’m a bit confused by case fans lately, specifically RPM and in relation to voltage. If I understand correctly different fans have different MAX and MIN RPM at a given voltage. But what happens when you connect two fans with different MIN/MAX RPM to a single header on the motherboard using a Y-splitter? Do the fans spin at different RPMs? And how does this work when they’re connected to a SATA-powered PWM Hub (like the one that comes with most Phanteks cases)? Also what is the amount of fans that can be safely connected to one header? I’ve read on forums that the cable or port can catch fire if they draw too much power, is this true? Thank you.”
24:25 - Ash_Borer: “#askgn-questions how do delidded (with LM) temperatures compare to soldered CPU temperatures? Do you ever plan to delid a ryzen and test the results? Im under the assumption that delidding provides better temps, so i dont mind that intel doesnt solder anymore - as an enthusiast i want to delid anyway and if its not soldered it is easier to delid.”
25:29 - Armand B.: “Modmat out of stock ? DAMN MINERS !”
25:53 - Nory The Explorer.exe: “@GN Staff What is an important fact, viewers should know about GamersNexus?(edited) And the opposite, what is a big misconception viewers have expressed about GamersNexus?”
Host: Steve Burke
Video: Andrew Coleman
Even when using supposed “safe” voltages as a maximum input limit for overclocking via BIOS, it’s possible that the motherboard is feeding a significantly different voltage to the CPU. We’ve demonstrated this before, like when we talked about the Ultra Gaming’s Vdroop issues. The opposite side of Vdroop would be overvoltage, of course, and is also quite common. Inputting a value of 1.3V SOC, for instance, could yield a socket-side voltage measurement of ~1.4V. This difference is significant enough that you may exit territory of being “reasonably usable” and enter “will definitely degrade the IMC over time.”
But software measurements won’t help much, in this regard. HWINFO is good, AIDA also does well, but both are relying on the CPU sensors to deliver that information. The pin/pad resistances alone can cause that number to underreport in software, whereas measuring the back of the socket with a digital multimeter (DMM) could tell a very different story.
This will be a quick one. There is some required viewing/reading before diving in: Previously, with the FFXV standalone benchmark release, we found significant culling deficiencies of objects in the game, including both GameWorks and non-GameWorks objects. This suggested overall inefficiency and hasty development, as opposed to some sort of malfeasance. Square Enix later tweeted rather direct acknowledgement of the benchmark’s issues, and began work to optimize the game (and the GameWorks integration) for launch.
Today’s test is a quick one. Square Enix launched a playable demo of Final Fantasy XV and, although it’s still not the complete game, we wanted to see if any of the object culling issues had been addressed. We were primarily interested in HairWorks LOD scaling, as that was previously an issue responsible for causing performance loss on both nVidia and AMD hardware – even when no HairWorks objects were anywhere remotely close to the player.
Following Final Fantasy XV’s benchmark launch, which we found to be flawed in a few ways, Square Enix has now launched its playable demo for the first portion of the game. This is the first time that FFXV has been playable on PC, barring some flukes in the benchmark, and is also the first revisit to the game since the benchmark’s launch.
Our primary concerns with the benchmark tool were validated by Square Enix, who noted they’d be addressing the concerns. The primary issue was that no graphics customization options were present without exposing the game’s .ini files via .dlls, which we did, and we later found the other issue: Some objects were being drawn at high LOD when never appearing on screen, something we validated with inspection tools. This included non-GameWorks objects and GameWorks objects, with the latter impacting performance more heavily on both AMD and nVidia devices.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.