Room Temperature vs. GPU Heat: Vega 64, GTX 1080, &Mining Machines
Sunday, 10 December 2017Vega 64 may consume more power than a GTX 1080, but until now, we haven’t known if that impact is relevant to room temperature. That’s what we wanted to know, and we eventually expanded that concept to include how much a 900W+ mining machine increases room temperature, a 600W machine, and so on. We were able to effectively replace any need of a heater for the past week, and right when it started to get colder.
In this test, we’re looking at the room ambient impact of various PC builds. This helps to conceptualize the real-world impact of all those power and thermal tests you see us (and others) publish, as it puts real numbers to the user experience outside of the case. Although this concept has about a million variables and “what ifs,” we controlled to the best of our abilities, are laying-out all the major variables, and can present an academic experiment that demonstrates room temperature increase from computer equipment. All watts are basically created equal, for the purposes of this test: A 940W mining rig will output just as much heat into the room as a 940W gaming rig, or a 940W rendering machine, and so forth; as long as the power load is equal between all of these (read: constant), watts are watts, and you can extrapolate room temperature for each type of machine.
The testing originally was concepted after our Vega 56 Hybrid mod, which used power mods and other mods to push the card up towards 400W of power consumption. We wanted to test a straight Vega 56 versus GTX 1070 for room ambient impact, but shifted that up a tier (to Vega 64 and a GTX 1080) for some parts that are more likely to show a difference. After that, we shifted up to a 940W mining machine, then picked a middle-ground ~600W machine (which could also represent SLI gaming or HEDT render systems).
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.