NVidia ShadowPlay: Retroactive Gameplay Capture Software
Fraps is probably the longest-standing video capture technology available to gamers, and given its frametime analysis and benchmarking options, it's become an important tool to nearly all video card reviews. Still, even Fraps can't record the past -- and even Fraps has a fairly significant hit to performance, especially disk IO given its raw footage capture.
NVidia's ShadowPlay software can capture gameplay video at a maximum resolution of 1080p (plans for more) at 30FPS, using the card-optimized, Kepler-integrated H.264 encoder. Users can configure ShadowPlay to automatically capture up to 20 minutes of footage in either "Shadow mode" or "manual mode," with manual mode performing similar to existing video capture tech (hit the button and it records).
With Shadow mode, however, things are a bit different: ShadowPlay will run in the background and capture gameplay silently, but won't save the file unless otherwise instructed by the user. This reduces overall performance tie-ups and reduces storage utilization to near-zero (files are only stored to disk once a hotkey is hit), but still offers the potential to capture key gaming moments. Undesired footage is dumped as it falls out of the user-designated timeframe.
I'll outline the most obvious use-case scenario of this tech, if you haven't already figured it out: Rather than recording all of your gameplay -- which undoubtedly includes boring deaths, loading, or wandering -- you can let ShadowPlay run silently, then instruct it to only save after you've just done something spectacular. Playing Shootmania, for instance, I'd be able to wait until I've had a major killstreak to save the file, then dump the useless footage that shows my limitless prior failed attempts at playing properly.
NVidia phrased it best in their press conference:
"The key to ShadowPlay is that it uses the built-in H.264 encoder that we have built in on all our GPUs to help with recording gameplay. We've reduced a lot of the overhead, so now when you record your gameplay, you see a 5% or less impact on performance, and you're encoding files that are smaller and more manageable: Easier to store, easier to share. [...]
[With Shadow mode], if you have some epic feat of gaming goodness, you hit a hotkey and it saves it - just the last [2-to-20] minutes. You don't have to know you're about to do something cool, you just capture anything you do automatically."
There is presently no support for stream integration (Twitch or otherwise), higher FPS or resolution, or frametime analytics (which is what FCAT is for). Members of the press reinforced the importance of these features in the conference, to which nVidia stated that they're wide open to feature suggestions.
ShadowPlay is expected to be available some time in the summer, likely June, and will be integrated with GeForce Experience.
NVidia GeForce Experience & Day-One Drivers
Other than ShadowPlay, the other side of the software news is largely focused around GFE, which has already been available in beta for a few months. To catch the unfamiliar up, GeForce Experience (GFE) effectively auto-tweaks your game's graphics settings based upon system specs, aiming to optimize performance for an underlying FPS/frametime bare minimum.
For nearly everyone reading this article, that's a useless feature; gaming and hardware enthusiasts—speaking for our staff, anyway—find the "options" panel first-and-foremost, eager to look at what graphics tech to expect. Once you know what your equipment is capable of, it's not that difficult to tweak options (and somewhat entertaining), so automating this process doesn't necessarily net us anything usable in the field. NVidia didn't deny this when we met with them at PAX East, but did mention that it'll help in the console-to-PC transition for gamers new to the PC's flexibility.
Part of GFE's mission, as was made clear last week, will be to automatically fetch and install relevant driver updates. NVidia's supporting logic effectively stated that "gamers don't care" if a driver update comes out days or a week later; a lot of gamers will be done with the game within a matter of days (if not the first day), and even fewer want to go fetch drivers manually. "They don't want to do that, they want to play their game," we were told.
Settings recommendations aside, GFE theoretically streamlines driver installation so that day-one drivers are fetched (hopefully) pre-install, so you don't need to waste any time getting to the game. It's not going to blow the lid off the industry, but it's worthwhile for certain types of users. Were I to briefly interject an opinion, I don't personally see myself (as a gamer) using GFE -- namely because I don't want more software, but also because I'm quite capable of doing all the things it presently does. That stated, I do see it as being relevant and offering great features for new-to-PC gamers. It also has potential for future features that are a bit more advanced, which is promising.
Official nVidia GeForce GTX 780 Specifications, Release Date, & MSRP
Now to the big news -- the hardware news. We already covered the GTX 780, 770, 760 Ti, and subsequent releases (speculation) in our last TechRAID episode, but we've finally received official confirmation from nVidia.
GeForce GTX 780 Specs
|GPU Boost||Boost 2.0|
|Single Precision||4.0 TFLOPs|
|Memory Config.||3GB GDDR5
|Memory Speed||6.0 Gbps|
|Power Connectors||6-pin + 8-pin|
|GTX 780 MSRP||$649|
|GTX 780 Release Date||May 23|
NVidia GeForce GTX 780 vs. AMD Radeon HD 7970
Internal testing bias noted, nVidia presented its own tests of the GTX 780 vs. AMD's 7970. We haven't yet tested the card, so take these results with a grain of salt.
In addition to these specs, we were told that "GTX 780 pricing is now confirmed at $649 USD," placing it right around where we'd expect an X80 card. The GTX 780's official release date is today, with nVidia's reference model being the only unit available for approximately one month; after a month has passed, card manufacturers will be shipping their own models with the usual cooler alterations. If you're not dying for a day-one 780, it's probably worthwhile to hold off for the manufacturer models given their often better cooling and pricing. I'd expect that the reference cards will have higher prices than manufacturer-made cards, for both competitive and manufacturing reasons (nVidia wouldn't want to undercut its board partners, for instance).
The 780 runs on GK110, as Titan does, and the new line of 700-series cards should all be equipped with Boost 2.0 (found on Titan); this is coupled with nVidia's approach to lower-TDP, lower-dBA, higher customization and OV/OC functionality. NVidia also noted that the GTX 780 should operate at nearly 10dBA quieter than the 680 and 580, specifically hovering at around 45dBA on the reference card. Their Adaptive Temperature Controller (see: Titan) means a more predictable fan curve that's less spiky. If there's a sudden drop in thermals due to decreased demand (like when loading a level), rather than the fan speed immediately falling off a cliff in RPM, it'll attempt to predict demand and adapt accordingly.
The reasoning for this is fairly straight-forward: As noted in the press conference, the human ear picks up noise differentials with great accuracy, and it's generally considered optimal to have a more consistent volume (and pitch) than a constantly changing one. This is true even when the dBA and RPMs are dropping -- your ear will pick up that difference, so if the card drops from 55dBA to 45dBA, then jumps back up a minute later, that's probably going to be more annoying than remaining at 55dBA. We're able to tune constants out.
Separately, more effort has been invested in SLI optimization this time around, with the GTX 780 consistently spitting out roughly 175%-180% the performance of a single 780 (which is about what you'd hope for). Multi-GPU arrays aren't a linear gain from their single base card, but if nVidia's internal benchmarks are to be trusted, this is a promising step to real optimization for SLI.
NVidia posted a video of the GTX 780 unveil on YouTube.
- Steve "Lelldorianx" Burke.