NVIDIA and AMD constantly go tit-for-tat on GPU + game bundles, each attempting to add that extra bit of value to sway buyers of new cards. How much that effects buying decisions is questionable, but I won't fight free games.
Anyway, we're keeping this one a lot quicker than previous AMD press conference write-ups. Here's the slide you care about:
In late 2013, AMD came out with their new GPU series that included the R9 290 and R9 290X, both of which ran quite loudly and at 95C. This has led to a plethora of custom coolers for these two cards being released by third-party vendors. One of the most unexpected being VisionTek’s CryoVenom, a liquid-cooled 290 for $600.
Almost immediately after our press conference with nVidia concluded -- the one directly challenging the validity of Mantle -- AMD contacted us for a discussion on their R9 295X2 video card. The R9 295X2 has been spoiled for quite a while in traditional AMD marketing fashion, namely by sending extremely flattering photos to some of the major tech outlets. As of last week's call, we were able to get the full Radeon R9 295X2 specs, including TDP, fab process, memory & buses, and details on the difficulty with PSU support.
Let's get right to it with this one.
Day one of GTC saw the presentation of nVidia’s refreshed lineup of VisualFX SDK tools, including GameWorks, FaceWorks, HairWorks, WaterWorks, and other *Works software. These Software Developer Kits are used in aiding the development of games, including the optimization of graphically-intensive dynamic elements, like fur, fire, and smoke. Graphics and CPU technology companies are often behind many of the game industry’s visual advancements, but as PC gamers, we don’t see much of it actually in-game for several years. Development time is part of this, adoption is part of this, and consoles are responsible in part.
Let’s look at some of nVidia’s more recent changes for character faces, real-time smoke and fire effects, pre-baked lighting effects, subsurface scattering, deep surface scattering, and fur/hair technology. It seems pertinent to recommend watching Epic’s Unreal Engine tech demo as well, since it utilizes many of these technologies and render methods; you can read our full article on UE4 here.
In a somewhat tricksy move today, AMD hosted a press conference a couple of miles from nVidia’s active GTC event going on down the road. In yesterday’s keynote by nVidia CEO Jen-Hsun Huang, we saw the introduction of the new Titan Z video card, Pascal architecture, machine learning, and other upcoming GPU technologies. Now, less than 24 hours later, AMD has invited us by to look at their new high-end workstation solution – the W9100 FirePro GPU.
The presentation was pretty quick compared to what we got with nVidia, but the primary focus was on computationally-intensive OpenCL tasks, real-time color correction and editing playback in full 4K resolution, and “enabling content creation.”
Let’s start with the obvious.
NVIDIA's keynote today kicked off with Linkin Park's appropriately-loud "A Light That Never Comes." We’ve been at nVidia’s GTC since yesterday, but today is the official kick-off of the GPU Technology Conference with nVidia CEO Jen-Hsun Huang; Huang hosted today’s GTC Keynote – part of nVidia’s annual hype parade – to discuss advancements in GPU technology, SDKs for gaming applications, machine learning, PCIe bandwidth, and the new architecture after Maxwell (Pascal). We’ll primarily focus on the GPU technology and Pascal here – I have a full-feature article pending publication that covers all of the SDK information.
The big news is the announcement of nVidia's Pascal architecture, the architecture that will follow-up on Maxwell, Titan Z specifications, and NVLink.
Last week we visited San Francisco for GDC, the Game Developers Conference, where we interviewed AMD (content forthcoming), Intel about Devil’s Canyon, the EverQuest Next team, Clockwork Empires devs, Dreamfall Chapters devs, and more. Content is still pending publication for many of these interviews (thanks given to our hotel internet bottlenecking video uploads), but it’s time to start switching gears for GTC – the GPU Technology Conference, as hosted by nVidia.
There's been a lot of discussion about Titanfall's performance lately. Our most recent Titanfall GPU performance benchmark showed that the game still exhibits serious issues on certain devices; nVidia cards showed severe stuttering, SLI has micro-stuttering and works better disabled, and the game is simply needlessly large. All these taken into account, the performance issues feel almost unjustified for the visuals -- the game looks fine, sure, but it's not melt-your-GPU level of graphics and certainly isn't spectacular to look at. It's another Source Engine game with adequate graphics. And I'm not saying that's a bad thing, so please don't get me wrong -- just that the performance isn't perfectly-tuned, at least, not yet. More drivers and patches will smooth that out.
I don't want to come off as too harsh, though. The mechanics are enjoyable for certain types of players and the game overall seems 'good,' it's just experiencing some (now-standard) launch issues with PC optimization. All is survivable, though.
NVidia started its press release off with some overly-marketed, infomercial-esque questions, but got to the point quickly: Daylight, the new psychological thriller (we previewed this) running on Unreal Engine 4, will be included with purchases of the Titan / Titan Black, GTX 780 Ti, 780, 770, 760, 690, 680, 670, 660 Ti, and GTX 660. NVidia says the game will activate on April 8th.
Titanfall's official launch brings us back to the topic of video card performance in the Source Engine-based game. When we originally benchmarked how various video cards performed in Titanfall, we clearly noted that the pre-release state of the game and lack of official driver support likely contributed to SLI microstuttering, CrossFire catastrophic failure, and overall odd performance. We're now back with a full report using the latest beta drivers (with Titanfall profiles and support) and the full version of the game.
In this Titanfall PC video card benchmark, we look at the FPS of the GTX 760, GTX 650 Ti Boost, GTX 750, R9 270X, R7 260X, 7850, the A10-5800K 7660D APU, and Intel's HD4000. I threw a GTX 580 in there for fun. Our thanks to MSI for providing the 750, 260X, and 270X for these tests.
We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.