For years, the de facto standard for PC gaming and consoles was 1920x1080 – even if consoles occasionally struggled to reach it. 1080p monitors have been the only practical choice for gaming for years now, but viability of 1440p-ready hardware for mid-range gaming PCs means that the market for 1440p monitors has become more competitive. Similarly, the 4K monitor market is also getting fairly competitive, but unfortunately mid-range (and even higher-end) GPUs still struggle to run at 4K in many modern games.

While 4K becomes more attainable for the average consumers, 2560x1440 monitors fit the needs of many gamers who want higher resolution than 1080p while still desiring to render – and show – 120+ FPS. With this in mind, we’ve created this buyer’s guide for the best 1440p gaming monitors presently on the market, particularly when accounting for price, high refresh rate, or panel type. Since the primary use case for the monitors in this guide is gaming, we have primarily included G-Sync (covered here) and FreeSync (covered here and here) compatible monitors for users with nVidia and AMD GPUs, respectively.

“Ye-- ye cain't take pictures h-- here,” a Porky Pig-like voice meekly spoke up from behind the acrylic windshield of a golf cart that'd rolled up behind us, “y-ye cain't be takin' pictures! I'm bein' nice right now!”

Most folks in media production, YouTube or otherwise, have probably run into this. We do regularly. We wanted to shoot an Ask GN episode while in California, and decided to opt for one of the fountains in Fountain Valley as the backdrop. That's not allowed, apparently, because that's just how rare water is in the region – don't look at it the wrong way. It might evaporate. Or something.

But no big deal – we grab the bags and march off wordlessly, as always, because this sort of thing just happens that frequently while on the road.

Regardless, because Andrew was not imprisoned for sneaking a shot of the fountain into our video or taking two pretzel snacks on the plane, Ask GN 29 has now been published to the web. The questions from viewers and readers this week include a focus on “why reviewers re-use GPU benchmark results” (we don't – explained in the video), the scalers in monitors and what “handles stretching” for resolutions, pump lifespan and optimal voltage for AIOs, and theoretical impact from HBM on IGPs.

Despite AMD’s FreeSync arriving later than nVidia’s G-Sync, FreeSync has seen fairly widespread adoption, especially among gaming monitors. The latest monitor – and the 101st – to officially support FreeSync is Lenovo’s Y27f. This also marks the announcement of Lenovo’s first FreeSync monitor.

For those interested in learning about FreeSync and G-Sync check out our articles explaining G-Sync, FreeSync, and comparing them both technically and logically.

This episode of Ask GN 25 carries our content output while we travel, granting a brief reprieve from the unrelenting GPU reviews of late. As always, post questions on the YouTube video page for potential inclusion in the next Ask GN episode. If you've got a non-GPU question, those would be greatly appreciated to break-up the content!

For this episode, we're focusing on the question of Fast Sync vs. V-Sync, talking GPU binning, the impact of power supply selection on overclocking headroom, and more. The very last comment in the video will address our RX 480 Endurance test – mostly difficulties with crunching and presenting as much data as we've collected.

Video and time stamps below:

This fifteenth episode of Ask GN springs forth a few quick-hitter questions, but a couple that require greater depth than was addressable in our episodic format. These longer questions will be explored in more depth in future content pieces.

For today, we're looking at the future of AMD's Zen for the company, forecasting HDR and monitor tech, discussing IGP and CPU performance gains, and talking thermals in laptops. As always, one bonus question at the end.

Timestamps are below the embedded video.

Stutter as a result of V-Sync (which was made to fix screen tearing -- another problem) has been a consistent nuisance in PC gaming since its inception. We’ve talked about how screen-tearing and stutter interact here.

Despite the fact that FPS in games can fluctuate dramatically, monitors have been stuck using a fixed refresh rate. Then nVidia’s G-Sync cropped-up. G-Sync was the first way to eliminate both stutter and screen-tearing on desktop PCs by controlling FPS-refresh fluctuations. Quickly after nVidia showed off G-Sync, AMD released their competing technology: FreeSync. G-Sync and FreeSync are the only adaptive refresh rate technologies currently available to consumers on large.

We welcomed AMD's Scott Wasson on-camera at the company's Capsaicin event, where we also spoke to Roy Taylor about driver criticism and covered roadmap updates. Wasson was eager to discuss new display technology demonstrated at the event and highlighted a critical shift toward greater color depth and vibrancy. We saw early samples of HDR screens at CES, but the Capsaicin display was far more advanced.

But that's not all we spoke about. As a site which prides itself on testing frame delivery consistency (we call them “low frametimes” – 1% and 0.1% lows), it made perfect sense to speak with frametime testing pioneer Scott Wasson about the importance of this metric.

For the few unaware, Wasson founded the Tech Report and worked as the site's Editor-in-Chief up until January, at which time he departed as EIC and made a move to AMD. Wasson helped pioneer “frametime testing,” detailed in his “Inside the Second” article, and we'd strongly recommend a read.

We recently proved the viability of UltraWide (21:9) monitors on an equally ultra-wide range of video cards, with performance benchmarking indicative of playability on lower-end hardware than might be expected. 21:9 resolution displays have seen a resurgence of interest lately with slowly dropping prices at the low-end and increasing GPU performance.

We've lately used the ~$1300 Acer Predator X34 and ~$540 Samsung 29” Ultra-wide; following more hands-on gaming experience, it made sense to address a new question: Do UltraWide (21:9) monitors give an advantage in gaming? We'll be looking at this topic from two angles – the competitive and the immersive aspects – using a 2560x1080 and 3440x1440 set of UltraWide displays.

Monitors have undergone a revolution over the past few years. 1080p lost its luster as 1440p emerged, and later 4K – which still hasn't quite caught on – and that's to say nothing of the frequency battles. 144Hz has subsumed 120Hz, both sort-of “premium” frequencies, and adaptive synchronization technologies G-Sync and FreeSync have further complicated the monitor buying argument.

But we think the most interesting, recent trend is to do with aspect ratios. Modern displays don't really get called “widescreen” anymore; there's no point – they're all wide. Today, we've got “UltraWides,” much like we've got “Ultra HD” – and whatever else has the U-word thrown in front of it – and they're no gimmick. UltraWide displays run a 21:9 aspect ratio (21 pixels across for every 9 pixels down, think of it like run/rise), a noticeable difference from the 16:9 of normal widescreens. These UltraWide displays afford greater production capabilities by effectively serving the role of two side-by-side displays, just with no bezel; they also offer greater desk compatibility, more easily centered atop smaller real-estate.

For gaming, the UltraWide argument is two-fold: Greater immersion with a wider, more “full” space, and greater peripheral vision in games which may benefit from a wider field of view. Increased pixel throughput more heavily saturates the pipeline, of course, which means that standard 1080p and 1440p benchmarks won't reflect the true video card requirements of a 3440x1440 UltraWide display. Today, we're benchmarking graphics card FPS on a 3440x1440 Acer Predator 34” UltraWide monitor. The UltraWide GPU performance benchmark includes the GTX 980 Ti, 980, 970, and 960 from nVidia and the R9 390X, 380X, 290X, and 285 from AMD.

AMD’s new Polaris architecture discretely sat in the company’s CES 2016 suite, running Star Wars Battlefront with impressively low system power consumption. Quietly the GPU sat, running a completely new architecture and process for GPUs. No fanfare, no bombastic marketing videos projected on the walls, no product unveil insanity.

The demo was simple: Show two Intel i5 Haswell systems side-by-side, one with an nVidia GTX 950 ($160) and one with AMD’s undisclosed Polaris GPU. AMD locked framerate to 60FPS in the demo, showing both GPUs at a constant 60FPS using the X-Wing Survival map (singleplayer), and directing focus toward Kill-A-Watt wall meters. The wall meters show total system watt consumption and, as one would expect from an AMD suite, the AMD-powered system ran at lower total system power consumption overall.

Page 2 of 6

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge