An Overclocking Primer: The Basics of Overclocking

By Published September 20, 2012 at 8:56 pm

System building is an exercise in both education and enthusiasm -- as the core principles of system assembly become second nature, the enthusiast approach to commanding more power from hardware is the next fitting evolution. Overclocking is a part of this process.


In this primer to overclocking, we'll provide the fundamentals of overclocking, discuss the principles behind it, and cover what, exactly, overclocking is and what it nets you (and supply a brief tutorial). This is the ultimate in top-level guides, so those who have a firm grasp of the basics may not find any new information herein; with that said, the goal of this primer is to provide a solid foundation for your future overclocking exploits. You will not find CPU-specific OC advice in this guide, but can certainly ask for it below or in our forums!

Let's dive in.


What is Overclocking?

Luring more kick out of your components - primarily the CPU, RAM, or GPU - by amping up multipliers, base clock rates (BCLK), voltage, and other aspects of the targeted components. Overclocking has the great ability to effectively create or unlock a better product with a bit of time and some key strokes. Some CPUs will have locked settings (such as the capped multipliers in non-K series Intel CPUs), but they can normally still be pushed at least slightly beyond their stock settings, even if it requires third party software or unlocking motherboards. The reasoning for "locked" components is explained below.

Why is Overclocking Even Possible?

To achieve the best yield-per-wafer (read about the silicon die process here), chip manufacturers determine the most stable speeds of the chip currently being fabricated. To use memory as an example, Kingston previously explained their extensive burn-in process which bins-out memory chips by their highest stable speeds (1600MHz, 1866MHz, etc). Fewer chips in this process will bin-out as natively higher speed modules as a result of decreased reliability at higher frequencies. As the price of the chip is largely dictated by the die yield per wafer (more dies successfully sliced per wafer equates more supply and lower prices), companies aim to output more dies at 'safer' specs. By doing this, the companies are opting for an overall more reliable product at the expense of speeds, which are quite possibly lower than their fullest potential. That's where you come in as an overclocker - unveiling the full potential of your chip.

waferA silicon wafer! This is where your CPU dies are sliced.

It's all about manufacturing toward the lowest-common (stable) denominator, and ensuring the largest percentage of stable products for the end-user.

This is why overclocking can happen: Chances are, the silicon dies behind your CPU, memory, or GPU have sizable levels of overhead in their performance output, but until you unlock that overhead and push the chip to its highest stable capabilities, this overhead goes untapped.

It's highly likely that the very chip you're operating on right now is capable of higher performance levels, but was binned-out to the lowest stable specification determined by the factory.

If you want the most headroom for overclocking, be sure to look into overclocking-ready or unlocked CPUs (Intel uses the "K" to demarcate maximum overclocking ability per model; AMD's 'black edition' CPUs have traditionally had great flexibility, although we've found that even AMD's entry-level CPUs have reasonably OC capabilities).

When is overclocking useful? Why should I overclock my PC?

The most important question in the entire process is whether or not overclocking is "worth it" to you. Hopefully, if only because it's fun to play with hardware, the answer is an easy 'yes,' no matter its practicality.

task-manager-100Every ounce counts when performing intensive tasks (transcoding/rendering featured here).

There are many practical uses for overclocked components, however fun it may be to simply play with hardware, so let's go over a few of those:

Faster Processing: This is particularly important for anyone utilizing multithreaded applications (compilers, encoders, renderers). Some programs, such as those that render edited videos, don't suffer from the same utilization limitations that the common game does; where a game can only use so much of a CPU's offerings (as opposed to video hardware), rendering applications will take as much CPU and memory as they can get. The faster the hardware, in these instances, the faster encoding tools can transcode or render video output. It is likely that the limiting factor will be a magnetic hard disk drive, but that's a different subject. Premiere video editing software suites will also make use of video hardware for the number-crunching capabilities, but it never hurts to amp up the usable clock speed of any hardware handling intense applications.

Increased Usable Life: As components near the end of their usable life for your purposes, there's not much that can be done to prevent their inevitable disuse or replacement. Overclocking won't counter the impending requirement for replacements, but it can fend off those looming expenses for just slightly longer. This is under the assumption, of course, that other significant hardware advances (which have undoubtedly been made by the time you're desperate to extend usable life) are not required in daily use. DirectX is an example on the GPU end -- no matter how far you push your GPU, no matter how many aftermarket heatsinks you solder to the chip, it's not going to be able to run iterations of Dx that it does not natively support.

With this said, a good CPU can last years upon years. Once things start to slow down (or once you have a desire for more), slap a sturdy and re-usable aftermarket cooler to the processor and amp up the speeds until satisfied. Please keep in mind that over-volting your CPU can result in the opposite of increased usable life -- an early demise.

Is overclocking useful for gaming?

This is a tough question to answer -- the viability of overclocking as a medium through which greater gaming support can be achieved is largely dependent on a few key factors: The game's ability to recognize and employ new speed advancements and the type of overclocked component.

A GPU that is pumped-up a bit through EVGA Precision or MSI Afterburner may help push your graphics settings from "medium-high" to "high," if you've been sitting on the razor's edge between the two. Turning a 2.8GHz CPU into a 4GHz CPU will not produce linear improvements in gaming quality for most games.

Some games, like Civilization and the Total War series, are more CPU-intensive than their peers, and so it is likely that bolstered CPU performance will yield a more noticeable enhancement than in, for instance, TERA (as one of our users noted) or Battlefield (both of which favor the GPU more than many other games, from our testing).

As lame as it may be, the answer here is "it depends." It depends on the game, on the hardware involved, the age of the hardware, whether or not newer technologies (shader technologies, DirectX, caching systems, filtering protocols, etc.) are required and present, and many other factors. Feel free to post on our forums for in-depth support on a per-game basis!

Continue on to Page 2 for our overclocking videos and for the "how-to overclock your CPU or GPU" section.

Prev Next »

Last modified on September 20, 2012 at 8:56 pm
Steve Burke

Steve started GamersNexus back when it was just a cool name, and now it's grown into an expansive website with an overwhelming amount of features. He recalls his first difficult decision with GN's direction: "I didn't know whether or not I wanted 'Gamers' to have a possessive apostrophe -- I mean, grammatically it should, but I didn't like it in the name. It was ugly. I also had people who were typing apostrophes into the address bar - sigh. It made sense to just leave it as 'Gamers.'"

First world problems, Steve. First world problems.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

  VigLink badge