Understanding Server CPU Specs
Servers are essential for so many business applications these days. If you rely on your servers for anything, then you tend to rely on them for everything.
When you’re shopping for servers (or looking to upgrade your hardware), you’re going to have to compare a lot of different specifications, and one of the most important components to analyze is the central processing unit (CPU). This is the core of the server that does all of the heavy lifting and “thinking” for the device. A great CPU runs a great server, and an inferior CPU leaves you frustrated.
So, the best way to compare servers is to understand CPUs. Below, you’ll find a breakdown of the essential metrics so you can make informed comparisons and find the right technology for your applications.
Multiple Processors
The first thing to understand about servers is that they aren’t like personal devices. Laptops, PCs, smartphones, and other things you might use in your daily life typically have a single processor. This one processor probably has multiple cores and threads (more on those concepts later), but there is one, centralized processing device.
Servers are different. The point of a server is that it can handle large volumes of data and processing, and it’s normal for a single server to have multiple processing units, all working together. You can then make a bunch of servers work together, multiplying your overall processing power.
When it comes to making purchasing decisions, here’s the bottom line. More processors is usually better but also more expensive. In general, you want as many processors as you can get while shopping within your budget. That will maximize your computational power and allow your server(s) to work even as your operation grows.
But, there’s a catch to this whole concept. Not all processors are equal, and it’s possible for a server to have more processing units and less overall processing capability, and that comes down to the components within each unit. That’s what the rest of this discussion will cover.
Cores
You know how a server can have multiple processors that work together to expand the device’s total capabilities? Well, a couple of decades ago, engineers figured out how to apply this concept to a single processor.
Within one processing unit, you can have multiple processing cores. Each CPU core is capable of independently crunching numbers or carrying out programming tasks. That means a single processor can multitask if it has multiple cores.
On top of that, software engineers can design their code to take advantage of multiple cores. Basically, each core can handle a different part of the code at the same time, allowing for simultaneous processing within a single CPU.
This boils down to the fact that multi-core processors are much faster than their single-core contemporaries. Actually, pretty much all processors are multi-core these days.
What you’re really comparing is the number of cores. Generally speaking, more cores means more processing power, just like more processing units means more power.
Again, there are other factors below that impact how your server performs, but on average, you want to get as many cores as you can while staying within your budget.
Threads
You’re about to notice a trend. More CPUs improves a server’s power. More cores improves a CPU’s processing power. Well, more CPU threads can improve a core’s power.
The same way that multiple cores can exist within a single processing unit, each core can actually handle multiple processing threads at a time.
Fundamentally, this concept is a little more complicated than what we’ve covered before. Basically, you can design a processing core so that it can do two things at once. In some cases, you can actually expand multithreading beyond this, but by and large, you’re looking at two threads per core.
A multithread core is a bit of a convergence between software and hardware design. A core can only use multithreading with software that supports it, and vice versa. To keep this from taking up your whole day, multithreading is a technique that allows a single core to execute code with double the efficiency, effectively allowing a single core to work like two virtual cores. It all boils down to design efficiency.
In terms of shopping, the vast majority of modern CPU cores are designed for multithreading. In general, any server CPU is going to have twice as many threads as cores.
If it’s so standard, why do you really need to know this?
There are a couple of things.
First, it means that doubling the core count in a CPU quadruples the effective multitasking and processing power of that CPU. Going from 4 to 8 cores is a much bigger improvement when you realize you’re jumping from 8 to 16 threads.
First, it means that doubling the core count in a CPU quadruples the effective multitasking and processing power of that CPU. Going from 4 to 8 cores is a much bigger improvement when you realize you’re jumping from 8 to 16 threads.
Second, multithreading isn’t always better. Some software will deliberately disable multithreading because it runs better for specific applications. If you’re running such software, then increasing the number of threads in your CPU won’t do you much good — at least for that specific software.
Clock Speeds
Let’s recap. More CPUs is better. More cores is even better. And, you usually get two threads per core, doubling the value every time you increase your core count.
Still, we have to cover clock speed. This is basically a measure of how fast a single core can carry out tasks. A higher clock speed translates to more processing power. So, all other things being equal, the processor with the higher clock speed is the better choice.
Unfortunately, all other things are often not equal. How would you choose between two CPUs if the one with the higher clock speed has fewer cores?
The answer is a bit complicated, and ultimately, you’re making trade-offs. In this example, the CPU with the higher clock speed can perform a single task better, but the CPU with more cores has substantially better multitasking capability.
So, what does your server do? If it handles communication for a large number of devices, more cores is probably better. If it’s crunching numbers for the math department at a university, you might actually favor the clock speed. If you’re primarily running software that doesn’t use multithreading, then clock speed is much more important to you.
You can see how this gets complicated, and the right choice isn’t always obvious.
A Simplified Bottom Line
Fortunately, there is a final specification that can help you make the right choice. For any processing unit, you should be able to find the maximum flop rate. This is a measure of how much data the device can process, accounting for all of the cores and threads at the maximum clock speed, all at the same time.
If you want an apples-to-apples comparison of two CPUs with different clock speeds and varying numbers of cores, flop rate is your metric. A higher flop rate means more total processing power. So, if you don’t have applications that specifically favor more cores or a higher clock speed, then get the processor with the highest flop rate (that’s still within your budget). It’s the ultimate tie-breaker.
Additional Learning Center Resources