Do you need a Quad-core CPU?

Intel Quad CoreUnless you’re a hardware enthusiast, you may have missed the fact that Intel significantly lowered the price on it’s quad-core CPUs this weekend. A quad-core CPU has four processing units on one chip (this first quad-core from Intel is actually two dual-core CPUs on the same chip). With the price drop, a quad-core CPU can now be purchased for just a little more than a high-end dual-core. Four CPUs for the price of two. Sounds like a good deal, right?

Well, not so fast. Unlike clock speed increases, multiple cores don’t scale linearly. While a 2ghz CPU is twice as fast as the same CPU at 1ghz, a quad-core CPU is not four times as fast as a single core. Like most desktop technology, quad-core CPUs have migrated from the server market. Comparing server performance using round numbers, a dual-core CPU offers about a 50% performance over a single core (not 100%), and there are diminishing returns. A quad-core CPU is only about 25% faster that a dual-core CPU.


Running multiple cores is very complex. The performance varies depending on hardware configuration, software etc. Those four CPUs are sharing bandwidth on the motherboard used to access RAM, hard drives, video, etc. In addition, very little software has been optimized to utilize four cores, even less mainstream, consumer software for the desktop. Photoshop, and a handful of new games come to mind. It’s very difficult to write multi-threaded applications that utilize multiple cores. Developers are still learning. It’s likely many years until the majority of mainstream applications are able to use multiple cores. There is also added overhead associated with managing multi-cores in both the chips microarchitecture, and the operating system.

Operating system like Windows, Linux and OSX are multi-threaded, and support splitting applications between CPUs. While this make sense for a dual-core system on the desktop, it would very rarely require four CPUs. For example, modern CPUs are able to multi-task common applications very easily. Only things like a virus scan, or ripping a CD require two CPUs to allow the system to work on different tasks without slowing. But, how often are you going to be encoding a video, running a virus scan, unpacking compressed files, and surfing the Internet with a half dozen other applications open, all at the same time?

Will you have a choice? Here is an example from the web server segment. If you want a web server, you can now get a quad-core, for the same price as a dual-core from many vendors. However, due to it’s higher clock speed, the fastest web server for most applications is not a quad-core, it’s a dual-core Xeon 5160 (Woodcrest). Because of the TDP (thermal dynamic profile) of a quad-core CPU (Clovertown), it’s not clocked as high as the dual-core. They simply generate too much heat. The fastest quad-core CPU available is clocked slower than the fastest dual-core CPU. Intel needs adoption of quad-cores in the server market. So what do they do? They’re discontinuing the faster dual-core Xeon 5160.

Ready or not, the quad-core CPUs are coming to the desktop, and again Intel is pushing them with marketing, and low prices. I’m guessing soon they’ll eliminate the faster dual-core chips, as they’ve done in the server market, and it may cost you. Instead of a dual-core system that would run the applications that you use the most faster, you may be forced into a quad-core with a lower clock speed, and a higher system price. Quad-cores are hungry. From amount and speed of the RAM, to the size of power supply and motherboard performance, quad cores have higher requirements.

How did we get here? Intel started on this path a couple of years ago when they were unsuccessful reaching the clock speeds that they had hoped with the Pentium 4. With the marketing department unable to sell clock speed, a new feature was badly needed. Moore’s law meant that CPUs were still doubling their number of transistors every 24 months, and provided the answer. CPUs had already added features for multi-media and huge on die memory caches. Multi-cores provided another way to utilize these extra transistors, while taking the focus off of clock speed.

No doubt that multi-core CPUs are the future. Intel has promised 80 core CPUs in five years. I’m not sure that we’re ready for four.