The new MacBook Air is a complete refresh of the product line, except for the CPU. What’s up with that? The analysts at Jon Peddie Research weigh in.
Today I noticed the CPU in the new uber-cool MacBook Air models introduced last week are using the same generation of CPU as the older models. On the face of it, that seems a bit odd, since there is a new generation of CPU available, and because Apple refreshed everything else about the Air.
The CPU in the older MacBook Air is Intel Core 2 Duo; the CPU in the new line is the same model. Compare that with the graphic unit. The old Airs used the Nvidia GeForce 9400M; the new Airs use the Nvidia GeForce 320M. As we reported last week, Apple claims graphics performance is over 2x faster from old Air to new Air. (See “New MacBook Air: Light and Lighter from Apple.”)
Nvidia told a reporter earlier this year that the 320M was made especially for Apple’s MacBook Pro line. But why should the Air, more of an accessory than a power user’s computer, sport the same GPU?
Wanting to separate the silicon forest from the trees, I turned to the rest of the team here at Jon Peddie Research for some insight. Weighing in were Jon Peddie, Alex Herrera, Steve Wolfe, and Ted Pollak. The consensus JPR view: Apple is a clever shopper. It has decided Nvidia offers a superior graphics experience at a lower power consumption, and has done the mix-and-match to achieve a balance of price and performance.
Without turning the rest of this article into a back-and-forth volley of “he said / he said,” I’ll summarize my colleagues’ thoughts as if speaking with one voice.
The JPR view
It’s been a long-running controversy about where computer makers can get the most bang for the buck in silicon—the CPU or the GPU? The CPU is the single most expensive component in any computer. When Intel introduces a new CPU, it doesn’t stop production of previous generations, it just sells them for less. Thus it becomes possible for new computers to come out with older CPUs.
The general consensus is that CPU speed increases have topped out, and going to multi-core has not brought any benefit to applications because most software vendors have not updated their code. At the same time, graphics processing units (GPUs) keep getting better and better, delivering more performance with each generation. Because on-screen performance is where buyers judge the computer experience, anything that makes that a better experience is a payoff and helps to sell new computers. GPUs, it seems, are delivering more buyer pleasure these days than the CPU.
So when Intel tried to sell a low-performance integrated graphics processor (IGP) chipset with its new CPUs, Apple, ever sensitive to the user experience, decided (as Sony, HP, and others have) to shift the balance in the silicon budget in favor of the GPU. They bought a slightly less-than-top CPU, rejected Intel’s IGP, and choose Nvidia’s GPU/chipset, the 320M.
Apple’s decision makes sense when considering the options they faced. Intel’s 32nm Westmere generation Arrandale processor includes an integrated GPU with its two cores. So if you want Arrandale, you either use Intel’s integrated graphics, or you add a third-party discrete graphics processor and waste the money and die area spent on the integrated GPU. The other choice is Intel’s Clarksfield CPU, the quad-core Core i7, but the power consumption is 45W or higher; clearly not a good choice for the lightweight Air designed for long battery life.
Apple makes a living differentiating its products from Windows-based PCs, so even if its most general-purpose notebooks were to choose Arrandale and its graphics, Apple would still want to make a mark with superior graphics. And given they don’t want Arrandale’s GPU, you might as well stick with the less-expensive Nehalem-generation dual-core. After all, Arrandale is essentially a dual-core Nehalem shrunk to 32nm with a package-integrated GPU thrown in.
Apple is not so much dissing the Westmere CPU; more specifically they think Westmere’s integrated GPU sucks enough that they still need a discrete graphics unit. By spec’ing Nehalem, they’re still getting essentially the same dual-core CPU experience they would with Arrandale and they saved enough money to upgrade the user experience with the superior Nvidia GPU.