Links on Android Authority may earn us a commission. Learn more.
Is Apple crazy enough to design its own application processor?
Every chip inside every iOS device to date has been “custom” designed by Apple. By that we mean the company has licensed application processors from ARM, graphics processors from Imagination Technologies, and then arranged various other components onto a chip that would then be manufactured by Samsung.
Last week, when Apple announced the iPhone 5, they said that their latest creation, the A6, was at the heart of it. They also said the A6 was twice as fast as the A5 inside the iPhone 4S. Most people thought that meant the company just doubled the clock rate (the MHz) of the A5 and simply called it the A6. Doing so wouldn’t have hurt them since the A6 is known to be a 32 nanometer chip, whereas the A5 is a 45 nanometer chip. Smaller transistors translates to less heat which translates to faster speeds.
A second camp, this writer included, thought Apple switched to ARM’s next generation application processor, the Cortex A15. The Apple A5 chip uses a pair of ARM Cortex A9 cores, so one obvious way to make it faster would be to swap out those cores for “faster” ones. In this case “faster” means an ARM Cortex A15 can do more work per core than a Cortex A9.
Now it’s been discovered that Apple isn’t overclocking, and they’re not using ARM’s Cortex A15 either. Instead, they’re designing their own application processor. The only other company we can think of that does does is Qualcomm, who designs the “Krait” application processor, the “Adreno” graphics processor, and then slaps both of those onto a chip that’s marketed as the “Snapdragon”.
We know there’s a lot of technical jargon in this article, so here’s what we want you to take away from this piece: Apple has become even more vertically integrated than they’ve been in the past. Whereas before they were “designing” their own chips, now they’re designing the cores inside those chips.
To put it another way: That would be like HP saying they’re tired of using Intel’s chips, so they’re going to start making their own. From scratch.
This isn’t something to be taken lightly.