The first half of next year should get very interesting for mobile devices. We’re expecting the Qualcomm S4, NVIDIA Kal-El, Texas Instruments OMAP 4470, and Samsung’s new Exynos 4212 chips to arrive then. But, even if these devices will all arrive in the first half of 2012, we don’t know the exact dates–and that might make a difference.
For example, if one of them is 30% better than the other, but arrives in June, while the other arrives in January, then it might not mean much, because the January chip will probably have an upgraded version by June, too, and might equal or even surpass the original “better” chip. So, keep that in mind while we compare these chips. Timing is very important here.
Qualcomm had a great 2010 because of the Nexus One which made its Snapdragon chip based on the Scorpion architecture very popular. But 2011 wasn’t that great for Qualcomm, at least in terms of mindshare. Their market share actually expanded in 2011 because of the inertia they got from 2010. But my point is their chips weren’t the best of the best anymore, in both CPU and GPU.
That looks like it might change a bit, at least in the first half of 2012, but mainly when just compared to other dual-core processors. NVIDIA should still provide them very strong competition in terms of overall performance.
The new S4 is based on a completely new architecture called Krait (more like the jump from Cortex A9 to Cortex A15 than the one from Cortex A8 to Cortex A9) that is also manufactured at the smallest processing node so far for ARM chips–28 nm.
This will allow the same-performance chips to be smaller and more power efficient, although I don’t think they’ll go that route too much. They’ll probably try to maintain the power consumption and chip size more or less the same, and increase the performance instead. Apple has already started a trend with bigger chips, with the A5 chip being about twice as large as a Tegra 2, and having a huge GPU on die, for example, which is why it was so much better than the other GPUs.
But, it’s very possible that Qualcomm’s initial dual-core, 1.5-GHz Krait S4 chip will compare favorably with other high-end ARM chips in the market. However, unless the difference is significant, it might not be noticeable because of all the other factors that influence power consumption (e.g., display type and size, battery size, background software, etc.).
One other way the S4 manages to save power is by having its 2 cores asynchronous. This means that the cores don’t have to run at the same clock speed all the time, and they can execute the instructions independently. This is Qualcomm’s way of competing with NVIDIA’s “companion” core or Texas Instruments OMAP 4′s Cortex-M3 cores.
Regarding performance, the Krait architecture gets about 3.3 DMIPS/MHz as opposed to Scorpion’s 2.1 DMIPS/MHz. So this is why Qualcomm is saying that the S4 is about 60% higher performance than older Snapdragon chips. A dual-core, 1.5-GHz S4 should be 60% better than the new dual-core, 1.5-GHz Snapdragon chips that we’re starting to see in new phones right now.
The difference is a little smaller compared to Cortex A9 chips, which have 2.5 DMIPS/MHz. That means the Krait S4 will only be about 30% better than Cortex A9 at the same clock frequency, but the difference is still a significant improvement.
Plus, Krait is getting the same fully out-of-order architecture that Cortex A9 has been benefiting from for a while now, and which Scorpion has been lacking. Out-of-order execution means the CPU will execute the instructions as they arrive, without following a strict order. This cuts down some processing delay, so it makes the CPUs perform better.
As a side note, Intel’s Atom is still in-order, which won’t change until 2013, and both Krait and Cortex A15 architectures should be more powerful than Intel Atom’s architecture. So, in 2012, we should see ARM chips that not only outperform Atom but also use less power. This is why I think Atom has no chance of catching up with ARM chips and why I found Google’s alliance with Intel and Google’s support for Intel Atom rather strange.
The “initial” GPU of S4 will be Adreno 225, which is getting the smallest upgrade yet compared to previous versions, it seems. It will get a 50% increase in performance over Adreno 220, while Adreno 220 and Adreno 205 each got 100% increase in performance over their respective previous version. Adreno 225 will probably only match the “current” Mali GPU in the Samsung Galaxy S II.
Personally, I would only start to get excited about S4 once it gets the Adreno 300 GPU, which Qualcomm claims will have graphics similar to that on the PlayStation 3 or Xbox 360. That sounds a bit like hyperbole, but we’ll still probably see at least twice an improvement over the Adreno 225 plus whatever advantages the new Adreno 300 architecture will bring. The only problem is, Adreno 300 will probably not arrive until second half of 2012, but when it does, it should be pretty competitive with the other GPUs out there.
NVIDIA should’ve gotten a nice head start by now with Tegra 2, but because of both Honeycomb delays and other manufacturer-related delays, the devices based on Tegra 2 arrived much later on the market than when everyone expected them. This led to a situation where only one or two months later, we saw similar dual-core chips from competitors that were already surpassing Tegra 2 in many areas–including CPU, GPU, and video playback performance.
It looks like NVIDIA’s next chip, Kal-E,l is suffering from the same kind of delays, but, hopefully this time, it won’t be as bad as last time. Last time, Tegra 2 arrived so late that NVIDIA even had to cancel its mid-life kicker, the Tegra 2 3D (dual core 1.2 Ghz chip). But now, we might actually get to see the Kal-El+ chip, which is Kal-El’s mid-life kicker until Wayne arrives. This is something everyone is doing: Qualcomm with its dual-core, 1.5 Ghz Scorpion chip; Samsung with the Exynos 4212; and Texas Instruments with OMAP 4470.
NVIDIA also does something differently from the others. It releases a more powerful tablet chip first, and then about 3-4 months later, it releases the more-optimized, lower-performance smartphone chip. NVIDIA intended to do this with Tegra 2 and Tegra 2 3D, as well, but because of all the delays, Tegra 2 smartphones and tablets arrived more or less at the same time.
But, NVIDIA intended to release them more like this: Kal-el for tablets in August, Kal-El for smartphones in November, Kal-El+ for tablets in February (2012), Kal-El+ for smartphones in May (2012), Wayne for tablets in August (2012), and so on. But, as it is now, that whole schedule got pushed back by a few months. However, the structure should remain more or less the same.
The original Kal-El was supposed to be a quad-core, 1.5-GHz chip. Recently, NVIDIA has announced that it added a “companion core” of 500 MHz, which is still a full Cortex A9 chip. This companion chip is meant to handle most of the easy low-performance tasks and keep the more powerful cores idle until they need to handle more intensive tasks, like browsing, gaming, more advanced apps, and so on. Going by the DMIPS benchmark, this 500-MHz core scores about 1250 DMIPS, which is as much as a 1.0-GHz ARM11 chip would score, to put things in perspective.
Now, I’ve also seen some blogs mentioning it now has 1.3 GHz per core (the other 4 cores). Maybe NVIDIA gave up on having 1.5 GHz per core, because if you think about it, it’s pretty crazy that NVIDIA managed to put 5 cores in there while the others still struggle with 2, and doing it at the same–or almost the same–clock speed.
But, it’s also possible that the 1.3 GHz per core is for smartphones, which would make sense if NVIDIA plans to release such chips in the same time (because of the delay); that is–quad core, 1.5 GHz for tablets and quad-core, 1.3 GHz for smartphones.
The companion core is also a much more efficient chip than ARM11, so if it ends up being used like 50% of the time, it could end up saving a lot of battery life. So far, I think it’s the best method employed to save power used by any chip maker, better even than Qualcomm’s asynchronous 28 nm cores, and TI OMAP 4470′s Cortex-M3 cores. But, so far, this is all theory. The NVIDIA Kal-El chip is also made at 40 nm still, and NVIDIA claims it uses less energy than competitors’ current chips at maximum performance, but it remains to be seen if it’s true in practice.
NVIDIA Kal-El’s GPU should be up to three times faster than Tegra 2′s GPU, which would be a much needed improvement. But, Exynos’s Mali 400 is already twice faster than Tegra 2′s GPU, and the Mali GPU in the upcoming Exynos 4212 will get a 50% improvement, which should make it about equal to Kal-El’s GPU.
That doesn’t necessarily mean games won’t look better on Kal-El devices. NVIDIA tends to focus more on geometry and physics. Plus, Kal-El will also benefit from 4 CPU cores, which should be equal or slightly weaker (if 1.3 GHz) than Exynos CPUs, and games could also take advantage of that.
It seems Texas Instruments will manage to remain competitive in the first half of 2012 with its OMAP 4470, and the biggest reason is that the company is finally moving on from the old and overclocked (again and again) PowerVR SGX540 to the PowerVR SGX544 MP1 (1 core). What’s interesting is that Android 4.0 Ice Cream Sandwich is getting optimized for the OMAP 4 platform, so even though rumors were that the OMAP division was for sale, this should keep them in business a while longer and might even help them get a comeback.
The OMAP 4470 will get a CPU boost of 20% compared to the OMAP 4460. This will get interesting if it ships at the same time as the Exynos 4212 (dual-core, 1.5 GHz) and Kal-El (1.3 GHz, or even 1.5 GHz). If it arrives near mid year, it might not be so interesting anymore, as even a dual-core, 1.5-GHz Krait S4 should surpass it in performance, due to the latter’s 30% improvement over Cortex A9 (more like a dual-core, 2.0-GHz Cortex A9).
Unless NVIDIA’s quad-cores actually use more power than dual-core, 1.8-GHz chips, then the OMAP 4470 and its two Cortex M3 chips should compare unfavorably with Kal-El in power consumption, because Kal-El’s companion chip will handle a lot more low-end tasks, while OMAP 4470 will be forced to use its high clock frequency chips more often. But, this depends a lot on implementation and how the device will be used. Cortex M3 will be used for 2D animations, touch interaction, and other low-end multimedia tasks.
Texas Instruments is moving away from PowerVR SGX540 and it’s adopting the newer PowerVR SGX544 (which is the same SGX543 in Apple’s A5), but it also has support for DirectX 9.3. However, it has only one core as opposed to A5′s two cores. However, we don’t know yet if it’s overclocked or not (it probably is), just like the SGX540 was originally in Galaxy S was overclocked by 50% when the latter was used in OMAP 4430.
Texas Instruments promises a 2.5 times improvement in the OMAP 4470 over the OMAP 4430 GPU. If that’s true, then it should also have a similar performance to the Exynos 4212 GPU and the Kal-El GPU. But, we don’t know exactly when it will ship (probably sometime in spring).
Samsung has surprisingly become one of the very best ARM chip manufacturers out there, in many cases besting long-time players such as Texas Instruments and Qualcomm, and that’s not even counting in others such as ST-Ericsson, Freescale, Marvell, etc.
Samsung tends to offer the most cutting-edge Cortex A9 chips with the best GPUs available in its latest flagships, along with its latest Super AMOLED displays. That’s a killer combination, and neither NVIDIA, Qualcomm, nor Texas Instruments should treat it lightly.
The good news for Samsung’s competitors is that Samsung doesn’t always use its own chips inside its phones. I don’t know if that has more to do with the fact that it can’t manufacturer that many chips itself, or it simply wants to offer more choices to their customers–but, my guess is it’s the former.
The “bad” news here is that Samsung’s CPU is only a dual-core, 1.5-GHz Cortex A9. But, as I said in the beginning, that is not such bad news if it comes much earlier than the others, and this chip should arrive very soon, either very early in 2012, or even end of 2011.
By the time Qualcomm releases its dual-core, 1.5-GHz Krait S4, Samsung could be already launching a dual-core, 2.0-GHz Exynos chip–or even a 2.5-GHz one–soon after S4. As I said, timing matters a lot, too.
The power consumption of the dual core 1.5 Ghz Exynos 4212 should be the same or better than the previous dual core 1.2 Ghz Exynos 4210. That’s because it’s made at 32nm instead of 45 nm, so it can increase performance at the same power consumption. Plus, I’m sure that they’ve optimized it further, just like Nvidia’s must’ve optimized their Cortex A9 cores, too.
Samsung says its new GPU will bring over 50% increase in performance, after the previous Mali 400 was already about twice as fast as other GPUs on the market. As mentioned above, that should put it somewhere around Kal-El’s GPU performance. Hopefully, Samsung’s latest success with the Galaxy S II will also get developers to optimize their games well on this GPU.
One thing I really like about the Mali GPU and a reason that I wanted it to succeed is that it’s more “open.” It’s just as open as the Cortex A9 CPU is. Everyone can license it from ARM (Mali is made by ARM) and modify it, while NVIDIA’s GPU or Qualcomm’s Adreno can’t be licensed.
What, then, is the best chip of them all? Here’s how I see it. I would definitely get a Kal-El tablet over anything else out there. You want your tablet to be as powerful as possible, because you’ll be browsing the Web a lot more than on a phone. You’ll also be playing advanced 3D games more often, and you’ll need all the performance you can get.
So, unless you want to save US$100 or US$200 on the tablet, and get the cheapest one you can find with a dual-core processor, then get a Kal-El-based tablet (possibly the ASUS Transformer 2, to benefit from a keyboard, too). It will also be the best chip at multi-tasking because of all its cores.
On a phone you won’t be multitasking or browsing as much, so you’ll also want the maximum performance per core as possible because you’ll be handling single-threaded applications more often. Plus, even though NVIDIA says its quad-core processor uses less power at maximum performance, we can’t know for sure right now. Could it really have made a five-core chip at 40 nm that uses less power than a dual-core 28-nm or 32-nm processor? We’ll see. But I wouldn’t put too much thought into it until we see it in practice and until we also see how that companion core manages to save power.
So, until we see all that, you’d probably want to go with an Exynos 4212 chip or an OMAP 4470 if you care about gaming a lot–or a Qualcomm S4 if you don’t care about maximum gaming performance as much but still want little power consumption and strong single-threaded performance in a smartphone.
Which is the best?
Like this post? Share it!
It’s not about the dates, it’s about the cycles. Besides, dates don’t even matter in the US: we got our Galaxy S II’s when Samsung is already planning the Galaxy S III.
It’s about real world performance. Saying gaming performance is equal o Xbox 360/PS3 is like saying way to see who will prove it then buy that one. LOL!
Wish the competition, can help to speed up the development of 5-core mobile…
Does anyone know if u can play Nvidia Tegra 2 games on the Galaxy S2 1.5GHZ
yes you can buy you have to download chainfire from the android market and get the plugins
Thats right timing is everything they need to bring out a chip faster then the A5 ASAP before Apple steals Androids Market. Also they need to unify a graphics standard so that there is no problems with games to do that its best to separate the GPU from the CPU. This way the CPU can worry about how to unify multitreading techniques and the GPU can adopt a standardized graphics code like direct X is for windows. Allowing no need for special games or app that will work on one phone and not the other. This is what is killing the growth of good software development and availability of titles . With out a larger catalog of software in he Android market then IOS devices they will never stand a chance in winning Apple devices no matter how much better the hardware is, software is the key to winning. This is why Apple computers have not won over Windows just yet.
couldn’t have said it better myself
I really hope the Exynos 4212 chip comes in a carrier variant of the Galaxy Nexus.
Like how T-Mobile’s SGSII came with a snapdragon chip instead of the Exynos 4210.
Even though Android 4.0 is “Optimized” for OMAP, I would still greatly prefer it to the 4460 and SGX540.
You need to revisit your source material relating to in-order vs out-of-order execution in CPUs.
I had read several sources that compare the adreno 220 favorably to the GPU in Galaxy S2.
Samsung recently announed the exynos 5250. a dual core which has 2 cortex a15 cores. probably, we will be able to see a quad core version soon. Samsung is going to murder NVIDIA if that happens.
nvidia better watch out for the quad-core Exynos 4412 with the Mali-T604 GPU. The Exynos may take the crown.
No. Exynos4412 has Mali-400. Exynos5210 has Mali-T604
i get the feeling qualcomm may make a comeback this year. Their adreno 225 gpu is said to compete the powervr sgx543mp2 and maybe sgx544
Tegra-3 benchmark was worse than A5 both in performance and power. I don’t see how would a quad-core A9@1GHz can compete with the 2x Krait @1.5Ghz.
Qualcomm Snapdragon S4 [Asynchronus dual-core 1.5 GHz] (Adreno 300), NVIDIA Kal-El (1.3 Quad-core + companion – 45nm), Texas Instruments OMAP 4470 (PowerVR SGX544), and Samsung Exynos 4212 (Mali-400 – Open), Apple A5 (NA)
I suggest writing a single paper comparing each processor’s features (like an infographic)
THANKS very much for the info (just what i was searching for) ^_^
Samsung Exynos is the best AP!
Look at the picture, Samsung Exynos 5 Ver. We know Samsung have made sample chip!
and Samsung Exynos 5 Ver. will be include ARM Mali -T 604 Quad-core GPU.
its sad that most people who voted probably never read the white papers from each core makers, and i can easily say that out of all of these cpus s4 is the most advanced in everything the only thing it wont be able to touch really is gpu performance(and will still be able to have ps2 xbox graphic performance ) from the other top gpu makers but all that aside just the fact that its 28nm and has integrated lte should speak for it self….finally an lte phone capable of staying alive for more than a day….
Well, I have to say, quite readable article. I agree with ya in all but one – Intel. To actually say that intel is no competitive with the others is really “dangerous” It’s like “Irritating snake with bare foot”. Yes x86 platform have this reputation that it can’t really compete with ARM’s architecture in terms of power consuptions but recent Intel’s advances refutes this, and I’m confident that in quite short time we’ll see some Intel’s solutions which will prove “who’s the daddy”. This time is yet to come, but it’s not far, Intel has some aces in sleeve and partnership with the Google is just the one of ‘em. To the future, Intel’s Atom platform seems to be the most promising.
This is the perfect opportunity to get started on cloud computing with Secured Cloud. Get our enterprise-class, cloud with $50 in free usage and no gimmicks! After that, prices start at only 2.9₵ per hour. That is around $21/month!! Phoenix NAP
It’s a perfect opportunity to realize that device computing (both CPU & GPU) is moving fast forward and cloud computing with all the routing processing involved is a great buzz and overall energy waster and where’s the compelling use case anyway? Cloud storage is great for some things but then again, we had that for a looong time so what’s new about all this Cloud fuss really? My life didn’t improve much using iTunes (I do get the use cases of data storage on this).
I don’t like Qualcomm. They are a GIANT PITA when it concerns drivers for GPU for projects.. While Nvidias Tegra GPU is not as open as Mali, at least it’s pretty easy to get help if you need a module built for a specific environment. Mali is, of course very nice in every regards, and I wish it the very best!
I think you are missing a low-wnd contender in this article, though. (At least now that it is released) – the Rockchip RK3066. With a nice dual core a9 it is certainly NOT a KRait contender. but at 15@ per SoC they are not meant to. The mali400 MP (4 cores) should lick the heals of SGX543 (with two cores).
In my mind it might be one of the most interesting things to happen in 2012 for embedded devices, as it can give us sub 250$ 10″ tablets with IPS screens and decent battery life – with “enough” CPU power for “anyone” and enough GPU power for anyone.
S4 is the best by now, HTC One S uses it to beat Tegra 3 and Exynos quad cores in most of the benchmarks. Have you ever heard of a dual core processor which beats quad cores?
yup,xperia t beats the hell out of s3
Qualcomm Snapdragon S4 Pro APQ8064 vs Tegra 3 AP37 vs A5 which is faster pls do a new articule to se who is king of the cpu GPU as of OCT 2012 thanks