Testing how well a smartphone performs is no small source of controversy. Normally people look to benchmarks for the answer, the truth is there’s no one way to do it — so we have to pick and choose what we’re going to focus on to contextualize our results.
How we test
When we test a phone for its performance, we load up a stock backup for the phone, and then run each of our target benchmarks three to five times and log the average. This way, unexpected outliers don’t make it to publish, and a more realistic score is what you read.
While that may seem a little too srtaightforward, there’s really nothing else to it. The software does the heavy lifting.
One benchmark won’t cut it
Most benchmarks only look at one aspect of performance, and that seems like a pretty smart way to go. People tend to forget that smartphones are made of lots of parts like memory, RAM, a processor, integrated GPU, and so on. Because of this, we chose benchmarks that cover just about everything we could want to learn about each device.
By getting a constellation of results instead of just one overall score, we can identify the strengths and weaknesses of each phone. While most units are far superior to anything from past years — and consequently, it’s less important to test than ever — it’s still useful to help people know what tradeoffs they’re making when choosing a phone.
Benchmarks that try to do everything at once tend not to do such an amazing job at it, so our approach sidesteps the errors introduced by jumbling all the scores into one number. While our final scores do something similar to that, there’s a big difference between gross scores and normalized ones. We collect and analyze results from:
- GFXBench (T-Rex)
- GFXBench (Manhattan)
- Basemark OS
- Geekbench (single core)
- Geekbench (multi core)
- Geekbench (single core stealth)
- Geekbench (multi core stealth)
- 3DMark (Slingshot Extreme)
Sometimes the numbers don’t matter
As we’ve seen before, sometimes benchmarks can be gamed a bit. While companies have differing strategies and reasons for manipulating performance, it’s not always actually “cheating.” Sometimes it can be simply something a company does in certain situations to free up resources for common tasks that require a little more juice. Other times, it’s straight-up tomfoolery.
Smartphones have come an unbelievably long way since their inception, and the problems of 2013 just don't exist in today's world.
In order to sidestep this, we’ve taken steps to defeat straight-up software gaming by partnering with friends in the benchmark industry. This way, we can see if the results are truly gamed or not. For our CPU tests, we’ll be comparing only the results we’ve confirmed to be accurate.
If you’re worried about your phone not scoring as well as others, don’t panic. Smartphones have come an unbelievably long way since their inception, and the problems of 2013 just don’t exist in today’s world. Even the “worst” phones are still pretty damn good. At a certain point, the benchmarks are just a number.