Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

lots of smartphones 2

Android Authority's Best of Android Awards: The evolution

In 2015, Android Authority created its signature awards series, the Best of Android. This is their evolving story.
By
July 25, 2020

In 2015, Android Authority created its signature awards series, the Best of Android. The awards have evolved over the years, both in scope and authority, to something the entire industry and our global audience looks forward to. This page traces the history of the Best of Android awards and tracks the winners — and the evolution of our testing methodology — over the years since its inception.

CES awards  |  MWC awards  |  IFA awards


2015

We kicked off Best of Android with what we called the “ultimate smartphone comparison.” We had realized, while most tech media was still trying to “eyeball” things that were already beyond the abilities of human perception, we needed to adopt more objective means of measuring the difference between the best of the best.

Our initial focus for in-depth objective testing was narrow — only flagship phones from the major manufacturers — but the award (and our process for getting there) was so well received we decided to expand its scope in 2016. In its inaugural year, we also introduced our Reader’s Choice survey to crown the crowd favorite, with over 12,000 votes cast. We also awarded the best apps, innovations, wearables, and more:


2016

In 2016, we almost doubled the number of contenders and removed the “flagship” requirement for eligibility. The decision had quite the impact, with our overall Best of Android winner priced at just $439. We otherwise followed the same objective testing criteria to crown our overall winner and runners-up. This year our ultimate winner very near nabbed the Reader’s Choice award too but was narrowly defeated by just 1% of the vote.


2017

In 2017, Android Authority moved all of its objective testing to a central lab in San Francisco to ensure absolutely no variables could creep into our Best of Android data. Our primary focus was ensuring our camera testing was both objective and identical for all our finalists, but we also offered a special reader’s choice round to identify the fan-favorite camera. In our new lab, we used a mix of custom and off-the-shelf third-party tools, equipment, and software.


2018

In 2018, we went big. We tested no less than 30 devices for Best of Android, completely redesigned our testing methodology from the ground up, and ran a massive 24-phone bracket-style competition over the course of several weeks to crown our Reader’s Choice winner.

In hindsight, that may have been a bit of overkill, with over 650,000 votes cast (more than many U.S. states in the 2018 election), but boy was it fun! Not only did the winner of our internal testing match the crowd favorite, but our top three positions were also identical.

One other thing to note: by 2018 we had discovered that some manufacturers were back to their old dirty tricks, detecting when benchmark apps were running to temporarily ramp up device performance. This was done with no regard for overheating or battery performance, in order to artificially inflate their device’s score.

We realized that relying on third-party benchmarking apps was problematic. So we started using stealth versions of these apps to maintain a level playing field and called out those brands we caught cheating. Coming up with a more robust and secure solution to this problem is what later led Android Authority‘s Gary Sims to create Speed Test G.


2019

Having perfected our objective testing methodology, in 2019 we began to confront the fact that there’s more to a phone than just raw performance. To address this, we added an Editor’s Choice award to allow us to take into account the more intangible aspects of the consumer journey including after-sales sales service, software update track record, and market availability.

We still crowned an overall winner based on raw performance, with a result that only served to prove how dominant low-cost Chinese manufacturers had become at building flagship-level products. That our winner wasn’t available in many markets only served to reiterate the need to start looking a bit more closely at non-objective data points.

2019 also saw our first mid-year edition of Best of Android, because many manufacturers release two flagships per year and our audience always wants to know what the best phone is at any given time.


2020

2020 was Best of Android’s swansong. Since its inception in 2015, our site and range of coverage had simply expanded far too much to comfortably fit within the confines of the “Best of Android” name any longer. In its final year, Best of Android 2020 crowned an overall winner — the Samsung Galaxy S20 FE — and a Reader’s Choice winner — the Samsung Galaxy Note 20 Ultra.

We also held a mid-year BoA for those looking for the latest and greatest from the first half of the year. The OnePlus 8 Pro scooped both the Editor’s Choice and Reader’s Choice awards for the mid-year event.

 

Starting in 2020, Android Authority made a more definitive shift in its testing methodology. Rather than have a raw performance winner and an Editor’s Choice winner as in 2019, we combined the two awards, tempering the objective data with a swathe of intangible factors that greatly affect the overall value proposition of a product. Due to the inherent problems involved in relying on third-party software, we also built a custom version of Speed Test G to replace our previous reliance on third-party benchmarking apps.

Check out the video versions for more: