Links on Android Authority may earn us a commission. Learn more.
The 5 biggest things to happen to smartphones since the Nexus 5
Five years ago today, Google and LG partnered to release one of the most cherished Android smartphones ever: the Nexus 5. Though not necessarily the best looking or the most popular handset, the Nexus 5 nonetheless captured something special.
Despite its oft-discussed flaws, this smartphone stole the hearts of many Android fans (indeed, created many fans) and is still used as a reference device even today.
On its birthday week, I’ve been thinking about how much it achieved, and how far smartphones have really come since its release at the end of 2013. Here are some of the biggest changes we’ve seen in the intervening years.
One of the easiest ways to tell if someone is using an old phone, even without seeing the handset, is through the photos it takes. Smartphone camera tech has dramatically transformed in the last five years, from the noise and bugs of the Nexus 5 era (you can find some sample photos of its camera below), to the near-DSLR quality cameras with which smartphones are now equipped.
The multi-lens approach simply provided a cost-effective means of achieving superior results.
We’ve seen an average megapixel increase, from the 5MP-13MP main cameras common in 2013 to 20MP and up to 40MP in 2018. This has provided higher resolution photos, but it’s only a small part of the total improvements.
Modern smartphones are often equipped with multiple cameras on the front or rear, helping them reach new heights in zooming, low-light performance, and the now ubiquitous bokeh-effect shot (often called Portrait Mode).
None of these advances necessarily require two or more cameras, the multi-lens approach simply provided a cost-effective means of achieving superior results, making them part of the core Android fabric as of 2018.
Interestingly, Google still uses a single rear camera on its most recent flagships the Pixel 3 and Pixel 3 XL (take a look at the kinds of photos they can achieve below), just like in the Nexus 5 days.
How Google managed to stay competitive (if not, become the very best) in smartphone photography with only one rear sensor relates to our second major smartphone development.
AI and machine learning
The Nexus 5 was powerful, intuitive, and launched the smartest OS Google had ever produced, Android KitKat. It was a pinnacle of Android usability, making just operating the smartphone a pleasure. Fast-forward five years and we don’t even need to touch our smartphones to get things done.
When the Nexus 5 launched, Google had been investigating ways to assist users through Android. Google Now, Google’s card-based web service, could pull flight information from your emails or birthdays from our calendar, to give you a heads-up about them. You could even use some simple voice instructions to make appointments and the like.
This would eventually evolve into Google Assistant and change the Android landscape.
Google Assistant is the AI with which you can have conversations. It can read the news to you, and soon it could even speak on your behalf (see the video above). It supports hundreds of commands to help with a varied number of tasks, from finding vacation spots, to getting sports scores, to tuning a musical instrument.
In the last five years, AI hasn’t just improved how it responds to our specific queries, though. Now it can manage chipset processes, intelligently arrange our photographs, and even improve our smartphone cameras.
You can always make a display crisper, or speakers louder, but what the future holds for AI is hard to imagine.
The Google Pixel 3’s main camera takes just about the best photos around using AI and machine learning. This has enabled it to take superb low-light shots, recommend photos you never took, and much more.
We took a deep dive into AI cameras recently to see how they compared, so if you want to learn more, don’t miss that coverage in our AI camera shootout.
Most Android OEMs now offer AI in some form and the implications are simultaneously more nebulous and more intriguing than the advances in other smartphone areas. You can always make a display crisper, or speakers louder, but what the future holds for AI is hard to imagine.
Fingerprint scanning was a smartphone fixture before the arrival of the Nexus 5, but it certainly wasn’t as common as it is now. The adoption of this has not only changed the way we interact with our phones but also how secure they are.
Unlocking a phone is now just a case of holding our phone in the correct position. This works so incredibly fast it often feels like I’ve barely reached the sensor before the display activates. In the past, we’d have to fudge out a few digits on an often inaccurate keypad or draw a pattern. Now we just hold a finger in place, and there you have it.
Not only is it convenient, but its harder to steal a fingerprint than a PIN or pattern (hello Kanye West), lowering the risk of someone nabbing your phone and operating it later. Fingerprint sensors can authorize payments too, which means spending even less time putting in credit card details and providing fewer chances for someone to see your digits.
In the past, we'd have to fudge out a few digits on an often inaccurate keypad or draw a pattern. Now we just hold a finger in place, and there you have it.
Modern smartphones are fundamentally operated and secured in a different way than in 2013. Biometric identification will no doubt continue to play a part in the smartphone experience as it develops. We’ve already seen fingerprint sensors evolve in just a few years with Face ID and in-display solutions, and it’s unlikely to end here.
The headphone jack’s removal
Not all smartphone developments since the Nexus 5 have been for the better. We could argue for weeks about the introduction of the notch, but I’m sure most of you would agree one recent change to the smartphone landscape has been categorically terrible.
The headphone jack’s fate was sealed on Sept. 7, 2016, when Apple announced the iPhone 7 would not include a 3.5mm headphone port (enter obligatory “I know Apple wasn’t the firs. Instead of using the headphone standard almost every pair of wired headphones came with, folks would have to use an adapter or dedicated headphones with a Lightning connector to listen to music on the iPhone.
Within a year, several Android OEMs had ditched the headphone jack too. The 3.5mm port has only become more scarce since.
At the time, Apple chalked this move up to making thinner smartphones and offering superior audio quality. Apple marketing chief Phil Schiller called it “courage.”
Apple favored the Lightning connector over the standard that almost every pair of wired headphones came with.
Reality check: phones slimmer than 3.5mm aren’t advantageous. Reality check: the quality of the headphones’ speakers have a more tangible impact on the audio than its connector ($40 headphones with Lightning or USB Type-C connectivity won’t sound as good as $200 headphones with a 3.5mm connector) and Bluetooth can’t measure up either.
Reality check: You have been misled.
The removal of the headphone jack stands to increase Apple and other’s profits, but it has arrived at convenience’s expense. The 3.5mm port-toting Nexus 5 can still lord that over numerous 2018 flagships.
The retail value of technology products tends to decrease over time. The first cell phone cost a whopping $3,995 and some years later cost less than dinner for two. The smartphone era, which generally began to heat up with the first iPhone release in 2007, also saw a high entry price fall in the following years.
By the time the Nexus 5 landed (costing $399), around halfway between the iPhone release and where we are at now, we’d already seen $100 smartphones — the particularly impressive first generation Moto E arrived just a few months after the Nexus 5 in 2014.
Since then, two significant, if predictable, things have happened: low-cost phones became even better and premium smartphone prices increased.
The iPhone 5S started at $650 in September 2013 and at the time it was one of the most expensive commercial phones on the market. Five years later, the most expensive smartphones on the market are reaching nearly double that.
The Huawei Mate 20 Pro starts at 1,049 euros (~$1,200). There aren’t many phones as expensive as the Mate 20 Pro, but even mainstream phones like the Sony Xperia XZ3 and Samsung Galaxy Note 9 start at $899 and $999 respectively (the Samsung Galaxy Note 3, released around the same time as the Nexus 5, cost around $724.99 off-contract).
Meanwhile, the mid-to-low tier sector started churning out even greater hardware and we now have phones like the Xiaomi Pocophone F1, which delivers flagship specs for around $300.
Unlike the headphone jack removal, both of these trends have been good for consumers. You can find better phones at lower prices than five years ago, and there are more options at the premium end for those eager to have the very best Android can offer.
There have been other big changes for smartphones in the last half-decade. Fast charging only narrowly missed this list (it’s come a long way and I wouldn’t buy a phone without it), while improvements in build quality and damage resistance mean smartphone screens don’t shatter like the Nexus 5’s did.
However, progress has been slow in other areas. Processing is faster, but not in a particularly meaningful way (the Snapdragon 800 was pretty nippy). Display quality is better, but not significantly (1080p was common then and still is).
Once lighting a phone’s display edges when it received a call was marketed as an innovative new feature, the jig was up: meaningful smartphone improvements have been floundering.
In the next five years, some life could be jolted back into the industry: folding smartphones, which we may even get a first look at next month, are almost here. If you think the handsets of today are all too similar to those of 2013, rest assured: the new breed will be unlike anything we’ve seen before.
What do you think has been the biggest smartphone industry change in the last five years? Give me your ideas in the comments below.