The State of Android & Audio

by: Robert TriggsAugust 8, 2014

Over the past several years, mobile has become the go-to platform for most people’s media consumption. From audio playback to movie streaming, there is a growing amount of content available in your pocket and on your tablet, and the market is still expanding.

Today we are seeing a move towards high-end 3D gaming environments, live music aids, and even home studio audio software suites designed to work on mobile devices and tablets. However, Android sadly has not been the forefront of this growing market, that position is firmly held by Apple.

Particularly in a creative capacity, tablets are quickly replacing laptops for music creation and live performance uses. Not to mention that there’s a whole market for digital effects, which can be purchased at much lower costs than traditional analogue equipment.

Line 6 Amplifi

Line 6’s latest digital effects amplifier is designed entirely around a mobile interfaces, but Android support is nowhere in sight.

However, the migration towards more digital content demands higher levels of processing power, on a platform limited by smaller batteries and thermal limitations. Android owners pride themselves on having some of the best hardware in the market, so why is it that Android seems so far behind its rival when it comes to audio applications?

A little about audio processing


Our mobile phones are more than powerful enough for simple playback tasks. However, as processing power has increased, we have also begun to demand more signal processing from our mobile devices, at a lot more of it in real-time, too.

We may take it for granted, but even when playing a game, every sound file takes time to be hauled from memory, converted from binary information to numerical values, before being pushed to a DAC, takes up valuable clock cycles. Additional post processing, such as passing the file through your optimized EQ settings or supplementing the sound with extra reverb, takes up even more time, and modern applications are becoming increasingly complex.

Although modern mobile processors long surpassed the multiple GHz mark and can match high-end PC equipment in core count, these simple figures are not all that matters when it comes to digital signal processing. Different processors designs complete different tasks in a different number of clock cycles, making some CPUs faster than others at the same tasks. This is why direct GHz and core count comparisons don’t always apply across designs.

The Beat Suite Recording Studio

Android might not be able to compete with expensive studio grade hardware, but a lot can be accomplished on a tight processing budget, if you know where to optimize.

With real-time audio, it is essential to be able to process floating point data, digital numbers with a decimal point, and SIMD (single instruction, multiple data) instructions quickly, preferably all within the short amount of time between samples, which is typically 44100 or 48000 kHz for most audio applications. Floating point units, a mathematical coprocessor commonly found in CPU core designs, are used for calculating mathematical operations on the digital audio signal with high levels of accuracy.

Multiple cores are not so important for audio -- instead brute speed is the key.

Multiple cores are not so important for audio, as most DSP algorithms are not optimized for multiple threads, instead brute speed is the key. The limitations of mobile processors, in this regard, can be found in the smaller memory bus bandwidth and smaller CPU cache’s, compared with beefier desktop grade CPUs. This can mean that your mobile CPU might actually end up spending more time waiting for data that it does processing it.

An example of one of the more – if not the most – demanding audio processing tasks is time stretching, where the tempo/speed of an audio sample is altered without the trade-offs in pitch/frequency alteration that can come from changing a samples wavelength. In this technique, audio is converted to digital, a Fast Fourier Transform algorithm then extracts the frequency information from the sound, which is used to correct/restore frequency information as the sample is stretched or shrunk in the time domain.

fast fourier transform example

Fast Fourier Transform is the process of extracting specific frequency information from a more complex waveform, and is highly CPU intensive.

Sounds pretty complicated right? This type of process puts a huge strain on the CPU, which can result in unacceptable latency. There are actually fewer than five FFT algorithms in the world which can run this type of process on mobile devices efficiently.

The maximum latency in any real-time system should ideally not exceed 20ms, which is roughly the perceptual limit of delay in humans. Any longer and our brains will notice the difference between sound coming in and out of a system, or between a button press and something happening on screen. Unfortunately, typical Android latency lies in the region of 100 to 250 milliseconds.

In a bid to increase performance and work around some of these shortcomings, mobile SoC developers, like Qualcomm, have started including their own dedicated DSP hardware alongside their main processors.


ARM has long included floating point units in nearly all of its core designs, excluding the Cortex-M3 and below, and supports extra Digital Signal Processing and SIMD extensions in its mobile processors.

ARM’s SIMD extension and NEON engine are particularly important for these types of scenarios

This DSP processing capability is aimed at keeping power consumption down, while offering the maximum performance available, up to 75 percent higher than that which can be achieved without the extensions. ARM’s tools are used for a range of common mobile applications, from monitoring sensors, to voice recognition, VOIP, and audio encode/decode.

ARM’s SIMD extension and NEON engine, found in the commonplace ARMv7 architecture, are particularly important for the types of scenarios that we are talking about. ARM has made particular optimization for faster sleep, 4-8x DSP algorithm performance enhancements, dedicated tools for Fast Fourier Transform applications, and a whole host of other optimization for performing complex and processor heavy mathematical calculations on a strict power budget.


ARM’s NEON Data Engine and Floating Point Units, found in all Cortex-A designs, are essential for efficient DSP processing.

The move to ARM’s 64-bit ARMv8 architecture could also have some useful benefits for audio software developers and consumers, as audio applications can be heavily memory dependent, and 64-bit could allow for devices with larger pools of RAM.

However, there is only so much that ARM can do on its own, and ARM’s library only really serves as an example as to how developers could go about creating their own lower level code. Without a fully-fledged library, different developers would have to go over the same processes again and again just to build the basic tools that they need. A further hindrance to smaller development teams is the high cost of ARM’s proprietary compiler.

Audio Development and Android

While we have mobile hardware that is clearly capable of providing a high-quality audio app experience, there seems to be a lack of software support for developers on Google’s side of things.

For the app developer, the first port of call is usually the Android SDK. However, Google’s Media API’s for Android are rather limited, to say the least. You won’t find many useful tools, barring the very basic MediaRecorder and playback from file functions. Delving a little deeper into the various Android packages will reveal a few tools for an equalizer, reverb pre-sets, and noise suppression. However, there aren’t any acceptable tools for low-latency real time audio processing, and the various operating system fragments found out in the wild often mean that these tools can be hit and miss depending on the user’s hardware.

Android Audio Examples

Android has an acceptable selection of audio focused apps already, but the platform doesn’t play so nice with the wider world of audio.

Compare this situation to Apple’s iOS platform, it couldn’t be in greater contrast. Apple has long included its Core Audio digital audio infrastructure in its operating systems, which offers developers a dedicated software framework for a variety of applications, such as the ones that we have already discussed.

There seems to be a lack of software support for developers on Google’s side of things

The Core Audio library includes tools for mixing and converting signals and files, easily implementing signal chains, as well as essential built in effects, while maintaining high performance. Apple also includes easy access to its hardware abstraction layer, allowing audio applications to effortlessly interface and communicate with other pieces of hardware, such as microphones or output devices that accept incoming audio signals. Most of this functionality is completely missing from the Android platform.

Apple Core Audio digital studio

As painful as it is to admit, Apple’s Core Audio platform is far more developer friendly than the Android ecosystem.

Instead, more complicated applications may find that they have to do a lot more of the low level coding themselves, adding to development times and costs. This is the primary reason as to why Android is so far behind Apple, when it comes to advanced audio applications. That is, unless you can find a third party SDK.

Introducing – Superpowered

Superpowered is one of the few feature rich audio SDKs available for mobile, which has just recently been made available for Android. It offers up a range of tools for Android and iOS developers to easily implement some more complex audio applications and effects. The SDK offers up a library of pre-built functions for audio filters, reverbs, echo effect, time domain stretching, and FFT, all designed with high quality studio grade audio in mind.

Superpowered has been built from the ground up to maximise DSP performance, while sidestepping issues with Android audio issues

Unlike other audio engines, Superpowered is not a wrapper around Core Audio or Android’s pre-build library. Instead, it has been built from the ground up to maximise DSP performance, while sidestepping issues with Android fragmentation, its lacklustre feature set, and latency issues. Superpowered claims that, as a result, it can even outperform Apple’s industry renowned Core Audio platform, which is no mean feat.

Superpowered is designed for ARM devices that make use of the NEON architecture extension, which basically means that 99% of smartphone and tablets are covered. It can be used to speed up development of almost anything audio related, from DJ apps and instrument effects, to audio book readers, podcast apps, and games. The video below shows off Superpowered’s co-founder demoing a wearable powered DJ interface powered by the platform.

Importantly for developers, Superpowered is a cross platform SDK, allowing apps to seamlessly operate on both Android and iOS without any differences in audio quality. While iOS may be the lead platform at the moment, this opens the door for a wider number of developers to consider Android too.

Superpowered isn’t stopping with audio though, the company will also be releasing DSP SDKs for image and video processing in the near future. Which could open up Android to a new generation of media editing apps and content.

If you’re a developer interesting Superpowered’s SDK, the good news is that it is free to download and implement in your app. Once your app reaches 50,000 installs, Superpowered help you set up a contract with them, which includes extra support for your app.

With hindsight, Android’s lack of out-of-the-box support for advanced audio apps and features appears to have been a missed opportunity. Fortunately, third party developers have stepped up to provide solutions to the problem. In the future, hopefully Android will prove itself to be a worthy platform for the power media developer too.

  • alex

    Maybe it was wishful thinking, but didn’t they mention at this year’s Google I/O that in Android L they’ve revamped the entire audio framework so that latency is now approaching 20ms? Regardless, this was an excellent post and one of the reasons I keep coming back to Android Authority.

    • joser116

      Approaching 20 ms is not good enough. iOS has had that kind of low latency for years. Google needs to tackle this problem with full force so developers can easily and consistency make apps with low latency audio

      • alex

        I think (“think” being the operative word) I remember reading they’ve got it down to ~50ms, but fully admit they still need to get it to 20ms. I agree with you though, Google has been sorely behind Apple on this front.

        • Dominic Powell

          It’s at 20ms as of Android L. They did more work and got it to 20ms. In JB 4.2.2 the guy gave an audio talk and some about lowering it from 100 ms to 45 ms. This year he demonstrated the difference from 45ms to 20ms. As suggested 20 ms is usable iOS is currently between 9ms – 14ms. We will probably see parity at I/O next year.

          • joser116

            Why is it taking them so long? Why only incremental improvements? I don’t know much about how Android audio works. I think it is different than iOS’s audio framework since Android is basically a virtual machine and iOS is native so latency and performance are fundamental problems on Android, but Google needs to do a complete overhaul of the Android audio framework, not just improve it or tweak it here and there. What they are doing is simply putting Band-Aids over the problem. They need to do some surgery.

          • Mike Reid

            IMO because some of the OEMs are dragging their feet. Google has not been “forcing” OEMs to improve audio performance.

            HTC in particular uses old style HALs, in part I think because of their “legacy” code for DSPs and such.

          • Grahaman27

            just to put google’s efforts in perspective- fireside chat, google io 2012:

            “We’re introducing a global fast mixer to reduce the latency. The targets we’re going for are sub-10ms. Anything that preempts you in the system will cause a problem. On the Galaxy Nexus the latency went from 100ms on ICS to 12ms on JB but we want it to always be below 10ms. Other JB devices are not as good as 12ms. We want to mandate a maximum latency but we aren’t there yet.”

            so what happened? kitkat introduced a low-power DSP, which INCREASED latency. Then Android L promises to reduce it again? I wonder how much google really cares about audio.

          • s2weden2000

            That was touchscreen latency..not audio

    • joser116

      Another thing I hate about how Android handles audio is that sound from taps only plays AFTER you lift your finger off the screen.

    • Grahaman27

      its amazing how short the section is on latency in this article, which is probably the main problem with android’s audio situation.

  • @Pv (
    • Cuerex

      Android release L is told to be further working on the audio processing. Does this has an impact on Superpowered?

      • Pv (

        We’re all for lower latency on Android, and if Android L materially lowers latency, that would be fantastic.

        That said, even if Android L solved the low latency problem, once and for all (it won’t) — developers still have the fragmentation issues and API issues to deal with — which is a major part of the value that Superpowered provides.

        • Cuerex

          Is Superpowered interested in a contract with google about this specific problem?

          • Pv (

            @Cuerex:disqus — we’d love to speak to the powers that be at Google about this — would you have specific folks we could get in contact with?

          • Cuerex

            Apparently no. But I believe, it shouldn’t be too hard to contact google about this, with the authority Superpowered posseses.

            Maybe you should make some activity on Reddit about Superpowered initiatives. It’s free advertising aswell and I see many industry-people reading topics there.

  • Sweet

    Excellent article.

  • Anonymousfella

    Nice article. I didn’t know Android was behind in the audio department…

  • DDT

    “…so why is it that Android seems so far behind its rival when it comes to audio applications?”

    Because it’s lacking some powerful 64-bit processing, great set of API’s and of course developers.

    “Most of this functionality is completely missing from the Android platform.”

    But, but, iOS 8 is just catching up in features to android 2.0….

    • So you can make shitty unprofessional sounds in garage band, congrats. We’d like to have MMS, E-mail attachments, and cut, copy, and paste prior to the 3rd generation of a mobile OS thank you very much. BTW I have an iPad so not a fan-boy, just saying don’t come here with all that non-sense.

      • mobilemann

        it’s not just for that. Professional stuido’s use iPad’s too, as control surfaces, with support for full faders, pan, recording playback etc. This doesn’t in anyway stop the accomplishments of Android, who lead the way in lots of other areas.

        There’s lots more too. If you have an ipad air, i suggest you learn how to use it; if you’re just trying to make yourself feel better because you’re a brand cheerleader, then that’s pathetic.

  • s2weden2000

    Superpowers for everyone…

  • Gerontis

    The latency issues of Android are built into the kernel of the OS, there is simply no way that ‘SuperPowered’ can do anything about that – unless it patches the kernel and then it opens up a can of worms of incompatibility with hardware.

    • Pv (

      @Gerontis — the latency issues are actually even more complicated than hacking the kernel — to truly get low latency Android, the prescription includes:

      “In contrast, Apple’s hardware-software symbiotic paradigm is required here to make provide a real low latency audio on Android solution with buffer sizes 128 or even less. For Android, this would require a SoC supplier, who can fine-tune the firmware and the audio drivers as well as providing a good Android low latency audio api instead of the OpenSL ES….
      And then we’ll have a low-latency audio in/out foundation to build upon.Fortunately, Superpowered has everything on top of that.”


  • Gon

    My audio feels normal no lag?

    • gommer strike

      That’s because you’re only doing the expected consumer-level stuff…playing your music, games, notifications etc which will all seem fine. But it’s when it comes to professional-grade stuff when all this stuff comes bubbling to the surface.

      Think to the stuff that you do for a living, that the average joe/jane would never see nor understand on why things work the way they do. You could surely talk for hours, if not days in the stuff you do and why people should buy X or Y, because of the underlying things that set it apart from the usual stuff that people who don’t know any better, spend their money on instead.

      This is one of those things. Consumers would never know. But the professionals…oh yeah.

  • agedpom

    All very well, but when all is said and done, the device is, first and foremost, a PHONE.
    I want a phone where I can hear the conversation with good volume and good quality, just as I could on the old copper land line.
    Please Mr Android, let it be; and soon!!

  • Pv

    As we all know, Android (lack of) low latency is the bane of the digital music creation — I wanted to alert everyone to a free app we have recently released that allows developers to measure device roundtrip audio latency.

    Interestingly, our data prove that even with the advent of Lollipop, Android audio latency isn’t yet “real-time” ie on par with iOS.

    The fastest Android device we have tested so far is the Nexus 9 clocking in at 48 ms roundtrip, while iPhone 5 comes in at 7 ms roundtrip.

    The free app, device latency data and the source code are all available online.

  • MattEgansHairLine

    You are never going to get an optimised OS if the Legal dept has to be part of your design team.

    If you have to reverse engineer Java and iOS to produce a saleable product, the legal dept had more design input than the designers or engineers.