Affiliate links on Android Authority may earn us a commission. Learn more.
Why conference calls sound bad, and what you can do about it
The gig economy prompted swathes of the global workforce to forgo their morning commutes, trading cubicle jobs for at-home desk jobs. This abrupt shift in work culture breathed new life into the dreaded conference call. It’s no longer about a group of your colleagues huddled around a 1990s speakerphone hub. Rather, modern conference callers rely heavily on software. Regardless of the technology though, call quality still leaves plenty to be desired.
Let’s dial in and figure out the technical issues behind conference calls and how to mitigate them.
What happens when a conference call is placed?
When you’re speaking on the phone to let your partner know you’re running late, your voice is converted into electrical signals. These are further processed by your smartphone’s antenna so they may be transmitted as radio waves to the nearest cell tower. The waves are then pin-balled from tower to tower until arriving at your partner’s phone. Here the electrical signals are converted back into audible sounds.
Two-way call quality depends on signal strength. As you may deduce, there’s a direct relationship between signal and call quality. There are a few variables to keep track of, though: connectivity varies between network type (CDMA vs. GSM), your distance from a compatible cell tower, and any physical barriers between you and said tower. When a conference call is held, these same variables are multiplied by the number of callers. This increases the chance of poor call quality.
Transmission efficiency is prioritized over audio quality
There’s only so much available bandwidth when making two-way calls, not to mention conference calls. Similar to how Bluetooth codecs process audio by compressing data, our voices are compressed as they’re relayed. Dynamic range compression reduces the volume of loud sounds while simultaneously amplifying quiet ones. Less auditory data is conveyed, making the entire process more efficient at the expense of sound quality.
The adaptive multi-rate (AMR) audio codec is used for standard calls. It narrows the frequency band from 200-3400Hz. In cutting off the low-end, anyone with a low voice may sound strange or muffled. That’s the trade-off though: efficiency gains equate to quality drops. This codec is great since it can maintain a call in poor conditions, but it can cause a message to lag. You know when you start a sentence and your co-worker accidentally interrupts you? There’s a good chance the AMR codec is to blame for slowing transmission in order to keep you on the line.
Variable bitrate maintains a stable connection but sacrifices audio quality.
Stepping up from the AMR codec is HD Voice. This uses the adaptive multi-rate wideband (AMR-WB) speech audio coding standard, doubling the frequency range of vanilla AMR. The frequency transmission is slightly extended and ranges from 50-7000Hz. If you have an very high or low vocal register, you’ll sound more nature with HD Voice. Unlike AMR, its transmission rate is static at 23.85kbps. Clarity is greater and latency proves less problematic, but there’s a catch. Both callers must have HD Voice-capable phones. It’s also more data demanding, which can be an issue with multiple callers.
Microphone quality and your environment
Regardless of what service provider you have, much of our conference calls boil down to the type of microphone in use. If you’ve ever made a call from a cheap pair of earbuds, only to have the person on the other line repeat “what” like a broken record, you know how much mic quality affects voice clarity.
Oftentimes, however, people call in from their respective handsets. This places a lot of responsibility on the integrated microphone, which can be good depending on what phone you have. For instance, iPhones use a slew of microphones to promote clear voice quality. Meanwhile, the Samsung Galaxy S10 line uses high acoustic overload point (AOP) microphone technology to mitigate distortion in noisy environments.
Flagships are usually outfitted with much better microphones than budget smartphones. There’s a high chance that a few of your conference call participants are using cheaper smartphones, which may be unable to combat background noise effectively.
How to improve conference calls
Now that we have a more comprehensive understanding of why conference calls sound bad, let’s explore a few ways to fix them.
Apps support greater bandwidth
While applications introduce their own host of problems, they exponentially increase the amount of available bandwidth. This is essentially widening the pipe so data may flow at a greater rate, essential for multi-way calls.
Our company uses Slack or Zoom, which provide Voice over Internet Protocol (VoIP) support. Why choose VoIP? It’s cheap and efficient. Voice and data are communicated over a single network. If users don’t have immediate internet access, there’s the option to dial in to the call from your smartphone. Opting for an umbrella solution like Zoom is easier than making sure all team members have an HD Voice-enabled phone on a supported provider.
Granted, there are some drawbacks to VoIP apps. For one, they’re not subjected to the same Quality of Service guarantees as telephone networks. Plus, just like any mode of telecommunication, they’re susceptible to data loss and may lag. VoIP apps are still an effective alternative for companies in need of an affordable, oftentimes free, solution to conference call frustration.
Offices should invest in a telecommunication hub
In a similar vein, offices will also benefit from getting a dedicated communication hub and microphones. This is less applicable to companies relying heavily on remote workers. For offices, though it may be a justifiable expense. Understandably, the beyerdynamic Phonum or Shure MXA 310 may be difficult to justify for smaller companies. If, however, employees frequently telecommute or you make conference calls to other offices, the investment is well worth it.
By spending money on a good telephony hub, you’re increasing productivity by minimizing wasted time. Depending on the participants’ pay rates, how many people are involved, and how much time is spent repeating and answering “What?” or getting everyone in on a call individually, you could end up saving quite a bit of money over the long run.
Get headphones with a good microphone
Another option is to get a pair of headphones with a good dedicated microphone. The OnePlus Bullets Wireless 2 have an excellent in-line mic and retail for $99. This is a great option if you’re an individual looking to upgrade your headset as it also serves as an excellent pair of everyday earbuds.
If you’re responsible for dolling headsets out to the entire office, the Poly Voyager 6200 UC is a brilliant pick. It has a slew of office-friendly features and is officially certified for Skype Business. While this doesn’t solve the problem of connection strength, it chips away at one of the main issues surrounding conference calls.
Related: Best wireless earbuds
Will conference call quality improve with 5G?
Yes, as 5G becomes the norm, the enhanced voice services (EVS) codec will be widely supported. We’ve seen this with certain 4G-enabled devices including the iPhone 8 and iPhone XR. Users are afforded up to 20kHz of audio bandwidth. Additionally, it’s supported by both AMR and AMR-WB and uses a variable bitrate to strengthen connection stability. Once globally supported, distortion will be lessened by the error concealment mechanism. In sum, calls will be clearer and more reliable as technology develops.
Until then, find and pick your favorite VoIP app or invest in a dedicated headset. Both are the quickest, easiest solutions to poor conference call quality.