Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Smart home privacy: What data is collected, and how is it used?

Are you worried about big tech snooping on you with smart speakers? Here's what you need to know about smart home privacy.
By
July 26, 2022
A Google Home Hub and an Amazon Echo Show

Smart home privacy can be a tough balancing act. While apps, devices, and cloud services all need a base level of user data to function and improve, tech firms are sometimes caught collecting more than they really need. That can pose risks if the data is stolen by hackers, shared with authoritarian governments, or sold to aggressive marketers. Some people simply object to anyone having a glimpse into their habits.

It’s impractical to detail the privacy policies of every company and platform in the smart home industry, but to cover a broad base we can examine what Amazon (Alexa), Apple (Siri/HomeKit), and Google (Google Assistant) collect via compatible smart speakers and displays.

Read more: The best smart home devices you can buy

QUICK ANSWER

Amazon, Apple, and Google all use an anonymous sample of recorded voice commands to analyze and improve their smart home assistants, though Google has recordings off by default. Collection can sometimes include accidental recordings when speakers mistake a wake word. With Amazon and Google, the things you ask for (like product orders and music) may also indirectly be used in advertising. All three companies are increasingly using on-device processing to reduce data collection, and offer some way of deleting past recordings. Both Apple and Google make it easy to avoid voice recording histories. Amazon and Google share some security video with police without demanding a warrant.


JUMP TO KEY SECTIONS

Amazon (Alexa) smart home privacy

A top view of the 4th-gen Amazon Echo
Adam Molina / Android Authority

Unless you mute it, every Alexa-equipped speaker continually listens for a wake word to be ready for voice commands. Typically, when it hears that word, a recording of the subsequent phrase (i.e. “turn on the living room lights”) is interpreted on Amazon servers. Recent devices like the fourth-generation Echo can optionally process that audio locally, but they still send transcripts to the cloud. This is why Alexa speakers (and most smart speakers, really) have little to no functionality if your internet goes down.

Amazon encrypts recordings, but they are linked to your account, and are kept indefinitely unless you shorten that timeframe in Alexa’s privacy settings. You can additionally delete recordings and/or force Amazon to stop saving them entirely, but there is the risk that Alexa will be less accurate in responding to you. Similar options are available for smart home accessory activity, as well as detected sounds if you have Alexa Guard and/or other sound-based automation routines active.

Except for people who opt out or stop recording entirely, Amazon uses 'an extremely small sample' of voice recordings to analyze Alexa's performance.

When it comes to camera-equipped devices like an Echo Show, Amazon says it never saves anything from video calls, and that its Visual ID feature (on products like the Echo Show 15) is handled on-device.

Except for people who opt out or stop recording entirely, Amazon uses “an extremely small sample” of voice recordings to analyze and improve Alexa’s performance, as hinted at earlier. Some people are uncomfortable with this, since while review teams shouldn’t be able to identify you personally, that still means strangers are hearing a tiny slice of your life. There’s also the chance false Alexa triggers will be picked up, and that your Alexa recordings might be useful in criminal cases, even if Amazon seems to resist sharing them with law enforcement.

The company’s Ring division is known to sometimes give security camera recordings to law enforcement without telling owners or demanding a warrant, mainly in “emergency” situations where death or serious injury are immediate risks. Police are generally expected to obtain a warrant from Ring or ask for that footage publicly, but if you’re understandably concerned, you’ll want to turn on end-to-end encryption via the Ring app. That should block access to anyone but yourself, although it’s only available for wired cameras, not wireless ones.

You should also be aware that, as with any interaction with Amazon, the things you do on an Alexa device will ripple out to marketing, advertising, and linked services. If you order products via your Echo for example, Amazon is going to learn your cumulative shopping habits and target ads accordingly. If you listen to music on Spotify, that service will decode your tastes just as if you were picking tracks on your phone, serving up ads and recommendations.

See also: How to use Amazon Alexa

Apple (Siri/HomeKit) smart home privacy

An Apple HomePod mini in orange
Apple

Apple is eager to sell how much it cares about privacy, and for the most part, the company lives up to its promises. HomeKit is heavily encrypted and secure, possibly to a fault — unlike Alexa or Google Assistant, you almost always have to scan (or type in) a physical code if you want to pair smart home accessories. Trying to re-pair something with HomeKit can be a pain, especially if you lose its ID sticker.

Siri, like Alexa, often depends on cloud processing to handle voice commands. It’s continually listening for the “Hey Siri” wake word, and when necessary, it sends recordings to Apple servers after a two-step verification process. Thankfully, devices with at least iOS 15 and an A12 Bionic processor (or newer) can now process many requests on-device. Some commands can be handled without any internet access at all, such as setting a timer, launching an app, or toggling settings.

On Apple’s end, a selection of recordings (and/or transcripts) may be used for review, and they remain linked to you by a random identifier for six months. You can opt out of review pretty easily, and delete anything saved within that six-month window. In fact anything deleted within 24 hours will never be reviewed, and if your Apple device supports local processing, Siri can catch and delete some false triggers before they’re uploaded. Apple also promises that it will delete most accidental recordings if they make it to the cloud, the exception being a portion used to make sure false trigger detection is working.

Apple is eager to sell how much it cares about privacy, and for the most part the company lives up to its promises.

Apple doesn’t record FaceTime video calls, but it does save a record of metadata (who called who and when) for 30 days. Apple claims this info is stored “in a way that doesn’t identify you,” yet it’s not clear whether this would be true if it was requested by government investigators. After all, conventional phone metadata can be used to piece together someone’s identity if it matches outside clues.

For improving Siri (unless you opt out) the company likewise collects contact names, a list of installed apps, and location data, but this is associated with your random identifier. It isn’t connected to your Apple ID or email address, so it wouldn’t be of much use to hackers even if they could get past encryption and other security measures.

In theory it could be exploited by governments, but this is only a serious threat in regions that have both authoritarian regimes and local Apple data centers, like China. The company does regularly cooperate with law enforcement and intelligence agencies around the world — including the US — but it’s famously reluctant to do so, and has designed its systems and encryption in a way that limits what it can access internally.

Perhaps most relevant to users on a daily basis is the fact that Apple doesn’t use Siri data to build marketing profiles. There’s no chance of talking to your iPhone or HomePod about buying a crib and being targeted with ads for baby products. Connected third-party services may receive data, but generally the bare necessities.

We should lastly touch on HomeKit Secure Video, which allows iCloud Plus subscribers to save (compatible) security camera footage in the cloud, and detect objects like people, pets, and cars. This video is saved for just 10 days, and encrypted end-to-end, meaning that even Apple employees can’t view it.

Google (Google Assistant) smart home privacy

A morning view on the second-generation Google Nest Hub.
Jimmy Westenberg / Android Authority

Google Home (the smart home app/framework) and Assistant (the voice command tech) are intertwined not just with each other, but with the rest of the Google universe. That’s both a problem and an advantage.

It’s an advantage in that there’s a lot of flexibility. Assistant is omnipresent in Google’s ecosystem, and doesn’t care if you’re talking to it from Chrome, an Android phone, the Google iOS app, or a Nest Hub Max — everything is tied into your Google account, so you can theoretically control speakers, lights, and other smart home accessories from any device, anywhere. You also get to tap into the vast power of Search, YouTube, and other Google services.

More: The best services for your Google Nest speaker

To help make the equation work, Google has decided to unify data and privacy settings under the Google Dashboard. You do have a fair amount of control, but it can be intimidating, and changing some settings there can have wide ramifications, whereas people might only be concerned with what their speakers and smart displays are doing. There are also odd exceptions — if you want streamlined activity history for your smart home accessories, you need the Google Home app, because the company’s My Activity web tool is focused on Assistant as a whole.

What might surprise some people is that when you trigger Assistant with “Hey Google” or “OK Google,” voice recordings aren’t kept by default. You have to opt in, and like Apple, Google both encrypts this audio and assigns it an anonymous identifier. If you’ve opted in, you can delete conversations via My Activity, asking Assistant to delete a set range, or limiting history to three or 18 months. Some Assistant-equipped devices can handle basic commands offline, such as local settings or playing saved music.

What might surprise some people is that when you trigger Assistant with 'Hey Google' or 'OK Google,' voice recordings aren't kept by default.

The company says that just 0.2% of recorded voice commands are reviewed for improving Assistant responses. It likewise never saves any video from third-party cameras, and if you’re using a Nest Cam or Doorbell, you can delete or disable your video history at will on top of any rolling plan-based limits.

Nest Hub Max displays have a feature called Face Match, which uses the onboard camera to detect who’s using them. Google claims that no Face Match video is uploaded to its servers beyond initial setup.

If you’re worried about smart speaker activity making its way to advertisers, Google is probably the platform to avoid, for the obvious reason that the company is a global juggernaut in online ads. It analyzes requests and serve ads accordingly, without distinguishing between asking your speaker something and typing that into Google’s website. If you ask your Nest Audio about places to buy engagement rings, expect to see a few wedding-related ads the next time you fire up a web browser.

The company does take some precautions — it doesn’t share any personally identifiable info with third parties unless it’s necessary to make something work, such as placing a phone call or hailing an Uber ride. You have to authorize sharing smart home accessory data, and Google also never shares audio recordings with third parties, instead providing them with transcripts if that content is necessary.

Less clear is how false voice command recordings are handled. Like Apple, Google says it’s automatically deleting them when detected, if just to avoid tainting performance analysis. It’s trying to cut down on false triggers overall, yet states only that it has “a number of protections in place to prevent this from occurring.”

Google’s security camera policies are similar to Amazon, in that they let police bypass warrants for recordings in “emergency” scenarios where death or serious injury is possible. Google specifically uses “bomb threats, school shootings, kidnappings, suicide prevention, and missing persons cases” as examples. The company says it tries to notify customers when this sharing happens, but that notifications might not arrive if Google isn’t informed the emergency is over.

See more: The Google Assistant commands you need to know

Which smart home platform has the best privacy?

The new look for Siri as of Apple's WWDC 2020

The clear choice is Apple. Even if its policies and track record aren’t perfect, Apple is taking privacy more seriously than its competitors. You don’t have to worry about marketers gleaning too much about your activity, and overall security is about as high as it can (plausibly) get.

There are two caveats, the first being compatibility. Siri is exclusively on Apple devices and a handful of HomeKit-ready speakers — it doesn’t exist on Android or Windows. HomeKit is itself platform-restricted, unusable for a smart home unless you own an iPhone or iPad. You can’t even add accessories using a Mac.

Even if its policies and track record aren't perfect, Apple is taking privacy more seriously than its competitors.

The second issue is ecosystem size. There aren’t nearly as many smart home products for Siri/HomeKit as there are for Alexa or Google Assistant. This is attributable not just to the latter two working with more operating systems, but to the difficulty of developing for HomeKit, which involves strict standards and high levels of encryption processing.

That leaves many people having to choose between Amazon and Google, unless they’re technically savvy enough to deal with standards like Zigbee or Z Wave. Google is probably more trustworthy than Amazon since it doesn’t record any voice data by default, but either way you’re making privacy concessions in the name of convenience.


Read more: The best smart speakers you can buy