Virtual Assistants - How Do They Handle Your Privacy?

My privacy is important to me: it’s not for sale.

If you use computer devices, the Internet or any type of payment solution other than cash, you are sharing information about… you. Unless you don’t use the Internet, don’t have a cell phone, don’t use GPS, don’t use a credit card, don’t travel by air, don’t rent cars, don’t use EZ-Pass and only pay in cash, your information, activities and transactions are logged one way or another. Your use of modern technologies goes along with having your information recorded. When considering using a virtual assistant, privacy is often a topic of concern. This is in part because a virtual assistant that is set to be voice activated is always listening, always. And this feels like a departure from what we are used to.

Here’s how virtual assistants listen and how they handle what they hear.

Your virtual assistant listens in short (a few seconds) snippets for the hotword. Those snippets are deleted if the hotword is not detected, and none of that information leaves your device until the hotword is heard. When the virtual assistant detects that you’ve said the hotword or that you’ve physically long pressed the listen button on your device, the LEDs on top of the device light up to tell you that recording is happening, the virtual assistant records what you say, and sends that recording (including the few-second pre-hotword recording) to our servers in order to fulfill your request.

This language is actually pulled directly from Google’s FAQ edited to make it generic. It’s an excellent description and is essentially the way all virtual assistants function.

Here are more specifics on how the top virtual assistant companies handle or claim to handle your recorded information and your privacy:

Amazon Alexa

“Amazon processes and retains your Alexa Interactions, such as your voice inputs, music playlists, and your Alexa to-do and shopping lists, in the cloud.” This is from Amazon’s Alexa Terms of Use. These retained items are linked to you and your account. You can review and even delete your interactions from your Amazon account, but this only deletes them from your view. Amazon retains all interactions for an unspecified amount of time. See Alexa and Alexa Device FAQs for more detailed information.

Google Home

Google collects your information in a similar fashion to Amazon and, like Amazon, links the information to your “permanent Google record.” Google starts off their privacy policy with “When you use Google services, you trust us with your information.” They go on to state: “We collect information about the services that you use and how you use them, like when you watch a video on YouTube, visit a website that uses our advertising services, or view and interact with our ads and content.” Google claims you can delete the information they store but they also state: “When you delete items from My Activity, they are permanently deleted from your Google Account. However, Google may keep service-related information about your account, like which Google products you used and when, to prevent spam and abuse and to improve our services.” So like Amazon, Google keeps unspecified information that they deem useful to improving their services for an unspecified period of time. See: Data security & privacy on Google Home.

Google also states: “We take your privacy seriously. You can control what information is stored and shared with Google.” They provide more information pertaining to specific products as well as other useful privacy related materials here Google Privacy Policy.

Apple Siri

Apple also collects your information just as Amazon and Google, but as usual, they do something unique. Apple collects all kinds of information about you when you use their devices and software, but when you use Siri, they anonymize your interactions. Unlike Amazon and Google who use a “persistent personal identifier” (your Google and/or Amazon account) to record your interactions, Apple uses a random rotating identifier that refreshes every 15 minutes. This means Apple doesn’t store your interactions against your AppleID profile, but it stores them anonymously and relates them to you only temporarily using a random number. But as with the other companies, the disassociated interactions are stored for an unspecified amount of time on Apple’s servers.

Apple starts with the following premise: “At Apple, we believe privacy is a fundamental human right.” They go on to say in their Apple Privacy Statement“At Apple, we build privacy into every product we make, so you can enjoy great experiences that keep your personal information safe and secure.” That is pretty strong language, but it is more marketing language than policy language.

Apple’s actual policy is not too dissimilar from Google’s and Amazon’s. Apple collects a lot of information about its users and uses the same language like “We also use personal information to help us create, develop, operate, deliver, and improve our products, services, content and advertising…” More specifics are described in the Apple Privacy Policy.

Mycroft

To round out this article’s look at virtual assistants, we’ll look at Mycroft’s privacy statement (They don’t seem to have a posted policy yet.). “At Mycroft, we delete the recordings as they come in, unless you actively choose to share your data by opting into our open data set.” Even though Mycroft sends interactions to their servers for processing, they claim that no commands are ever stored and are deleted as soon as they are processed.

For more information about Mycroft see the Mycroft II Kickstarter page and their Usability vs. Privacy blog article.

Wait, there’s more…

The examples above show how the virtual assistant companies are handling privacy, but this is only one aspect of smart home privacy. If you are really concerned about privacy, you also need to look at the device manufacturer and service provider policies that are integrated into your smart office solution. Let’s say you are using Ecobee thermostats with Philips Hue lighting, Leviton light switches and Netgear Arlo Pro security cameras, all of these devices collect information and communicate with your virtual assistant through the cloud and over the Internet. Basically all of these manufacturers state something along the lines that they collect and use the data related to the device and app activity as long as the information does not identify the individual or device. The acceptable usage of the data is anything from “to help improve the product” to “without restriction.”

In comparison, credit card and cell phone companies collect more information about us than we probably care to dwell upon:

Reading privacy polices could be a full time job. For almost everything we do nowadays, we are presented with agreements and polices that we need to accept to use the related services. Sometimes these documents are a single page, but they are mostly dozens of pages of fine print. Do any of us really take the time to read them, before we click the “sign me up for free now” button? We ignore the small print and trust it all.

Privacy v. Function

Our privacy must always be of concern, and it must direct our choices. Understanding privacy in context is not a simple equation. A credit card company can know what you purchased, when and where. A cell phone company can know where you are, what you do with your phone, and the specifics of your communications. A virtual assistant can listen in on conversations, know your routines, your whereabouts, and more. If you didn’t tolerate some level of sharing your information or exhibit some amount of trust to this modern system of services, you would be paying cash and doing a lot of walking or bike riding.

AI, virtual assistants and smart homes and offices are new. We aren’t even close to understanding their reach and haven’t figured out what is “right” in terms of use, privacy and legislation. The financial costs to use an AI solution are relatively small in comparison to the complexity of the technology. This is due in part to the AI companies trying gain market share as well as the valuable individual user/consumer information they capture and collect as part of their services and which, arguably, subsidizes these same services. In many ways our information that they collect out values the dollar amounts they might otherwise charge for services and devices.

Should you be concerned?

AI services are not much different than any of the other modern products and services that we have become accustomed to using on a daily basis. Nevertheless, if you are considering setting up a virtual assistant, you should go into it with your eyes wide open and assess your comfort levels with the technologies as well as with the level of trust you place in the multitude of service providers that make up your virtual assistant solution.

They call this the Information Age. Information is everywhere.