Virtual Assistants - How Do They Handle Your Privacy?
If you use computer devices, the Internet or any type of payment solution other than cash, you are sharing information about… you. Unless you don’t use the Internet, don’t have a cell phone, don’t use GPS, don’t use a credit card, don’t travel by air, don’t rent cars, don’t use EZ-Pass and only pay in cash, your information, activities and transactions are logged one way or another. Your use of modern technologies goes along with having your information recorded. When considering using a virtual assistant, privacy is often a topic of concern. This is in part because a virtual assistant that is set to be voice activated is always listening, always. And this feels like a departure from what we are used to.
Here’s how virtual assistants listen and how they handle what they hear.
Your virtual assistant listens in short (a few seconds) snippets for the hotword. Those snippets are deleted if the hotword is not detected, and none of that information leaves your device until the hotword is heard. When the virtual assistant detects that you’ve said the hotword or that you’ve physically long pressed the listen button on your device, the LEDs on top of the device light up to tell you that recording is happening, the virtual assistant records what you say, and sends that recording (including the few-second pre-hotword recording) to our servers in order to fulfill your request.
This language is actually pulled directly from Google’s FAQ edited to make it generic. It’s an excellent description and is essentially the way all virtual assistants function.
Here are more specifics on how the top virtual assistant companies handle or claim to handle your recorded information and your privacy:
Apple also collects your information just as Amazon and Google, but as usual, they do something unique. Apple collects all kinds of information about you when you use their devices and software, but when you use Siri, they anonymize your interactions. Unlike Amazon and Google who use a “persistent personal identifier” (your Google and/or Amazon account) to record your interactions, Apple uses a random rotating identifier that refreshes every 15 minutes. This means Apple doesn’t store your interactions against your AppleID profile, but it stores them anonymously and relates them to you only temporarily using a random number. But as with the other companies, the disassociated interactions are stored for an unspecified amount of time on Apple’s servers.
Apple starts with the following premise: “At Apple, we believe privacy is a fundamental human right.” They go on to say in their Apple Privacy Statement: “At Apple, we build privacy into every product we make, so you can enjoy great experiences that keep your personal information safe and secure.” That is pretty strong language, but it is more marketing language than policy language.
To round out this article’s look at virtual assistants, we’ll look at Mycroft’s privacy statement (They don’t seem to have a posted policy yet.). “At Mycroft, we delete the recordings as they come in, unless you actively choose to share your data by opting into our open data set.” Even though Mycroft sends interactions to their servers for processing, they claim that no commands are ever stored and are deleted as soon as they are processed.
Wait, there’s more…
The examples above show how the virtual assistant companies are handling privacy, but this is only one aspect of smart home privacy. If you are really concerned about privacy, you also need to look at the device manufacturer and service provider policies that are integrated into your smart office solution. Let’s say you are using Ecobee thermostats with Philips Hue lighting, Leviton light switches and Netgear Arlo Pro security cameras, all of these devices collect information and communicate with your virtual assistant through the cloud and over the Internet. Basically all of these manufacturers state something along the lines that they collect and use the data related to the device and app activity as long as the information does not identify the individual or device. The acceptable usage of the data is anything from “to help improve the product” to “without restriction.”
In comparison, credit card and cell phone companies collect more information about us than we probably care to dwell upon:
Reading privacy polices could be a full time job. For almost everything we do nowadays, we are presented with agreements and polices that we need to accept to use the related services. Sometimes these documents are a single page, but they are mostly dozens of pages of fine print. Do any of us really take the time to read them, before we click the “sign me up for free now” button? We ignore the small print and trust it all.
Privacy v. Function
Our privacy must always be of concern, and it must direct our choices. Understanding privacy in context is not a simple equation. A credit card company can know what you purchased, when and where. A cell phone company can know where you are, what you do with your phone, and the specifics of your communications. A virtual assistant can listen in on conversations, know your routines, your whereabouts, and more. If you didn’t tolerate some level of sharing your information or exhibit some amount of trust to this modern system of services, you would be paying cash and doing a lot of walking or bike riding.
Should you be concerned?
AI services are not much different than any of the other modern products and services that we have become accustomed to using on a daily basis. Nevertheless, if you are considering setting up a virtual assistant, you should go into it with your eyes wide open and assess your comfort levels with the technologies as well as with the level of trust you place in the multitude of service providers that make up your virtual assistant solution.
They call this the Information Age. Information is everywhere.