Three members of the Schellman & Co. team were interviewed by ECT News Network on the issue of smart speakers (voice assistants) and consumer privacy. Are the devices always listening? What exactly is being recorded? You can read the entire article below or on the ECT News Network website.
Written by Jack M. Germain
Three leading smart speaker technology makers -- Amazon, Google and Apple -- have suspended contractor review of consumer recordings following disclosures that the devices are nearly always listening and have captured personal, business and other delicate human interactions.
Consumers using smart speakers and digital assistant apps from Amazon and Google can apply some control over their system settings to mute the device microphones and erase recordings.
Apple does not yet have a way to delete any previously stored interactions recorded by its smart speaker.
In theory, smart speakers are intended to record only after consumers utter the wake phrase for the particular device -- "OK Google" for Google Home speaker series, "Alexa" for the Amazon speaker, or "Hey Siri" for Apple's speaker device. In practice, unless users physically turn off the microphones or unplug their devices, they appear to do more than just snooze until they hear the wake-up word.
Smart speakers have been recording conversations when not specifically alerted to turn on, according to recent reports from publications in the European Union. Leaked recorded conversations include discussions of medical information, drug deals and intimate moments.
Although Apple, Google and Amazon recently implemented measures for consumers to control the human review of their recordings, there are still limitations on how much control consumers have over the data associated with these services, noted Chris Lippert, privacy manager at Schellman & Company.
"The associated devices will still log the activity made between the various paired apps and will still need to be listening at all times in order for consumers to actually use them on a daily basis."
"The associated devices will still log the activity made between the various paired apps and will still need to be listening at all times in order for consumers to actually use them on a daily basis," he told the E-Commerce Times.
Potential Privacy Provocations
Pulling the plug on third-party recording analyzers resulted from issues discovered with smart speaker users in Europe this summer. Some privacy scholars believe practices associated with Google Assistant violate the EU's GDPR rules.
There were harsh reactions after someone leaked to a Belgian news outlet 1,000 private conversations recorded by Google Assistant via Google Home smart speakers. The audio recordings involved users from the Netherlands and Belgium.
Amazon recently admitted that it stores its Alexa recordings indefinitely, which critics argue violates GDPR rules that say personal data cannot be stored by companies for longer than necessary.
Apple permitted third-party contractors to listen to questions consumers asked Siri, according to a whistleblower's report published in The Guardian. Workers reportedly listened to the recordings to help Siri improve and determine if the request was handled correctly.
Amazon and Google have defended their technology in the wake of the controversial reports about their use of the recordings.
"We take customer privacy seriously and continuously review our practices and procedures. Customers can review voice recordings associated with their account and delete those voice recordings one by one or all at once," an Amazon spokesperson said in comments provided to the E-Commerce Times by company rep Samantha Kruse.
Google employs language reviewers to review and transcribe a small set of queries to better understand different languages. It transcribes only a small fraction (0.2 percent) of the audio snippets. Those recordings are not associated with user accounts, the company said in an online post.
"Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally. You can turn off storing audio data to your Google account completely, or choose to auto-delete data after every three months or 18 months," a Google spokesperson said in comments provided to the E-Commerce Times by company rep Ashley Thompson.
In the reported cases, it seems that humans listened to recorded phrases to see if speech recognition and intent detection provided accurate results. Presumably, the phrases were not linked to any individual's identity, said John Foster, CEO of Aiqudo.
"So I think the privacy angle may be overblown. What's important is that each of the big voice platforms be open about their processes, which would have been a much better approach. Be clear and transparent about what you are doing and why, rather than having to come up with these opt-out policies after the fact, which end up looking reactive and ad hoc," he told the E-Commerce Times.
The end result likely will be increased wariness among consumers over using their voice assistants, Foster said, which sets back the whole voice industry and impacts consumers negatively as well.
"Voice, done properly, will make our lives easier and our interactions with our technology frictionless, but these issues will delay progress toward that goal," he noted.
Removing human auditors that review the quality of the transcription can cause slightly worse machine understanding of commands. However, it can produce much stronger privacy safeguards, suggested Ka Mo Lau, COO of Thunder Experience Cloud.
"It will simply take more time across more users for tech companies to achieve better-quality voice command understanding," he told the E-Commerce Times. "Users concerned by privacy should not keep voice assistant devices in sensitive areas, such as the bedroom. Similar to growing usage of camera blockers, privacy-focused consumers can cover their audio devices to muffle input when having private conversations."
How to Control the Recordings
Getting a measure of control about access to the recordings varies with the device. Users of smart speakers need to exercise a degree of control with smart speakers, just as they should with video recording devices.
Users can turn off human review on their Amazon and Google smart speakers. They also can delete previously recorded interactions.
Amazon customers can go to Settings > Alexa Privacy in the Alexa app or visit the website.
Alexa users also can enable the ability to delete their recordings by voice. Once enabled, they can delete the voice recording of their last request by saying, "Alexa, delete what I just said." Or they can delete all the voice recordings from their account for the day by saying, "Alexa, delete everything I said today."
To turn off human review of your Alexa interactions:
- Open the Amazon Alexa app
- Go to Menu > Settings > "Alexa Privacy."
- Find and tap on "Manage How Your Data Improves Alexa."
- Toggle off next to "Help Improve Amazon Services and Develop New Features."
Alternatively, you can visit this Web page.
You also can delete any of your existing Alexa voice recordings by going back to the Alexa Privacy screen in the app and selecting "Review Voice History."
To turn off the human review of your Google Home interactions:
- Open the Google Home app and select "Account."
- Tap on "More Settings," and then select "Your Data In the Assistant."
- Tap on "Voice & Audio Activity" and toggle off.
To delete any of your existing Google Assistant recordings, go to "Manage Activity" in the Google Home app.
While Apple Siri users cannot delete stored recordings, they can turn off new Siri recordings. However, they must do it manually on each Apple device. That is not necessary on the Apple Watch or HomePod.
Here is how to turn off new Siri recordings:
- Go to Settings > Siri & Search
- Turn off "Listen for 'Hey, Siri'"
- Turn off "Press Home for Siri" for phones with home buttons
- Turn off "Press Side Button for Siri"
- Go back to Settings > General > Keyboard and turn off "Enable Dictation."
Proactive Consumer Actions
Smart speaker users can implement or mitigate risk by applying several additional steps, said Avani Desai, president of Schellman & Company.
First, most virtual assistants have a mute button. When having confidential conversations regarding clients or confidential internal conversation, mute the hardware, she suggested. If there is not a physical mute button -- as with Siri, for instance -- you can tell her to stop listening and then confirm she has stopped listening.
Second, make sure you have implemented two-factor authentication on the apps to review, change or update settings on the voice assistant devices. Passwords are not enough -- nor is SMS/text 2FA.
"Get yourself a hardware key for 2FA so it is impossible for unauthorized users to remote into your system and capture the data," Desai told the E-Commerce Times.
Third, review and then delete your historical data. Deleting this on an ongoing basis or turning off the history can mitigate that risk, she added. Last, unplug the device when you do not need it.
The Privacy Road Ahead
The major smart speaker providers could give consumers the ability to modify what is shared, potentially with a "you need to share X to enable Y capability," said Doug Barbin, cybersecurity and emerging technologies practice leader at Schellman.
"While consumers certainly share privacy concerns, many are willing to share a certain level of information for the convenience offered through integrated home management and entertainment capabilities," he told the E-Commerce Times. "At a minimum, start by giving them the option."
Two other concerns need consumers' attention, noted Schellman's Lippert.
First, you may be able to limit the human review aspect, but will there also be a limitation to how artificial intelligence can analyze the voice data? Also, how long will the voice data be stored in raw form and also for analytical purposes?
"There will always be limitations in some of these areas as consumers will naturally want their voice assistants to improve over time with the rest of their technology," Lippert said.
One potential solution, he suggested, may be introducing some sort of anonymization/data aggregation option for consumers.
About Schellman Team Members interviewed in this article:
Chris Lippert is a Manager and Privacy Technical Lead with Schellman and is based in Atlanta, GA. With more than 6 years of experience in information assurance across numerous industries, regulations and frameworks, Chris developed a passion for and concentration in data privacy. He is an active member of the International Association of Privacy Professionals (IAPP), holding his Fellow of Information Privacy (FIP) designation, and advocates for privacy by design and the adequate protection of personal data in today’s business world.
Avani Desai is the President at Schellman. Avani has more than 15 years of experience in IT attestation, risk management, compliance and privacy. Avani’s primary focus is on emerging healthcare issues and privacy concerns for organizations. Named as one of the 2017 Global Leaders in Consulting by Consulting Magazine she has also been featured and published in the ISSA Journal, ITSP Magazine, ISACA Journal, Information Security Buzz, Healthcare Tech Outlook, and many more.
Doug Barbin is a Principal at Schellman & Company, LLC. Doug leads all service delivery for the western US and is also oversees the firm-wide growth and execution for security assessment services including PCI, FedRAMP, and penetration testing. He has over 19 years of experience. A strong advocate for cloud computing assurance, Doug spends much of his time working with cloud computing companies has participated in various cloud working groups with the Cloud Security Alliance and PCI Security Standards Council among others.
About the AuthorMore Content by Schellman Compliance