WASHINGTON, D.C. – U.S. Senator Chris Coons (D-Del.), a member of the Senate Judiciary Committee, today sent a letter to Amazon Chief Executive Officer Jeff Bezos requesting information about the company’s privacy and data security practices for Alexa devices. The letter follows recent news reports indicating that Amazon stores and indefinitely preserves text transcripts of user voice recordings, a practice that potentially puts Amazon users’ privacy at risk.
 
Last year, Coons and Senator Jeff Flake (R-Ariz.) sent a letter to Bezos regarding privacy concerns about the Amazon Echo and its voice-activated software, Alexa. In response, Amazon highlighted many of its privacy protections, including the ability for a consumer to delete any and all voice recordings that Alexa sends to the cloud.
 
“Unfortunately, recent reporting suggests that Amazon’s customers may not have as much control over their privacy as Amazon had indicated,” Senator Coons wrote. “While I am encouraged that Amazon allows users to delete audio recordings linked to their accounts, I am very concerned by reports that suggest that text transcriptions of these audio records are preserved indefinitely on Amazon’s servers, and users are not given the option to delete these text transcripts. The inability to delete a transcript of an audio recording renders the option to delete the recording largely inconsequential and puts users’ privacy at risk.”
 
In the letter, Coons requests information on the types of data Amazon collects, stores, and preserves, as well as the degree to which consumers control their personal information.
 
To view a PDF of the letter, click here
 
Full text of the letter can be found below. 
 
Mr. Jeff Bezos                    
Chief Executive Officer
Amazon, Inc. 
410 Terry Avenue North
Seattle, WA 98109  
 
Dear Mr. Bezos:
 
I write regarding Amazon’s privacy and data security practices in light of recent news reports indicating that your company stores and indefinitely preserves text transcriptions of user voice recordings, a practice that potentially puts Amazon users’ privacy at risk.  
 
Last year, I wrote a letter with Senator Flake regarding concerns about the Echo, Amazon’s Internet-connected personal assistant-style device, and its voice-activated software, Alexa.  In response to this letter, Amazon highlighted many of its privacy protections.  These included assurances about Alexa’s on-device buffer, as well as the ability for a consumer to delete any and all voice recordings that the Alexa system sends to the cloud.
 
Unfortunately, recent reporting suggests that Amazon’s customers may not have as much control over their privacy as Amazon had indicated.  In the letter that we received on July 27, 2018, Amazon wrote that “[c]ustomers can review and listen to the voice recordings associated with their account in the Alexa app, and delete them individually or all at once, which also deletes them from our server.”  While I am encouraged that Amazon allows users to delete audio recordings linked to their accounts, I am very concerned by reports that suggest that text transcriptions of these audio records are preserved indefinitely on Amazon’s servers, and users are not given the option to delete these text transcripts.  The inability to delete a transcript of an audio recording renders the option to delete the recording largely inconsequential and puts users’ privacy at risk.
 
In Amazon’s response to my concerns last year, I was assured that “[f]rom early-stage development, [Amazon] built privacy deeply into the hardware and service by design, and with Alexa and Amazon’s Alexa-enabled products we strive to put the control with our customers.”  If consumers cannot delete transcripts of their voice recordings, I am concerned that Amazon has not lived up to this standard.
 
The increasing popularity of in-home, Internet-connected devices and voice-activated technologies raises questions about the types of data they collect, store, and share, as well as the degree to which consumers control their personal information.  Companies like Amazon that offer services through these devices should address these concerns by prioritizing consumer privacy and protecting sensitive personal information. 
 
As a member of the Judiciary Committee, I have a longstanding interest in the privacy and security of consumers’ personal data, including information collected by in-home, Internet-connected devices and voice-activated technologies.
  
I therefore request that Amazon provide answers to the following questions.

1. Regarding transcripts of user voice recordings that have been sent to the cloud: 
      • How long does Amazon store the transcripts of user voice recordings? 
      • Do users have the ability to delete any or all of these transcripts?  
      • Are there any transcripts that a user cannot delete?  If so, why is a user permitted to delete voice recordings but not the corresponding text transcript?
      • For what purpose does Amazon use these transcripts?
      • Does Amazon take any measures to anonymize user identity and other information in relation to these transcripts?

2. Does Amazon preserve, either in audio or textual form, the Alexa system’s responses to user questions?  If so, is a user able to delete the audio or the transcript of such a response?

3. Regarding the wake word processing system:

      • In determining when a user has finished issuing a command, for how long does the system wait until it stops recording, on average?
      • Is any audio that is captured by the on-device buffer ever sent to the cloud if the wake word is not detected?
      • Is audio that is captured and stored in the device’s temporary memory transcribed by the automatic speech recognition system?  If so, are those transcripts similarly stored only in the device’s temporary memory, or are they sent to the cloud?
      • Amazon’s July 27, 2018 letter indicates that the Alexa system comes with a setting whereby a user can allow Alexa to respond to a series of requests without the customer needing to repeat the wake word.
      • Is this a default setting, or does a consumer need to affirmatively enable this setting?
      • For how long does Alexa listen for subsequent commands after the wake word is spoken when this setting is enabled?

I appreciate your prompt attention to this matter and would respectfully request a response by June 30, 2019.

 
                                                                                                Sincerely,