Woman Learns That Amazon Stores Thousands Of Her Voice Recordings

woman finds amazon's collection

By Mayukh Saha / Truth Theory

The section detailing a website’s privacy policy is usually put right in front of our faces. We are also repeatedly told to read through it. But the sheer size makes the majority of us click past it while signing up. To be fair, it does not really make for an interesting read – filled with jargon and long-winded. But it is the only place where we get to know all the things that we are agreeing to when we check that box.

A TikToker has taken it upon herself to save those who do not read the section. She looks through the tech products that are very popular and posts their privacy policies. She even scrutinizes them and highlights the parts that might be red flags.

The Amazon Data Can Be Freely Requested

The TikToker in question is appropriately known as @my.data.not.yours. Recently, she rated the privacy policy of Amazon and gave it an 8 out of 10. She even educated her audience about how, at any time, they can ask Amazon for a copy of whatever data the company has collected about them.

But a specific section of the privacy policy caught her attention. In that part, Amazon states that they might store recordings of your voice when you talk with Alexa. It also states that videos and images can also be stored or collected that are related to Amazon Services.

@my.data.not.yours Reply to @ladyisabellemae #privacy #fyp #foryoupage #amazon #alexa #dataprivacy #bigtech #bigbrother #trending #bigdata #privacyrevolution #update ♬ Spongebob – Dante9k


This made the TikToker request a copy of her data. What she was sent shocked her. Amazon had more than 3,354 short audio clips from the times she talked with the devices that had Alexa. To further detail, she claimed that she owned 1 Echo and 2 Dots.

Read: Apple Privacy Update Shows Tik-Tok Is Harvesting Sensitive Data From Millions Of Users 

On her TikTok, she showed a huge folder that had every audio clip. Then she played a recording where she could be heard telling Alexa to switch on the lights. There was a more shocking discovery awaiting here. Amazon had the complete lists of the contacts on her phone. However, she could not remember ever synchronizing that date with Alexa. Finally, the folder also had data on her location.

A More Educative Video, Than Shocking Revelation  

Since the video of her Amazon information was uploaded, the short TikTok has been watched more than 2.5Mn times and received more than 185,000 likes. Most users took the revelation humorously.

One comment said that this means Amazon also had all the clips of the user verbally assaulting their device when it failed to listen. Another said that this was hardly ‘scary’ because they did not have anything interesting enough that they would mind Amazon having in their audio or contacts. A third pointed out that this was done so that the user’s life could be simpler. Moreover, it is detailed in their terms and conditions which we agree to while setting up the device.

The TikToker has since written a guide about how others can ask for their data as well.

Read: Guy Claims To Expose How Pigs Are Fed Plastic And Paper And Got Fired For Doing So 

Even if 3,354 clips might sound like a large number, Echo devices have been programmed to detect only the wake word chosen by the user. No audio clips are sent or stored on the cloud until this wake word is detected by the device. Additionally, users can know when Alexa communicates with the cloud when an audio tune sounds or a blue light appears.

Users of Alexa are also reminded to review their privacy preferences a month after they set up the devices. Amazon also emails Alexa users every year that outlines available features and settings for privacy. As such, the story of the TikToker had nothing extraordinary. But this should be a reminder to spend some more time with the privacy policies.

SUPPORT US ON PATREON AND GET ACCESS TO OTHER EXCLUSIVE CONTENT, CLICK HERE TO LEARN MORE 

FOLLOW MIKE ON INSTAGRAM FOR MORE: @MIKESYGULA

Feature Image Credits: @my.data.not.yours

Leave Comment: