In the world of smart assistants, Amazon Alexa is one of the most popular and widely-used. It’s accessible through smart speakers, phones, and smartwatches, and while it’s an incredibly powerful tool, the convenience of using it comes at the expense of your privacy.
Privacy concerns surrounding Alexa, Google Assistant, etc. aren’t new, but according to a new report from Bloomberg, employees of Amazon’s Alexa Data Services team have access to people’s precise home address — and in some cases, full names, phone numbers, and more.
The Alexa Data Services team is in charge of reviewing recordings of Alexa conversations to help train the AI and has thousands of members in Boston, India, and Romania. Per Bloomberg:
Team members with access to Alexa users’ geographic coordinates can easily type them into third-party mapping software and find home residences, according to the employees, who signed nondisclosure agreements barring them from speaking publicly about the program.
While there’s no indication Amazon employees with access to the data have attempted to track down individual users, two members of the Alexa team expressed concern to Bloomberg that Amazon was granting unnecessarily broad access to customer data that would make it easy to identify a device’s owner.
Bloomberg says it was shown a demonstration of this, and during it, saw an Amazon employee take the latitude and longitude of one user stored in Amazon’s database, entered it into Google Maps, and had an image of that area and the user’s house in “less than a minute.”
In addition to location data, a smaller category of employees “who tag transcripts of voice recordings to help Alexa categorize requests” have access to even more data:
After punching in a customer ID number, those workers, called annotators and verifiers, can see the home and work addresses and phone numbers customers entered into the Alexa app when they set up the device, the employee said. If a user has chosen to share their contacts with Alexa, their names, numbers and email addresses also appear in the dashboard. That data is in the system so that if a customer says “Send a message to Laura,” human reviewers can make sure transcribers wrote the name correctly so that the software learns to pair that request with the Laura in the contact list.
Amazon has since issued a statement in response to this story, saying:
Access to internal tools is highly controlled, and is only granted to a limited number of employees who require these tools to train and improve the service by processing an extremely small sample of interactions. Our policies strictly prohibit employee access to or use of customer data for any other reason, and we have a zero tolerance policy for abuse of our systems. We regularly audit employee access to internal tools and limit access whenever and wherever possible.
Despite that statement, one employee that spoke with Bloomberg noted that “they believed the vast majority of workers in the Alexa Data Services group were, until recently, able to use the software.”
While we have no reason to believe that any user information was compromised or used maliciously, the fact that such sensitive data is so readily available is not a good look for Amazon. Then again, if you’re trusting a device to live in your house and constantly listen to you, are you really that concerned with things like this?