According to a new Canalsys report, market research experts, there will be more than 200 million smart speakers in homes around the world by the end of this year. Devices such as Amazon Alexa, Apple HomePod and Google Home were very popular Christmas gifts last year – and probably also this year.
The ability to stream millions of songs, answer basic questions, and compile shopping lists has helped make digital assistants a useful addition to our homes. But questions about privacy and security continue to cause problems.
Always on, always listening?
Smart speakers are designed to respond to specific keywords so that they know when you talk to them. Say the trigger word ("Hey Siri", "Alexa", "OK Google") and the speaker knows that the following words will be an instruction. Which means that the speaker really always listens.
The good news is that only the words spoken after the trigger word are uploaded and stored in the cloud of the service provider. However, this is not always the case – Google already holds a patent for technology that checks background noise to be able to offer advertisements to the device owner. And Amazon filed a similar patent last year.
It would be relatively easy to integrate this feature into smart speakers someday. So it may be that privacy is about to become much harder to protect.
Are your voting data already being misused?
Last month, reports revealed that Amazon employees had access to private recordings made by Alexa owners. Amazon insists that these recordings are checked to ensure that Alexa responds correctly to orders.
But Amazon employees are also being asked to accidentally record where Alexa has responded incorrectly. Bloomberg claims that these recordings sang a woman in the shower and a child screaming for help. Unlike Apple's voice analysis, Amazon recordings are not anonymous. Employees who listen to recordings know the name of the person who has the Echo speaker.
How to protect your Alexa recordings
Unfortunately, there is no way to prevent your Alexa recordings from being reviewed by Amazon. However, you can delete them; if you remove them quickly enough, Amazon will not be able to listen to them.
You have access to (and can listen to) all your recordings online on the Alexa Privacy Settings web page. All recordings must be deleted manually – there is no way to have them deleted automatically. You must remember that you must log in regularly and delete recordings. There are some more instructions in our blog post, your virtual assistant knows quite a lot about you.
Whether you have an Amazon Echo or an Apple HomePod, you still have to protect yourself. Unfortunately, this means that you will probably have to be more careful with what you say at home – and never say anything that sounds like the trigger word unless you want to talk to your digital assistant.
Voice assistants are still an emerging technology and technology giants such as Amazon, Google and Apple are still trying to find the right balance between functionality and privacy. Given the negative reaction to this latest Amazon data sharing incident, we should (hopefully) see these companies take action rather than later.