With a recent change to Alexa, you can now opt-out of having your voice recordings reviewed by Amazon’s employees.
Why the change? Is it the reports of Google and Apple employees listening in on personal exchanges recorded by the voice-activated assistants? Or perhaps it’s a response to European courts rulings limiting what companies (like Amazon) can do with the data collected by said voice-activated assistants.
To Opt-Out of Human Review
- Open the Alexa app on your phone or tablet.
- Select Settings, Alexa Privacy, then Manage How Your Data Improves Alexa.
- Locate the setting “With this setting on, your voice recordings may be used … manually reviewed.” and tap OFF.
Siri, Cortana, Alexa; each is markedly female. And, despite settings that allow you to modify your voice-activated assistant, the voices have been decidedly binary.
Until now; Meet Q
Created by a group of linguists, technologists, and sound designers, Q hopes to “end gender bias” and encourage “more inclusivity in voice technology.” They recorded the voices of two dozen people who identify as male, female, transgender, or non-binary in search for a voice that typically “does not fit within male or female binaries.” To find this voice, the Q team conducted a test involving over 4,600 people, who were asked to rate the voice on a scale of 1 (male) to 5 (female).
For more on the research and technology behind Q, visit TheNextWeb
With Google‘s “Interpreter mode” your Home devices can act as an on-the-fly translator. One person speaks one language, the other person speaks another, and Google Assistant tries to be the middle man between the two.
To get started, you just say something like “Hey Google, be my Spanish interpreter.”
via Google Home can now translate conversations on-the-fly — TechCrunch
We’ve heard about her creepy laugh, but this may be the first reported instance of her tattling on her
masters family. In what Amazon calls an “unlikely but possible” string of events one Echo device recorded a family’s private conversation and sent recording to contact.
Apparently, the device interpreted a word in the background conversation as ‘Alexa,’ prompting her to wakeup and take notice. She then it interpreted the ensuing words in the conversation as a send message.
Alexa’s query, ‘To whom? ‘ went unheard among the conversants, and she (Alexa) interpreted a word among in the conversation as the name of a saved family contact. Asking for confirmation (‘ ________, right?’) she interpreted, among the following words, a confirming, ‘right.’
Then did what she believes was told to do.
Amazon’s statement : “As unlikely as this string of events is, we are evaluating options to make this case even less likely.“