dearJulius.com Write For Us

Hey Alexa, come clean about how much you’re really recording us

SHARE:

© Provided by WP Company LLC d/b/a The Washington Post

By Geoffrey Fowler, The Washington Post

We’re learning an important lesson about cutting-edge voice technology: Amazon’s Alexa is always listening. So are Google’s Assistant and Apple’s Siri.

Putting live microphones in our homes has always been an out-there idea. But tech companies successfully marketed talking speakers such as the Amazon Echo and Google Home to millions by assuring they only record us when we give a “wake word.”

That turns out to be a misnomer. These devices are always “awake,” passively listening for the command to activate, such as “Alexa,” “O.K. Google,” or “Hey Siri.” The problem is they’re far from perfect about responding only when we want them to.

The latest, and most alarming example to date: A family in Portland, Ore., two weeks ago found its Echo had recorded a private conversation and sent it to a random contact. The event, reported by Washington state’s KIRO 7, went viral Thursday among Echo owners — and naysayers on the idea of allowing tech companies to put microphones all over our homes.

Privacy is the one aspect of Alexa that Amazon can’t afford to screw up. (Amazon's chief executive, Jeffrey P. Bezos, owns The Washington Post.)

Amazon, in a statement, made it sound like the Portland case involved a sequence of events you might expect in a “Seinfeld” episode. It said the Echo woke up when it heard a word that sounded like Alexa. "The subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list."

Amazon also said the incident was rare and it is “evaluating options to make this case even less likely.”

But how often do these devices go rogue and record more than we’d like them to? Neither Google nor Amazon immediately responded to my questions about false positives for their “wake words." But anyone who lives with one of these devices knows it happens.

As a tech columnist, I’ve got an Echo, Google Home and Apple HomePod in my living room — and find at least one of them starts recording, randomly, at least once per week. It happens when they pick up a sound from the TV, or a stray bit of conversation that sounds enough like one of their wake words.

Separating a command out from surrounding home noise — especially loud music — is no easy task. Amazon's Echo uses seven microphones and noise-canceling tech to listen out for its wake word. Doing so, it records about a second of ambient sound on the device, which it constantly discards and replaces. But once it thinks it hears its wake word, the Echo’s blue light ring activates and it begins sending a recording of what it hears to Amazon’s computers.

Over-recording isn’t just an Amazon problem. Last year, Google faced a screw-up where some models of its Home Mini were set to record everything and had to be patched. Earlier this month, researchers reported they were able to make Siri, Alexa and Google’s Assistant hear secret audio instructions undetectable to the human ear.

So what should you do about this? You can mute these devices, which in the case of the Amazon Echo physically disconnects the microphone — until you’re ready to use it. But that partly defeats the usefulness of a computer you can just holler at when your hands are otherwise occupied.

Another approach is to turn off some more-sensitive functions in the Alexa app, including making product purchases via voice. You can turn off the “drop in” feature that lets another Echo automatically connect to start a conversation.

You also have the ability to dig deeper into what’s being recorded. Prepare to be a bit horrified: Amazon and Google keep a copy of every single conversation, both as a nod toward transparency and to help improve their voice-recognition and artificial intelligence systems. In the Alexa app and on Google's user activity site, you can listen to and delete these past recordings. (Apple also keeps Siri recordings, but not in a way you can look up — and anonymizes them after six months.)

The nuclear response is to unplug your smart speaker entirely until the companies come clean about how often their voice assistants over-listen — and what they’re doing to stop it.

COMMENTS



Note: If you think this story need more information or correction, feel free to comment below your opinion and reaction.
Like & Follow to Stay Updated ...

Name

AI,3,Amazon,1,Apple,5,Emoji,1,Facebook,17,Games,35,Google,7,Instagram,6,Science,205,Security,4,Social Media,24,Tech,242,Technology,1157,Tesla,5,Twitter,4,
ltr
item
Technology - U.S. Daily News: Hey Alexa, come clean about how much you’re really recording us
Hey Alexa, come clean about how much you’re really recording us
https://4.bp.blogspot.com/-dH36JfSxucs/Wwj2K0oCKoI/AAAAAAAAqZ0/QT-Oi2xTHZccFpW_9Q0BbkU0d7ysHk2ggCEwYBhgL/s1600/2.jpg
https://4.bp.blogspot.com/-dH36JfSxucs/Wwj2K0oCKoI/AAAAAAAAqZ0/QT-Oi2xTHZccFpW_9Q0BbkU0d7ysHk2ggCEwYBhgL/s72-c/2.jpg
Technology - U.S. Daily News
https://tech.dailynews.us.com/2018/05/hey-alexa-come-clean-about-how-much.html
https://tech.dailynews.us.com/
https://tech.dailynews.us.com/
https://tech.dailynews.us.com/2018/05/hey-alexa-come-clean-about-how-much.html
true
4191228214535516123
UTF-8
Loaded All Posts Not found any posts VIEW ALL Read More Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share. STEP 2: Click the link you shared to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy