Erasing Alexa | God's World News

Erasing Alexa

01/01/2020
  • 1 Alexa
    Uh-oh! Did Alexa hear that? (AP)
  • 2 Alexa
    Tell Alexa to erase what it has heard. (AP)
  • 1 Alexa
  • 2 Alexa

“Forget I said that.” Ever wish you could erase what comes out of your mouth? Amazon’s Alexa virtual assistant not only hears but also records every word . . . and saves it too. Amazon has settings to let users erase voice recordings automatically. But the company warns that deletions hamper Alexa’s service. That forces users to choose between privacy and quality.

The idea that someone (or something!) might be listening to everything we say seems alarming. It has many folks either worried or creeped out. Christians know that God is omniscient. He sees and hears all—without high-tech help. But unlike people and businesses, He has intentions for us that are always for good. The Alexa debate is a reminder to “let no corrupting talk come out of your mouths.” (Ephesians 4:29)

Alexa is one of several smart services that responds to user questions and commands. When Alexa (or Siri or Google . . .) hears a “wake word,” it knows a request is coming: “Alexa, will it rain today?” “Hey Siri, play ‘The World and Everything in It’ podcast.” Virtual assistants (VAs) can make to-do lists, give news and weather updates, play music, set alarms, and much more. The services can also control smart devices like lights and thermostats.

VAs “learn” about users from requests. To improve service (and, ahem, marketing), the companies behind the VAs save voice recordings—sometimes indefinitely.

That practice raises concerns with privacy experts. Companies often use human reviewers to sort through some requests. This human involvement causes concern. Unscrupulous employees could leak private details embedded in the voice commands or in conversations that the VA “overhears.”

Previously, most users had to delete recordings manually. Now Alexa users can change the settings to have Amazon automatically delete recordings either immediately or after a certain length of time. But automatic erasure triggers a warning about degrading Alexa's ability to respond or understand.

Apple, Microsoft, Google, and Amazon all claim they strip personal info from recordings before reviewers see them. They also say that a real person reviews less than one percent of conversations with smart assistants. But while assuring users that the listening is no big deal, VA makers are either increasing their warnings about the recordings or putting other safeguards—including strict hiring policies—in place.

Today’s tech users are starting to realize what makes smart devices so smart: They’re always listening—and often watching too. Users will need to decide whether they’re okay with devices constantly eavesdropping on their lives.