“Hello, how can I help you?”
“Hi, I’m calling to book a woman’s haircut for a client. Umm, I’m looking for something on August third.”
“Sure, give me one second.”
The exchange sounds pleasant. But it’s no ordinary phone call. Half of the conversation is computer-generated. Can you guess which half? A machine called to make the hair appointment. This example from a recent technology demonstration represents a new challenge for artificial intelligence experts: how—and whether—to let people know there’s a bot on the line.
Tech company Google recently revealed a new computer assistant, Duplex. The assistant made human-sounding phone calls, including the salon call above. Duplex’s speech includes verbal clutter and slang—words like uh and gotcha. Duplex even understands thick accents. At least in the demos, the people don’t seem to realize they’re talking to a machine.
But critics say that kind of mix-up reveals a basic question: Is it fair—or even legal—to trick people into talking to an artificial intelligence (AI) system . . . especially since computers can record all conversations? Recording phone calls without the consent of both the caller and the person being called is already illegal in some places.
Duplex and other automated technologies highlight the non-harmful uses of computer-dialed phone calls, or robocalls. “It’s important to us that users and businesses have a good experience with this service,” Google engineers say.
But sadly, humans are often bent on doing wrong. (Genesis 6:5) So what happens when unethical people program computers to start making calls?
Matthew Fenech researches AI’s effect on policies and laws. He calls the Duplex technology “impressive.” But he warns, “It can clearly lead to more sinister uses.” Fenech envisions nefarious robocalls—ones that spam businesses, scam seniors, or steal the voices of political or personal enemies. (Just think of the fake soundbites a computer could generate!)
Duplex technology is still in the testing stage. Google and other AI developers will need to carefully follow laws about privacy. Those can vary by state or by country. And consumers will need to start being more aware of who—or what!—they’re talking to.