Massachusetts Institute of Technology, or MIT, is at it again. This time, a tech team says it has developed a virtual assistant headset that can capture and translate human thoughts.
The headset, called AlterEgo, can’t actually read your mind. But it can interpret the words you intentionally think inside your head and then translate those to another device. For instance, if you think “Hey, Siri,” the headset could interpret that and send a message to activate Siri, the virtual assistant used in Apple operating systems.
Here’s how it works. AlterEgo is a wearable technology headset. It’s noninvasive, but it fits onto a person’s face so that sensors will pick up tiny nerve signals the brain sends to the muscles that operate the jaw and face. When the wearer intentionally thinks a thought in words, such as “What is 25 percent off on $37?” the headset recognizes from the brain’s signals the words the jaw and face would be forming.
At that point, the message can be transferred to another assistant wirelessly and silently. It could contact a computer calculator app to do that math and respond with the answer in the headset’s earpiece. Or it might send the message to an assistant, like Google, to look it up and respond. If desired, the message could be transferred to an audio speaker and broadcast to anyone in earshot. All that happens without the wearer even opening his or her mouth!
At present, AlterEgo can read internalized thoughts with 92 percent accuracy. The MIT team says as the technology continues to improve, it will be possible to control any computer with the mind only.
In one test, a wearer used the headset while playing chess. He internally described each move his opponent made. The headset interpreted those thought words and sent them to a chess strategy program. After each move, the program reported back into the headset’s earpiece the next best move for the wearer.
Potential uses for the headset include military secret operations—in which it may be beneficial for personnel to communicate silently and receive detailed instructions the same way. It could give an artificial voice to those who are unable to speak out loud. Those are potentially good uses. But knowing human nature and its tendency to corrupt every good thing for personal gain (see Genesis 6:5), others worry the device might be used wrongly. It could make cheating on tests harder to detect. Even worse, some envision how AlterEgo could be misused to forcefully extract information that another might not willingly impart.