Activate This 'Bracelet of Silence,' and Alexa Can't Eavesdrop The bracelet, a cuff with spiky transducers, has 24 speakers that emit ultrasonic signals when the wearer turns it on, at the University of Chicago in Illinois on Feb. 13, 2020. Last year, Ben Zhao decided to buy an Alexa-enabled Echo speaker for his Chicago home. Zhao just wanted a digital assistant to play music, but his wife, Heather Zheng, was not enthused. “She freaked out,” he said. Zheng characterized her reaction differently. First she objected to having the device in their house, she said. Then, when Zhao put the Echo in a workspace they shared, she made her position perfectly clear: “I said, ‘I don’t want that in the office. Please unplug it. I know the microphone is constantly on.’” Zhao and Zheng are computer science professors at the University of Chicago, and they decided to channel their disagreement into something productive. With the help of an assistant professor, Pedro Lopes, they designed a piece of digital armor: a “bracelet of silence” that will jam the Echo or any other microphones in the vicinity from listening in on the wearer’s conversations. The bracelet is like an anti-smartwatch, both in its cyberpunk aesthetic and in its purpose of defeating technology. A large, somewhat ungainly white cuff with spiky transducers, the bracelet has 24 speakers that emit ultrasonic signals when the wearer turns it on. The sound is imperceptible to most ears, with the possible exception of young people and dogs, but nearby microphones will detect the high-frequency sound instead of other noises. “It’s so easy to record these days,” Lopes said. “This is a useful defense. When you have something private to say, you can activate it in real time. When they play back the recording, the sound is going to be gone.” During a phone interview, Lopes turned on the bracelet, resulting in static-like white noise for the listener on the other end. Polite Surveillance Society As American homes are steadily outfitted with recording equipment, the surveillance state has taken on an air of domesticity. Google and Amazon have sold millions of Nest and Ring security cameras, while an estimated 1 in 5 American adults now owns a smart speaker. Knocking on someone’s door or chatting in someone’s kitchen now involves the distinct possibility of being recorded. It all presents new questions of etiquette about whether and how to warn guests that their faces and words could end up on a tech company’s servers or even in the hands of strangers. By design, smart speakers have microphones that are always on, listening for so-called wake words like “Alexa,” “Hey, Siri,” or “OK, Google.” Only after hearing that cue are they supposed to start recording. But contractors hired by device-makers to review recordings for quality reasons report hearing clips that were most likely captured unintentionally, including of drug deals and sex. Two Northeastern University researchers, David Choffnes and Daniel Dubois, recently played 120 hours of television for an audience of smart speakers to see what activated the devices. They found that the machines woke up dozens of times and started recording after hearing phrases similar to their wake words. “People fear that these devices are constantly listening and recording you. They’re not,” Choffnes said. “But they do wake up and record you at times when they shouldn’t.” Rick Osterloh, Google’s head of hardware, recently said homeowners should disclose the presence of smart speakers to their guests. “I would, and do, when someone enters into my home, and it’s probably something that the products themselves should try to indicate,” he told the BBC last year. Welcome mats might one day be swapped out for warning mats. Or perhaps the tech companies will engineer their products to introduce themselves when they hear a new voice or see a new face. Of course, that could also lead to uncomfortable situations, like having the Alexa in your bedside Echo Dot suddenly introduce herself to your one-night stand. ‘No Longer Shunned as Loonies’ The “bracelet of silence” is not the first device invented by researchers to stuff up digital assistants’ ears. In 2018, two designers created Project Alias, an appendage that can be placed over a smart speaker to deafen it. But Zheng argues that a jammer should be portable to protect people as they move through different environments, given that you don’t always know where a microphone is lurking. At this point, the bracelet is just a prototype. The researchers say that they could manufacture it for as little as $20 and that a handful of investors have asked them about commercializing it. “With the internet of things, the battle is lost,” Zhao said, referring to a lack of control over data captured by smart devices, whether it gets into the hands of tech companies or hackers. “The future is to have all these devices around you, but you will have to assume they are potentially compromised,” he added. “Your circle of trust will have to be much smaller, sometimes down to your actual body.” Other precursors to the bracelet include a “jammer coat” designed by an Austrian architecture firm in 2014 to block radio waves that could collect information from a person’s phone or credit cards. In 2012, artist Adam Harvey created silver-plated stealth wear garments that masked people’s heat signature to protect them from the eyes of drones, as well as a line of makeup and hairstyles, called CV Dazzle, to thwart facial recognition cameras. In 2016, Scott Urban, an eyewear-maker in Chicago, developed a line of reflective frames that turned back visible and infrared light. When a surveillance camera films a person wearing the $164 frames, the reflected light blurs out the face. Urban called them Reflectacles. He is working full time on privacy protection eyewear, including a new version with lenses that absorb infrared light to deter iris-scanning and facial recognition cameras. His customers include privacy enthusiasts, political activists and card counters whose faces have been placed on casinos’ watch lists. “People into their privacy are no longer shunned as loonies,” Urban said. “It’s become a concern for people of all ages, political perspectives and walks of life.” He added: “New technologies are continually eroding our privacy and anonymity. People are looking for an opt-out, which is what I’m trying to provide.” Woodrow Hartzog, a law and computer science professor at Northeastern University, doesn’t think privacy armor is the solution to our modern woes. “It creates an arms race, and consumers will lose in that race,” he said. “Any of these things is a half-measure or a stopgap. There will always be a way around it.” Rather than building individual defenses, Hartzog believes, we need policymakers to pass laws that more effectively guard our privacy and give us control over our data. “Until then, we’re playing cat and mouse,” he said. “And that always ends poorly for the mouse.” .