TIME Magazine
When Victor Collins was found dead, floating faceup in his friend James Bates’ hot tub in Bentonville, Ark., one chilly morning in November 2015, police were quick to suspect foul play. Broken glass littered the patio, and blood was splattered on a brown vinyl pool cover nearby. But in a subsequent investigation, which led police to indict Bates, 32, for Collins’ murder, some of the most crucial evidence was gleaned not only from the crime scene but from an array of Internet-connected devices in Bates’ home.
Data from his “smart” utility meter, for example, indicated that someone had used 140 gal. of water between 1 a.m. and 3 a.m., a detail that seemed to confirm investigators’ suspicions that the patio had been hosed down before they arrived. Records from Bates’ iPhone 6s Plus, which required a passcode or fingerprint to unlock, suggested he had made phone calls long after he told police he’d gone to sleep. And audio files captured by Bates’ Echo, Amazon’s popular personal assistant that answers to “Alexa,” promised to offer police a rare window into Bates’ living room the night Collins died.
The case, which goes to trial in July, marks the first time ever that data recorded by an Echo, or any other artificial intelligence–powered device, like Google’s Home or Samsung’s smart TV, will be submitted as evidence in court. The move has alarmed tech analysts and privacy advocates nationwide. The issue is not only that these new devices are equipped with so-called smart microphones that, unless manually disabled, are always on, quietly listening for a “wake word,” like “Alexa” or “Hey, Siri.” It’s also that these now ubiquitous microphones live in our most intimate spaces: in our living rooms and kitchens, on our bedside tables. In a world in which these personal assistants are always listening for our voices and recording our requests, have we given up on any expectation of privacy within our own homes?