I recently had lunch in Washington, D.C., with a woman who told me an unremarkable story that I can’t get out of my head.
She had been in Asheville, North Carolina, shopping for shoes with a friend who suffers from Morton’s neuroma or plantar fasciitis — I can’t remember which. The idea was to find a shoe with ameliorative properties, and she stumbled onto a brand promising just that.
“How about this one?” she called to her friend. “Ug-ly,” her friend mouthed. And that was that.
Until the next day when the woman who had held the shoe received online ads for it. Wherever she browsed, the shoe ad followed. What she had done was stand by a particular shoe rack in a particular store while holding her cellphone. The company geotagged it, and her. Her hobbling friend, who had stood 15 feet away from her, got no ads.
This woman’s tale forced me to consider how much data I give away to Google, my seven Alexas (yes, I have a problem) and in my business trips through GPS and other travel records. Each of us leaves a digital trail that could be malappropriated in a sinister world.
Last week, San Francisco officials banned government use of facial recognition software in the city. In typical San Francisco fashion, the ban was enacted to stymie ICE but I can’t help feeling joyful about the vote just the same. It’s a reminder that we’re still allowed to push back in this country. Shortly after the vote, a bill was introduced in Albany that would ban the use of facial recognition technology by landlords in residential buildings.
This will be an increasingly important consideration as surveillance, bioengineering and other life-altering technologies proliferate.
Already we’re tracked pretty much everywhere we go. The average Londoner is reportedly caught on camera more than 300 times per day; U.S. city dwellers go relatively unscathed with just 70 daily camera appearances, according to estimates. Our license plates are routinely logged on highways and speed cameras.
When we read about this technology in China or North Korea, we gasp at the thought of what a totalitarian state might do with the data. But here, we tell ourselves it’s about catching bad guys, rarely thinking that officials looking to inhibit our freedoms could one day peer through those camera lenses, making Morton’s neuroma seem like a walk in the park.
William F.B. O’Reilly is a consultant to Republicans.