Quantcast

The many warts of facial recognition technology

I had lunch in Washington, D.C., a few weeks ago with a woman who told me an unremarkable story that I can’t get out of my head.

She had been in Asheville, North Carolina, shopping for shoes with a friend who suffers from Morton’s neuroma or plantar fasciitis — I can’t remember which. The idea was to find a shoe with ameliorative properties, and she stumbled onto a brand promising just that.

“How about this one?” she called to her friend an aisle or two away. She was instantly waved off. “Ug-ly,” her friend mouthed. And that was that.

Until the next day when the woman who had held the shoe began getting online ads for it. Wherever she browsed, the shoe ad followed. At first, she couldn’t figure it out. It would be one thing if she had done an online search for the brand — the shoe company would then be able to identify her as a potential customer — but she hadn’t. What she had done, though, was stand by a particular shoe rack in a particular store while holding her mobile phone. The company had geotagged it, and, subsequently, her. Her hobbling friend, who had stood just 15 feet away from her in the store, got no ads.

We’ve all been aware of technology like this for some time but this woman’s tale still startled me. It forced me to consider how much data I give away to Google, my seven Alexas ( yes, I have a problem) and in my business trips through GPS and other transportation records. Each of us is leaving a digital trail that could be malappropriated in a sinister world. But rather than think too much about that, I typically whisper one of three things to myself and keep kicking it to Alexa: “My life isn’t that interesting,” “I have nothing to hide” or “this is how they catch the terrorists and crooks.” I suspect most of us do that.

Last week in California, though, something titillating happened that’s already kicking off a larger conversation. The San Francisco Board of Supervisors banned government use of facial recognition software within city limits. In typical San Francisco fashion, the ban was enacted for a far-out, left-wing reason — to stymie Immigration and Customs Enforcement in this case — but I can’t help feeling joyful about the vote just the same. It’s a reminder that we’re still allowed to push back in this country. Just because it’s possible to do something, doesn’t mean we have to do it, or have it done to us. Shortly after the San Francisco vote, a bill was introduced in Albany that would ban the use of facial recognition technology by landlords in residential buildings.

This will be an increasingly important consideration in the immediate years ahead as surveillance, bioengineering and other life-altering technologies proliferate. When you consider the efficacy of Moore’s law — computer chips doubling in capacity every two years — it’s hard to imagine what 2030 or 2040 might look like.

Already we’re tracked pretty much everywhere we go. The average Londoner is reportedly caught on camera more than 300 times per day; city dwellers in the United States go relatively unscathed with just 70 daily camera appearances, according to estimates. Our license plates are routinely logged on highways, and speed cameras are so accurate that they can catch you picking your nose if you’re not careful. (A portrait of yours truly was mailed to me by Iceland’s Reykjavik Police Department a couple of years back. Only scratching it — swear!)

When we read about this type of technology in China or North Korea, we gasp at the thought of what a totalitarian state might do with the data. But here, we tell ourselves it’s all about catching bad guys, rarely thinking that officials looking to inhibit our freedoms could one day be peering through those camera lenses, making Morton’s neuroma seem like a walk in the park.

William F.B. O’Reilly is a consultant to Republicans.