Close

A Nightmarish Army of Unblinking Spies

By: Mike Maharrey

The surveillance state constantly expands. That thing that seems like no big deal today can suddenly become a big deal as technology evolves.

Take the proliferation of surveillance cameras. We’ve come to accept electronic eyes recording our every move like a normal part of life. Most of the time we hardly even notice the cameras. At some level, we may realize we’re being recorded, but we figure nobody will ever actually watch the footage. Even with cameras everywhere, we feel relatively safe in our anonymity.

But how would you feel if you knew somebody was monitoring every camera pointed in your direction 24/7. Scrutinizing your every move. Judging your every action. Noting whom you associate with and scouring your facial expressions for signs of suspicious behavior?

We’re rapidly getting to that place.

Of course, we’re not talking about human “somebodies.” We’re talking about artificial intelligence – “AI agents” capable of scouring video footage every second of every day and flagging “suspicious” behavior.

The ACLU recently released a report on the rapidly growing use of “video analytics” as a surveillance tool. As the ACLU puts it, AI has the potential to turn every-day surveillance cameras into a “nightmarish army of unblinking watchers.”

What we found is that the capabilities that computer scientists are pursuing, if applied to surveillance and marketing, would create a world of frighteningly perceptive and insightful computer watchers monitoring our lives. Cameras that collect and store video just in case it is needed are being transformed into devices that can actively watch us, often in real-time. It is as if a great surveillance machine has been growing up around us, but largely dumb and inert — and is now, in a meaningful sense, “waking up.”

According to the report, police and government intelligence agencies have used AI to develop “anomaly detection” algorithms that can pick up on “unusual,” “abnormal,” “deviant,” or “atypical” and flag such individuals for further scrutiny. As the ACLU reports, this could have far-reaching ramifications and brings with it tremendous potential for abuse.

Think about some of the implications of such techniques, especially when combined with other technologies like face recognition. For example, it’s not hard to imagine some future corrupt mayor saying to an aide, “Here’s a list of enemies of my administration. Have the cameras send us all instances of these people kissing another person, and the IDs of who they’re kissing.” Government and companies could use AI agents to track who is “suspicious” based on such things as clothing, posture, unusual characteristics or behavior, and emotions. People who stand out in some way and attract the attention of such ever-vigilant cameras could find themselves hassled, interrogated, expelled from stores, or worse.

AI also raises concerns about accuracy. We’ve already heard about problems with facial recognition systems misidentifying people – particularly minorities. As the ACLU puts it, “Many or most of these [AI] technologies will be somewhere between unreliable and utterly bogus.”

The interconnectedness of the U.S. surveillance state magnifies danger and the threat to your privacy these systems pose. If a local camera happens to flag you, you will almost certainly end up in national databases accessible by police and government officials across the U.S. Federal, state and local law enforcement agencies can share and tap into vast amounts of information gathered at the state and local level through fusion centers and a system known as the “information sharing environment” or ISE.

George Orwell’s Big Brother would drool over the all-encompassing surveillance system quietly under construction in the United States. Cameras equipped with facial recognition technology. monitored by “AI agents,” and linked to federal, state and local databases can track your every move just by pointing a camera at your face. It effectively turns each of us into a suspect standing in a perpetual lineup.

Police operate these camera systems with little oversight and oftentimes in complete secrecy.

With their rapid proliferation, the potential for abuse and the threat to basic privacy rights posed by camera surveillance, state and local governments need to make oversight and placing limits on law enforcement use of facial recognition a top priority. At the least, law enforcement agencies should be required to get local government approval in a public meeting before obtaining facial recognition technology. The TAC’s Local Ordinance to Limit Surveillance Technology covers this.