The Journal of
Cognitive Liberties

This article is from Vol. 2, Issue No. 3 pages 47-54
© 2001 CENTER FOR COGNITIVE LIBERTY AND ETHICS
All rights reserved worldwide.  ISSN: 1527-3946

 

 

 

 

Face It, They’re Watching You

nessie

The Federal Bureau of Investigation—Central Intelligence Agency—Drug Enforcement Administration—Bureau of Alcohol, Tobacco and Firearms—National Security Agency constitute America’s de facto gestapo. They would like nothing better than to have us believe they mainly spend their time and our money protecting warm fuzzy kittens from those heinous fiends at Bonsaikitten.com, keeping kids off drugs, and trying to convince those woolly headed do-gooders in Congress to let them look over our shoulders while we surf the Net so that bogey man extraordinaire Osama bin Laden doesn’t slip another one past them and kill more innocent Americans. They also like to look like they are keeping a lid on domestic ecosabotage and the depredations of militant animal-rights activists.

Don’t believe it. Oh sure, they do make an effort to protect us. We are, after all, relatively valuable livestock. But their primary role is to keep us in line. This cannot be done by brute force alone. There are simply too many of us, and we are too well armed. So instead they rely on information, informers, and information technology in order to stay one step ahead of us. So far, it’s working. These people like nothing better than to keep track of our numbers, our locations, and activities. The virtual panopticon is closing in around us at an alarming rate. The renaming of its components and the concealment of its processes, fool only the most naive.

It used to be that those of us who weren’t criminals or political activists could expect to be able to conduct our lives without being subject to government surveillance. Those days are over. Now even sports fans are being subjected to treatment once reserved for criminal suspects. Fans who lined up to attend Super Bowl XXXV were, without their knowledge, standing in a virtual lineup. According to the Los Angeles Times on Feb. 1, 2001:

 

Hidden cameras scanned each of their faces and compared the portraits with photos of terrorists and known criminals of every stripe.

In a command post at Raymond James Stadium in Tampa, Fla., the digitized images of fans and workers were cross-checked against files of local police, the FBI and state agencies at the rate of a million images a minute.

The cameras identified 19 people with criminal histories, none of them of a "significant" nature. [Tampa police spokesman Joe] Durkin said the department wanted to screen for pickpockets and other potential scam artists drawn to the huge event and for potential terrorists who wanted to use its worldwide TV and radio audience to make a political statement ...

No arrests were made that day. But, Durkin said, "it alerted us that they were there. It confirmed our suspicions that a crowd of this magnitude would attract people trying to take advantage of the situation."

 

Typically for corporate news, this story is only partly true. Obviously, it wasn’t cameras that compared the portraits. It was face-recognition software (FRS). The Times also neglects to question how the police confirmed their suspicions without making arrests. Can they read minds? Questioning the police, or authority in general, is not something the corporate media does well or often. If that’s what you want, you’d better go to the anti-corporate media instead. At Indymedia, the news about what was done to the fans at Super Bowl has sparked vigorous debate about how FRS can be deceived, or “spoofed” as it's called in the trade. Sports themselves have become the subject of long-overdue discussion.

Long overdue as well, is recognition by the public of the threat that FRS, and artificial intelligence in general, presents to political activism. Though you’d never know it from the Times’ account of Super Sunday, FRS is nothing new. Back in 1997, Science Daily reported:

 

Computer "eyes" are now up to such tasks as watching for fugitives in airline terminals and other busy locations. A sophisticated face-recognition system that placed first in recent Army competitive trials has been given the added ability to pick out faces in noisy or chaotic "street" environments.

The new Mugspot software module developed at the University of Southern California automatically analyzes video images, looking for passers-by. When it finds them, it picks out the heads in the images and then tracks the heads for as long as they remain in the camera's field...

This face-recognition software, developed at USC and the University of Bochum, Germany, and now in commercial use for clients such as Germany's Deutsche Bank, is robust enough to make identifications from less-than-perfect face views. It can also often see through such impediments to identification as mustaches, beards, changed hair styles and glasses – even sunglasses.

Take note of that date. As well as being a technology with many commercial applications, artificial-intelligence software such as FRS is of great use to the military and intelligence communities. It is not at all atypical for technology with military and intelligence applications to exist for 10, 20, even 30 or more years before reaching the commercial market (if at all). The entire dynamic of identity disguise at public demonstrations must be reevaluated in the light of FRS, and that reevaluation must be backdated considerably. The calculus has changed. Those of you who still wonder why political activists might want to conceal their identity need only to read history. Start with COINTELPRO. Even a cursory perusal will set you straight. As recently as last year’s political conventions, the arbitrary, preemptive arrests of those who the state sees as leaders of dissent illustrated the enormous threat to liberty that FRS represents when it is in the wrong hands. And make no mistake about it, it is in the wrong hands.

FRS programs mimic the way that the human brain recognizes a face. They electronically analyze the distances between various parts, or landmarks, of the face. Every face has its own distinct pattern, so the information enables the programs to distinguish one individual from another. Facial landmarks are on distinctive structures, such as the eye sockets, the bridge of the nose or the cheekbones. Facelt, one of Mugspot’s competitors, defines the face as having 60 landmarks. According to its developers, Facelt takes only 14 of these landmarks to reconstruct an individual’s distinctive facial pattern.

Since FRS software makes such effective use of bone structure, a ski mask or bandanna probably won’t defeat it. If it can see through a beard and sunglasses, how much good do you think a rag over your face is going to do? A loose, rubber mask may spoof FRS, but don't bet your freedom, or even your life, on it. No one who takes an active role in organizing public dissent is safe from the withering gaze of techno-repression. Toss Echelon, Carnivore, Prosecutor’s Management Information System (or PROMIS), and High-Definition TV into the mix, and it’s a whole new world. Today, anyone who does more with his political convictions than grumble into his beer is, of necessity, forced to consider his or her personal life to be an open book. People’s opinions, appearance, and even location, is a matter of record. These records can be cross matched, sometimes with life-altering results.

Modern information technology, especially artificial intelligence, has redefined forever the economics of surveillance. No longer is the tedious, expensive, and intrinsically subjective work of the human mind required. The days of three shifts a day, 24-7, trench coat-and-sunglasses-wearing teams working for scale are over. Today, even as innocuous an expression of one’s objection to the tyranny of our rulers as kvetching over the Internet, is not too expensive to investigate. Artificial intelligence has made the cost of conducting surveillance virtually negligible. It has made truly effective mass covert surveillance a possibility for the first time in history. The powers that be not only admit to using covert surveillance on innocent citizens, they brag about it. They are justifiably proud of themselves. But that’s not why they are bragging. They are bragging to send us a message.

Covert mass surveillance has been a long-standing, front burner project since before we were born. SS chief Heinrich Himmler, for example, was a notoriously obsessive collector of records about minutia. He was supposedly asked once what possible value there could be in knowing that a “Private so-and-so did KP duty on such-and-such a night.” He is said to have answered, “One never knows.”

Not only do our rulers now employ artificial intelligence to keep track of what we are doing, they have apparently begun using it to predict what we will do in the future. This is called behavioral-recognition software. If it’s not already in use, it’s in the pipeline. They seem to be trying to break this to us gently. Last April, we were permitted to learn that TASC, a subsidiary of defense giant Litton Industries, was joining with Loronix Information Systems to co-develop a state-of-the-art digital video technology that employs software to find behavioral patterns in video images.

The proposed technology will allow retailers to catch shoplifters before they ever take an object, capture the image of people performing a fake “slip and fall” for an illegal lawsuit, and clean up a spill before an accident occurs. Law enforcement could use such intelligent video technology to spot erratic traffic patterns, such as cars moving at high speeds, irregular turning, or other atypical traffic behavior. By using intelligence extracted from the video, law enforcement officials could proactively manage problem spots by isolating trends before problems got out of hand. Highway officials, Loronix points out, could also monitor critical safety areas like railroad crossings more effectively. Imagine getting a ticket for an infraction you haven’t even committed yet.

It gets worse. They are now teaching computers to hunt in packs. According to EurekAlet, an NEC Institute-Penn State study shows that computer programs, known as autonomous agents, not only can evolve their own language and talk with one another, but also can use communication to improve their performance in solving the classic predator-prey problem. Like kids playing hide and seek, the autonomous agents used in the study hunted for and found their prey faster and more efficiently if they communicated with one another. Who, we must wonder, are these packs being taught to hunt?

Because the technology does not simply “look” for an object or an individual, security teams at airports and casinos can use it to spot a person’s irregular behavior. If it can detect suspicious behavior in an airport, it can detect suspicious behavior at a demonstration. What, exactly is “suspicious behavior” in the government’s eyes, anyway?

Here—as compiled by Center for Constitutional Rights lawyer David Cole in Insight—are reasons the DEA has actually given in court for targeting people:

 

Arrived in the afternoon

Was one of the first to deplane

Was one of the last to deplane

Deplaned in the middle

Purchased ticket at the airport

Made reservation on short notice

Bought coach ticket

Bought first-class ticket

Used one-way ticket

Carried no luggage

Carried small bag

Carried a medium-sized bag

Carried two bulky garment bags

Carried two heavy suitcases

Carried four pieces of luggage

Disassociated self from luggage

Traveled alone

Traveled with a companion

Suspect was Hispanic

Suspect was a black female

Acted too nervous

Acted too calm

Walked quickly through the airport

Walked slowly through the airport

Walked aimlessly through the airport

Imagine having your face recognized in a crowd, instantly cross matched by a computer program with a record of every time you have interfaced with the Internal Revenue Service; the Department of Motor Vehicles; and local, state, and federal law enforcement; with a profile of your political opinions as expressed over the Internet; with your current credit rating; with a list of your last six months of telephone traffic; with your home address; and with all the same information about your friends, family, and associates, and anybody else who came up in the search.

Now imagine all that information being used to predict what you will do next. Imagine what happens if the program thinks that whatever it thinks you are going to do rates proactive intervention. Imagine being then subjected to a preemptive strike by the jack-booted thugs of the state.

Now imagine what would happen if it wasn’t your face that alarmed the software, but the face of someone who looked like you, only the software couldn’t tell the difference. It could happen. Sooner or later, it will happen. It might sound like science fiction, but it’s not. It’s life in the world today. It’s not even a secret. It’s a brag. Welcome to the New World Order.

 

 

jclcover1.jpg (4845 bytes)

Learn more about subscribing to the print version

____________________________________
nessie is an independent investigator, who can be found on the Internet at: http://www.sfbg.com/nessie/index.html. A version of this article first appeared at sfbg.com, the website of the San Francisco Bay Guardian.