Civil Libertarians Wrong to Fear Biometrics
Civil libertarians are wrong to fear facial recognition and other biometric identity technologies. But, they will fundamentally change the way we must think about privacy and could have very negative consequences for democracy if not regulated correctly.
Civil libertarians are wrong to fear facial recognition and other biometric identity technologies. But, they will fundamentally change the way we must think about privacy and could have very negative consequences for democracy if not regulated correctly, said constitutional law professor Jonathan Turley, George Washington University, at the AFCEA International Federal Identity Forum and Expo in Tampa, Florida.
Facial recognition “is perfectly suited to blow privacy law to pieces,” Turley told the audience in his closing keynote.
He said his message to his fellow civil libertarians was, “We are going to lose this battle if we fight on conventional privacy grounds, I am 100 percent confident of that.” He compared the civil liberties movement to generals fighting the last war and predicted they would lose.
The main reasons, he said, were connected with the historical origins of the right to privacy in the United States.
“Usually from a privacy standpoint, most of our doctrines, and certainly the constitution itself, is designed to deal with the threat from the government...The greatest threat with biometrics is coming from...private products, commercial products,” he said.
Moreover, the case which he called “the foundation of privacy law in this country,” Katz vs. United States, restricted surveillance based on the doctrine that the government couldn’t violate citizens’ “reasonable expectations” of privacy.
But that doctrine created a “downward and rather ruinous” spiral, he said. “If reasonable expectations fall, government authority, ability rises. As government authority and ability rises, reasonable expectations fall,” Turley explained.
Biometric identity techniques weren’t new when Katz was decided, he pointed out, but there had been a technological “quantum leap” in recent years, which meant they could now be deployed on a mass scale. “Much of what we relied upon in the privacy community was the technological barrier to being able to spy on a lot of people at the same time,” he said. “That barrier has collapsed.”
And the U.S. should not look to Europe for a different approach, he argued, because the EU relies on the principle of consent. “And this is the rub: Consent will be given … The public wants the products … Most people in the world are readily consenting to having their facial identities used in products,” said Turley.
Nor should civil libertarians regard the use of face recognition by law enforcement as somehow “evil,” he added. “I don’t like the old system” that facial recognition is replacing, he said. As a trained defense lawyer, “I don’t like eyeballing for identification.” For decades, Turley said, he had fought cases based on police officer identification of a suspect. “The error rate is huge,” he stated, arguing that you would get fewer false or mistaken arrests and prosecutions based on accurate facial matching by new technologies.
“If you give me a 99 percent accuracy rate, I’m not going to reject it. I’m going to regulate it. I’m going to make sure it’s used correctly,” Turley said.
And privacy was the wrong approach to that regulation. “That dog doesn’t hunt,” he said.
Civil libertarians should start by deciding what they want to protect. “We can’t protect anonymity, because there is no anonymity, and the public doesn’t want anonymity,” Turley said, adding, “We don’t live in an anonymous world, we live in a nonymous one"— i.e. a world where everyone’s identity is known.
“What we need to protect is democratic values,” Turley said. “The objective has to be to assure citizens that even though they live in a nonymous world...they are still protected in their public movements and associations,” the professor added. Only that would avoid the chilling effect that would otherwise flow from the growing ubiquity of facial recognition and surveillance tools.
“We have to find ways of obscuring some information [about citizens] to protect the values we’re trying to protect,” Turley concluded. “We’re still going to be in a fishbowl society, but we can obscure the fish.”