security

TSA's Biometric Screening May Not Be Optional for Long – Reason


The Transportation Security Administration’s facial recognition program is voluntary, for now. If you don’t want the federal government to scan your face each time you fly via a program going nationwide in 2023, you are, at least in theory, allowed to decline.

But eventually, TSA Administrator David Pekoske admitted in mid-March, biometric screening will cease to be optional. The TSA wants to make sure “that this system has as little friction as it possibly can,” Pekoske said, and, apparently, assertion of a basic constitutional right to (some remnant of) privacy is just too much friction.

It’s remarkable how casually Pekoske describes eliminating a simple opt-out procedure—one TSA representatives have previously touted as proof that all is above board—for a major new surveillance technology being foisted upon the public.

It’s maybe more remarkable how utterly useless our lawmakers have been in restricting the agency here, and that includes members of Congress who object to the biometrics rollout. Even a February letter from five senators, which casts the program as a threat to democracy itself, wields no bigger stick than urging the TSA to pause and answer a few questions.

The way this is going, TSA facial scanners will become a new example of a too-common pattern in state surveillance and law enforcement more broadly: What isn’t banned often becomes mandatory.

It’s easy to understand how this happens. It’s the result of an unconstrained interplay of technological advances and human fear. Better safe than sorry, we say when a new policing technology drops. Why wouldn’t you want law enforcement to have that tool at hand for catching criminals? we ask. Why would you hobble the TSA as they try to protect us from terrorists? Seems kind of suspicious that you don’t want to take advantage of everything science can offer to keep us safe.

Readers Also Like:  16 Practical Ways To Ensure Companywide Teams Keep Their Software Up To Date - Forbes

This aspect of human nature isn’t new, of course, but the tech available to us certainly is. A Thursday New York Times report on a police conference in Dubai would’ve read like science fiction or a cheap spy novel not so long ago. “A brain wave reader that can detect lies. Miniaturized cameras that sit inside vape pens and disposable coffee cups. Massive video cameras that zoom in more than a kilometer to capture faces and license plates”—isn’t that James Bond stuff?

In more authoritarian countries whose pursuit of “zero crime” is unhampered by constitutional limits, it’s not. A United Arab Emirates official told the New York Times that a “headset that is said to detect when a part of the brain concerning memories is activated” is “useful during interrogations to determine if a suspect was lying.” In Dubai, police already have a “citywide facial recognition program called Oyoon—Arabic for eyes—[which] can pull the identity of anyone passing one of at least 10,000 cameras, linking to a database of images from airport customs and residents’ identification cards.” Private businesses are required to contribute footage from their own security cams.

Absent preemptive legal protections, ideally crafted after we get a good sense of where tech advances are heading but before some crisis sends us whirling into a do something panic, similarly invasive and dystopian technology will probably be adopted here, too.

The airport face scans are a good case study in how this works. The TSA is pursuing the program on the strength of the 2001 Aviation and Transportation Security Act, which was passed just eight days after the 9/11 attacks and authorized use of biometric technology “for providing access control and other security protections for closed or secure areas of the airports.”

Readers Also Like:  PMI Contraction Is Bad News for the UK, Good News for Stocks - Bloomberg

But the biometrics tech of 22 years ago was not the biometrics tech we have today. Reports from the time speak of one-time fingerprinting for known travelers, and while facial recognition did exist, it was hardly the artificial intelligence–enhanced screening of 2023. Proposals from two decades ago tended to involve searching anonymous crowds for a small group of known terrorists. That’s meaningfully different from the organized, ID-linked facial recognition the TSA is now implementing—matching a photo of each traveler to their image in a massive database of government-issued IDs—evidently without any further congressional sign-off or oversight.

Pekoske said the reason the TSA isn’t saving every traveler photo it takes is just that agency itself decided not to do so. When asked under what conditions the TSA’s biometrics use would expand—perhaps “when you get approval from the stakeholders in Congress,” the interviewer helpfully prompts—he spoke of popular acceptance nudged upwards through sheer familiarity, not any explicit direction from legislators. That is: The public will accept it because the TSA is doing it already, and the TSA will do it more once the public accepts it.

But hey, don’t worry, they’re not saving the pictures. They value our trust. And our representatives haven’t banned this stuff. That’s why the TSA can make it mandatory.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.