security

U.S. Border Protection app causes tech headaches for asylum seekers – Marketplace


Last month, the Joe Biden administration unveiled a new strategy in its use of a mobile phone app for asylum seekers. It allows them to schedule appointments to enter the country.

Since the scheduling feature launched, thousands of migrants coming from Latin America have been scrambling to sign up. But many are finding the app full of glitches.

The app, called CBP One, is part of the toolkit of Customs and Border Protection, an agency of the Department of Homeland Security. If asylum seekers want to get into the United States, they have to use it.

Cristian Valencia is one of them. He was staying at a migrant shelter in Tijuana, Mexico, and running into technical difficulties.

“The app keeps crashing, the screen stays frozen,” he said in Spanish.

The shelter’s Wi-Fi isn’t strong enough for the app, Valencia added. 

CBP One is the administration’s latest effort to address the migrant crisis. The government has already invested in facial recognition and geotracking technologies to process new migrants at the border and surveil those inside the country.

Migrant advocates said there are obvious problems with the app. It’s only available in Spanish and English, and some migrants don’t have a smartphone or reliable internet access. Without those, they can’t sign up for an asylum screening.

“Whether or not you get an appointment is based on the strength of your internet connection and chance,” said Erika Pinheiro, executive director of Al Otro Lado, a migrant support organization.

CBP One essentially makes seeking asylum like trying to buy tickets to a Taylor Swift concert, she added.

Readers Also Like:  XDR Cybersecurity: Technologies and Best Practices - Security Boulevard

“It works like Ticketmaster,” she said. “So when you have a concert that is going to sell out, everyone presses the button at the same time, and some people get tickets and some people don’t. That’s basically what CBP One has reduced the asylum system to.”

A Black woman holds her phone up to her face to take a self portrait photo.
Maria, a Haitian migrant, struggled to get the CBP One app to recognize her face. (Matthew Bowler/KPBS)

Pinheiro and civil liberties advocates also have concerns about the app’s facial recognition feature. They point to studies showing that kind of technology tends to do a poor job of identifying people of certain races and ethnicities.

“There are really high error rates with certain races, especially Black and Asian applicants,” Pinheiro said. “So we would expect that people who are not white are going to have a harder time with the facial recognition feature.”

This played out on a recent day outside Tijuana’s City Hall. An elderly woman named Maria struggled with the app. We’re only using her first name because she said she was fleeing persecution in Haiti.

CBP One failed to distinguish her dark skin tone from the dark background. She tried to face a window for better lighting, but it still didn’t work.

Assuming Maria eventually got the app to work, she’ll still have to wait at least a couple of weeks. All of the January appointments had already been booked.

“Marketplace Tech” reached out to Customs and Border Protection about the CBP One app. A spokesperson said the agency is aware of some of its technical challenges and will “continue to enhance the mobile application as additional improvement opportunities arise.”

There’s a video of Gustavo Solis’ and his colleague Matthew Bowler’s reporting at the border on the website of member station KPBS. Click here to see it.

Readers Also Like:  Cypago raises $13 million for cyber governance, risk, and ... - CTech

As for the issue of migrants having trouble with facial recognition, that is unfortunately not related to the CBP One app. A growing body of research finds bias embedded in facial recognition systems.

A study from Stanford and the Massachusetts Institute of Technology published in 2018 found that facial recognition programs perform best on light-skinned men — with an error rate of less than 1%.  

Facial recognition systems performed worst on dark-skinned women, getting it wrong more than a third of the time.

One of the study’s authors, Joy Buolamwini, was on this show discussing the issue in 2021. She said then that the communities most likely to be subject to surveillance with technology like facial recognition are the same communities whose members are most likely to be identified inaccurately.

She stressed the importance of getting feedback, as these apps are being developed, from the people she called the “excoded” — those who stand to suffer the most when the technology goes wrong.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.