Skip to main content

February 5, 2026

Bipartisan bill would ban ICE and CBP from using facial recognition

Electronic Privacy Information Center
Daily Hampshire Gazette
Office of Rep. Pramila Jayapal
Greenfield Recorder
Congress.gov
+4

NEC Corporation''s app matches field photos against 1.2 billion faces for ICE

The ICE Out of Our Faces Act was introduced on February 5, 2026 as a bicameral bill in both chambers. Senate sponsors are Edward MarkeyEdward Markey (D-MA), Jeff MerkleyJeff Merkley (D-OR), and Ron Wyden (D-OR). House sponsor is Pramila Jayapal (D-WA). Co-sponsors include Angela Alsobrooks (D-MD) and Bernie SandersBernie Sanders (I-VT). The bill would ban ICE and CBP from acquiring or using facial recognition and biometric surveillance technology entirely.

Every piece of biometric data already collected through these programs would be destroyed under the bill. Unlike passwords or ID numbers, faces cannot be changed — making collected biometric templates a permanent privacy liability. The private right of action created by the bill allows individuals to sue ICE and CBP directly in federal court, and grants state attorneys general standing to sue on behalf of their residents.

NEC Corporation built Mobile Fortify under a $23.9 million DHS contract awarded in 2025. The app lets ICE and CBP agents photograph anyone they encounter in the field and instantly match that photo against a database of approximately 1.2 billion facial images, returning real-time identification results. Since its deployment in June 2025, agents have used it more than 100,000 times — making this a fully operational surveillance system, not a pilot.

DHS deployed Mobile Fortify without completing the Privacy Impact Assessments mandated by the E-Government Act of 2002, which requires such reviews before any federal agency deploys technology collecting personally identifiable information. The requirement exists precisely to catch risks like racial bias and mission creep before deployment. DHS has not disputed that the assessments were skipped.

NIST's Face Recognition Vendor Test evaluated 189 algorithms from 99 developers and documented higher false-positive rates for Black and Asian faces compared to white faces in most systems. MIT researcher Joy Buolamwini's Gender Shades study found error rates of 34.7% for darker-skinned women versus 0.8% for lighter-skinned men. In immigration enforcement, a false positive can trigger detention or deportation proceedings against a misidentified citizen or lawful resident.

The bill faces steep odds with Republicans controlling both chambers. As of February 2026, no Republican has co-sponsored either the House or Senate version. Republican leadership has consistently supported expanding ICE enforcement capabilities, and restricting immigration enforcement tools is unlikely to receive committee hearings or floor votes under current leadership.

The bill is endorsed by the Electronic Frontier Foundation, Electronic Privacy Information Center, ACLU, and Fight for the Future. EFF senior policy analyst Matthew Guariglia warned that facial recognition is 'dangerous, invasive, and an inherent threat to civil liberties' and called the bill the right response to a system that was 'deployed on the American public before we could stop it.'

🤖AI GovernanceCivil Rights🔒Digital Rights💡Technology🏢Legislative Process

People, bills, and sources

Edward Markey

Edward Markey

U.S. Senator (D-MA), Lead Senate Sponsor

Pramila Jayapal

U.S. Representative (D-WA), Lead House Sponsor

Jeff Merkley

Jeff Merkley

U.S. Senator (D-OR), Senate Co-Sponsor

Ron Wyden

U.S. Senator (D-OR), Senate Co-Sponsor and Ranking Member, Senate Finance Committee

Bernie Sanders

Bernie Sanders

U.S. Senator (I-VT), Co-Sponsor

Kristi Noem

U.S. Secretary of Homeland Security

Joy Buolamwini

AI Researcher and Founder, Algorithmic Justice League; MIT Media Lab

Matthew Guariglia

Senior Policy Analyst, Electronic Frontier Foundation

Angela Alsobrooks

U.S. Senator (D-MD), Co-Sponsor