Close
Current temperature in Boston - 62 °
BECOME A MEMBER
Get access to a personalized news feed, our newsletter and exclusive discounts on everything from shows to local restaurants, All for free.
Already a member? Sign in.
The Bay State Banner
BACK TO TOP
The Bay State Banner
POST AN AD SIGN IN

Trending Articles

James Brown tribute concert packs the Strand

The Boston Public Quartet offers ‘A Radical Welcome’

Democratic leaders call for urgent action in Haiti

READ PRINT EDITION

Council seeks curb on facial recognition

Commissioner says police will not use technology in the near future

Kenneal Patterson
Council seeks curb on facial recognition
A Boston Police Department surveillance camera on Province Street in downtown Boston. BANNER PHOTO

Boston city councilors are pushing to ban facial recognition, a surveillance software that disproportionately misidentifies people of color and may violate civil rights and basic privacies.

“When it comes to facial recognition tech, it doesn’t work,” said Councilor Ricardo Arroyo during a hearing on June 9. “It’s not good. It’s been proven through the data to be less accurate for people with darker skin.”

A recent MIT examination of facial analysis software revealed that the technology has an error rate of 0.8% for light-skinned men, but 34.7% for dark-skinned women.

Cities nationwide have already banned facial recognition technology. San Francisco implemented the ban in May of 2019. In Massachusetts, Cambridge, Brookline, Northampton, Somerville and Springfield have since adopted similar bans. The computer hardware company IBM recently announced that it would no longer offer, develop or research facial-recognition technology. The company’s decision was motivated by protests against discriminatory police practices and the murder of George Floyd, a Minneapolis resident who was suffocated by a white police officer.

In Boston, Councilor Michelle Wu originally scheduled the hearing a month ago, and noted the issue is only growing more urgent.

“It just so happened that the timing of it now has lined up with a moment of great national trauma,” she said. “And in this moment the responsibility is on each one of us to step up to truly address systemic racism and systemic oppression.”

Councilor Julia Mejia agreed.

“We’re in a time where our technology is outpacing our morals,” she said. “We’ve got a 2020 technology with a 1620 state of mind.”

The Boston Police Department does not currently use facial recognition software. The ordinance would prevent city officials from using the technology in the future without community consent. It would prohibit mass surveillance of the millions of protestors that have taken to the streets since Floyd’s death.

“The department, for the record, does not currently have the technology for facial recognition,” said Police Commissioner William G. Gross. “As technology advances, however, many vendors have and will continue to incorporate automated recognition abilities.”

Gross said that the department would not use the technology until it becomes more accurate. He said that as technology advances and becomes more reliable, the police department would like to consider using it to respond to specific crimes and emergency situations.

Although the BPD has no desire to generally surveil Boston’s residents, said Gross, the department notes a distinction between facial surveillance systems and facial recognition technology. He said that facial recognition technology may be useful to the city with the right safeguards and community input.

“Video has been proven to be one of the most effective tools for collecting evidence of criminal offenses, resolving crimes and locating missing and exploited individuals,” he said. “Any prohibitions on these investigative tools without a full understanding of potential uses under strict protocols could be harmful and impede our ability to protect the public.”

Gross said that he wants to work with the City Council to define language permitting certain uses. He added that facial recognition technology with well-established guidelines and under strict review would be helpful to the city. But he noted that the technology is behind.

“I’m telling you right now, as an African American male, the technology that is in place today does not meet the standards of the Boston Police Department, nor does it meet my standards,” he said.

The facial recognition ban in several Massachusetts cities followed a campaign spearheaded by the American Civil Liberties Union (ACLU). Kade Crockford, ACLU’s Technology for Liberty Program director, cited the technology’s racial and gender bias issues during a press conference prior to the hearing.

“In the absence of any regulations protecting basic civil rights, civil liberties and constitutional rights, we do not believe that any city agency in the city of Boston ought to be using the technology,” she said. Advocates of the ban have noted that facial recognition technology is frequently used by totalitarian governments to spy on residents.

Crockford said that the technology is largely unregulated, and that cities across the nation are using surveillance without community consent.

“What we’re doing here at the city level is drawing a firm line in the sand to say that this technology is not going to creep into government use in Boston without democratic debate and oversight,” she said.

Facial surveillance technology also raises concerns over monitoring students in schools. Jessica Tang, president of the Boston’s Teachers Union (BTU), said that the technology fails to accurately identify children as their faces change.

“We need less policing in schools, not more,” she said.

“Face surveillance in schools will contribute to the school-to-prison pipeline, threatening children’s welfare, educational opportunities and life trajectories,” agreed BTU’s Erik Berg. He added that the technology also misclassifies transgender people and will have a harmful impact on transgender youth.

Facial surveillance software also harms immigrant families, he added.

“Immigrants are already fearful of engagement with public institutions, and face surveillance systems would further chill student and parent participation in our schools,” he said.

Karina Ham, a member of the Student Immigrant Movement, told the Banner that facial recognition would make it easier for schools to racially profile, target and surveil immigrant students.

“The school system, at least in BPS, [is] already in collaboration with ICE and other federal agencies,” Ham said.

Tang added, “It’s really important that schools are safe places to learn, where families can go without fear.”

The BPD’s contract with the surveillance software company BriefCam expired last month. The new contract could include updated surveillance features, but Gross said that department would not yet use facial recognition surveillance.