Digital policing

Digital Policing: Facial Recognition Software and Community Resistance

Webinar video | Biographies | Resources

Featuring: Tawana Petty (Data for Black Lives), Deborah Raji (Mozilla), Ann Cavoukian (Metropolitan Toronto). Hosted by: Luke Stark (Western).

25 February 2021, 7PM

How can we come together to push back on the deployment of facial recognition technologies by police forces, schools, and other civic institutions? What are the best strategies for the successful abolition of these and other carceral technologies? 

The second in our Big Data at the Margins series examines how the digitization and datafication of the criminal Justice system has intersected with the development and deployment of AI-driven technologies like facial recognition and predictive policing. Police forces in Canada have been eager to use facial recognition to identify and arrest, raising major concerns surrounding data privacy and the civil rights of the accused. Civil society activists ranging from the Water Protectors of Standing Rock to the Black Lives Matter activists of this past summer’s uprisings against policy brutality and the carceral have been similarly targeted for FRT surveillance by law enforcement authorities. And algorithms used in the US criminal justice system to predict recidivism have drawn international condemnation for their potential for bias against Black defendants. This intensification of policing via digital tools has been met by stiff resistance by communities across North America, calling not only for many of these technologies to be banned, but also for the broader dismantling of the irredeemably racist elements of the carceral state. 

Our internationally recognized panelists will address the impacts of facial recognition technologies on individual privacy, the perpetuation of algorithmic bias against already disadvantaged groups, and the struggle for community data justice. The award-wining work of Deborah Raji, incoming Mozilla Fellow and collaborator with the Algorithmic Justice League, has highlighted the racial bias intrinsic to computer vision systems. Ann Cavoukian, former Information & Privacy Commissioner of Ontario and one of the world’s leading privacy experts, has pioneered Privacy by Design, a framework to ensure human privacy is protected via digital technologies. And Tawana “Honeycomb” Petty, co-founder of Our Data Bodies and  former director of the Data Justice Program at Detroit Community Technology Project, is one of North America’s foremost community advocates and social justice activists pushing both for the abolition of carceral technologies and for grassroots community-led digital life. 

This event is funded with the generous assistance of the Faculty of Information & Media Studies, Western Research, and the Social Sciences and Humanities Research Council of Canada. 

Back to top | Webinar video | Biographies | Resources

Biographies

Deborah Raji is an incoming Mozilla fellow, interested in topics of algorithmic auditing and evaluation. She has worked closely with the Algorithmic Justice League initiative on several award-winning projects to highlight cases of bias in computer vision. She has also worked with Google’s Ethical AI team and been a research fellow at the Partnership on AI and AI Now Institute at New York University working on various projects to operationalize ethical considerations in Machine Learning engineering practice. 

Ann Cavoukian is recognized as one of the world’s leading privacy experts. She served an unprecedented three terms as the Information & Privacy Commissioner of Ontario, Canada. There she created Privacy by Design, a framework that seeks to proactively embed privacy into the design specifications of information technologies, networked infrastructure and business practices, thereby achieving the strongest protection possible. In 2010, International Privacy Regulators unanimously passed a Resolution recognizing Privacy by Design as an international standard. Since then, PbD has been translated into 39 languages. Dr. Cavoukian has received numerous awards recognizing her leadership in privacy, including being named as one of the Top 25 Women of Influence in Canada, named among the Top 10 Women in Data Security and Privacy, named as one of the ‘Power 50’ by Canadian Business, named as one of the Top 100 Leaders in Identity, and most recently, Dr. Cavoukian was awarded the Meritorious Service Medal for her outstanding work on creating Privacy by Design and taking it global (May, 2017).

Tawana “Honeycomb” Petty is a mother, social justice organizer, youth advocate, poet and author. She is intricately involved in water rights advocacy, data and digital privacy rights education and racial justice and equity work. She is former director of the Data Justice Program at Detroit Community Technology Project, co-founder of Our Data Bodies, a convening member of the Detroit Digital Justice Coalition, an anti-racism facilitator with Detroit Equity Action Lab, a Digital Civil Society Lab fellow at the Stanford Center on Philanthropy and Civil Society (PACS) and director of Petty Propolis, a Black woman led artist incubator, primarily focused on cultivating visionary resistance through poetry, literacy and literary workshops, anti-racism facilitation, and social justice initiatives.

Back to top | Webinar video | Biographies | Resources

Resources

With AI and Criminal Justice, the Devil is in the Data” by Vincent Southerland (American Civil Liberties Union)

Facebook doesn’t seem to mind that facial recognition glasses would endanger women” by Arwa Mahdawi (The Guardian)

What Clearview AI did Was illegal, but don’t play down the RCMP’s role in it” by Jamie Duncan and Alex Luscombe (Huffington Post)

Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks” by Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kircher (ProPublica)

ICE just signed a contract with facial recognition company Clearview AI” by Kim Lyons (The Verge)

To surveil and predict: A human rights analysis of algorithmic policing in Canada by Kate Robertson, Cynthia Khoo, and Yolanda Song (University of Toronto Citizen Lab)

Use of facial recognition technology by police growing in Canada, as privacy laws lag” by David Burke (CBC News)

Canadians can now opt out of Clearview AI facial recognition, with a catch: Controversial U.S. firm requests picture to identify other images of any user in database” by Thomas Daigle (CBC News)

Wrongfully accused by an algorithm” by Kashmir Hill (The New York Times, behind paywall)

Garbage in, garbage out: Face recognition on flawed data” by Clare Garvie (Georgetown University Law’s Center on Privacy and Technology)

Racial discrimination in face recognition technology” by Alex Najibi (Harvard University’s Science in the News)

There is a crisis of face recognition and policing in the US” by Tate Ryan-Mosley (MIT Technology Review)

From facial recognition, to predictive technologies, big data policing is rife with technical, ethical and political landmines” by John Lorinc (Toronto Star)

Your face is not your own” by Kashmir Hill (The New York Times Magazine)

Bias in facial recognition isn’t hard to discover, but it’s hard to get rid of”, interview with Joy Buolamwini, MIT Media Lab (Marketplace)

Facing bias in facial recognition technology” by Brianna Rauenzahn, Jamison Chung, and Aaron Kaufman (University of Pennsylvania Law School’s The Regulatory Review)

Automated anti-Blackness: Facial recognition in Brooklyn, New York” by Mutale Nkonde (Harvard Kennedy School Journal of African American Policy, Harvard University’s John F. Kennedy School of Government)

AI technologies – like police facial recognition – discriminate against people of colour” by Jane Bailey, Jacquelyn Burkell and Valerie Steeves (The Conversation)

Back to top | Webinar video | Biographies | Resources