By definition from the 115th Congress of the United States, AI appears to be working for the people, for it is “designed to act rationally.” If AI is to act rationally, can we give it the power to take over impressionable jobs like policing our cities?
Facing the Facts is a collection of web experiences that informs the public about the racial biases in facial recognition programs. Not many people realize that cities have implemented Ai systems into their police departments. Predictive policing uses facial recognition to give police officers tips on locations, people to be attentive to, and people to prosecute. Machine learning analyzes recurring patterns and makes predictions based on previous data. This data is historical — meaning it is inherently racist.
“What Would I Score,” tells the story of a victim being wrongly accused by an algorithm. It places the user in front of the camera to force them to look at themselves, reflecting on what the percents placed over their faces means for them and others. “Facing the Facts,” is an interactive experience of research gathered over the past year. The 3D helix includes interactive websites and videos that advocate for the removal of facial recognition in policing. We have a right to privacy and equal opportunity. If we can learn how it's being obstructed, we can make a change.