Blog post 2 & Dirty Data: The Revolution will be Digitized ( Week 2)
Dirty Data and Bad Predictions was an exceptionally interesting read for me due to my first exposure and insight into the policing system in the United States.
This text summarizes how predictive policing is conducted using ‘dirty data’ from times and moments in history when policing was charged for corruption, misinformation of cases, racism, reaching a certain quota of arrests, false charges, and arrests made for the desire of public funding or other politically beneficial environments. The data that is generated and subsequently used throughout the criminal justice system, is without adequate transparency, accountability, oversight, or public engagement which makes it hard enough to predict the extent to which this dirty data is involved into making the dirty predictions.
The writers looked into 13 jurisdictions to analyze to what extent dirty data had been used in deploying predictive technologies to predict either where a crime can occur in a given time window and who will be involved in the crime as a victim or perpetrator. This piece of read covers three out of the 13 jurisdictions in further details; Chicago, New Orleans and Maricopa County.
Chicago: Has indulged in investigations and legal challenges, including evidence of over one hundred cases of CPD torturing Black men between and a lawsuit challenging CPD’s inequitable deployment of police to emergency calls in neighborhoods with higher minority populations. During these years of investigations of and challenges to CPD practices and policies, CPD developed the Strategic Subject List (SSL), a computerized assessment tool that incorporates numerous sources of information to analyze crime as well as identifies and ranks individuals at risk of becoming a victim or possible offender in a shooting or homicide. Having said that, this deployment of SSL was a recipe to dirty predictions as these predictions were derived off data that was distorted, influenced, misinformed and bias.
New Orleans: The Department of Justice’s Investigation Report documented evidence of “dirty data.” The report identified inconsistencies in NOPD arrest and field interviews, thus concerning NOPD policies that encouraged unwarranted data collection. In 2012, the City of New Orleans signed Palantir to use its data analysis for police prediction. There is limited information about the Palantir system as the contract was signed without the knowledge of key government officials and the public. Evidence also suggests an that the Palantir system relied on NOPD’s dirty data because the system’s prediction reflected similar biases of that of NOPD’s.
Maricupo Country: Department of Justice released an investigation documenting MCSO’s discriminatory behavior against Latino residents including unlawful stops and arrests. They also noted that MCSO created a “wall of distrust” that “substantially compromised effective policing by limiting the willingness of witnesses and victims to report crimes and speak to the police about criminal activity,”. There is also a case where lack of transparency makes it difficult to know whether data was used in any local predictive policing system. The Mesa Police Department in Maricopa County entered a three-year contract with the predictive policing software company PredPol, which required the police department to provide local crime data.
Confirmation feedback loops: This term is interesting as it highlights the realities of crime and public safety. As communities observe increased police presence in marginalized communities it reinforces assumptions and stereotypes.
For Can you make AI fairer than a judge, it made me realize how distorted not only algorithms and data analytic softwares are, but also the way biases could lie among the systems of communities these algorithms and data exist within. Here, the prison system is just one example of the way a lot of predictive systems are used and simply relied upon for making big picture decisions and that inaccuracy of these results are not the algorithms alone.
The google thesis read was equally insightful in understanding how a large scale design is presented globally and what techniques goes behind presenting such a magnitude of data. What really intrigued me about this read is thinking about how to justify so much information without giving it an order or some form of hirearchy?
I picked Plato’s The Laws for my blog of a Utopian philosophy. In the book, Plato lays out different laws and policies in a fictional city called Magneisa. Plato’s perspective says that effective law makers need to persuade citizens of Magnesia (The Republic) and not simply imply commands and policies to them by through, making this one of the major themes of the innovations in the political theory of the Laws. This brings us to the relationship between the Laws and the Republic. Recently, it has been understood the Republic is Plato’s statement of what the ‘ideally’ best city is (his utopia); the Laws, whereas he describes the best city is given less optimistic assumptions because of what the human nature is capable of. Plato thought that its ideal city of Magneisa placed too great demands on its citizens, way beyond what they could psychologically comprehend.