police discrimination, police accountability, predictive policing, auditing, algorithms, disparate impact
Predictive policing technology has been adopted by police departments in major cities throughout the US, and while its use continues to spread, the legal implications of its use have not been thoroughly examined. Predictive policing algorithms may be biased against minorities and other vulnerable groups, which may lead police departments to pursue policing strategies that harm these groups. This article proposes that legislatures should fill the current gap in legal oversight by requiring external audits of predictive policing algorithms as used by police departments. These audits would examine the algorithms for bias using up-to-date technological tools and release the results in a public report, thereby holding predictive policing companies and police departments accountable as well as providing transparency and protection for the public.
BYU ScholarsArchive Citation
"Auditing Predictive Policing,"
Brigham Young University Prelaw Review: Vol. 33, Article 4.
Available at: https://scholarsarchive.byu.edu/byuplr/vol33/iss1/4