Document Type



As this Article sets forth, once a computerized algorithm is used by the government, constitutional rights may attach. And, at the very least, those rights require that algorithms used by the government as evidence in criminal trials be made available—both to litigants and the public. Scholars have discussed how the government’s refusal to disclose such algorithms runs afoul of defendants’ constitutional rights, but few have considered the public’s interest in these algorithms—or the widespread impact that public disclosure and auditing could have on ensuring their quality.

This Article aims to add to that discussion by setting forth a theory of the public’s First Amendment right of access to algorithms used as evidence in criminal trials. This Article uses probabilistic genotyping programs as an illustrative example, largely because the creators of these algorithms have most aggressively pushed to keep them secret. Section I begins by defining the relevant terms, including computerized algorithms, probabilistic genotyping program, machine learning, and source code.

Section II describes the roles that humans play in designing, building, operating, and communicating the results of such algorithms—and the variety of errors and mistakes that almost inevitably result. Section III summarizes caselaw articulating the public’s First Amendment right of access and suggests how and why that right should attach to computerized algorithms used as evidence in criminal trials.