Text Mining for Bias: A Recommendation Letter Experiment
Publication Title
American Business Law Journal
Document Type
Article
Publication Date
Spring 2022
Abstract
This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.
Recommended Citation
Charlotte S. Alexander, Text Mining for Bias: A Recommendation Letter Experiment, 59 Am. Bus. L.J. 5 (2022).
Institutional Repository Citation
Charlotte Alexander,
Text Mining for Bias: A Recommendation Letter Experiment,
Faculty Publications By Year
3270
(2022)
https://readingroom.law.gsu.edu/faculty_pub/3270
Volume
59
First Page
5
Comments
Lexis