Podcast
Charlotte Alexander, Georgia State University – Text Mining for Bias: A Recommendation Letter Experiment
How do we hire a more diverse workforce?
Charlotte Alexander, associate professor of legal analytics at Georgia State University, details how to avoid bias when looking for new employees.
Charlotte S. Alexander is an associate professor of legal analytics at Georgia State University’s Robinson College of Business and director of its Legal Analytics Lab, which is a joint initiative with the Robinson College of Business the university’s College of Law. Alexander is a recipient of the Distinguished Early Career Faculty Award from the Academy of Legal Studies in Business in 2016 and was also named to the Fastcase 50 list of Legal Innovators. Prior to her academic career, Alexander worked as an employment lawyer and represented women facing discrimination and harassment on the job.
Text Mining for Bias: A Recommendation Letter Experiment
How can organizations identify and prevent gender bias in hiring? What are some actionable tips for people who frequently give recommendations?
My research involved manual and computational text analysis of over 3,000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. I found key differences in how men vs. women were being described. Letters in support of men emphasized achievement, professionalism, excellence, and technical skills, while letters in support of women focused on communal skills, like compassion, caring for others, and acts of service.
These gendered language differences can have broad implications for job applicants, employers and people writing recommendation letters. Gendered language in recommendation letters may impact an employer’s hiring process, which could implicate Title VII of the Civil Rights Act of 1964.
The findings also prompt a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.
The problem can be remedied at the employer level by screening recommendation letters and phone calls for gendered language to prevent it from influencing the rest of the hiring process. Employers could implement a computationally-driven system to help identify and eradicate bias. Reference checks over the telephone can also be monitored for gender bias using transcription technology. Employers can review reference call transcripts to identify gendered usage of language, similar to what occurs in recommendation letters.
As for people who write recommendation letters, academics in particular may want to rethink wording that reinforces stereotyped occupational archetypes, such as using adjectives that align with communal skills and acts of service to describe female candidates but not male candidates.
The post Charlotte Alexander, Georgia State University – Text Mining for Bias: A Recommendation Letter Experiment appeared first on The Academic Minute.