Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
Investigating gender and racial-ethnic biases in sentiment analysis of language
oleh: Steven Zhou, Arushi Srivastava
Format: | Article |
---|---|
Diterbitkan: | Taylor & Francis Group 2024-12-01 |
Deskripsi
Recently, there has been an increase in text analysis and natural language processing for both research and applied practice, especially to quantify emotions in language (i.e. sentiment analysis). Building on different theories of how language and emotions interact and how these interactions differ by gender and race/ethnicity, our study assesses for bias in the use of common sentiment analysis tools (e.g. AFINN, NRC). Specifically, we focus on measurement bias and predictive bias between genders and races/ethnicities using a novel real-world dataset of participant interviews in a simulated multi-day team-based competition. There was no evidence of measurement bias by race/ethnicity, but there were some biases by gender; specifically, females tended to express higher mean levels and more variance in emotion. There was no evidence of predictive bias by gender or race/ethnicity, though the latter was marginally significant. We hope this study paves the way towards more inclusive and accurate analytical tools to help researchers reduce demographic biases in their research. These findings also hold importance for organizations in employing equitable tools to better understand the needs of their diverse customers and employees.