A Field-specific Analysis of Gender and Racial Biases in Generative AI
Presenter(s)
Joseph K. West and Meagan Kaufenberg-Lashua (student co-presenter)
Abstract
Four artificial intelligence (AI) image generators were utilized to produce "faces" of chemists with varying occupational titles. Images were analyzed for representational biases and compared to data available from National Science Foundation (NSF). Presentational biases were analyzed within American Chemical Society (ACS) Diversity, Equity, Inclusion, and Respect guidelines with a particular focus on presentation of diversity and disability. Amplification of both representational and presentational biases was observed for all four AI generators despite alignment of demographic trends with NSF and ACS reporting. Influences of occupationally-tied prompts (specific to chemistry) on demographic distributions of AI-generated images were investigated. At least one AI image generator assigned women and racial minorities to "assistant" positions while males and whites occupied the "top" positions in the field. Our data also demonstrates erasure of people with visible disabilities in the AI-generated outputs.
Perspectives of current students in chemistry classes at Winona State University on 'what a chemist looks like' have been collected and analyzed. The disturbing prevalence of a 'white male' image of a chemist, even for students identifying as female or a person of color, was evidenced.
College
College of Science & Engineering
Department
Chemistry
Campus
Winona
First Advisor/Mentor
Joseph West
Location
Oak Rooms E/F - Kryzsko Commons
Start Date
4-18-2024 11:40 AM
End Date
4-18-2024 11:59 AM
Presentation Type
Oral Presentation
Format of Presentation or Performance
In-Person
A Field-specific Analysis of Gender and Racial Biases in Generative AI
Oak Rooms E/F - Kryzsko Commons
Four artificial intelligence (AI) image generators were utilized to produce "faces" of chemists with varying occupational titles. Images were analyzed for representational biases and compared to data available from National Science Foundation (NSF). Presentational biases were analyzed within American Chemical Society (ACS) Diversity, Equity, Inclusion, and Respect guidelines with a particular focus on presentation of diversity and disability. Amplification of both representational and presentational biases was observed for all four AI generators despite alignment of demographic trends with NSF and ACS reporting. Influences of occupationally-tied prompts (specific to chemistry) on demographic distributions of AI-generated images were investigated. At least one AI image generator assigned women and racial minorities to "assistant" positions while males and whites occupied the "top" positions in the field. Our data also demonstrates erasure of people with visible disabilities in the AI-generated outputs.
Perspectives of current students in chemistry classes at Winona State University on 'what a chemist looks like' have been collected and analyzed. The disturbing prevalence of a 'white male' image of a chemist, even for students identifying as female or a person of color, was evidenced.