Mason student presents on AI bias at international conference

Body

Kevin Kuck, a George Mason University mechanical engineering senior, recently presented a paper on generative artificial intelligence (AI) bias at an international conference in Monterrey, Mexico. The paper was an off-shoot of work that Kuck completed for A Seat at the Table, a class examining diversity, equity, and inclusion (DEI) issues in engineering. 

Kuck presented at the World Engineering Education Forum (WEEF) and Global Engineering Deans Council (GEDC) Conference, which is organized annually by the International Federation of Engineering Education Societies (IFEES, housed in the Mason College of Engineering and Computing). The conference attracted hundreds of engineering educators, administrators, students, and other professionals interested in engineering pedagogy and research. 

Kuck said that being exposed to DEI topics got him excited about research possibilities. "It was very eye-opening, and for my final project, I did a case study on the over-sexualization of women who use a popular AI-powered TikTok filter that converts people into anime-stylized characters. I realized the filter was also unintentionally misrepresenting people's gender and race, and it made me see what a big problem this is across many AI systems." He was so fascinated by the topic and its pervasive problems that his paper, Generative Artificial Intelligence: A Double-Edged Sword, stretched to 80 pages, from the assigned two. 

The critical issue that the paper identified was that bias stems from the datasets that AI developers use to inform their AI model's language and output, which are often culturally, ethnically, and racially biased. This is because the companies that create the datasets gather data through massive "scrapes" of the Internet, which are not adequately reviewed.

Professors Leigh McCue-Weil and Christopher Carr encouraged Kuck to submit his paper to the conference. After whittling it down to 10 pages, he was accepted and asked to present. "I originally had 10 minutes," he said. "But one person had to drop out, and the moderators didn't stop me, so I hit all my major points." 

Five people gather around a student presenting a poster at a conference
Kuck said that there was tremendous interest in AI bias among attendees.

The event was Kuck's first international trip. He enjoyed Monterrey's architecture and nightlife and found the city to be very attractive. "All the food was delicious," he added. The networking opportunities were also significant. "I approached Paul Gilbert, the CEO of Quanser, and he already knew who I was and said that he had read up on my paper. We had a great conversation about AI bias." Kuck also discussed his research with a panel of AI professionals and many attendees and presented a poster summarizing his research. 

During the conference, Kuck befriended the leadership team of the Student Platform for Engineering Education Development (SPEED); he joined the organization and was appointed webmaster and photographer, a role which may potentially include an expense-paid trip to the 2024 WEEF and GEDC Conference in Sydney, Australia. 

Regarding the future of AI bias, Kuck said, "Developers haven't been tackling it logically. They need to return to where the datasets are first implemented instead of trying to attack the problem after it's been embedded. For example, some AI developers have tried filtering the datasets to mitigate the problem. Still, those filters can cut out good examples of diversity and, in turn, have been shown to increase bias." 

As part of his research, Kuck developed a novel framework for addressing bias in AI systems and proposed ways in which we can begin to solve the problems. "If we begin tackling these issues now and commit to changing how datasets are created and implemented, we can cut down on the severity and frequency of bias being exhibited by AI systems so that they can be more equitable and reflective of the diversity of our world."