Decolonizing Tech: Interrogating the Impacts of Generative AI on BIPOC Communities
DescriptionWhat are the implications of a Black AI-generated Instagram model run by a non-Black creator? How might voters in communities of color be influenced or manipulated in elections with the use of deepfake images and videos? Over the last decade, the emergence and refinement of artificial intelligence technologies have opened the door to new possibilities. But with this vision of augmented futures and streamlined work processes comes the potential for harm, especially for communities that might be most susceptible to the negative impacts of generative AI tools. Interdisciplinary researchers have revealed the harms of algorithmic negligence on communities of color and the devastating impacts of racial bias in AI tools, as seen by facial recognition being used as policing tools and synthetic media. Although the future holds the promise of being the final frontier in which BIPOC communities are free, there remain necessary questions to ask (as users, developers, researchers, and technologists) about the ways generative AI technologies have the potential to engender new harms. In this Birds of a Feather session, we invite both panelist experts and participants to engage with the following questions: What is generative AI? What are positive implications and negative impacts of generative AI? How has it impacted BIPOC communities thus far and what is our role in helping to mitigate the negative impacts of these tools? Our panelists will include researchers and domain experts across academia and industry, whose backgrounds center around AI fairness research and algorithmic bias. With the growth of generative AI being used as a way to influence millions, it is necessary to contend with the harms that wielding generative AI may present to racially minoritized communities and we anticipate that this panel will help facilitate productive conversations around ways to mitigate it.
Event Type
Birds of a Feather
TimeThursday, September 19th1:45pm - 2:45pm PDT