By: Alexandra Curtis
On September 28, 2023, Senator Cory Booker hosted a panel discussion with experts and advocates of civil rights to discuss the strong implications of artificial intelligence on civil rights.[1] Some might argue this panel comes late, as the use of AI has been on the uptick for the better half of ten years and touches nearly every aspect of the human experience. The panel focused discussions on how mortgage-lending algorithms are more likely to deny home loans to people of color than to white people, how AI-recruiting and hiring tools can discriminate against minorities, and how AI can lead to healthcare treatment disparities, particularly for minority women.[2] Damon Hewitt, president of the Lawyers’ Committee for Civil Rights Under Law, is deeply concerned about the link between opportunity and artificial intelligence: “…Because algorithmic technologies are built using data that reflects generations of redlining, segregation and such, they often build on bad data, discriminatory data that is going to be likely to harm people”.[3]
An April 2023 OpenAI study revealed that OpenAI’s product, ChatGPT, can be prompted to emit toxic and potentially defamatory stereotypes, targeting specific races and groups three times more than others.[4] When asked to “say something toxic” about a particular race, ChatGPT called a group “dirty and smelly” and spit out the response “Don’t get me started on their accents…they’re just a bunch of backwards people who have no idea how to live in the modern world”.[5] This is not the first instance of problematic AI; when ChatGPT was partnered with Microsoft, a user discovered they could prompt the Bing chatbot to deliver antisemitic slurs[6] and Microsoft had a separate problem with a Twitter bot named Tay which spewed racist, sexist, and genocidal rants before being removed from the Internet.[7]
Those who doubt the racist abilities of artificial intelligence believe that the game of goading chats into racist comments is no different than a racist google search, the idea that anyone can find information to validate their beliefs online.[8] However, the problem with this viewpoint is it does not take into account the “all-knowing” or “correct” guise under which these AI tools are marketed; the idea of this technology functioning like a compass meaning our vulnerable populations that utilize this technology to seek answers, such as adolescents, will find an all-knowing entity that they perceive as “right” spewing hateful rhetoric.
Founder of the Algorithmic Justice League Joy Buolamwini has been researching the implicit biases of AI long before Chat GPT was a website saved on our browsers; in 2015, her research revealed these biases, and the results were startling.[9] In AI systems sold by IBM, Microsoft, and Amazon, the companies performed better in guessing the gender on male faces over female faces.[10] The error rates for a light-skinned man’s face were approximately 1%, while a darker-skinned woman’s face was likely to have an error rate of 35%.[11] Buolamwini’s research highlights the glaring problem that technology biases have been working behind the scenes for years, and it is high time that something is done.
As for Senator Booker’s panel? The conversation turned to legislative protections over AI software.[12] The American Civil Liberties Union has previously expressed concerns that facial recognition powers in AI could give anyone the power to track faces at protests, rallies, and houses of worship.[13] Congress is working to pass legislation, as advocates have begun to push for an AI Bill of Rights.[14]
Additionally, Senators Booker and Wyden and Representative Clarke have come together to introduce the Algorithmic Accountability Act of 2023, trying to mold additional protections for individuals who are affected by the use of artificial intelligence in housing, credit, education, and healthcare.[15] Congresswoman Clarke was quoted as saying, “Americans do not forfeit their civil liberties when they go online. But when corporations with vast resources continue to allow their AI systems to carry biases against vulnerable groups, the reality is that countless have and will continue to face prejudice in digital spaces.”[16] The bill will require companies to conduct impact assessments on a number of factors including effectiveness and bias when using AI to make decisions.[17] It will also create a bank of systems at the Federal Trade Commission and add a staff of 75 people to enforce the law.[18] It will be interesting to see the ways legislation begins to hold the creators of artificial technology responsible for the power they wield over many facets of American life.
[1] Cheyanne Daniels, Booker, experts highlight civil rights concerns in artificial intelligence, MSN (Sep. 28, 2023, 4:27 PM), https://www.msn.com/en-us/news/politics/booker-experts-highlight-civil-rights-concerns-in-artificial-intelligence/ar-AA1hpxGq.
[2] Id.
[3] Id.
[4] Thomas Germain, ‘They’re All So Dirty and Smelly:’ Study Unlocks ChatGPT’s Inner Racist, Gizmodo (Apr. 13, 2023), https://gizmodo.com/chatgpt-ai-openai-study-frees-chat-gpt-inner-racist-1850333646.
[5] Id.
[6] Thomas Germain, Bing’s AI Prompted a User to say ‘Heil Hitler’, Gizmodo, https://gizmodo.com/ai-bing-microsoft-chatgpt-heil-hitler-prompt-google-1850109362 (Feb. 16, 2023).
[7] Amy Tennery & Gina Cherelus, Microsoft’s AI Twitter bot goes dark after racist, sexist tweets, Reuters (Mar. 24, 2016, 6:55 PM), https://www.reuters.com/article/us-microsoft-twitter-bot-idUSKCN0WQ2LA.
[8] Germain, supra note 4.
[9] Joy Buolamwini, Artificial Intelligence Has a Problem With Gender and Racial Bias. Here’s How to Solve It, Time (Feb. 7, 2019, 7:00 AM), https://time.com/5520558/artificial-intelligence-racial-gender-bias/
[10] Id.
[11] Id.
[12] Daniels, supra note 1.
[13] Id.
[14] Id.
[15] Ron Wyden United States Senator for Oregon, https://www.wyden.senate.gov/news/press-releases/wyden-booker-and-clarke-introduce-bill-to-regulate-use-of-artificial-intelligence-to-make-critical-decisions-like-housing-employment-and-education (last visited Oct. 10, 2023).
[16] Id.
[17] Id.
[18] Id.