Denied by Design: How Algorithms Reinforce Housing Inequality

By Ashley Romay

Artificial intelligence (AI) is a transformative and versatile tool that replicates human intelligence at incredible speed. But like any technology, its impact is dependent on the manner it is utilized. At first glance, this tool promises efficiency and precision, yet its widespread application, particularly in the housing industry, should raise caution. The use of data-driven tools should be devoid of bias, promoting equitable and inclusive housing opportunities, yet algorithms based on biased data have the pervasive impact of reinforcing racial and socioeconomic inequities through a seemingly “objective” tool. AI tenant screening tools obtain and analyze housing applications by evaluating the applicant’s credit history, employment, rental history, civil and criminal records, as well as proprietary risk scores.[1] These reports, in turn, are used by housing providers to determine whether to accept or deny an applicant. Although it is undisputed that housing providers have a right to select tenants who will lawfully comply with the terms of the lease, providers have a responsibility to provide a fair and transparent process, and AI screening tools, based on “imprecise or overbroad criteria,” unjustifiably exclude people from housing opportunities, therefore posing a legal challenge to the Fair Housing Act. [2]

The Fair Housing Act prohibits discrimination by direct providers of housing, explicitly barring facially discriminatory screening practices.[3] However, the use of AI introduces a more subtle challenge; rather than being an intentionally discriminatory practice, AI algorithms create an unjustified discriminatory effect by having a disparate impact on applicants, amplifying longstanding systemic inequalities in credit scores, employment, and criminal justice. Historically, marginalized communities have experienced unequal access to socioeconomic opportunities and resources, factors that clearly impact one’s credit score, employment status, and rental history. In addition to barriers to entry to opportunities, these communities face yet another economic obstacle: error-ridden credit reports. Individuals in Black and Hispanic neighborhoods are “far more likely to have disputes of inaccurate information appear on their credit reports”[4] or have “experiences that resulted in low or no credit scores.”[5] Moreover, eviction history is another criterion of screening tools that disproportionally affects Black and Hispanic renters who “have eviction cases filed against them at higher rates than White renters, with women [in] those groups bearing even higher disparities.”[6] When AI systems generate reports by replicating algorithms embedded with data that is a “mirror of inequalities of the past,” an unjustified discriminatory impact arises. [7]

The difficulty of proving an unjustified discriminatory impact, however, is rooted in the obscure nature of the AI tool and absence of transparency behind its decisions. Yet, this hasn’t stopped cases from emerging in the courts. In an ongoing case, Jacksonville Area Legal Aid filed a suit, challenging an AI screening tool for disproportionately affecting Black renters by generating a blanket-denial to housing applicants with previous eviction filings.[8] Similarly, in Open Communities v. Harbor Group Management Co., a settlement was reached after plaintiffs alleged violations of the Fair Housing Act, claiming an AI screening tool was discriminatory by automatically denying applications using Housing Choice Vouchers.[9] As AI use becomes widespread, there is no question as to whether these cases will continue to make their way to the courts. 

So, where does this leave us? The current impact of AI screening tools—instituting invisible barriers to further restrict access to housing to marginalized communities—circumvents the protections of the Fair Housing Act. Although AI is becoming the backbone of critical decision-making and it is impractical to abandon the tool altogether, reform is necessary. Access to stable housing creates better educational opportunities, contributes to healthier habits, and forms stronger economic foundations; denying individuals this access is a critical issue of social injustice.[10] That said, stronger oversight, diversity of data,  and audits of algorithmic tools are essential to increase transparency, detect bias in AI models, and ensure housing screening policies are transparent, accurate, and fair. Without change, these tools will continue to preserve cycles of housing inequality for those who need it the most.


[1] Consumer Financial Protection Bureau, Tenant Background Checks Market Report (Nov. 2022), https://files.consumerfinance.gov/f/documents/cfpb_tenant-background-checks-market_report_2022-11.pdf.

[2] U.S. Dep’t of Hous. & Urb. Dev., Guidance on Screening of Applicants for Rental Housing, https://www.hud.gov/sites/dfiles/FHEO/documents/FHEO_Guidance_on_Screening_of_Applicants_for_Rental_Housing.pdf.

[3] U.S. Dep’t of Justice, The Fair Housing Act, https://www.justice.gov/crt/fair-housing-act-1.

[4] Consumer Financial Protection Bureau, CFPB Finds Credit Report Disputes Far More Common in Majority Black and Hispanic Neighborhoods (Nov. 2, 2021), https://www.consumerfinance.gov/about-us/newsroom/cfpb-finds-credit-report-disputes-far-more-common-in-majority-black-and-hispanic-neighborhoods/.

[5] Consumer Financial Protection Bureau, supra note 1.

[6] Sophie Beiers, Sandra Park & Linda Morris, Clearing the Record: How Eviction Sealing Laws Can Advance Housing Access for Women of Color, ACLU (Jan. 10, 2020), https://www.aclu.org/news/racial-justice/clearing-the-record-how-eviction-sealing-laws-can-advance-housing-access-for-women-of-color.

[7] Natalie Campisi, From Inherent Racial Bias to Incorrect Data: The Problems with Current Credit Scoring ModelsForbes (Feb. 26, 2021), https://www.forbes.com/advisor/credit-cards/from-inherent-racial-bias-to-incorrect-data-the-problems-with-current-credit-scoring-models/.

[8] Charlie McGee, Judge Rules JWB Must Face Trial Over Alleged Use of Algorithms Against Black Renters, Jacksonville Legal Aid (Jun. 10, 2024), https://www.jaxlegalaid.org/2024/06/10/judge-rules-jwb-must-face-trial-over-alleged-use-of-algorithms-against-black-renters/.

[9] Cheryl Lawrence & Dominic Voz, Press Release: Open Communities Reaches Resolution in Case Alleging AI Discrimination, OPEN COMMUNITIES (Jan. 31, 2024), https://www.open-communities.org/post/press-release-open-communities-reaches-accord-in-case-addressing-artificial-intelligence-communicat.

[10] Veronica Gaitán, How Housing Can Determine Educational, Health, and Economic Outcomes, Housing Matters an urban Institute initiative (Sep. 19, 2018), https://housingmatters.urban.org/articles/how-housing-can-determine-educational-health-and-economic-outcomes.