Enhancing Fairness and Accuracy in Tenant Screening: Navigating the Challenges and Opportunities of AI-Driven Processes

Tenant screening processes have increasingly relied on artificial intelligence (AI) algorithms to review and evaluate potential renters. These algorithms analyze various data points, including criminal background checks, eviction histories, credit scores, and rental payment history. As highlighted in "How Automated Background Checks Freeze Out Renters" by Lauren Kirchner and Matthew Goldstein in the New York Times, the rapid expansion of the tenant screening industry—now valued at $1 billion—has been fueled by the growing data economy and the rise of American rentership since the 2008 financial crisis. According to a 2022 Consumer Financial Protection Bureau report, around 68% of renters pay application fees for tenant screening, which significantly influences the U.S. rental market.

The rise of corporate landlords and centralized property management systems has led to a reliance on AI-driven tenant screening services, prioritizing efficiency over accuracy. However, this technological shift raises significant concerns about the fairness and accuracy of the reports produced. AI-driven background checks often contain errors, such as incorrect criminal records, which can unjustly disqualify potential renters from securing housing. These errors lead to significant life disruptions, forcing individuals into substandard living conditions or homelessness. Moreover, these AI systems disproportionately affect marginalized groups, particularly people of color, by exacerbating existing societal biases and reinforcing racial disparities in housing access. The lack of stringent regulation overseeing these AI tools compounds these issues.

Despite existing laws and agencies like the Fair Housing Act of 1968, the Fair Credit Reporting Act of 1970, the Federal Trade Commission (FTC), the Consumer Financial Protection Bureau (CFPB), and the Department of Housing and Urban Development (HUD), discrimination remains rampant in the housing market. These laws and agencies do not adequately protect consumers from the consequences of errors on tenant reports caused by AI data gathering practices. Furthermore, the tenant screening industry lacks standardized procedures to ensure data accuracy.

The use of AI algorithms in housing necessitates regulatory laws by state and local governments to promote transparency and accuracy in tenant screening processes and safeguard tenants' rights to correct any incorrect data. This paper explores best practices from relevant literature to mitigate the policy problems associated with AI-driven tenant screening, aiming to improve accuracy, fairness, and transparency.

Tenant screening processes commonly rely on AI algorithms to evaluate potential renters, using various data points, including criminal background checks, eviction histories, credit scores, and rental payment history. The growing reliance on automated tenant screening services, particularly with the rise of large corporate landlords and centralized property management systems, prioritizes efficiency over accuracy. The use of AI in tenant screening is important for several reasons:

  • Errors in Reports: AI-driven background checks often contain errors, including incorrect criminal records, which can unjustly disqualify potential renters from securing housing. This leads to significant life disruptions and can force individuals into substandard living conditions or homelessness.

  • Disproportionate Impact on Marginalized Groups: These systems disproportionately affect marginalized groups, particularly people of color. AI algorithms can worsen existing societal biases, reflecting and reinforcing racial disparities in housing access. For example, Abby Boshart notes that "Black women were more likely to have an eviction filing that ultimately resulted in dismissal," yet these records still appear on their rental histories. Kathryn A. Sabbeth elaborates that these cases will still show up on rental records, further disadvantaging affected individuals.

Cleo Bluthenthal’s report, "The Disproportionate Burden of Eviction on Black Women," highlights that 68% of Black mothers are the sole breadwinners of their households. The impact of evictions and housing instability on children is profound, leading to academic decline, emotional trauma, and long-term health implications.

The lack of stringent regulation overseeing the accuracy and use of these AI tools exacerbates these issues. Current laws do not sufficiently protect consumers from the consequences of mistaken reports, and the industry lacks standardized procedures to ensure information accuracy.

Rebecca Burns, in her article “Artificial Intelligence Is Driving Discrimination in the Housing Market,” quotes Hannah Holloway from the Tech Equity Collaborative: “We don’t know what their data sources are, or how often they’re scrubbing that information and updating it. And while some characteristics may be fairly objective, she noted, 'If I’m a tenant, I have no idea how they’re using that information to come up with a prediction about whether I’ll damage the property or miss a payment.’”

The problem of AI in tenant screening is substantial, growing, and critical due to its broad impacts on fairness, equality, and access to housing. The lack of regulation and increasing reliance on this technology make it an urgent area for policy intervention and reform.

To improve accuracy, fairness, and transparency in tenant screening processes, several best practices are recommended:

  1. Implement Stricter Regulations: Ensure the accuracy of tenant screening reports and enforce comprehensive data matching techniques. Regulations should mandate accurate data matching and provide tenants with the right to dispute and correct errors.

  2. Increase Transparency: Enhance tenants' rights to access and contest their data. Cities like San Francisco and Newark have implemented "ban the box" regulations that restrict certain types of data in tenant screening, reducing discriminatory impacts.

  3. Standardize Screening Guidelines: Develop and enforce standardized guidelines for tenant screening that prioritize data relevance and minimize reliance on potentially discriminatory information.

  4. Enhance Oversight: Conduct regular audits of tenant screening agencies and hold them accountable for errors. Increased oversight can ensure adherence to fairness standards.

  5. Support Tenant Education: Implement programs to help tenants understand their rights and navigate disputes about tenant screening.

  6. Employ Advanced Data Analytics: Use advanced data analytics to improve the accuracy of tenant screening, ensuring that only relevant and verified data influence rental decisions.

Implementing these best practices involves legislative action, industry standards, and community support. Together, these strategies can address current deficiencies in tenant screening processes, making them fairer and more transparent, ultimately improving housing access for all potential renters.

The integration of AI in tenant screening processes presents significant policy challenges due to inaccuracies and biases that can unjustly affect potential renters, especially marginalized groups. Current regulations do not sufficiently address the complexities of AI-driven data inaccuracies in tenant screening, perpetuating inequality. Implementing best practices in tenant screening not only improves fairness and accuracy but also stabilizes rental markets, benefiting both landlords and tenants.

The negative impacts of AI algorithms on housing and data protection require urgent regulatory intervention by state and local governments. These interventions should enhance transparency, ensure accuracy in tenant screening processes, and safeguard tenants' rights to correct mistaken data. By adopting these best practices and strategic steps, policymakers can create a fairer housing system, ensuring that the benefits of technology in the rental market are realized without sacrificing fairness or accuracy.

References

CFPB reports highlight problems with tenant background checks. (2022). Retrieved from https://www.consumerfinance.gov/about-us/newsroom/cfpb-reports-highlight-problems-with-tenant-background-checks

TechEquity. (2022). Tech, Bias, and housing initiative: Tenant screening. Retrieved from https://techequitycollaborative.org/2022/02/23/tech-bias-and-housing-initiative-tenant-screening/

Kirchner, L., & Goldstein, M. (2020). How automated background checks freeze out renters. Retrieved from https://www.nytimes.com/2020/05/28/business/renters-background-checks.html

Sabbeth, K. A. (2021). Erasing the “Scarlet e” of eviction records. Retrieved from https://theappeal.org/the-lab/report/erasing-the-scarlet-e-of-eviction-records/

Boshart, A., Tajo, M., & Choi, J. H. (2022). How tenant screening services disproportionately exclude renters of color from housing. Retrieved from https://housingmatters.urban.org/articles/how-tenant-screening-services-disproportionately-exclude-renters-color-housing

Bluthenthal, C. (2023). The disproportionate burden of eviction on Black Women. Retrieved from https://www.americanprogress.org/article/the-disproportionate-burden-of-eviction-on-black-women/

Wu, C. C., Nelson, A., Kuehnhoff, A., Cohn, C., Sharpe, S., & Cabañez, N. (2023). Digital denials. Retrieved from https://www.nclc.org/wp-content/uploads/2023/09/202309_Report_Digital-Denials.pdf

Burns, R. (2023). Artificial Intelligence is driving discrimination in the housing market. Retrieved from https://jacobin.com/2023/06/artificial-intelligence-corporate-landlords-tenants-screening-crime-racism


Previous
Previous

Preventing Tech-Fueled Political Violence

Next
Next

Privacy and Ethical Considerations for the Use of Evolving Technology within Government