Background
In Louis v SafeRent Solutions, two renters and a housing-advocacy organisation alleged that SafeRent’s tenant-screening algorithm produced disproportionately low “SafeRent Scores” for Black and Hispanic applicants using housing vouchers, in violation of the Fair Housing Act and Massachusetts law. The court denied SafeRent’s motion to dismiss in July 2023 and later entered a Final Approval Order and Judgment approving a class settlement that included both monetary relief and injunctive measures requiring changes to the scoring model to mitigate discriminatory effects.
AI interaction
The Department of Justice described the dispute as involving “an algorithm-based tenant screening system” that may “discriminate against Black and Hispanic rental applicants” in part through “the use of housing vouchers.” (U.S. Department of Justice, Statement of Interest, 13 Jan 2023). This framing directly linked automated risk-scoring to potential disparate impact under fair-housing law. The litigation and subsequent settlement established that algorithmic decision-making in housing is subject to the same anti-discrimination obligations as traditional screening practices.
Note:
The Final Approval Order and Judgment (20 Nov 2024) is available here. It follows the U.S. Department of Justice’s Statement of Interest (13 Jan 2023), which outlined the government’s interpretation of algorithmic disparate impact under the Fair Housing Act. Together these documents establish a benchmark for federal oversight of automated screening systems in the rental market.