Federal Agency Settles with Meta in First-of-its-Kind Suit that Challenged Algorithmic Bias in Housing Ads


An announcement by the U.S. Department of Justice (DOJ) on Tuesday said that it has settled a discrimination suit with Meta Platforms Inc. over an algorithm that illegally used race, origin, sex, and other criteria to determine which Facebook users were shown housing ads. Under the terms of the settlement, Meta has agreed to reform its advertising tools, including sunsetting its Special Ad Audience tool that allegedly relied on a discriminatory algorithm.

The lawsuit, filed on behalf of the Department of Housing and Urban Development (HUD), alleged that among other things, Meta uses algorithms that differentiate users baased on characteristics protected under the Fair Housing Act (FHA) including race, color, religion, sex, disability, familial status, and national origin, adversely impacting marginalized communities. 

In particular, the federal government took issue with several aspects of Meta’s ad targeting and delivery system, including how Meta encouraged advertisers to target Facebook audiences based on their demographics. Relatedly, HUD said that Meta’s algorithms actually used protected characteristics to help determine which subset of an advertiser’s targeted audience would actually receive a housing ad.

The same day that news of the settlement broke, Meta posted about the changes it intends to make. In addition to eliminating its Special Ad Audience, the company said it will incorporate a new method into its ad system toward “a more equitable distribution of ads process.” Meta noted that while HUD raised concerns about personalized housing ads, it also plans to revise its ad methods related to employment and credit.

The settlement, which must first receive court approval, also requires Meta to pay a civil penalty of just over $115,000, the maximum penalty available under the FHA.

Commenting on the agreement, Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division remarked that “[a]s technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner. This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.”