The Justice Department has settled a case in which Meta was accused of engaging in discriminatory advertising practices, specifically as it relates to the Fair Housing Act.
The department alleged that Meta was using a discriminatory algorithm in determining which Facebook users should receieved advertsiements for housing, according to the June 21 press release. The algorithm relied in part on characteristics protected under the Fair Housing Act, including age, color, religion, sex, disability, familial status and national origin to target housing ads, either providing them or not, according to the lawsuit.
As part of the settlement, filed in the U.S. District Court for the Southern District of New York, Meta will now stop its use of the "special ad audience" tool in distributing housing advertisements. It also has until December 2022 to develop a new tool that addresses some of the biases in advertising algorithms.
"When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act," said U.S. Attorney Damian Williams for the Southern District of New York. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. "