San Francisco, June 22
Meta (formerly Facebook) has agreed to settle a lawsuit with the US government that accused the social network of engaging in discriminatory advertising for housing on its platform.
The housing discrimination lawsuit, filed by the government in 2019, accused that Meta enabled and encouraged advertisers to target their housing ads by relying on race, colour, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible, and ineligible, to receive housing ads.
Under the settlement, Meta will stop using an advertising tool for housing ads which relies on a discriminatory algorithm to find users who "look like" other users based on Fair Housing Act (FHA)-protected characteristics.
Meta also will develop a new system over the next six months to address racial and other disparities caused by its use of personalisation algorithms in its ad delivery system for housing ads, the US Department of Justice said in a statement late on Tuesday.
If the US concludes that the new system adequately addresses the discriminatory delivery of housing ads, then Meta will implement the system, which will be subject to Department of Justice approval and court oversight.
However, if the government finds that the new system is insufficient to address algorithmic discrimination in the delivery of housing ads, then the settlement agreement will be terminated.
"Because of this ground-breaking lawsuit, Meta will -- for the first time -- change its ad delivery system to address algorithmic discrimination," said US Attorney Damian Williams.
"But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation," Williams added.
Meta's ad delivery system used machine-learning algorithms that rely in part on FHA-protected characteristics -- such as race, national origin, and sex -- to help determine which subset of an advertiser's targeted audience will actually receive a housing ad.
"As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner," said Assistant Attorney General, Kristen Clarke.
This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.
"The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalised communities," Clarke added.
4 hours agoAngry mob attacks Udaipur murder accused outside court; hurls slippers, bottles
6 hours agoMizoram launches online processing of RTI
6 hours agoAAP alleges govt may charge for borewells in housing society in Gujarat
6 hours agoTelangana CM KCR launches scathing attack on PM Modi
6 hours ago'Don't buy from Muslim vendors' notice surfaces in Guj village
6 hours agoNDA is Nitish and Nitish is NDA, says Upendra Kushwaha
6 hours agoAngry mob attack Udaipur murder accused outside court; hurls slippers, bottles
6 hours agoDhanush to lead the cast in Tamil big-budget film 'Captain Miller'
6 hours agoRashmika sends Netizens into a tizzy as she sizzles in red saree
6 hours agoJohn Bradly unravels the complicated life of his 'Moonfall' character
6 hours ago7.2 bn minutes in a week: 'Stranger Things' breaks Nielsen's streaming record
6 hours ago'Modern Love Hyderabad' jukebox with six romantic tracks goes live
7 hours agoEx-Kerala MLA PC George arrested for outraging modesty of Solar scam accused