Facebook and the US agree to stop discriminatory housing ads

Facebook and the US agree to stop discriminatory housing ads

Facebook Ethiopia Hate Speech (Copyright 2021 The Associated Press. All rights reserved.)

Facebook Ethiopia Hate Speech (Copyright 2021 The Associated Press. All rights reserved.)

Facebook will change its algorithms to prevent discriminatory housing advertising, and its parent company will face court oversight to settle a US Department of Justice lawsuit on Tuesday.

In a press release, U.S. government officials said Meta Platforms Inc., formerly known as Facebook Inc., announced Tuesday that it had reached a settlement to settle the lawsuit filed in federal court in Manhattan the same day.

According to the press release, it was the Justice Department’s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook is now subject to Department of Justice approval and court oversight for its ad targeting and delivery system.

US Attorney Damian Williams called the lawsuit “groundbreaking”. Assistant Attorney General Kristen Clarke called it “historic”.

Ashley Settle, a Facebook spokeswoman, said in an email that the company is “developing a novel machine learning method, without our ads system, that will transform the way housing ads are delivered to US residents from diverse demographic groups.” will”.

She said the company will expand its new method for employment and credit-related ads in the United States

“We’re excited to advance this effort,” Settle added in an email.

Williams said Facebook’s technology has a history of violating the Fair Housing Act online, “just as companies engage in discriminatory advertising using more traditional advertising methods.”

Clarke said, “Companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.”

Under the terms of the settlement, Facebook will stop using a housing ads advertising tool that the government said uses a discriminatory algorithm to find users who “look like” other users based on characteristics protected by the Fairness Act Homes are protected, the Justice Department said. By December 31, Facebook must stop using the tool once called “Lookalike Audience,” which relies on an algorithm that the US says discriminates based on race, gender and other characteristics.

Facebook will also develop a new system over the next six months to eliminate racial and other disparities caused by the use of personalization algorithms in its home ad delivery system, it said.

If the new system is inadequate, the settlement agreement can be terminated, the Justice Department said. According to the settlement, Meta also has to pay a fine of just over $115,000.

The announcement comes after Facebook agreed back in March 2019 to overhaul its ad targeting systems to eliminate discrimination in housing, credit and job ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance and others to prevent .

The changes announced at the time were designed so that advertisers who wanted to run housing, job, or loan ads were no longer allowed to target people by age, gender, or zip code.

The Justice Department said Tuesday the 2019 settlement reduced potentially discriminatory targeting options for advertisers but didn’t resolve other issues, including Facebook’s discriminatory delivery of housing ads through machine learning algorithms.

Leave a Reply

Your email address will not be published.