Meta Agrees to Transform Advertising Focusing Technology to the United States

SAN FRANCISCO – Meta agreed to change its targeting technology and pay a $ 115,054 fine on Tuesday, in a settlement with the Justice Department over allegations that the company engaged in housing discrimination by allowing advertisers to block those who could see ads on the site. platform based on their color, gender and ZIP code.

Under the agreement, Meta, formerly known as Facebook, said it would change its technology and adopt a new computer-assisted approach that aims to check regularly if targeted and eligible viewers receive home ads. those ads. The new approach, which Meta calls a “differential reduction system,” relies on machine learning to ensure that advertisers deliver housing-related ads to specific protected layers.

Meta also said it would no longer use a feature called “ad-specific audience,” a tool it has created to help advertisers expand the groups of people whose ads would reach them. The company said the agency is an early effort to combat bias, and that its new tactics will be more effective.

“From time to time we will be taking pictures of retailers’ audiences, seeing who they are targeting, and removing as many differences as we can from the audience,” said Roy L. Austin, Meta vice vice president of civil rights and deputy attorney general. , he said in an interview. He called it “a major technological advancement in how machine learning is used to deliver personalized ads.”

Facebook, which grew big business by collecting its users’ data and allowing advertisers to target ads based on audience characteristics, has been plagued by complaints for years that some of these practices are biased and discriminatory. The company’s advertising systems have allowed advertisers to select from viewers who have viewed their ads using thousands of different credentials, which have also allowed advertisers to exclude people under several protected categories.

While Tuesday’s solution covers home advertising, Meta said it also plans to use its new system to focus on job-related advertising and loan advertising. The company previously faced a blow for allowing favoritism against women in job advertisements and excluding certain groups of people from seeing credit card ads.

“Because of this fundamental case, Meta – for the first time – will change its advertising system to address algorithm discrimination,” Damian Williams, a U.S. attorney, said in a statement. “But if Meta fails to show that it has changed its delivery system enough to protect itself against algorithm bias, this office will continue with the case.”

The issue of favoritism has been discussed in particular in home advertising. In 2018, Ben Carson, then secretary of the Department of Housing and Urban Development, filed a formal complaint against Facebook, accusing the company of having advertising systems that “illegally discriminated against” based on categories such as race, religion and disability. Facebook’s potential for advertising discrimination was also revealed in a 2016 survey by ProPublica, which showed that the company made it easier for retailers to exclude specific ethnic groups for advertising purposes.

In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Housing Rights Act. The organization said Facebook systems did not present ads to “various audiences,” even though the advertiser wanted the ad to be widely seen.

“Facebook discriminates against people based on who they are and where they live,” he said. Carson said at the time. “Using a computer to set a limit on a person’s choice of housing can be as discriminatory as knocking on someone’s face.”

The issue of HUD came amid widespread pressure from civil rights groups claiming that large and complex broadcasting systems that hold some of the world’s largest platforms have a natural preference built into them, and that technology companies like Meta, Google and some should do more knocking. to reclaim that privilege.

The area of ​​research, known as “algorithmic equilibrium,” has become an important topic of interest among computer scientists in the field of artificial intelligence. Top researchers, including former Google scientists such as Timnit Gebru and Margaret Mitchell, have sounded the alarm over such bias for years.

In the years since, Facebook has restricted the types of categories that retailers can choose from when buying home ads, reducing the number to hundreds and eliminating targeting options based on color, age and area code.

The new Meta system, which is still under development, will periodically monitor who is being shown real estate, employment and credit, and ensure that those audiences match the people that marketers want to target. If the ads that are being introduced start to mislead white men in their 20s, for example, the new system will realize this theoretically and change the ads so that they are delivered more evenly between a wider and more diverse audience.

Meta said it would work with HUD in the coming months to integrate technology into Meta advertising targeting systems, and approved a third-party review of the effectiveness of the new system.

The penalty that Meta pays in the settlement is the maximum amount found under the Housing Rights Act, the Justice Department said.

Leave a Comment