Technology

Meta agrees to change targeted advertising technology in settlements with US

SAN FRANCISCO – Meta agreed to overhaul its targeted advertising technology and paid a $ 115,054 fine Tuesday, in a settlement with the Justice Department over claims that the company was involved in housing discrimination by restricting advertisers who can view ads. Platforms based on their race, gender and zip code.

Under the agreement, Meta, the company, formerly known as Facebook, said it would change technology and use new computer-assisted methods aimed at regularly checking if a targeted and eligible audience for a live ad is, in fact, seen. Those ads. The new approach, called a “differential reduction system”, relies on machine learning to ensure that advertisers are delivering home-related ads to a specific group of people.

Meta also said it would not use a feature called “Special Ad Viewer”, a tool it has developed to help advertisers expand the group of people their ads reach. The company says the tool is an early effort to combat bias, and its new approach will be more effective.

“We will occasionally take a photo of the marketer’s audience, see who they target, and take as much difference as we can from that audience,” said Roy L. Austin, Meta’s civil rights vice president and deputy general counsel. , Said in an interview. He called it “significant technological advancement for machine learning methods to display personalized ads.”

Facebook, which has become a business venture by collecting user data and allowing advertisers to target ads based on audience characteristics, has faced complaints for years that some of those practices are biased and discriminatory. The company’s advertising system allows marketers to choose who sees their ads using thousands of different features, leaving those advertisers excluding those who fall into a certain protected category.

While Tuesday’s settlement involved residential advertising, Meta said it also plans to use its new system to examine the targeting of employment-related advertising and credit. Previously, the company faced charges of allowing women to be biased in job advertisements and excluding certain groups from viewing credit card advertisements.

“Because of this broken lawsuit, Meta will – for the first time – change its ad delivery system to address discrimination,” U.S. attorney Damian Williams said in a statement. “But if Meta fails to show that it has changed the delivery system sufficiently to prevent algorithmic bias, this office will take legal action.”

The issue of targeting biased advertising has been particularly debated in residential advertising. In 2018, Ben Carson, then secretary of the Department of Housing and Urban Development, filed a formal complaint against Facebook, accusing the company of having an “illegal discrimination” advertising system based on categories such as race, religion and disability. Facebook’s potential for ad discrimination was also revealed in a 2016 investigation by ProPublica, which showed that the company made it easy for marketers to exclude specific ethnic groups for advertising purposes.

In 2019, HUD sued Facebook for its involvement in housing discrimination and violations of the Fair Housing Act. The agency said Facebook’s system does not deliver ads to “diverse audiences”, although advertisers want the ads to be widely visible.

“Facebook is discriminating against people based on who they are and where they live,” Carson said at the time. “Using a computer to restrict an individual’s housing choices can be as discriminating as knocking on someone’s door.”

The HUD series comes amid widespread advocacy from civil rights groups claiming that the vast and complex advertising systems that support some of the largest internet platforms have a bias in them, and that tech companies like Meta, Google and others should do more. Return those biases.

The area of ​​education, called “algorithmic justice,” has been a major topic of interest to computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists Timnit Gebru and Margaret Mitchell, have been warning of such biases for years.

In recent years, Facebook has narrowed the list of categories that marketers can choose when buying residential ads, cutting the numbers down to hundreds and eliminating targeted options based on race, age and ZIP code.

Meta’s new system, which is still under development, will sometimes check who is serving ads for housing, employment and credit, and make sure those audiences match those that marketers want to target. If the ads being served start to skew seriously towards white men in their 20s, for example, the new system will recognize it theoretically and change the ads to be evenly served across a wider and different audience.

Meta said it will work with HUD over the coming months to integrate the technology into Meta’s targeted advertising system, and agreed to a third-party audit on the effectiveness of the new system.

The Department of Justice says the penalties that Meta pays for payments are the maximum available under the Fair Housing Act, the Justice Department said.

Leave a Reply

Your email address will not be published.