Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print RSS Search Secure Twitter WhatsApp YouTube
PROPUBLICA Expose Corruption. Defend Truth. Support Investigative Journalism.
DONATE

Facebook (Still) Letting Housing Advertisers Exclude Users by Race

After ProPublica revealed last year that Facebook advertisers could target housing ads to whites only, the company announced it had built a system to spot and reject discriminatory ads. We retested and found major omissions.

Facebook CEO Mark Zuckerberg speaks in San Jose, California, in October 2016. (David Paul Morris/Bloomberg via Getty Images)

In February, Facebook said it would step up enforcement of its prohibition against discrimination in advertising for housing, employment or credit.

But our tests showed a significant lapse in the company’s monitoring of the rental market.

Last week, ProPublica bought dozens of rental housing ads on Facebook, but asked that they not be shown to certain categories of users, such as African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina and Spanish speakers.

All of these groups are protected under the federal Fair Housing Act, which makes it illegal to publish any advertisement “with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” Violators can face tens of thousands of dollars in fines.

Every single ad was approved within minutes.

The only ad that took longer than three minutes to be approved by Facebook sought to exclude potential renters “interested in Islam, Sunni Islam and Shia Islam.” It was approved after 22 minutes.

Under its own policies, Facebook should have flagged these ads, and prevented the posting of some of them. Its failure to do so revives questions about whether the company is in compliance with federal fair housing rules, as well as about its ability and commitment to police discriminatory advertising on the world’s largest social network.

Facebook’s advertising portal lets users choose their audiences based on specific traits, demographics and behavior profiles. Our ad for an apartment rental excluded African Americans, Asian Americans and Spanish-speaking Hispanic audiences. It was approved in under a minute.

Housing, employment and credit are the three areas in which federal law prohibits discriminatory ads. However, the U.S. Department of Housing and Urban Development — the agency responsible for enforcing fair housing laws — told us that it has closed an inquiry into Facebook’s advertising policies, reducing pressure on the company to address the issue. In a 2015 newspaper column, Ben Carson, now HUD secretary, criticized “government-engineered attempts to legislate racial equality” in housing.

Facebook’s failure to police discriminatory rental ads flies in the face of its promises in February that it would no longer approve ads for housing, employment or credit that targeted racial categories. For advertising aimed at audiences not selected by race, Facebook said it would require housing, employment and credit advertisers to “self-certify” that their ads were compliant with anti-discrimination laws.

Based on Facebook’s announcement, the ads purchased by ProPublica that were aimed at racial categories should have been rejected. The others should have prompted a screen to pop up asking for self-certification. We never encountered a self-certification screen, and none of our ads were rejected by Facebook.

“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments,” Ami Vora, vice president of product management at Facebook, said in an emailed statement. “The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.”

Vora added that Facebook’s anti-discrimination system had “successfully flagged millions of ads” in the credit, employment and housing categories and that Facebook will now begin requiring self-certification for ads in all categories that choose to exclude an audience segment. “Our systems continue to improve but we can do better,” Vora said.

About 37 percent of U.S. households rented in 2016, representing a 50-year high, according to the Joint Center for Housing Studies of Harvard University. On average, renters earn about half as much as homeowners, and the percentage of families with children that rent rather than buy has increased sharply in the past decade, the study said. Minority renters have long faced pervasive housing discrimination. A 2013 study by HUD found that real estate agents show more units to whites than to African Americans, Asians and Latinos.

Facebook has long been a popular destination for rental listings, on pages hosted by real estate brokers, property owners and building managers. Earlier this month, Facebook announced that it had added two large providers of rental listings to its Facebook Marketplace service. “Marketplace is a popular place for people to look for a home to rent,” Facebook product manager Bowen Pan said in a press release.

Facebook warns rental advertisers in its Marketplace section that “listings that discriminate against a protected class can be reported and will be removed from Facebook.”

Facebook’s anti-discrimination initiative was prompted by an article published last year by ProPublica. For that story, we bought a Facebook ad targeting house hunters. We were able to use Facebook’s features to block the ad from being shown to anyone with an “affinity” for African American, Asian American or Hispanic people. Our ability to narrow the audience based on race raised the question of whether such ads violated the Fair Housing Act.

After ProPublica’s article appeared in the fall of 2016, HUD, then under the Obama administration, began examining Facebook’s practices. Facebook then said it would build an automated system to spot ads that discriminate illegally. “We take these issues seriously,” Facebook Vice President Erin Egan wrote in a blog post. “Discriminatory advertising has no place on Facebook.”

In February, Facebook announced it had built its system and was rolling it out. The press lauded the announcement: “Facebook cracks down on ads that discriminate” was the Washington Post’s headline.

Facebook has been under fire for other aspects of its automated ad buying system as well. Two months ago, the company disclosed that it had discovered $100,000 worth of divisive political ads placed by “inauthentic” Russian accounts. And in September, ProPublica reported that Facebook’s ad targeting system allowed buyers to reach people who identified themselves as “Jew haters” and other anti-Semitic categories. Facebook pledged to remove the offending categories and to hire thousands more employees to enforce its ad policies.

“We’re adding additional layers of review where people use potentially sensitive categories for targeting,” Facebook General Counsel Colin Stretch said during Senate testimony earlier this month.

After Stretch’s public statement, we wondered whether the ability to buy discriminatory housing ads had really been addressed. So we set out to buy an advertisement with the exact same targeting parameters as the ad we bought last year. The ad promoted a fictional apartment for rent and was targeted at people living in New York, ages 18–65, who were house hunting and likely to move. We asked Facebook not to show the ad to people categorized under the “multicultural affinity” of Hispanic, African American or Asian American.

(ProPublica generally forbids impersonation in news gathering. We felt in this instance that the public interest in Facebook’s ad system justified the brief posting of a fake ad for non-existent housing. We deleted each ad as soon as it was approved.)

The only changes from last year that we could identify in Facebook’s ad buying system was that the category called “Ethnic Affinity” had been renamed “Multicultural Affinity” and was no longer part of “Demographics.” It is now designated as part of “Behaviors.”

Our ad was approved within minutes.

Left: A screenshot of ad targeting categories ProPublica submitted and Facebook approved in 2016. Right: Categories ProPublica submitted and were approved in 2017, raising questions about what the social network has done to police discriminatory ads.

Then we decided to test whether we could purchase housing ads that discriminated against other protected categories of people under the Fair Housing Act.

We placed ads that sought to exclude members of as many of the protected categories as we could find in Facebook’s self-service advertising portal. In addition to those mentioned above, we bought ads that were blocked from being shown to “soccer moms,” people interested in American sign language, gay men and Christians.

We also tested whether it was possible to use geography as a way to target racial groups — a practice known as redlining. We bought a housing ad that targeted ZIP codes in Brooklyn whose residents are more than 50 percent non-Hispanic white people, according to the U.S. Census bureau. By definition, that meant the ad was not shown to Facebook users living in Brooklyn neighborhoods where minorities are a majority of the residents.

Facebook drew blue lines around our target neighborhoods and told us our “audience selection is great!” It approved the ad.

Portrait of Julia Angwin

Julia Angwin

Julia Angwin is a senior reporter at ProPublica. From 2000 to 2013, she was a reporter at The Wall Street Journal, where she led a privacy investigative team that was a finalist for a Pulitzer Prize in Explanatory Reporting in 2011 and won a Gerald Loeb Award in 2010.

Portrait of Ariana Tobin

Ariana Tobin

Ariana is the crowdsourcing and engagement team editor at ProPublica, where she works to cultivate communities to inform our coverage.

Latest Stories from ProPublica

Current site Current page