Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print RSS Search Secure Twitter WhatsApp YouTube

We Have Some Follow-Ups for Facebook — And We Want Your Help

Senators held Facebook’s Mark Zuckerberg to account today, grilling him while often citing our investigations. You can help keep Facebook accountable, too.

Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of the Senate Judiciary and Commerce committees on the protection of user data on April 10, 2018. (Tom Williams/CQ Roll Call)

Mark Zuckerberg just gave almost five hours’ worth of testimony on Capitol Hill. This was a sit-down senators had long requested, and he had to answer years’ worth of built-up concerns.

The lawmakers appear to have been reading our work.

They grilled the Facebook founder about our investigations involving abusive ad targeting, housing and employment discrimination and hate speech.

Here are some takeaways — and follow-up questions we still need the public’s help to answer:

Political Ads

The senators grilled Zuckerberg about Russia, of course, and asked whether or not Facebook would support the Honest Ads Act, which would require disclosures around political ads online.

What we need is your help keeping political ads on Facebook honest right now. You can do that with our Facebook Political Ad Collector.

The project is important because Facebook’s advertising tools make it easy for politicians and interest groups to show ads just to a specific target audience without necessarily telling the rest of us. This makes it very difficult to fact-check what the ad-buyers are saying — or even to make sure they’re complying with the federal laws governing elections.

We’ve built the tool to help us monitor ad transparency on Facebook. It works by copying the ads you see on Facebook, so that anyone, on any part of the political spectrum, can see them.

We want to make sure we’re collecting ads in every part of the world where politicians might be up to something (AKA everywhere). All you need is a desktop computer with the Chrome or Firefox internet browsers. We just put up a guide explaining how to install it here.

If you’d like to do this and encounter any hurdles installing, please let us know at [email protected]! And then: Please encourage your friends and family to install it. Get your grandparents to do it. Your brothers-in-law. Your high school classmates you haven’t talked to in years. The more people in more parts of the country, the better.

The ads can shed important light on candidates’ strategies — and who interest groups think their supporters are. Here are some ads we’ve found already:

  • This inflammatory ad about “chain migration” that Donald Trump’s reelection campaign shows to people in an “audience of people they want to reach,” according to Facebook’s explanation.

An ad from President Donald Trump’s re-election campaign with inflammatory messaging around so-called “chain migration” — and which was targeted to an opaque subset of Americans, according to Facebook’s explanation. The ad was submitted by a ProPublica reader as part of our Facebook Political Ad Collector project.
  • Candidates targeting their ads with data sourced from data brokers — perhaps using information from users’ offline lives.

Left: An ad from Sen. Martin Heinrich, D-N.M. Right: An ad from Cathy Myers, who is seeking the Democratic nomination to run against Rep. Paul Ryan of Wisconsin. Both ads were targeted to people who are, according to Facebook, “part of an audience created based on data provided by Experian,” which is a data broker.

An ad from a Florida interest group seeking to limit the expansion of casino gambling in that state, targeted to men at least 65 years old, according to Facebook’s explanation. The ad was submitted by a ProPublica reader as part of our Facebook Political Ad Collector project.

If you see ads we should look into, we’d love to hear about them. You can see our full set here. Stay tuned for more.

Hate Speech and Content Moderation

Sen. Ben Sasse, R-Neb., asked Mark Zuckerberg to define hate speech. Zuckerberg said it’s “a really hard question and I think it’s one of the reasons why we struggle with it.”

Facebook’s content moderators are actually required to use a very specific definition of hate speech. The company’s policies are spelled out in training materials breaking groups into “protected categories” and “non-protected categories.” Last year, we revealed that the rules protected white men, but not black children because age, unlike race and gender, was not a protected category. (In response to our article, Facebook added age to its protected characteristics.)

While the company’s rules are highly specific, Facebook doesn’t share them with users. Moreover, enforcement isn’t consistent.

We know, because we collected examples from nearly 1,000 Facebook users, chose a collection of 49 posts and asked the company to explain why they were either left up or taken down. The company admitted that reviewers had made mistakes in 22 cases.

Take a look at what we found. (Warning: some posts are disturbing.)

When we talked with Facebook representatives in December, they told us that abuse in the private messenger feature gets handled differently than comments or posts on the newsfeed. They said this was in part due to privacy concerns.

Last month, however, Bloomberg reported that the company is scanning what you send to other people on its messenger app. So if you’ve got a story about a messenger-based hate speech attack from the past few months, please let us know.

Fair Housing and Employment Law

Other senators asked Zuckerberg about advertisers potentially violating housing, employment and credit laws using the site’s ad targeting system.

It’s illegal to post a housing ad that excludes certain categories of people. Thanks to the Fair Housing Act, you can’t get away with putting an ad in a newspaper that says “no black people allowed.” But on Facebook? Until recently, there was nothing stopping you.

We know, because we tried. Way back in 2016, we discovered that Facebook allowed advertisers to exclude users by race, even in housing ads. Facebook apologized and promised to add more safeguards to the system. Then, in 2017, we managed to buy the same ad. [*[Exactly]* the same ad.

Fair housing groups have said they could still buy discriminatory ads without a hitch as recently as February.

Ad Targeting in General

We also managed to buy ads that were just flat-out discriminatory, targeting categories of users labelled “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’” Until we flagged these categories, Facebook enabled advertisers to direct their pitches to the newsfeeds of almost 2,300 people who had expressed interest in those topics.

Zuckerberg says the solution is “to develop more AI tools.” But in response to our reporting, the company promised to hire more human reviewers.

We know a lot of our readers have spent time in Facebook’s advertising portal. If you notice something, we’d like to hear about it.

Nearly two years ago, we published a series explaining the algorithms we encounter every day. We created a browser extension that showed users the various ad categories Facebook put them in. It also allowed those users to share the categories with us. We collected more than 52,000 unique attributes that Facebook used — everything from “Pretending to Text in Awkward Situations” to “Breastfeeding in Public.”

The crowdsourced project allowed us to figure out that while the site shows users how Facebook categorizes them, it doesn’t reveal the data it is buying about their offline lives.

You can download all the ad categories we crowdsourced.

Filed under:

Latest Stories from ProPublica

Current site Current page