Skip to main content Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print ProPublica RSS Search Secure Twitter WhatsApp YouTube
Defend the Facts Support independent journalism by donating to ProPublica.
DONATE
  • ProPublica
  • Local Initiatives
  • Data Store
  • Donate
  • Follow us on Twitter
  • Like us on Facebook
  • Search
  • Newsletters

ProPublica logo

  • Graphics & Data
  • Newsletters
  • About
  • Racial Justice
  • Criminal Justice
  • Biden Administration
  • Health Care
  • More…
  • Series
  • Video
  • Impact
  • Search
  • More

Not Shutting Up

Occasional notes about the issues facing journalism and American democracy, from ProPublica’s leadership.


How platforms moderate speech (or don’t).
Not Shutting Up
By Stephen Engelberg

Welcome to Not Shutting Up, a newsletter from ProPublica’s leadership. You’re receiving this because you’ve supported ProPublica’s journalism; we’re grateful for that, and we hope to give you some context on how our newsroom works. If this email was forwarded to you, you can sign up to receive it here.


Like many of you, I’ve spent a lot of time during the past few months frustrated by the spread of disinformation on social media. As 2020 crawls to a conclusion at the speed of a Georgia vote count, I thought it might be useful to ask an expert to explain what passes for rules on the major platforms. Fortunately, we have just the right person on staff at ProPublica’s virtual headquarters:

I’m Kengo Tsutsumi, ProPublica’s platform editor and a member of our audience team. I spend my time thinking about how best to get our journalism in front of people and where we should focus to make that happen. There are lots of ways to do this, and social media is one of them. I’m as exhausted as the rest of you after the past few weeks, but to help make things less confusing, I thought it might be helpful to share some things I know to be true about social platforms.

It might seem like social media is the Wild West, with hardly any sheriffs and a set of rules that seemingly change from day to day. That’s not quite true, though it certainly has looked like that in the past four years. In fact, the biggest platforms — Twitter, Facebook, Instagram — do have rules which are enforced ... extremely selectively. And therein lies the problem.

Twitter, for example, has “terms of use” that, among other things, bar people from threatening violence or “hateful conduct.” For example, in November, Steve Bannon was permanently banned from Twitter (but not Facebook!) for calling for Anthony Fauci’s beheading on his podcast. Rapper Talib Kweli’s account was likewise permanently suspended over the summer for targeted harassment. In October, Charlie Kirk, a conservative activist, was temporarily suspended from Twitter for spreading false voter information when he shared a ProPublica and Philadelphia Inquirer story, incorrectly stating that it showed Pennsylvania had rejected 372,000 mail-in ballots. (We reported that 372,000 ballot requests, most of which were duplicates, were rejected, not ballots themselves.)

Up to this point, it all probably sounds pretty logical. Ignoring the fact that lots of tweets flagged for breaking the rules stay posted, the system really collapses when someone with, say, 85 million followers who happens to be president of the United States starts tweeting outright lies. By any of Twitter’s definitions (even under the public service exceptions Twitter outlines for world leaders), President Donald Trump could have been kicked off the platform many months ago, for example when he called for shooting Black Lives Matter protesters (the tweet was flagged but remained on the platform) or when he explicitly threatened other countries and their leaders or any of the number of times he’s insulted or attacked individuals.

There was clearly little stomach for removing the nation’s leader from social media. Instead, by the 2020 election both Facebook and Twitter chose to label presidential posts that spread clearly untrue election information as “disputed.” At times, the platforms appended factual information completely at odds with the president’s version of reality. That left us with a number of posts that looked like this:

"I won the election!"

You might be asking yourself, Why can Twitter and Facebook just make up rules and enforce them inconsistently? First, they are companies and courts have given corporations broad latitude to set rules of behavior on their own property, which in this case is their website. The basic parameters were spelled out in the foundational statute of the modern internet: the Communications Decency Act of 1996, specifically Section 230. The internet was in its infancy back then; Mark Zuckerberg was 12 years old and wouldn’t write the first code for Facebook for another six years. Much of the web involved message boards on which people posted whatever came to mind, including the sorts of lies that can get a newspaper or magazine sued for libel.

Entire books have been written about Section 230, but essentially what it says is that the provider of an interactive computer service (e.g., Facebook and Twitter) is not responsible for what users of the service post. As the owner of the service, though, the company is allowed to create and enforce certain standards. This protects the owners of the service from being responsible for content that breaks the law while allowing them the freedom to suppress or remove content that they do not want to host for a wide variety of reasons. In other words, the platforms have a lot of leeway.

Similarly, the platforms are entirely free to make rules that suppress or boost stories written by media organizations. If they decide the best way to make money is to show you fewer news stories and more political posts from your friends and relatives, they’re entirely free to do so. A few years ago, that’s exactly what they did.

Independent research shows that Facebook does not suppress conservative content, but complaints about “censorship” — which I think you’ll continue to hear in the coming years — ignore the crucial point that these companies make decisions about what you see all the time with little transparency about what or why.

Earlier this year, the computer algorithm that powers Facebook decided that ProPublica was posting clickbait, stories with flashy headlines that promise salacious or highly newsworthy content but don’t deliver either. Our Facebook account was suppressed for weeks while the reps we work with at Facebook tried to find a human who could help. Ultimately, Facebook could never explain the source of the problem to us and we had no recourse. \

Over the past year, my colleagues on ProPublica’s audience team and I have pursued a strategy of using social media platforms to share the “receipts” that underlie our often-complex investigative stories. Sometimes these documents are actual receipts; sometimes they’re images, letters or videos. It’s my hope that you’ll continue to see that material on Facebook, Twitter and elsewhere in the coming months and years. Unfortunately, what you see is ultimately not up to people like me.

Have a great week,

Kengo

More From Our Newsroom
The NYPD Said the Killing of Kawaski Trawick “Appears to Be Justified.” Video Shows Officers Escalated the Situation.
Pistols, a Hearse and Trucks Playing Chicken: Why Some Voters Felt Harassed and Intimidated at the Polls
How Dozens of Trump’s Political Appointees Will Stay in Government After Biden Takes Over
The Way Prisoners Flag Guard Abuse, Inadequate Health Care and Unsanitary Conditions Is Broken
States With Few Coronavirus Restrictions Are Spreading the Virus Beyond Their Borders
“We Don’t Even Know Who Is Dead or Alive”: Trapped Inside an Assisted Living Facility During the Pandemic
Close this screen
Close this screen

Republish This Story for Free

Creative Commons License (CC BY-NC-ND 3.0)

Thank you for your interest in republishing this story. You are are free to republish it so long as you do the following:

  • You have to credit ProPublica and any co-reporting partners. In the byline, we prefer “Author Name, Publication(s).” At the top of the text of your story, include a line that reads: “This story was originally published by ProPublica.” You must link the word “ProPublica” to the original URL of the story.
  • If you’re republishing online, you must link to the URL of this story on propublica.org, include all of the links from our story, including our newsletter sign up language and link, and use our PixelPing tag.
  • If you use canonical metadata, please use the ProPublica URL. For more information about canonical metadata, refer to this Google SEO link.
  • You can’t edit our material, except to reflect relative changes in time, location and editorial style. (For example, “yesterday” can be changed to “last week,” and “Portland, Ore.” to “Portland” or “here.”)
  • You cannot republish our photographs or illustrations without specific permission. Please contact [email protected].
  • It’s okay to put our stories on pages with ads, but not ads specifically sold against our stories. You can’t state or imply that donations to your organization support ProPublica’s work.
  • You can’t sell our material separately or syndicate it. This includes publishing or syndicating our work on platforms or apps such as Apple News, Google News, etc.
  • You can’t republish our material wholesale, or automatically; you need to select stories to be republished individually. (To inquire about syndication or licensing opportunities, contact [email protected].)
  • You can’t use our work to populate a website designed to improve rankings on search engines or solely to gain revenue from network-based advertisements.
  • We do not generally permit translation of our stories into another language.
  • Any website our stories appear on must include a prominent and effective way to contact you.
  • If you share republished stories on social media, we’d appreciate being tagged in your posts. We have official accounts for ProPublica on Twitter, Facebook and Instagram.

Copy and paste the following into your page to republish:

Close this menu
  • Graphics & Data
  • Topics
  • Series
  • Videos
  • Impact
  • ProPublica
  • Local Initiatives
  • Data Store

Follow Us:

  • Like us on Facebook
  • Follow us on Twitter

Stay informed with the Daily Digest.

Site Navigation

Sections

  • ProPublica
  • Local Reporting Network
  • Texas Tribune Partnership
  • The Data Store
  • Electionland

Browse by Type

  • Topics
  • Series
  • Videos
  • News Apps
  • Get Involved
  • The Nerd Blog
  • @ProPublica
  • Events

Info

  • About Us
  • Board and Advisors
  • Officers and Staff
  • Diversity
  • Jobs and Fellowships
  • Local Initiatives
  • Media Center
  • Reports
  • Impact
  • Awards
  • Corrections

Policies

  • Code of Ethics
  • Advertising Policy
  • Privacy Policy

Follow

  • Newsletters
  • Podcast
  • iOS and Android
  • RSS Feed

More

  • Send Us Tips
  • Steal Our Stories
  • Browse via Tor
  • Contact Us
  • Donate
  • More Ways to Give
ProPublica

Journalism in the Public Interest

© Copyright 2023 Pro Publica Inc.
Current site Current page