ProPublica

Journalism in the Public Interest

Machine Bias

There’s software used across the country to predict future criminals. And it’s biased against blacks. Read more.

All 28 Stories (0)

  updates since last visit

California to Investigate Racial Discrimination in Auto Insurance Premiums

The state’s insurance department is following up on our findings that eight auto insurers charge more in minority neighborhoods than in other neighborhoods with similar risk.

Lawmakers Seek Stronger Monitoring of Racial Disparities in Car Insurance Premiums

In response to our report that minority neighborhoods pay higher premiums than white areas with the same risk, six members of Congress and two Illinois state senators are pushing for closer scrutiny of insurance practices.

Minority Neighborhoods Pay Higher Car Insurance Premiums Than White Areas With the Same Risk

Our analysis of premiums and payouts in California, Illinois, Texas and Missouri shows that some major insurers charge minority neighborhoods as much as 30 percent more than other areas with similar accident costs.

Chicago Area Disparities in Car Insurance Premiums

Some car insurers charge higher premiums in Chicago’s minority neighborhoods than in predominantly white neighborhoods with similar risk of accidents.

How We Examined Racial Discrimination in Auto Insurance Prices

Read our methodology.

Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say

ProPublica’s analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes.

Facebook Doesn’t Tell Users Everything It Really Knows About Them

The site shows users how Facebook categorizes them. It doesn’t reveal the data it is buying about their offline lives.

Facebook Says it Will Stop Allowing Some Advertisers to Exclude Users by Race

Facebook says it will build a system to prevent advertisers from buying credit, housing or employment ads that exclude viewers by race.

Where Traditional DNA Testing Fails, Algorithms Take Over

Powerful software is solving more crimes and raising new questions about due process.

Facebook Lets Advertisers Exclude Users by Race

Facebook’s system allows advertisers to exclude black, Hispanic, and other “ethnic affinities” from seeing ads.

Breaking the Black Box: How Machines Learn to Be Racist

Artificial Intelligence is only as good as the patterns we teach it. To illustrate the sensitivity of AI systems, we built an AI engine that deduced synonyms from news articles published by different types of news organizations.

Breaking the Black Box: When Machines Learn by Experimenting on Us

As we enter the era of artificial intelligence, machines regularly conduct experiments on human behavior. Here’s a look at how software used by the New York Times and New York Post uses you to test their headlines.

Breaking the Black Box: When Algorithms Decide What You Pay

The phone you use, the computer you own and the ZIP code you live in can all be factors in what prices you see when shopping online. Welcome to the world of mass customization.

Breaking the Black Box: What Facebook Knows About You

We live in an era of increasing automation. But as machines make more decisions for us, it is increasingly important to understand the algorithms that produce their judgments.

Amazon Defends Its Pricing Algorithm, But Leaves Out Billions in Sales

Amazon says the vast majority of its sales don’t obscure shipping costs. That still adds up to a lot of sales.

Amazon Says It Puts Customers First. But Its Pricing Algorithm Doesn’t

Amazon bills itself as “Earth’s most customer-centric company.” Yet its algorithm is hiding the best deal from many customers.

How We Analyzed Amazon’s Shopping Algorithm

We examined the listings for 250 bestselling products across a wide range of categories, from electronics to household supplies, over a period of several weeks this summer.

Making Algorithms Accountable

As algorithms control more aspects of our lives, we need to be able to challenge them.

ProPublica Responds to Company’s Critique of Machine Bias Story

Northpointe asserts that a software program it sells that predicts the likelihood a person will commit future crimes is equally fair to black and white defendants. We re-examined the data, considered the company’s criticisms, and stand by our conclusions.

Technical Response to Northpointe

Northpointe asserts that a software program it sells that predicts the likelihood a person will commit future crimes is equally fair to black and white defendants. We re-examined the data, considered the company’s criticisms, and stand by our conclusions.

Wisconsin Court: Warning Labels Are Needed for Scores Rating Defendants’ Risk of Future Crime

The court said judges can look at the scores – so long as their limitations are made clear.

The Senate’s Popular Sentencing Reform Bill Would Sort Prisoners By ‘Risk Score’

Federal prisoners would be scored according to their risk of recidivating. Those who got high scores would be ineligible for sentence reduction.

How We Decided to Test Racial Bias in Algorithms

Podcast: Our reporters talk about how they uncovered racial bias in software used to predict future criminals.

What Algorithmic Injustice Looks Like in Real Life

A computer program rated defendants’ risk of committing a future crime. These are the results.

Machine Bias

There’s software used across the country to predict future criminals. And it’s biased against blacks.

How We Analyzed the COMPAS Recidivism Algorithm

What We Know About the Computer Formulas Making Decisions in Your Life

Algorithms make a lot more decisions in our lives than what news we read on Facebook. Here’s a reading guide.

Uber’s Surge Pricing May Not Lead to a Surge in Drivers

Uber’s surge pricing doesn’t necessarily increase the availability of rides. It just makes them more expensive.

When Big Data Becomes Bad Data

Corporations are increasingly relying on algorithms to make business decisions and that raises new legal questions.

The Tiger Mom Tax: Asians Are Nearly Twice as Likely to Get a Higher Price from Princeton Review

One unexpected effect of the company’s geographic approach to pricing is that Asians are almost twice as likely to be offered a higher price than non-Asians, an analysis by ProPublica shows.

Get Updates

Our Hottest Stories

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •